UAS for Protecting the Historical Built Environment: Monitoring, Damage Detection, and Diagnostics of Heritage Infrastructure Supported by Aerial Systems

A special issue of Drones (ISSN 2504-446X).

Deadline for manuscript submissions: closed (31 October 2023) | Viewed by 2487

Special Issue Editors


E-Mail Website1 Website2
Guest Editor
Department of Architecture and Design, Polytechnic University of Turin, Viale Pier Andrea Mattioli 39, 10138 Torino, Italy
Interests: urban heritage; UAV and close-range photogrammetry; SLAM-based technologies and application in the field of historical structures and urban contexts
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Ephorate of Antiquities of Cyclades, Hellenic Ministry of Culture and Sports, 105 55 Athens, Greece
Interests: cultural heritage documentation; digital photogrammetry; LiDAR; multispectral sensors; data fusion; digital archaeology; drones; NDT for building monitoring; GIS

Special Issue Information

Dear Colleagues,

The preservation of cultural heritage, and particularly the application of intervening actions aimed at the conservation and maintenance of the historical built environment, poses multifarious challenges. Effectively implementing such interventions requires thorough documentation and extensive knowledge of the altering factors that put cultural heritage at risk. Therefore, unmanned aerial systems (UASs) have an important role to play, as they can support the comprehensive recording of these factors, facilitating the nondestructive diagnostic process that precedes any protecting intervention, that would otherwise be difficult to implement only via a terrestrial point of view. In other words, UASs offer a new perspective for heritage protection via supporting large-scale mapping, close-range digitization, and, ultimately, inspection and monitoring.

The recent progress of unmanned platform manufacturing, the increasing miniaturization of sensing payloads, and the decreasing cost of integrated microelectronics have gradually allowed the implementation of UAS for aerial and aerial-supported integrated surveys for cultural heritage structures. A variety of on-platform sensors can provide rich data through multi-sensor recording and pre-/post-data processing (e.g., computer vision or image processing techniques). The produced data can be integrated with terrestrial acquisition technologies and visualization and information modeling techniques.

In this context, this Special Issue invites research papers demonstrating innovative developments in applying UASs to survey, digitize, study, and monitor the historical built environment, including recent research, reviews, short and technical notes, and/or significant case studies related to drone inspection in cultural heritage. Contributions may cover, but are not limited to, the following topics:

  • 3D documentation of historical building, building complexes, infrastructure, and urban centers using or supported by UASs devices for data collection;
  • Inspection, multitemporal monitoring, and damage detection of critical heritage infrastructures using UASs;
  • Multi-sensor and multiplatform recording and data integration using or supported by UASs;
  • Color cameras, thermographic cameras, multispectral cameras, hyperspectral cameras, LiDAR, SLAM-enabling mobile scanners for condition assessment in cultural heritage by means of UASs;
  • UAS flight planning and control for built heritage mapping;
  • Intelligent UAS-based solution for built heritage surveys

Prof. Dr. Fulvio Rinaudo
Dr. Giulia Sammartano
Dr. Efstathios Adamopoulos
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Drones is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • built heritage
  • optical, infrared, multispectral, hyperspectral and geomatics sensors
  • data fusion
  • NDE of historical buildings and infrastructure
  • multi-temporal mapping and 3D modeling
  • damage localization and monitoring
  • intelligent drone-based and drone-supported solutions

Published Papers (1 paper)

Order results
Result details
Select all
Export citation of selected articles as:

Research

23 pages, 44241 KiB  
Article
Image-to-Image Translation-Based Structural Damage Data Augmentation for Infrastructure Inspection Using Unmanned Aerial Vehicle
by Gi-Hun Gwon, Jin-Hwan Lee, In-Ho Kim, Seung-Chan Baek and Hyung-Jo Jung
Drones 2023, 7(11), 666; https://doi.org/10.3390/drones7110666 - 08 Nov 2023
Viewed by 1646
Abstract
As technology advances, the use of unmanned aerial vehicles (UAVs) and image sensors for structural monitoring and diagnostics is becoming increasingly critical. This approach enables the efficient inspection and assessment of structural conditions. Furthermore, the integration of deep learning techniques has been proven [...] Read more.
As technology advances, the use of unmanned aerial vehicles (UAVs) and image sensors for structural monitoring and diagnostics is becoming increasingly critical. This approach enables the efficient inspection and assessment of structural conditions. Furthermore, the integration of deep learning techniques has been proven to be highly effective in detecting damage from structural images, as demonstrated in our study. To enable effective learning by deep learning models, a substantial volume of data is crucial, but collecting appropriate instances of structural damage from real-world scenarios poses challenges and demands specialized knowledge, as well as significant time and resources for labeling. In this study, we propose a methodology that utilizes a generative adversarial network (GAN) for image-to-image translation, with the objective of generating synthetic structural damage data to augment the dataset. Initially, a GAN-based image generation model was trained using paired datasets. When provided with a mask image, this model generated an RGB image based on the annotations. The subsequent step generated domain-specific mask images, a critical task that improved the data augmentation process. These mask images were designed based on prior knowledge to suit the specific characteristics and requirements of the structural damage dataset. These generated masks were then used by the GAN model to produce new RGB image data incorporating various types of damage. In the experimental validation conducted across the three datasets to assess the image generation for data augmentation, our results demonstrated that the generated images closely resembled actual images while effectively conveying information about the newly introduced damage. Furthermore, the experimental validation of damage detection with augmented data entailed a comparative analysis between the performance achieved solely with the original dataset and that attained with the incorporation of additional augmented data. The results for damage detection consistently demonstrated that the utilization of augmented data enhanced performance when compared to relying solely on the original images. Full article
Show Figures

Figure 1

Back to TopTop