Next Article in Journal
Synchrotron and Neutron Tomography of Paleontological Objects on the Facilities of the Kurchatov Institute
Previous Article in Journal
User-Centered Predictive Model for Improving Cultural Heritage Augmented Reality Applications: An HMM-Based Approach for Eye-Tracking Data
Previous Article in Special Issue
Long-Term Monitoring of Crack Patterns in Historic Structures Using UAVs and Planar Markers: A Preliminary Study
Article Menu
Issue 8 (August) cover image

Export Article

Open AccessArticle
J. Imaging 2018, 4(8), 102; https://doi.org/10.3390/jimaging4080102

Airborne Optical Sectioning

Institute of Computer Graphics, Johannes Kepler University Linz, 4040 Linz, Austria
*
Author to whom correspondence should be addressed.
Received: 4 July 2018 / Revised: 2 August 2018 / Accepted: 11 August 2018 / Published: 13 August 2018
(This article belongs to the Special Issue New Trends in Image Processing for Cultural Heritage)
Full-Text   |   PDF [7671 KB, uploaded 13 August 2018]   |  

Abstract

Drones are becoming increasingly popular for remote sensing of landscapes in archeology, cultural heritage, forestry, and other disciplines. They are more efficient than airplanes for capturing small areas, of up to several hundred square meters. LiDAR (light detection and ranging) and photogrammetry have been applied together with drones to achieve 3D reconstruction. With airborne optical sectioning (AOS), we present a radically different approach that is based on an old idea: synthetic aperture imaging. Rather than measuring, computing, and rendering 3D point clouds or triangulated 3D meshes, we apply image-based rendering for 3D visualization. In contrast to photogrammetry, AOS does not suffer from inaccurate correspondence matches and long processing times. It is cheaper than LiDAR, delivers surface color information, and has the potential to achieve high sampling resolutions. AOS samples the optical signal of wide synthetic apertures (30–100 m diameter) with unstructured video images recorded from a low-cost camera drone to support optical sectioning by image integration. The wide aperture signal results in a shallow depth of field and consequently in a strong blur of out-of-focus occluders, while images of points in focus remain clearly visible. Shifting focus computationally towards the ground allows optical slicing through dense occluder structures (such as leaves, tree branches, and coniferous trees), and discovery and inspection of concealed artifacts on the surface. View Full-Text
Keywords: computational imaging; image-based rendering; light fields; synthetic apertures computational imaging; image-based rendering; light fields; synthetic apertures
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).

Supplementary materials

  • Supplementary File 1:

    ZIP-Document (ZIP, 10434 KB)

  • Externally hosted supplementary file 1
    Doi: doi.org/10.999/airborne.optical.sectioning/bushes
    Link: https://zenodo.org/record/1227183
    Description: The raw and processed (rectified and cropped) recordings and pose data for experiment 1.
  • Externally hosted supplementary file 2
    Doi: doi.org/10.999/airborne.optical.sectioning/tower16
    Link: https://zenodo.org/record/1227246
    Description: The raw and processed (rectified and cropped) recordings and pose data for experiment 2.
SciFeed

Share & Cite This Article

MDPI and ACS Style

Kurmi, I.; Schedl, D.C.; Bimber, O. Airborne Optical Sectioning. J. Imaging 2018, 4, 102.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
J. Imaging EISSN 2313-433X Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top