Next Article in Journal
Intelligent Control for Unmanned Aerial Systems with System Uncertainties and Disturbances Using Artificial Neural Network
Next Article in Special Issue
UAV Photogrammetry and GIS Interpretations of Extended Archaeological Contexts: The Case of Tacuil in the Calchaquí Area (Argentina)
Previous Article in Journal
Remote Sensing of Yields: Application of UAV Imagery-Derived NDVI for Estimating Maize Vigor and Yields in Complex Farming Systems in Sub-Saharan Africa
Previous Article in Special Issue
Debitage and Drones: Classifying and Characterising Neolithic Stone Tool Production in the Shetland Islands Using High Resolution Unmanned Aerial Vehicle Imagery
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Potential of Drones and Sensors to Enhance Detection of Archaeological Cropmarks: A Comparative Study Between Multi-Spectral and Thermal Imagery

by
Paula Uribe Agudo
1,*,
Jorge Angás Pajas
2,
Fernando Pérez-Cabello
3,
Jaime Vicente Redón
4 and
Beatriz Ezquerra Lebrón
4
1
Department of Prehistory, Ancient History and Archaeology, ATHAEMHIS Group, Faculty of Geography and History, University of Salamanca, c/Cervantes s/n, 37008 Salamanca, Spain
2
School of Land Surveying, Geodesy and Mapping Engineering, Polytechnic University of Madrid, Camino de la Arboleda s/n, 28031 Madrid, Spain
3
Department of Geography and Spatial Management, Geoforest-IUCA Group, University of Zaragoza, c/Pedro Cerbuna 12, 50009 Zaragoza, Spain
4
Museum of Teruel, Pza. Fray Anselmo Polanco nº3, 44001 Teruel, Spain
*
Author to whom correspondence should be addressed.
Drones 2018, 2(3), 29; https://doi.org/10.3390/drones2030029
Submission received: 3 July 2018 / Revised: 2 August 2018 / Accepted: 12 August 2018 / Published: 29 August 2018
(This article belongs to the Special Issue (Re)Defining the Archaeological Use of UAVs)

Abstract

:
This paper presents experimentation carried out at the Roman Republican city of La Caridad (Teruel, Spain), where different tools have been applied to obtain multispectral and thermal aerial images to enhance detection of archaeological cropmarks. Two different drone systems were used: a Mikrokopter designed by Tecnitop SA (Zaragoza, Spain) and an eBee produced by SenseFly Company (Cheseaux-sur-Lausanne, Switzerland). Thus, in this study, we have combined in-house manufacturing with commercial products. Six drone sensors were tested and compared in terms of their ability to identify buried remains in archaeological settlements by means of visual recognition. The sensors have different spectral ranges and spatial resolutions. This paper compares the images captured with different spectral range sensors to test the potential of this technology for archaeological benefits. The method used for the comparison of the tools has been based on direct visual inspection, as in traditional aerial archaeology. Through interpretation of the resulting data, our aim has been to determine which drones and sensors obtained the best results in the visualization of archaeological cropmarks. The experiment in La Caridad therefore demonstrates the benefit of using drones with different sensors to monitor archaeological cropmarks for a more cost-effective assessment, best spatial resolution and digital recording of buried archaeological remains.

1. Introduction

In recent years, various new technologies have emerged making it possible to obtain accurate data quickly at a lower cost in the field of archaeology. Firstly, the reliability of drones has rapidly improved, allowing archaeologists to obtain accurate images in variable weather conditions. The possibilities offered in the field of archaeology by this emergent, cutting-edge technology is now generating great interest. Reflecting this interest are recent scientific publications on the topic [1,2,3,4,5,6,7,8,9,10,11] to cite some examples. Secondly, parallel to the drone market, various types of software, such as Agisoft PhotoScan [12,13] and Pix4D processes [14], have improved mosaicing, orthorectification, image georeferencing and multispectral post-processing, allowing archaeologists to obtain a wide range of results such as accurate maps, 3D models or digital terrain and elevation models [15].
Likewise, the evolution of trade sensors (cameras) adapted for drones, above all those related to the field of precision agriculture, today allow the use of devices with a high spatial and a wide spectral resolution. Thus, it is possible to place multispectral cameras on these platforms, covering visible, near-infrared and thermal [6] wavelengths, similar to those provided by satellite imagery.
All these technological advances have been used in this project. Consequently, our aims have been to discriminate between the possibilities of identification of remains buried beneath flat cultivated fields with different drones and sensors. The purpose of this experiment was to determine which tools were most useful and obtained the best results in the visualization of buried archaeological remains.
These hidden archaeological remains can be revealed by means of the so-called vegetation marks or crop marks. The remote sensing in archaeological research is based on these marks [16]. Just as Verhoeven and Vermeulen emphasize [17] the crop marks are often the main focus of both aerial observers and image interpreters due to their potential to disclose very detailed morphological information about hidden features and structures, certainly in cereals such as wheat and barley, like we can see in this case of study.
The near-infrared [18], multispectral and hyperspectral [19] images can highlight crop marks in vegetated areas by exploiting the spectral information conveyed in reflected solar radiation. In recent years, different vegetation indices and several other image features have been used, with varying success, to improve the interpretation of remotely sensed images for archaeological research [20,21,22]. In this study, we use some of this feature images for highlight the crop marks in La Caridad (Spain). The aim of this paper is to examine the potential use of different drones with different sensors. The method used for the comparison of the tools is based on direct visual inspection, as in traditional aerial archaeology.

2. Study Area: La Caridad (Caminreal, Teruel, Spain)

The Roman republican archaeological site of La Caridad is on the left bank of the river Jiloca and has a trapezoidal shape covering 12.5 hectares (Figure 1). It is considered one of the most important Republican era settlements in the Ebro valley. The original name of the city is still unknown with different opinions about the by scholars [23,24,25].
The absence of previous settlements and its topography allowed orthogonal urban planning, with cardines and perpendicular decumani of uniform width [26]. The archaeological data confirms a single and very brief occupation between the 2nd century B.C. and the destruction of the city around 80–72 B.C. However, fragments of Hispanic terra sigillata, found within the site, seem to indicate the continuation of part of the habitat to the 1st century AD. The core of the settlement was built ex novo. Thanks to the excavations by the Museum of Teruel, which opened in 1984, much is now known about their households, which contributes valuable information on domestic architecture in the Ebro valley in the Republican era. However, the most striking fact about this city is that, although the architecture is distinctly Roman, its inhabitants were locals, as confirmed by the material culture and inscriptions found. According to Vicente and Ezquerra [25], analysis of the inscriptions denotes the presence of Celtiberians to a lesser extent than Iberians. They highlight the total absence of Latin inscriptions, with the exception of a potter’s stamp on a ceramic mortar. It is suggestive for Vicente and Ezquerra [25]. They suggest that the city was created to house auxiliary veteran Celtiberians.
This case of study was chosen because La Caridad is the one of the oldest Roman cities of northern Hispania. In La Caridad, the Roman houses are well known by means of the excavations but the public part of the city, where the forum and temples should have been located, is still unknown.
A full study of the site has been carried out, in which different Roman houses have been documented (Figure 2). Only the remains of structures can be seen in Figure 3, which are probably related to public architectonic elements, maybe the forum or the temples. In fact, this knowledge has been very important by means of this investigation because this type of domestic construction is the only one currently known as a result of the different fieldwork seasons.
The cereal crop fields indicated in Figure 4 were chosen to facilitate experimentation with different tools. These fields have been chosen because they were located in the suburban zone (outside of walls) of the old Roman city. Moreover, we know that the same cereal is grown every year (wheat).
The drones were used in different years and times of the year during different archaeological aerial campaigns. In the first campaign, the crops were in the beginning of the senescence but, in the second period, they were at the maximum vegetal vigor of the wheat. The method used for the comparison of the tools has been based on direct visual inspection, as in traditional aerial archaeology. In this sense, we consider that, in this visual interpretation method, that flights are carried out in different dates, do not alter the visual results. This multi-temporal data, with different phenological stages, can improve the visual enhance of the crop marks.

3. Materials

3.1. Drone Systems

As shown in Table 1, two different drones system were used: a Mikrokopter designed by Tecnitop SA together with the University of Zaragoza (Spain) and an eBee produced by SenseFly Company (Cheseaux-sur-Lausanne, Switzerland). Thus, in this study we have combined in-house manufacturing with commercial equipment. The former allows greater versatility in adapting different sensors whereas the latter, which comes equipped with irreplaceable cameras, provides a faster and more accurate workflow.
(a) In-house system equipment: The Tecnikopter
An 8-engined UAV with a folding carbon structure—manufactured in-house—was used in our project, with a Mikrokopter v. 2.02 ME flight control board. This system integration has been designed by Tecnitop SA together with the University of Zaragoza Department of Design and Manufacturing, which was enabled by the SME (Small and Medium Enterprise) Technology Transfer Program of the Aragonese Foundation for Research and Development. The project arose from the necessity of a platform capable of integrating particular sensors, in this case photogrammetric and multispectral cameras already available in the research group but which could not be used with commercial systems. In this case, the size of the area covered per flight is approximately five hectares. This system is operated with a free software application, called MK tools [27].
(b) Commercial system equipment: The eBee
The eBee is a professional mapping drone manufactured by SenseFly Company (Cheseaux-sur-Lausanne, Switzerland) https://www.sensefly.com/. It is a fully autonomous drone that can capture high-resolution aerial photos and generate accurate orthomosaics. This flying wing is a very light UAV (700 g with the camera) and its wingspan is 96 cm (Figure 5). It has an on-board artificial intelligence system, which analyzes data from an Inertial Measurement Unit (IMU) and an on-board GPS to optimize every aspect of its flight. The main advantage of this flying wing compared to the Tecnikopter is the size of the area covered per flight. However, the wind speed has to be lower than 40 km/h. A lithium polymer battery provides at least 50 min of continuous operation. The flight is operated with the eMotion® software, provided with the eBee. This software allows the flight to be planned before the mission and interaction with the drone during the flight. All the parameters, such as the height of the flight, the overlap between the images or the images’ spatial resolution, are user-specified before each flight [28].

3.2. Drone Sensors

Five drone sensors were tested (Figure 6) and compared in terms of their ability to identify buried remains in archaeological settlements. The sensors have different spectral ranges and spatial resolution. Table 2 lists their relevant properties.
(a) E-PM2
To capture visible wavelengths with the TecniKopter, an OLYMPUS E-PM2—a micro four-thirds compact camera with a 16-megapixel Live MOS sensor (max. resolution 17.2 megapixels) and TruePic VI processor—was used. This camera has a primary color (RGB) filter known as a Bayer filter [29,30].
(b) MINI-MCA 6 Tetracam
The Tetracam MINI-MCA is a six-band multispectral camera that can acquire imagery in six discrete wavebands suitable to its performance, size and weight. It has six channels corresponding to different filters with different wavelengths. The system works with a master camera that is responsible for synchronizing the rest of the cameras (slaves), calculating exposure and GPS synchronization for the georeferenced information in each image. Each camera has its own filter (Andover spectroscopic) with a bandwidth between 10 and 20 nm, resulting in a wavelength that spans from 490 to 940 nm.
(c) S110 RGB
The commercial eBee system uses the CANON Powershot ELPH110 HS RGB camera, with a resolution of 16.1 Mpixel, to acquire RGB. Its focal length ranges from 4.3 mm to 21.5 mm, which for a flight height of 150 m, for example, yields a ground sampling distance of 4.69 cm.
(d) MultiSPEC 4C eBee
The multiSPEC 4C is a cutting-edge sensor unit developed by Airinov’s agronomy specialists and customized for the eBee Ag. It contains four separate 1.2 MP sensors that are controlled by the eBee Ag’s autopilot. These acquire data across four highly precise bands, plus each sensor features a global shutter for sharp, undistorted images.
The multiSPEC 4C provides image data across four highly precise bands—Green, Red, Red-edge and NIR—with no spectral overlap. In addition, its upward-facing irradiance sensor automatically compensates for sunlight variations, resulting in unparalleled reflectance measurement accuracy.
(e) thermoMAP
The thermoMAP is a professional camera payload that produces greyscale images and video with high pixel density and thermal resolution. Its temperature sensitivity range makes it ideal for a wide range of industrial applications such as solar panel inspection, site inspection, stockpile hotspot detection and agricultural applications such as irrigation and plant health monitoring. The thermoMAP automatically calibrates itself during the mission. It briefly closes its shutter at each waypoint and images the back of the shutter. A built-in temperature sensor accurately measures the temperature.

4. Methods

The method used for the comparison of the tools is based on direct visual inspection, as in traditional aerial archaeology. We can observe in Figure 4 that, in the visible spectrum, we cannot visualize the buried remains. Only the structures are identified in the sensors that possess wavelengths that overtake and pass the NIR. The visual comparison between these sensors is realized in the results.

4.1. Flight Planning and Data Acquisition Procedure

The flights were planned in different manners:
(a)
With the Tecnikopter we used the MKTools, a free software application. The purpose of this planning was to define a route that would allow a longitudinal and transversal image overlap of 80%. The average speed for flights was 10 m/s at an altitude of approximately 100 m. The RBG images were recorded in a .raw format (the proprietary Olympus format being .orf) with the same fixed Zuiko Digital ED 12 mm f/2.0 lens with an ISO of 200 and with a manual focus to infinity. For the Tetracam camera the shutter speed was every two seconds at 4 m/s with the speed maintained.
(b)
With the commercial system eBee we used the eMotion software to calculate the flight plan, which is based on automatic photogrammetric parameters. Once we set the configuration (GSD and overlapping percentages, lateral and longitudinal) the software automatically calculates the number of strips required to cover the archaeological area.
For both of them the acquisition of field data requires the determination of several control points on the ground, known as GCPs (Ground Control Points). We used the same GCPs for all the flights (Figure 7). This acquisition required a good geometric distribution within the defined flight area with the objective of scaling the final model to a metric system, and inserting it into the selected coordinate reference system. The geodesic reference system used was the official system in Spain—UTM ETRS89 (Royal decree 1071/2007 of 27 July)—in UTM zone 30. The GCPs were acquired using a Leica GS14 GNSS system. The connection between the mobile equipment to a fixed network was carried out via GPRS connection to the ARAGEA active geodesic network of the Aragonese Government (http://gnss.aragon.es/). Subsequently between 15 and 20 points were distributed within the defined area, and all the points were used for all the sensors in the field data collection. Landmarks ranged from easily identified points in the field, such as roads or stones, to targets and survey stakes in areas with vegetation. Consequently, a file with the format “x, y, z, cod” was obtained, to permit later photogrammetric support.

4.2. Image Processing

(a)
For the Tecnikopter images the orthorectification and mosaicing processes were performed by Agisoft PhotoScan (2014) software [12,13], for all wavelengths. In this sense, we obtained a structure-from-motion (Sfm) elevation model using a radiometric adjustment by Adaptive Orthophoto. For the texture of the 3D model, the software performs a radiometric adjustment based on the average of each pixel in each image taken. The software calibrates the images assuming that every camera model used specifies the transformation from point coordinates in the local camera coordinate system to the pixel coordinates in the image frame (Figure 8). The local camera coordinate system has origin at the camera projection center where the Z axis points towards the viewing direction, X axis points to the right and Y axis points down. The image coordinate system has origin at the top left image pixel, with the center of the top left pixel having coordinates (0.5, 0.5). The X axis in the image coordinate system points to the right, Y axis points down. Image coordinates are measured in pixels. Equations used to project a points in the local camera coordinate system to the image plane are provided below for each supported camera model [31].
(b)
For eBee we used Pix4D [14] for initial processing of matching image pairs and generate a point cloud, mesh and orthomosaic. This software uses a radiometric calibration that allows the calibration and correction of the image reflectance, taking the illumination and sensor influence into consideration (Figure 8). The external camera parameters are different for each image. They are given by [32]:
  • T = (Tx, Ty, Tz) the position of the camera projection center in world coordinate system.
  • R the rotation matrix that defines the camera orientation with angles ω, ϕ, κ (PATB convention.)
    R = R x ( ω ) R y ( ϕ ) R z ( κ ) = ( 1 0 0 0 cos ( ω ) sin ( ω ) 0 sin ( ω ) cos ( ω ) )     ( cos ( ϕ ) 0 sin ( ϕ ) 0 1 0 sin ( ϕ ) 0 cos ( ϕ ) ) ( cos ( κ ) sin ( κ ) 0 sin ( κ ) cos ( κ ) 0 0 0 1 ) = ( cos κ cos ϕ sin κ cos ϕ sin ϕ cos κ   sin ω sin ϕ + sin κ cos ω cos κ cos ω sin κ sin ω sin ϕ sin ω cos ϕ sin κ sin ω cos κ cos ω sin ϕ sin κ cos ω sin ϕ + cos κ sin ω cos ω cos ϕ )
After obtaining the orthophotos and 3D models, the data was adapted to work with Erdas Imagine (2013) and ArcGis (2013). This adaption consisted of the creation of a layer stack with the VIS and the NIR images in .tiff format, generating multilayer files with 6 bands of multispectral information in each case. For the Tecnikopter, we have obtained images with this order band: 1 = 490–500 nm; 2 = 550–560; 3 = 680–690; 4 = 720–730; 5 = 800–810; 6 = 920–940; for the images with Multispec 4C obtained with eBee: 1 = 550; 2 = 660; 3 = 735; 4 = 790. The Thermomap produces panchromatic greyscale images, 1 = 10,000.

4.3. Image Post-Processing

The principal objective of these different procedures has been to enhance visual detection of the cropmarks for the identification of buried archaeological remains.
The following techniques of digital image processing have been applied in combination for the multispectral images: (a) the generation of new bands of information (spectral indices and Principal Components PCs) from the original bands; and (b) the application of radiometric enhancements and the creation of compositions in false color (RGB), applied to both the original and the new bands.
(a) The generation of indices and PCs allows us to extract continuous information and enhance significant aspects of the images from the original bands. For this study, the metric relationship between bands has been employed to estimate biophysical parameters such as plant vigor, in the case of obtaining vegetation indices, and the statistical relationship between bands to create the PCs. For the thermal images we have only operated with the temperature index (°C), estimated directly with Pix4D. In this case, the software uses the pixel values in the band (10,000 nm) and calculates:
Thermal_Index = pixel value/100 − 100.
The vegetation indices consist of the pixel-by-pixel building of ratios or quotients, between the digital levels of two or more image bands [33]. The potential of this technique in the field of archaeology was raised by Verhoeven [18,34] and Bennett et al. [20]. Specifically, this project has used: NDVI—Normalized Difference Vegetation Index—[35], Green NDVI—Green Normalized Difference Vegetation Index—[36] and SAVI—Soil Adjusted Vegetation Index—[37]. These indices use three key spectral ranges (Green, Red and NIR) related to spatial differences of the density and physical state of vegetation and soil moisture that might be linked to the presence of archaeological remains [34].
N D V I = NIR Red NIR + Red
S A V I = ( 1 + L ) NIR Red NIR + Red + L
G N D V I = NIR Green NIR + Green
where, NIR = Near Infrarred and L = 0.5.
NDVI shows an effective capability in the detection of crop marks and have widely been used to detect archaeological features [20]. GNDVI is more sensitive to variation in chlorophyll content [36] and SAVI is used to reduce the spectral contribution of soil. Both aspects are especially important to consider over cereal crops because of the phenology dynamic and management interactions might affect the identification of signals related to buried structures.
The Principal Components Analysis (PCA) is a statistical procedure that reduces redundant information by transforming several quantitative variables (common underlying dimensions) into several new uncorrelated variables. The first two variables or components retain the highest proportion of variation, and from the CP number 3 begin to increase the percentage of noise. This statistical process is performed using a variance-covariance matrix without losing significant portions of the original information. This capacity of synthesis is of great interest in the field of remote sensing as the acquisition of images in adjacent spectrum bands often involves the detection of redundant information, since canopy cover often exhibits similar behavior in close wavelengths.
In the field of archeology, it has been used in particular when working with hyperspectral sensors [21,22] contributing significantly to the identification of marks of archaeological structures. In this study, it has been applied to the VIS-NIR image (containing all bands) for the creation of new bands including in false color compositions.
(b) With respect to radiometric enhancements, radiometric histogram adjustments and false color compositions (RGB) have been applied. According to Jensen [38], the linear contrast tends to expand the original digital levels to the minimum and maximum (0–255) to exploit the sensitivity of the device. In this case, linear contrast adjustment and histogram equalization were used to increase the global contrast, which were defined statistically and interactively using breakpoints and percent clip. With regard to histogram equalization, the number of pixels of every image is divided by different number of bins equaling the number of pixels in each bin.
These settings have been applied to the following bands and new channels: VIS, NIR, and VIS-NIR images, NDVI, Green NDVI, SAVI, CP1, CP2, CP3 and Thermal index. False color compositions are three-band combinations, to each of which were assigned additive primary colors (Red, Green and Blue RGB). Here, in place of the original bands, the three NDVIs, or the first three CPs, have been used for revealing specific patterns and emphasizing variations between site and their surroundings that not be visible without applications of these kinds of techniques.

5. Results

(a) Tecnikopter plus Tetracam
This system has allowed us to use the desired camera for research and not one imposed by the manufacturer, in this case the Tetracam Mini MCA6. Although the workflow with this system is quite complex and requires laborious and time-consuming post-processing, the results can identify different buried archaeological remains. Due to the complexity of the workflow and the deformations that the camera produces, we have obtained better results for the crop marks working with individual images. The creation of orthophotos has not proved satisfactory in enhancing the crop marks as the sensor has problems with the effective size of the camera lens aperture, resulting in what is known as vignetting [39] as well as a poor overlap between bands. In this case, the band 6 images (920–940 nm) with a standard deviation enhancement of 2.5 allows us to obtain the best results (Figure 9G).
Various image treatments such as obtaining vegetation indices [20] (Figure 10) and PCs have been made. This image post-processing has allowed us to identify several anomalies in the ancient remains of the city of La Caridad. In the case of the Tetracam sensor, and taking into account the advanced phenological state of the vegetation as the drone passes, the techniques that have given the best results are: (1) SAVI with any of the infrared bands (800, 720 or 920) (Figure 11B–D, respectively), and (2) the false color composition of the PCs (components 123) with an equalized enhancement of the histogram (Figure 11G).
Based on the SAVI indices in the northern part of the field, a succession of quadrangular forms can be made out with a different tonality of grey that represents different levels of the greenness of the cereal following a spatial pattern that is clearly orthogonal and non-natural.
These forms reveal the presence of buried archaeological remains. The different levels of greenness will have been determined by the varying soil conditions associated with the architectonic elements documented here. In some cases, the presence of the remains at subsuperficial level will have contributed to the modification of the physical properties of the ground and either speeded up or slowed down the ageing process of the cereal. With the second technique—the generation of false color composition using the first principal components- and afterwards applying an equalize enhancement of the histogram (Figure 11G), we can observe how these quadrangular elements appear in purple and turquoise tones.
However, in the process of planting the cereal, no similar planting pattern is known to have been applied. Taking into account the usual physical interpretation of the PCs and the false color composition used, the turquoise tones are associated with very green, moist and shiny surfaces while the purple tones reveal a low degree of greenness and brightness.
(b) eBee plus Multispec 4C
This commercial system only allows us to work with the Multispec 4C multispectral camera. The process of data capture and post-processing is faster as the workflow is fully systematized. The results are satisfactory for the identification of archaeological remains as can be seen in Figure 12 and Figure 13. In this case, it allows a greater visualization of the cropmarks in the NIR orthomosaic (Figure 12E). In it we can identify, with lighter grayish tones, the succession of quadrangular structures distributed in the northern area of the field. These shades reveal the greater Digital Number of vegetation at this wavelength. In contrast, the southernmost structure can be clearly identified using the 660 nm red band (Figure 12C). In this case, the lighter tone of this form signifies a lower presence of vegetation since the pigments (chlorophylls and xanthophylls) absorb most of the radiation at this wavelength. Likewise, the vegetation indices collected in Figure 13C,D (GreenNDVI and SAVI respectively) highlight the lower photosynthetic activity of the vegetation. With regard to the false color composition (Figure 13F), obtained by means of this sensor, the assignment of the first two components to the three color canons (RGB), component 1 is displayed simultaneously in red and blue. The southern structure is shown in green tones while the rectangular shapes in the northern part of the field appear in bright magenta shades.
The quadrangular structures are not in the same tones and thus, there is no systematic correspondence with the type of cover. Consequently, we can affirm that each quadrangular structure behaves differently without there being a systematic correspondence with the cover type.
(c) eBee plus ThermoMap
According to the results obtained in this experiment, basing our method on direct visual inspection, it seems that this is the best system for enhancing the archaeological crop marks. Simply by using the temperature index we can observe (Figure 14) that the sharpness of the display of the remains is increased.
In Figure 14E, the lower temperature (bluish tones) recorded by the crop marks corresponds to the highest temperature in the field (~20 °C) (yellowish tones). The fact that the cultivation field is already harvested favors the identification of the remains due to the absence of a homogenizing role of the vegetation in its phenological maximum and its greater emissivity. In the field immediately to the west, whose cover is in a phase near this maximum, it is possible to verify the masking effect that the vegetation has on the cropmarks. Therefore, we think that with this type of sensor, as in classical aerial archeology, it would be necessary to carry out the flights when there is greater difference within the same crop. Additionally, the spatial resolution is lower than in the previous cases.

6. Discussion

The purpose of this paper has been to determine which drones and sensors were most useful and obtained the best results in the visualization of buried archaeological remains by means of enhancing the archaeological crop marks. To achieve this objective, we have used two drone systems and five sensors. These systems have been applied to archaeological surveying of an area defined within the Republican Roman city of La Caridad (Teruel, Spain), a site that is partially excavated. The experiment was conducted with two drone systems: one in-house (the Tecnikopter) and a commercial system (the eBee). Flights with the Tecnikopter were performed using the Tetracam MiniMCA6 multispectral camera. With the commercial eBee, a multispectral camera (the MultiSpec 4C) and a thermal camera (the ThermoMap) were used. The method used for the comparison of the tools has been based on direct visual inspection, as in traditional aerial archaeology.
The technical progress that this sector is experiencing means that we are unable to discern which tool is the most useful because of the sheer range of options, or we are unable to obtain enough funding to use the tool in question. The ultimate goal of this work has been to find the best tool that would enable a standardized work process, which was applicable at all times and with which we knew that satisfactory results would be obtained.
Obviously, the in-house system—Tecnikopter—allows us to work with the desired sensor, in this case Tetracam Mini MCA6. This is a sensor that provides good results but which is difficult to work with and above all, to establish a standardized workflow. The first problem arises with the adaptation of the available camera drone system, and this problem increases exponentially when downloading data, with deformation of the images and their subsequent treatment. Yet the 920–940 wavelength filter is still an essential element for archaeological applications since other sensors on the market do not reach this wavelength.
The commercial system constrains us to particular products, in this case MultiSpec and ThermoMap, but facilitates the entire work process—both in data collection in the field and in the process of composition of the mosaics and subsequent image processing. We think the MultiSpec 4C could offer similar results to the Tetracam Mini MCA6 if it was fitted with another filter length similar to that of the 920–940 nm in the Tetracam. This means that the archaeological results are always less sharp when it comes to displaying the remains and that the post-processing of images with remote sensing techniques is more time-consuming when extracting the data.
As we have demonstrated, basing our method on direct visual inspection, the most effective tool (Figure 14) has been the combination of eBee with the ThermoMap system. As other work has shown [18,21], the use of longer wavelengths enables enhanced visualization of archaeological cropmarks. Just as in classic aerial surveying using aircraft [40], we have observed how the phenological state influences its characterization. As shown in the results obtained, the archaeological remains are best visualized in the field when the crop has begun its senescent phase. However, in employing a thermal camera we lose spatial resolution due to the characteristics of the camera.
Consequently, we believe that there is not yet a single product (drone plus sensor) on the market that provides optimum results in the field of Archeology, as is the case in other fields such as precision agriculture. Thus, the ideal solution is to combine the available tools to achieve the best topographic results using one method, and remote sensing results using another. In any case, we think that these new tools (drone plus different sensors) should be added to the modern archaeological praxis as a common and standardized procedure in all explorations prior to or instead of an archaeological excavation, especially at sites which are not threatened by looting and/or construction works.

7. Conclusions

We can affirm that the use of drones with multispectral or thermal cameras is a useful tool in the field of archeology, as shown by these results and other publications [1,2,3,4,5,6,7,8,9,10,11]. The problem is in choosing the sensor to use to obtain the best results.
With the in-house system (Tecnikopter and Tetracam), we have found that the workflow is quite complex and involves laborious post-processing, but the results allow us to identify buried archaeological remains. The eBee has only allowed us to work with the Multispec 4C multispectral camera. The process of data capture and post-processing has been faster as the workflow is fully systematized. According to the data obtained and by means of merely visual comparison the best results have been obtained with the combination of the eBee and Thermomap. Simply by using the temperature index we can observe that the sharpness of the display is greater than with the other tools. However, it is not homogeneous for the rest of the cultivated fields that make up the ancient site. Additionally, although the results obtained with the latter system allow better identification of buried remains, spatial resolution is lost.
It is harder to speculate as to the nature of the archaeological structures but we can conclude that each documented structure behaves in a specific way. This means that the data obtained by these sensors, in this particular case, is not related to the vegetation cover but to what is happening in the subsoil.
As a consequence we focused on the thermal data and posited that they could be positive crop marks [40,41] due to the growth of more vegetation in a more humid zone, which would explain its lower temperature. In this way we could hypothesize that they were deeper structures such as types of tank, sink, or foundation caissons, since their widths vary between 7 m and 21 m.
This paper has shown that the use of drones with multispectral or thermal cameras is a useful tool in the field of archeology, providing data for archaeological surveying. As stated above, the use of these tools should become a common standardized procedure before starting an excavation. However, until now no comparison of data obtained with different drone systems and sensors, indicating which method obtained the best results, has been published. Despite our experiments, we conclude that there is not yet a single product (drone plus sensor) on the market that attains optimum results in the field of archeology, as is the case in other fields such as precision agriculture. As explained by other authors [42], the character of archaeological features is so variable that it is not possible to establish some more automatic way of identifying possible examples. No single technique or tool was universally successful.

Author Contributions

P.U. and J.A. conceived and designed the research; P.U. and J.A. collected the data; P.U., J.A. and F.P. analyzed the data and wrote the study; J.V. and B.E. contributed with the analysis of field works and the interpretations of the excavations.

Funding

This research was funding by Museum of Teruel.

Acknowledgments

We would like to thank Joan Cano and Romel de Blas for all their help during this experimentation carried out in the Roman Republican city of La Caridad (Teruel, Spain).

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Asandulesei, A. Inside a Cucuteni Settlement: Remote Sensing Techniques for Documenting an Unexplored Eneolithic Site from Northeastern Romania. Remote Sens. 2017, 9, 41. [Google Scholar] [CrossRef]
  2. Campana, S. Drones in Archaeology. State-of-the-art and Future Perspectives. Archaeol. Prospect. 2017, 241, 275–296. [Google Scholar] [CrossRef]
  3. Sonnemann, T.F.; Herrera, H.; Hofman, C.L. Applying UAS Photogrammetry to Analyze Spatial Patterns of Indigenous Settlement Sites in the Northen Dominican Republic. In Digital Methods and Remote Sensing in Archaeology; Forte, M., Campana, S., Eds.; Springer: Basel, Switzerland, 2016; pp. 71–90. [Google Scholar]
  4. Sonnemann, T.F.; Ulloa, J.; Hofman, C.L. Mapping Indigenous Settlement Topography in the Caribbean Using Drones. Remote Sens. 2016, 8, 791. [Google Scholar] [CrossRef]
  5. Fernández-Hernandez, J.; González-Aguilera, D.; Rodríguez-Gonzálvez, P.; Mancera-Taboada, J. Image-Based Modelling from Unmanned Aerial Vehicle (UAV) Photogrammetry: An Effective, Low-Cost Tool for Archaeological Applications. Archaeometry 2015, 57, 128–145. [Google Scholar] [CrossRef]
  6. Casana, J.; Kantner, J.; Wiewel, A.; Cothren, J. Archaeological aerial thermography: A case study at the Chaco-era Blue, J. community, New Mexico. J. Arch. Sci. 2015, 45, 207–219. [Google Scholar] [CrossRef]
  7. Uribe, P.; Angás, J.; Pérez-Cabello, F.; de la Riva, F.; Bea, M.; Serreta, A.; Magallón, M.A.; Sáenz, C.; Martín-Bueno, M. Aerial mapping and multi-sensors approaches from remote sensing. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, XL-5/W4, 461–467. [Google Scholar]
  8. Nex, F.; Remondino, F. UAV: Platforms, regulations, data acquisition and processing. In 3D Re-Cording and Modelling in Archaeology and Cultural Heritage; Remondino, F., Campana, S., Eds.; BAR: Oxford, UK, 2014; pp. 73–88. [Google Scholar]
  9. Verhoeven, G.; Doneus, M.; Atzberger, C.; Wess, M.; Rus, M.; Pregesbauer, M.; Briese, C. New approaches for archaeological feature extraction of airborne imaging spectroscopy data. Archaeological prospection. In Proceedings of the 10th International Conference on Archaeological Prospection, Vienna, Austria, 29 May–2 June 2013; pp. 13–15. [Google Scholar]
  10. Hill, A.C. Israel: Low-cost high-tech tools for aerial photography and photogrammetry. Soc. Am. Arch. Rec. 2013, 13, 25–29. [Google Scholar]
  11. Eisenbeiss, H.; Zang, L. Comparison of DSMs generated from mini UAV imagenery and terrestrial laser scanner in a cultural heritage application. Int. Arch. Photogramm. Remote Sens. 2006, 36, 90–96. [Google Scholar]
  12. Verhoeven, G. Taking computer vision aloft—Archaeological three-dimensional reconstructions from aerialphotographs with PhotoScan. Archaeol. Prospect. 2011, 18, 67–73. [Google Scholar] [CrossRef]
  13. De Reu, J.; Plets, G.; Verhoeven, G.; De Smedt, P.; Bats, M.; Cherretté, B.; De Maeyer, W.; Deconynck, J.; Herremans, D.; Laloo, P.; et al. Towards a three-dimensional cost-effective registration of the archaeological heritage. J. Archaeol. Sci. 2013, 40, 1108–1121. [Google Scholar] [CrossRef]
  14. Daponte, P.; De Vito, L.; Mazzilli, G.; Picariello, F.; Rapuano, S. A height measurement uncertainty model for archaeological surveys by aerial photogrammetry. Measurement 2017, 98, 192–198. [Google Scholar] [CrossRef]
  15. Remondino, F.; Barazzetti, L.; Nex, F.; Scaioni, M.; Sarazzi, D. UAV photogrammetry for mapping and 3Dmodeling—Current status and future perspectives. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2011, 38, C22. [Google Scholar]
  16. Jones, R.J.; Evans, R. Soil and crop marks in the recognition of archaeological sites by airphotography. In Aerial Reconnaissance for Archaeology; Wilson, D.R., Ed.; The Council for British Archaeology: London, UK, 1975; Volume 12, pp. 1–11. [Google Scholar]
  17. Verhoeven, G.; Vermeulen, F. Engaging with the Canopy—Multi-Dimensional Vegetation Mark Visualisation Using Archived Aerial Images. Remote Sens. 2015, 8, 752. [Google Scholar] [CrossRef]
  18. Verhoeven, G. Near-infrared aerial crop marks archaeology: From its historical use to current digital implementations. J. Archaeol. Method Theory 2012, 19, 132–160. [Google Scholar] [CrossRef]
  19. Cerra, D.; Agapiou, A.; Cavalli, R.M.; Sarris, A. An Objective Assessment of Hyperspectral Indicators for the Detection of Buried Archaeological Relics. Remote Sens. 2018, 10, 500. [Google Scholar] [CrossRef]
  20. Bennett, R.; Welham, K.; Hill, R.A.; Ford, L.J. The application of Vegetation Indices for the prospection of Archaeological features in grass-dominated Environments. Archaeol. Prospect. 2010, 19, 209–218. [Google Scholar] [CrossRef]
  21. Traviglia, A. Archaeological usability of hyperspectral images: Successes and failures of images processing techniques. In From Space to Place, Proceedings of the 2nd International Conference on Remote Sensing in Archaeology, Roma, Italy, 4–7 December 2006; Forte, M., Campana, S., Eds.; Archaeopress: Oxford, UK, 2006; pp. 123–130. [Google Scholar]
  22. Aqdus, S.A.; Hanson, W.S.; Drummond, J. The potential of hyperspectral and multi-spectral imagery to enhance archaeological cropmark detection: A comparative study. J. Archaeol. Sci. 2012, 39, 1915–1924. [Google Scholar] [CrossRef]
  23. Burillo, F. Etnias, ciudades y estados en la Celtiberia. In Pueblos, Lenguas y Escrituras en la Hispania Prerromana. VII Coloquio Sobre Lenguas y Culturas Paleohispanica; Villar, F., Beltrán, F., Eds.; Universidad de Salamanca: Salamanca, Spain, 1999; pp. 109–140. [Google Scholar]
  24. Pérez, L. La Ubicación de Osicerda. El Miliario Extravagante 1990, 2, 8–9. [Google Scholar]
  25. Vicente, J.; Ezquerra, B. La tésera de Lazuro: Un nuevo documento celtibérico en La Caridad (Caminreal, Teruel). Paleohispánica 2003, 3, 251–269. [Google Scholar]
  26. Vicente, J.; Martín, R.; Herce, A.I.; Escriche, C.; Punter, P. La Caridad (Caminreal, Teruel). In La casa Urbana Hispanorromana; Beltran, M., Ed.; Institución Fernando el Católico: Zaragoza, Spain, 1991; pp. 81–130. [Google Scholar]
  27. Colomina, I.; Molina, P. Unmanned aerial systems for photogrammetry and remote sensing: A review. ISPRS J. Photogramm. Remote Sens. 2014, 92, 79–97. [Google Scholar] [CrossRef]
  28. Chiabrando, F.; D’andria, F.; Sammartano, G.; Spanò, A. UAV photogrammetry for archaeological site survey. 3D models at the Hierapolis in Phrygia (Turkey). Virtual Archaeol. Rev. 2018, 9, 28–43. [Google Scholar] [CrossRef]
  29. Bayer, B.E. Color Imaging Array. U.S. Patent 3971065 A, 20 July 1976. [Google Scholar]
  30. Hirakawa, K.; Wolfe, P.J. Spatio-spectral color filter array desing for enhanced image fidelity. IEEE Trans. Image Process. 2008, 17, 1876–1890. [Google Scholar] [CrossRef] [PubMed]
  31. Agisoft Photoscan. Available online: http://www.agisoft.com/pdf/photoscan-pro_1_2_en.pdf (accessed on 1 August 2018).
  32. Pix4d. Available online: https://support.pix4d.com/hc/en-us/articles/202559089-How-are-the-Internal-and-External-Camera-Parameters-defined (accessed on 1 August 2018).
  33. Chuvieco, E. Teledetección Ambiental. La Observación de la Tierra Desde el Espacio; Ariel Ciencias: Barcelona, Spain, 2010. [Google Scholar]
  34. Verhoeven, G. Are We There Yet? A Review and Assessment of Archaeological Passive Airborne Optical Imaging Approaches in the Light of Landscape Archaeology. Geosciences 2017, 7, 86. [Google Scholar] [CrossRef]
  35. Rouse, J.W.; Hass, R.H.; Schell, J.A.; Deering, D.W. Monitoring vegetation systems in the Great Plains with ERTS. In Proceedings of the Third Earth Resources Technology Satellite-1 Symposium, Washington, DC, USA, 10–14 December 1973; Volume SP-351, pp. 309–317. [Google Scholar]
  36. Gitelson, A.A.; Kaufman, Y.J.; Merzlyak, M.N. Use of green channel in remote sensing of global vegetation from EOS-MODIS. Remote Sens. Environ. 1996, 58, 289–298. [Google Scholar] [CrossRef]
  37. Huete, A.R. A soil-adjusted vegetation index (SAVI). Remote Sens. Environ. 1988, 25, 295–309. [Google Scholar] [CrossRef]
  38. Jensen, J. Introductory Digital Image Processing; Prentice Hall Series in Geographic Information Science; Prentice Hall: Upper Saddle River, NJ, USA, 1996. [Google Scholar]
  39. Del Pozo, S.; Rodríguez-Gonzálvez, P.; Hernández-López, D.; Felipe-García, B. Vicarious Radiometric Calibration of a Multispectral Camera on Board an unmanned Aerial System. Remote Sens. 2014, 6, 1918–1937. [Google Scholar] [CrossRef]
  40. Musson, C.; Palmer, R.; Campana, S. Flights into the Past. Aerial Photography, Photo Interpretation and Mapping for Archaeology; Edizioni all’ Insegna del Giglio: Firenze, Italy, 2013. [Google Scholar]
  41. Gojda, M.; Hejcman, M. Cropmarks in main field crops enable the identification of a wide spectrum of buried features on archaeological sites in Central Europe. J. Archaeol. Sci. 2012, 39, 1655–1664. [Google Scholar] [CrossRef]
  42. Powlesland, D.; Lyall, J.; Donoghue, D. The application and integration of multi-sensor, non-invasive remote sensing techniques for the enhancement of the sites and monuments record. Heslerton Parish Project, N. Yorkshire, England. Internet Archaeol. 1997, 2. [Google Scholar] [CrossRef]
Figure 1. The situation of the Roman republican archaeological site of La Caridad. The red line corresponds with the estimated extent of the archaeological site.
Figure 1. The situation of the Roman republican archaeological site of La Caridad. The red line corresponds with the estimated extent of the archaeological site.
Drones 02 00029 g001
Figure 2. Different documented structures by means of crop marks and soil marks. The areas selected for this study are A1 and A2. From B to G: other areas with known cropmarks during this investigation.
Figure 2. Different documented structures by means of crop marks and soil marks. The areas selected for this study are A1 and A2. From B to G: other areas with known cropmarks during this investigation.
Drones 02 00029 g002
Figure 3. Archaeological structures not related to Roman houses (Area E in Figure 2): (1) False color composition of PCA of Tetracam, bands 123 with an equalize enhancement of the histogram; (2) NDVI with band 920 and an equalize enhancement of the histogram.
Figure 3. Archaeological structures not related to Roman houses (Area E in Figure 2): (1) False color composition of PCA of Tetracam, bands 123 with an equalize enhancement of the histogram; (2) NDVI with band 920 and an equalize enhancement of the histogram.
Drones 02 00029 g003
Figure 4. The cereal crop fields chosen for the experimentation, within the red wide line (areas A1 and A2 in Figure 2).
Figure 4. The cereal crop fields chosen for the experimentation, within the red wide line (areas A1 and A2 in Figure 2).
Drones 02 00029 g004
Figure 5. Drone systems: (A) The Tecnikopter (B) eBee.
Figure 5. Drone systems: (A) The Tecnikopter (B) eBee.
Drones 02 00029 g005
Figure 6. The five drone sensors used in this experimentation.
Figure 6. The five drone sensors used in this experimentation.
Drones 02 00029 g006
Figure 7. Distribution and accuracy of the GCP.
Figure 7. Distribution and accuracy of the GCP.
Drones 02 00029 g007
Figure 8. Internal calibration parameters of the sensors.
Figure 8. Internal calibration parameters of the sensors.
Drones 02 00029 g008
Figure 9. Tecnikopter plus Tetracam (Bands): (A) Visible image obtained by the sensor Canon S110. (B) Band 490–500 nm with a standard deviation enhancement of 2.5. (C) Band 550–560 nm with a standard deviation enhancement of 2.5. (D) Band 680–690 nm with a standard deviation enhancement of 2.5. (E) Band 720–730 nm with a standard deviation enhancement of 2.5. (F) Band 800–820 nm with a standard deviation enhancement of 2.5. (G) Band 920–940 nm with a standard deviation enhancement of 2.5. (H) Architectural plan produced by means of the interpretation of the images. (B,C) does not observe crop marks. (The extension is different between Tetracam and the rest of cameras because with this sensor only they are observed crop marks in one of the fields).
Figure 9. Tecnikopter plus Tetracam (Bands): (A) Visible image obtained by the sensor Canon S110. (B) Band 490–500 nm with a standard deviation enhancement of 2.5. (C) Band 550–560 nm with a standard deviation enhancement of 2.5. (D) Band 680–690 nm with a standard deviation enhancement of 2.5. (E) Band 720–730 nm with a standard deviation enhancement of 2.5. (F) Band 800–820 nm with a standard deviation enhancement of 2.5. (G) Band 920–940 nm with a standard deviation enhancement of 2.5. (H) Architectural plan produced by means of the interpretation of the images. (B,C) does not observe crop marks. (The extension is different between Tetracam and the rest of cameras because with this sensor only they are observed crop marks in one of the fields).
Drones 02 00029 g009
Figure 10. Tecnikopter plus Tetracam (Vegetation Index): (A) Visible image obtained by the sensor Canon S110. (B) NDVI with band 720 and an equalize enhancement of the histogram. (C) NDVI with band 800 and an equalize enhancement of the histogram. (D) NDVI with band 920 and a standard deviation enhancement of 2.5. (E) GreenNDVI with band 720 and an equalize enhancement of the histogram. (F) GreenNDVI with band 800 and an equalize enhancement of the histogram. (G) GreenNDVI with band 920 and an equalize enhancement of the histogram. (H) Architectural plan produced by means of the interpretation of the images. (The extension is different between Tetracam and the rest of cameras because with this sensor only they are observed crop marks in one of the fields).
Figure 10. Tecnikopter plus Tetracam (Vegetation Index): (A) Visible image obtained by the sensor Canon S110. (B) NDVI with band 720 and an equalize enhancement of the histogram. (C) NDVI with band 800 and an equalize enhancement of the histogram. (D) NDVI with band 920 and a standard deviation enhancement of 2.5. (E) GreenNDVI with band 720 and an equalize enhancement of the histogram. (F) GreenNDVI with band 800 and an equalize enhancement of the histogram. (G) GreenNDVI with band 920 and an equalize enhancement of the histogram. (H) Architectural plan produced by means of the interpretation of the images. (The extension is different between Tetracam and the rest of cameras because with this sensor only they are observed crop marks in one of the fields).
Drones 02 00029 g010
Figure 11. Tecnikopter plus Tetracam (Vegetation Indices and PCs): (A) Visible image obtained by the sensor Canon S110. (B) SAVI with band 720 and a standard deviation enhancement of 2.5. (C) SAVI with band 800 and a standard deviation enhancement of 2.5. (D) SAVI with band 920 and a standard deviation enhancement of 2.5. (E) Band 2 of the PCs with an equalize enhancement of the histogram. (F) False color composition of the PCs bands 121 with an equalize enhancement of the histogram. (G) False color composition of the PCs bands 123 with an equalize enhancement of the histogram. (H) Architectural plan produced by means of the interpretation of the images. (The extension is different between Tetracam and the rest of cameras because with this sensor only they are observed crop marks in one of the fields).
Figure 11. Tecnikopter plus Tetracam (Vegetation Indices and PCs): (A) Visible image obtained by the sensor Canon S110. (B) SAVI with band 720 and a standard deviation enhancement of 2.5. (C) SAVI with band 800 and a standard deviation enhancement of 2.5. (D) SAVI with band 920 and a standard deviation enhancement of 2.5. (E) Band 2 of the PCs with an equalize enhancement of the histogram. (F) False color composition of the PCs bands 121 with an equalize enhancement of the histogram. (G) False color composition of the PCs bands 123 with an equalize enhancement of the histogram. (H) Architectural plan produced by means of the interpretation of the images. (The extension is different between Tetracam and the rest of cameras because with this sensor only they are observed crop marks in one of the fields).
Drones 02 00029 g011
Figure 12. eBee plus Multispec4C (Bands): (A) Visible image obtained by the sensor Canon S110. (B) Green band (550 nm) with a standard deviation enhancement of 2.5. (C) Red band (660 nm) with a standard deviation enhancement of 2.5. (D) Red edge band (735 nm) with a standard deviation enhancement of 2.5. (E) NIR band (790 nm) with a standard deviation enhancement of 2.5. (F) Architectural plan produced by means of the interpretation of the images.
Figure 12. eBee plus Multispec4C (Bands): (A) Visible image obtained by the sensor Canon S110. (B) Green band (550 nm) with a standard deviation enhancement of 2.5. (C) Red band (660 nm) with a standard deviation enhancement of 2.5. (D) Red edge band (735 nm) with a standard deviation enhancement of 2.5. (E) NIR band (790 nm) with a standard deviation enhancement of 2.5. (F) Architectural plan produced by means of the interpretation of the images.
Drones 02 00029 g012
Figure 13. eBee plus Multispec4C (Vegetation Index and PCs): (A) Visible image obtained by the sensor Canon S110. (B) NDVI with a linear contrast enhancement of the histogram and a standard deviation enhancement of 2.5. (C) GreenNDVI with a standard deviation enhancement of 2.5. (D) SAVI with a standard deviation enhancement of 2.5. (E) Band 2 of the PCs with a standard deviation enhancement of 2.5. (F) False color composition of the PCs bands 121 with a standard deviation enhancement of 2.5. (G) Architectural plan produced by means of the interpretation of the images.
Figure 13. eBee plus Multispec4C (Vegetation Index and PCs): (A) Visible image obtained by the sensor Canon S110. (B) NDVI with a linear contrast enhancement of the histogram and a standard deviation enhancement of 2.5. (C) GreenNDVI with a standard deviation enhancement of 2.5. (D) SAVI with a standard deviation enhancement of 2.5. (E) Band 2 of the PCs with a standard deviation enhancement of 2.5. (F) False color composition of the PCs bands 121 with a standard deviation enhancement of 2.5. (G) Architectural plan produced by means of the interpretation of the images.
Drones 02 00029 g013
Figure 14. eBee plus ThermoMap: (A) Visible image obtained by the sensor Canon S110. (B) Orthophoto with a standard deviation enhancement of 2.5. (C) Thermal index with a standard deviation enhancement of 2.5. (D) Thermal index with a linear contrast enhancement of the histogram with breakpoints (value pixel 15°–27°). (E) Thermal map. (F) Architectural plan produced by means of the interpretation of the images.
Figure 14. eBee plus ThermoMap: (A) Visible image obtained by the sensor Canon S110. (B) Orthophoto with a standard deviation enhancement of 2.5. (C) Thermal index with a standard deviation enhancement of 2.5. (D) Thermal index with a linear contrast enhancement of the histogram with breakpoints (value pixel 15°–27°). (E) Thermal map. (F) Architectural plan produced by means of the interpretation of the images.
Drones 02 00029 g014
Table 1. The different drones used in this study.
Table 1. The different drones used in this study.
NameTecnikoptereBee
ManufacturerMikroKopter (Tecnitop and UZ)SenseFly Company
Weight (g)5650 gApprox. 710 g with camera
Max. payload (g)1000 g710 g
Power sourceLipo 8000 20C2150 Lipo
Endurance (min)18 m45 m
GNSS NavigationUblox LEA---
SensorsOlympus EPM 2, MCA 6 TetracamS110 NIR/RE/RGB, Multispec, thermoMAP
Table 2. Drone sensors used in this study.
Table 2. Drone sensors used in this study.
NameE-PM2Mini-MCA 6S110MultispecThermoMap
CompanyOlympus Tokyo, JapanTetracam Chatsworth, U.S.ACanon, Tokyo, JapanAirinov, Paris, FrancesenseFly, Cheseaux-sur-Lausanne, Switzerland
TypeRBG cameraMultispectralRBG cameraMultispectralThermal
Spectral rangeBlue, Green, Red490–940 nmBlue, Green, Red530–810 nm7500–13,500 nm
Image formatJPGRAWTIFFTIFFTIFF
HandlingWireless triggerInterval modeInterval modeInterval modeInterval mode
Date of data acquisition25 June 201525 June 201511 May 201611 May 201611 May 2016
Flying height above ground100 m103 m100 m102.7 m90 m
Groung Sampling Distance2.7 cm/pix10.9 cm/pix3.5 cm/pix10.7 cm/pix17 cm/pix
Type of droneTecniKopterTecniKoptereBeeeBeeeBee

Share and Cite

MDPI and ACS Style

Agudo, P.U.; Pajas, J.A.; Pérez-Cabello, F.; Redón, J.V.; Lebrón, B.E. The Potential of Drones and Sensors to Enhance Detection of Archaeological Cropmarks: A Comparative Study Between Multi-Spectral and Thermal Imagery. Drones 2018, 2, 29. https://doi.org/10.3390/drones2030029

AMA Style

Agudo PU, Pajas JA, Pérez-Cabello F, Redón JV, Lebrón BE. The Potential of Drones and Sensors to Enhance Detection of Archaeological Cropmarks: A Comparative Study Between Multi-Spectral and Thermal Imagery. Drones. 2018; 2(3):29. https://doi.org/10.3390/drones2030029

Chicago/Turabian Style

Agudo, Paula Uribe, Jorge Angás Pajas, Fernando Pérez-Cabello, Jaime Vicente Redón, and Beatriz Ezquerra Lebrón. 2018. "The Potential of Drones and Sensors to Enhance Detection of Archaeological Cropmarks: A Comparative Study Between Multi-Spectral and Thermal Imagery" Drones 2, no. 3: 29. https://doi.org/10.3390/drones2030029

Article Metrics

Back to TopTop