Next Article in Journal
Validity of the Apatite/Merrillite Relationship in Evaluating the Water Content in the Martian Mantle: Implications from Shergottite Northwest Africa (NWA) 2975
Next Article in Special Issue
From Above and on the Ground: Geospatial Methods for Recording Endangered Archaeology in the Middle East and North Africa
Previous Article in Journal / Special Issue
How Can Remote Sensing Help in Detecting the Threats to Archaeological Sites in Upper Egypt?
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Optical Remote Sensing Potentials for Looting Detection

by
Athos Agapiou
*,
Vasiliki Lysandrou
and
Diofantos G. Hadjimitsis
Remote Sensing and Geo-Environment Laboratory, Eratosthenes Research Centre, Department of Civil Engineering and Geomatics, Cyprus University of Technology, Saripolou 2-8, 3603 Limassol, Cyprus
*
Author to whom correspondence should be addressed.
Geosciences 2017, 7(4), 98; https://doi.org/10.3390/geosciences7040098
Submission received: 31 July 2017 / Revised: 29 September 2017 / Accepted: 2 October 2017 / Published: 4 October 2017
(This article belongs to the Special Issue Remote Sensing and Geosciences for Archaeology)

Abstract

:
Looting of archaeological sites is illegal and considered a major anthropogenic threat for cultural heritage, entailing undesirable and irreversible damage at several levels, such as landscape disturbance, heritage destruction, and adverse social impact. In recent years, the employment of remote sensing technologies using ground-based and/or space-based sensors has assisted in dealing with this issue. Novel remote sensing techniques have tackled heritage destruction occurring in war-conflicted areas, as well as illicit archeological activity in vast areas of archaeological interest with limited surveillance. The damage performed by illegal activities, as well as the scarcity of reliable information are some of the major concerns that local stakeholders are facing today. This study discusses the potential use of remote sensing technologies based on the results obtained for the archaeological landscape of Ayios Mnason in Politiko village, located in Nicosia district, Cyprus. In this area, more than ten looted tombs have been recorded in the last decade, indicating small-scale, but still systematic, looting. The image analysis, including vegetation indices, fusion, automatic extraction after object-oriented classification, etc., was based on high-resolution WorldView-2 multispectral satellite imagery and RGB high-resolution aerial orthorectified images. Google Earth© images were also used to map and diachronically observe the site. The current research also discusses the potential for wider application of the presented methodology, acting as an early warning system, in an effort to establish a systematic monitoring tool for archaeological areas in Cyprus facing similar threats.

1. Introduction

Looting is considered as a major anthropogenic threat for cultural heritage due to the irreversible damage that is caused to the archaeological context and the findings themselves, often diverted into the illicit market [1,2]. Several reports can be found from all over the world indicating the size and extent of this problem [3,4,5,6]. Recent examples from the war-conflicted areas in the Middle East showcase a part of this problem [7,8,9].
Due to the complexity of the problem, the scientific community and local stakeholders are seeking ways to minimize the degree and the extent of looting by the exploitation of innovative technologies [10]. In this concept, Earth observation and aerial sensors are considered as important aspects of a holistic approach to eventually constraining the problem. Recent examples from both optical and passive remote sensing technologies can be found in the literature, indicating the advantages and the accuracy of the results for mapping archaeological areas that are under threat [11,12,13,14]. In some cases, Earth observation proved to be the only means of documenting the destruction made by the looters due to war conflicts [3]. In other cases, ground geophysical prospecting has also been applied, as in the case of [5] in Peru.
Even though these technologies are not capable of preventing looters, the image analysis results can be used by the local stakeholders to take all the necessary measurements for future restrictions, as well as to warn the scientific community of illegal excavations.
It should be stated that existing literature is mainly focused on the exploitation of remote sensing technologies for extended looted areas, where hundreds of looted signs are visible from space and air [3,5,6]. On the contrary, this paper aims to present small-scale looting attempts which seem to have been made in recent years in Cyprus. In addition, no scheduled flight or satellite overpass was performed to monitor the site under investigation. Therefore, the use of existing datasets captured by various sources and sensors was the only means of mapping the looting imprints. It is evident that the specific case study is limited in terms of the size of the threat, but is also bounded by the availability of existing images rarely captured from space and air. This restricted context provides a realistic case study which is appropriate to discuss the potential use of non-contact remote sensing technologies to map small-scale systematic attempts made by looters in recent years.

2. Methodology

Existing archive aerial images and satellite datasets have been exploited to meet the aims of this study. A complete list of the all of the data used is provided in Table 1. The temporal resolution of the analysis was carried out covering the last nine years (i.e., from 2008 to 2017). Aerial images included the sub-meter-resolution red-green-blue (RGB) orthophoto color composite produced in 2008 (with a spatial resolution of 0.50 m), and the latest RGB orthophoto of 2014 (with a spatial resolution of 0.20 m). Both archives were produced by the Department of Land and Surveys of Cyprus. A greyscale aerial orthophoto with a spatial resolution of 1 m taken in 1993 (and therefore prior to any looting phenomena) was used as reference. In addition, a very-high-resolution WorldView-2 multispectral satellite image taken on 20 of June 2011 was also consulted. The WorldView-2 sensor provides a high-resolution panchromatic band with a ground sampling distance (GSD) of 0.46 m (at nadir view) and eight multispectral bands with 1.84 m GSD at nadir view. The latest bands include the conventional red, green, blue, and near-infrared wavelengths, amongst other parts of the spectrum, which cover the coastal, yellow, red edge, and near-infrared wavelengths.
To use all possible available sources to examine looted imprints of the area, Google Earth© images have also been extracted and analyzed. The Google Earth© 3D digital globe systematically releases satellite images at high spatial resolution, which can be used for various remote sensing applications (see [15,16,17]). The platform provides very high-resolution natural-color (i.e., RGB) images based on existing commercial space borne sensors, such as IKONOS, QuickBird, WorldView, etc. Despite the various limitations of Google Earth© images for scientific purposes, such as the compression of the original satellite images, the loss of image quality, as well as limitations in the spectral resolution (i.e., no near-infrared band is provided), recent research demonstrated the great potential of such platforms supportive to research and providing updated information [18]. Indeed, Google Earth© images have already been used for investigation of looting phenomena in the area of Palmyra [3,19] and the ancient city of Apamea, in Syria [20].
In the case study of Politiko, looted tombs were difficult to detect directly from the aerial and satellite datasets. This is due not only to the small scale and the depth of the looted tombs (i.e., more than 3 m), but also due to the spatial resolution and the view geometry (i.e., the nadir view) of the aerial and satellite datasets. Therefore, tombs’ shadows could not be used as an interpretation key as in the case of [3]. In this case, soil disturbance due to these legal activities was considered as a proxy for the looted tombs. The excavated soil was placed very close to the looted tombs, providing a homogenous spectral characteristic target compared to the surrounding non-excavated area.
The methodology followed in this study is presented in Figure 1. Nine RGB images from the Google Earth© platform between 2008 and 2017 have been extracted and interpreted in a geographical information system (GIS). To improve the photo-interpretation of these images, various histogram enhancements were applied. These included brightness and contrast adjustments for each image to enhance the looting soil disturbance against the surrounding area, which was intact and partially vegetated (see examples in Figure 3). In addition, other linear (linear percent stretch) or non-linear histogram stretches (histogram equalization) were applied for enhancement of the spectral properties of the soil disturbance, which was considered as a proxy for the looted tombs.
For the enhancement of the archive aerial orthorectified images, similar histogram adjustments have also been applied. The WorldView-2 multispectral image was also spatially improved using both Gram–Schmidt and NNDiffuse pan-sharpening algorithms. The image was then processed into various levels, including vegetation indices, vegetation suppression, orthogonal equations for the detection of crop marks [21,22], principal component analysis (PCA), and color transformations such as HSL (hue, saturation, and lightness) and HSV (hue, saturation, and value). The latest are considered as transformations of the Cartesian (cube) RGB representation. Finally, the WorldView-2 image was classified using object-oriented segmentation adjusting edge and full lambda parameters, also considering texture metrics. All of the above-mentioned image processing techniques were implemented in ENVI 5.3 (Environment for Visualizing Images, Harris Geospatial Solutions).
In addition, an in situ inspection of the site was carried out, during which the looting imprints detected through the image processing were mapped using a double differencing Global Navigation Satellite System (GNSS) and a real-time kinematic positioning technique. The vertical/horizontal combined accuracy of the in situ GNSS campaign was set to be less than 3 cm. Finally, the overall satellite and aerial image processing outputs were evaluated and cross-compared with the ground truthing investigation of the site.

3. Case Study Area

The area under investigation is in the southwestern part of the modern village of Politiko, in Nicosia District (Figure 2). In this area, looted tombs have been identified in the past, as well as in more recent years. The tombs are hewn out of the natural bedrock. Undisturbed tombs are not easily detected through aerial and/or satellite datasets since they are underground at an approximate depth of 3 m below the surface. In contrast, signs of looted tombs are more likely to be observed and identified in this manner (Figure 2 and Figure 3).
The wider area of Politiko village consists of an intense archaeological territory which is very important for the history of Cyprus, linked to the ancient city-kingdom of Tamassos. While several archaeological missions excavated in the past or are still excavating in the area of Politiko (Politiko-Kokkinorotsos 2007: La Trobe University, Melbourne under Dr. David Frankel and Dr. Jenny Webb; Politiko–Troullia 2016: University of West Carolina Charlotte, USA under Dr. Steven Falconer and Dr. Patricia Fall, see for example [23,24]), the necropolis under investigation here has never been studied. Even though this area has been declared as an ancient monument (Scheduled B’ monument) and is protected by law, the looting has not only continued but, as will be seen later, it has been augmented throughout the years.

4. Results

4.1. Aerial Orthophotos and Google Earth© Images

The investigation of the site initially started from the visual inspection of the Google Earth© images. Brightness and contrast adjustments were applied in an attempt to support the visual interpretation. Historical records from high spatial resolution images over the area of interest were examined, as shown in Figure 4. The images were imported and sorted in chronological order in a GIS environment. More specifically, the following images were extracted from the Google Earth© platform: 9 July 2008, 13 July 2010, 20 June 2011, 29 July 2012, 10 November 2013, 13 July 2014, 16 February 2015, 5 April 2015, and 27 of April 2016. Even though the looted tombs were not visible in the images, as mentioned earlier, looted areas were spotted based on the looting soil disturbance (in some instances achieved by using mechanical means). Recently disturbed terrain was clearly visible in the Google Earth© images.
Looted imprints are shown in Figure 4, in the yellow square. It is interesting to note the size, as well the systematic attempts made by the looters. The first looting activity is recorded to have taken place between 9 July 2008 and 13 July 2010 (Figure 4a,b), affecting three different areas, including more than one tomb each. In less than a year (20 June 2011; Figure 4c), a new attempt was made a few meters to the west of the previously affected northern area. The old looted areas shown in Figure 4b are partially visible now (i.e., Figure 4c) due to the vegetation growth of the area. New looting activity was captured between 29 July 2012 and 10 November 2013 (see Figure 4d,e) further to the east. Terrain disturbance was visually detected due to the characteristic white tone of the excavated soil, in contrast to the dark tone recorded by the vegetated area. The old looted areas are now difficult to spot, especially in Figure 4e. Most probably, vegetation was grown around and on top of the excavated soil, hiding the white tone of the excavated soil. It seems that the same areas were re-visited after a very short time (13 of July 2014), since a much larger disturbance has been documented at the same spots (Figure 4f). No new looting attempt was evidenced for some time (Figure 4g,h), until 2016 where a new looted imprint became visible in an image taken on the 27 April 2016 (Figure 4e).
Apart from one looted tomb in the western part of the area presented in Figure 4i, the rest of the looting marks detected in the aerial and satellite analysis have been successfully identified during the in situ investigation carried out in February 2016. In the case of the in situ documentation, the looted areas were accurately mapped. The small scale of the individual looting areas (i.e., clusters of one to three tombs each time), as well as the small size of the excavation made (approximately 1.5 m square or circle like shape trench), the detection of the looting marks is extremely difficult in case of no a priori knowledge of the area. The automatic detection of looting marks, is further hampered by the topography of the area with scattered vegetation and nude bedrock. This will be further discussed in the following section using segmentation and object-oriented analysis in the multi-spectral WorldView-2 images.
Following a similar approach, photo-interpretation was carried out using the two aerial images taken in 2008 and 2014. These images were also improved using the linear percent stretch (5%) histogram enhancement technique. The earliest aerial image confirmed the results obtained from the satellite products of Google Earth©, indicating no looting attempts in the wider area of Politiko (Figure 5, bottom). Instead, at least four looting marks were spotted in the aerial image of 2014 (Figure 5, top). Looting traces indicated as b–d in Figure 5 were also recorded in the Google Earth© image (see 13 July 2014 in Figure 4f) and confirmed by the in situ inspection in February 2016. Apart from the verification of the results of the previously-elaborated images, the aerial datasets revealed a new looted tomb (see Figure 5 top,a) at the northern part of the site and approximately 100 m from other looted areas, not seen before. The interpretation of the aerial images was more efficient mainly due to the improved quality of the archive aerial datasets and the better spatial resolution. In all four cases, it was possible to identify the soil extracted from the tombs, but not the looted tombs themselves.
To proceed beyond photo-interpretation (hence, to try to detect possible changes in the funerary landscape of Politiko in a semi-automatic way), the aerial orthophotos of 1993 (single band, 1 m resolution), 2008 (RGB bands, 0.5 m resolution), and 2017 (RGB bands, 0.2 m resolution), were merged into a seven-band pseudo-color composite. In this multi-temporal image, a PCA analysis was then applied. PCA is a well-established approach to detect any significant changes. PCA analysis is a statistical tool to decompose multiple variables—as in this case study the seven-band pseudo-color composite—into principal components having orthogonality, while these components are being ranked with respect to their contribution to explaining the variances of the total seven-band image. Therefore, PCA transforms and converts high-dimensional data into linearly-uncorrelated variables (i.e., principal components).
The first two principal components (PC1 and PC2) are shown in Figure 6a,b, while a pseudo-color composite of the first three principal components (PC1–PC3) is shown in Figure 6c. The latest image (i.e., Figure 6c) was generated by displaying PC1, PC2, and PC3 into red, green, and blue bands (RGB). Looted tombs are visible in the pseudo-color composite (see the arrows in Figure 6c) because of landscape alterations.

4.2. Satellite Image Processing

PCA analysis was also applied in the multi-spectral satellite image WorldView-2. The result is shown in Figure 7 (right), where a three-band pseudo-color composite is created by the first three PCs. Again, in this case, the three-band pseudo-color composite was created by displaying PC1, PC2, and PC3 into red, green, and blue bands (RGB). The looted tomb—indicated by the yellow square in this figure—was detectable after interpretation. However, it should be stressed that the identification of looting imprints in this pseudo-color composite was not a straight-forward procedure. This should be linked mainly to the spatial resolution of the multi-spectral bands (i.e., 1.84 m at nadir view). In addition, a vegetation suppression algorithm was applied in the image. The algorithm was employed, modeling the amount of vegetation per pixel, while an extended Crippen and Blom’s algorithm was applied for vegetation transformation, as proposed by [25,26] based on a forced invariance approach.
The model follows five steps to de-vegetate the bands of the satellite image. At first an atmospheric correction is applied (digital number (DN) subtraction), then a vegetation index is calculated as the simple ratio index. Following this, statistics between the DN and the vegetation index for each band are gathered and then a smooth best-fit curve to the plot is estimated. Finally, for each vegetation index level, all pixels are multiplied at that vegetation level using the smooth best-fit curves.
The model calculates the relationship of each input band with vegetation, then it decorrelates the vegetative component of the total signal on a pixel-by-pixel basis for each band. The result of the application of the vegetation suppression is shown in Figure 7 (middle). It seems that the visibility of the looted area was enhanced by this transformation compared to the initial WorldView image (Figure 7, left), since the mark is mostly surrounded by bushes and low vegetation. In addition, the specific algorithm seems to be very promising in looted areas which are fully cultivated and vegetated.
HSV and HSL color transformations results are shown in Figure 8. The color transformations were applied in the pan-sharpen WorldView-2 image after the implementation of the Gram–Schmidt and NNDiffuse pan-sharpening algorithms. Both color transformations were performed in ENVI 5.3. These two-color transformations are widely common cylindrical-coordinate representations points in an RGB color model. In this way, the intimal red, green, and blue values of this color model are transformed into new color components. In the HSV model, hue (H) defines pure color in terms of “green”, “red”, or “magenta”, while saturation (S) defines a range from pure color (100%) to gray (0%) at a constant lightness level. Finally, value (V) refers to the brightness of the color. Similarly, HSL color transformation refers to the hue, saturation, and lightness (L) of the color.
Higher hue values make it easier to distinguish the looted tomb from the surrounding area, even though the overall results are not encouraging. In contrast, both pan-sharpen algorithms applied in the multi-spectral image improved the overall quality of the satellite image and the spatial resolution. The looted area became more visible, even from simple photo-interpretation.
More sophisticated algorithms have also been tested and evaluated using the WorldView-2 multispectral image. At first almost 40 different indices (mostly vegetation indices) were applied and interpreted. Table 2 provides the list of the indices applied, while the corresponding result is shown in Figure 9.
The four most promising indices are the Sum Green Index (Figure 9, b-VII), the Transformed Chlorophyll Absorption Reflectance Index (Figure 9, c-VII), the WorldView Built-Up Index (Figure 9, a-VIII), and the WorldView New Iron Index (Figure 9, c-VIII). From these indices, the WorldView Built-Up Index seems to be the most promising, as far as the looted marks interpretation is concerned. The specific index is based on the spectral properties of the objects as recorded in the coastal and red-edge part of the spectrum (i.e., bands 1 and 6, respectively). The index could improve soil areas, as in the case of the looted tomb, and the earthen road in the western part of the area under investigation. The use of “non-ordinary” indices for archaeological purposes, such as the WorldView Built-Up Index, which was initially used to distinguish built-up areas, has also been reported in previous studies [57].
Other indices, including the traditional widely-applied indices, such as the Normalized Difference Vegetation Index (NDVI), performed less encouraging results as shown in Figure 9, a-VI. Some of them were demonstrated to be inappropriate as far as detecting looting imprints is concerned (i.e., the Enhanced Vegetation Index in Figure 9, c-II and the Leaf Area Index in Figure 9, a-IV). It should be mentioned that similar histogram enhancements have been equally applied in all indices.
Based upon the results from the various indices shown in Figure 9, a recent re-projection of the WorldView-2 spectral space has been implemented. This reprojection was initially developed to enhance buried archaeological remains through crop marks [21]. The WorldView-2 bands were re-projected into a new 3D orthogonal spectral space with three new axes—namely the soil component, the vegetation component, and the crop mark component, as shown in the following equations:
C r o p   m a r k W o r l d V i e w 2 = 0.38   ρ b l u e 0.71   ρ g r e e n + 0.20   ρ r e d   0.56   ρ N I R ,
C r o p   m a r k W o r l d V i e w 2 = 0.38   ρ b l u e 0.71   ρ g r e e n + 0.20   ρ r e d 0.56   ρ N I R ,
S o i l W o r l d V i e w 2 = 0.09   ρ b l u e + 0.27   ρ g r e e n 0.71   ρ r e d 0.65   ρ N I R ,
The results of this application are shown in Figure 10a–c, as well as the RGB pseudo-color composite (Figure 10d). The soil component (Figure 10a) enabled the enhancement of one of the looted tombs, while the vegetation component (Figure 10b) shows the three looting marks of the area. The crop mark component (Figure 10c) was less efficient, while the overall RGB pseudo-color composite (Figure 10d) improved the interpretation of the looted areas. Looted tombs in the vegetation component are detectable due to the small values (i.e., black tones of gray in Figure 10b, vegetation component) compared to the enhanced vegetated areas (i.e., white tones of gray in Figure 10b, vegetation component).
Furthermore, object-oriented classification was applied in the WorldView-2 images. An “optimum” segmentation of the image was achieved after several iterations and changes of the scale level and merge algorithms. Finally, the scale level was set to a value of 65.0, while the merge level was applied after the full lambda algorithm was set to 90.0 using the following equation (see more details in the ENVI Handbook):
t i , j =   | O i | · | O j | | O i | + | O j | · u i u j 2 l e n g h t ( ( O i , O j ) )
where:
  • | O i | is region i of the image,
  • | O j | is the area of region i,
  • u i is the average value in region i,
  • u j is the average value in region j,
  • u i u j 2 is the Euclidean distance between the spectral values of regions i and j,
  • l e n g h t ( ( O i , O j ) ) is the length of the common boundary of | O i | and | O j | .
Full lambda algorithm was applied to merge small segments within larger ones based on a combination of spectral and spatial information, while the scale level is based on the normalized cumulative distribution function (CDF) of the pixel values in the image (see more in [58,59]). The units refer to greyscale tones. In addition, a 3 × 3 texture kernel was employed. The texture kernel refers to the spatial variation of image greyscale levels (tone) for a moving window of 3 × 3 pixels. After the segmentation of the image, rules were set for its classification. These rules included spatial attributes (areas less than 25.0), spectral properties (thresholds in coastal and red edge bands—like the WorldView Built-Up Index), and roundness parameters. The results of the object segmentation and classification are shown in Figure 11. Distinguished segments that are characterized as objects—like those presently recognized as looting marks—are shown in red, while the confirmed looted tomb is shown as a yellow square. Through this analysis, new risk-sensitive areas have been spotted in the wider area of Politiko village, while the already-known looted tomb (Figure 11, within the yellow square) was successfully detected. The false positives that were observed in the rest of the area should be linked to the similar spectral characteristics of the soil, as well as to other cultivation practices and land use properties. It is therefore evident that the automatic object-oriented approach for the extraction of looted areas is only valid to some degree in small, specific archaeological zones, and not beyond these areas. Therefore, a priori knowledge of the area under investigation is essential.

5. Conclusions

The paper aims to demonstrate the potential use of various remote sensing datasets for the detection of looting signs. Though the use of such datasets has been presented in the past by other researchers, in this example the looting signs were of a small scale (i.e., 1–3 looting attempts per year) and no schedule image was provided. Therefore, the question here was to investigate if existing datasets can be used to support local stakeholders for monitoring these threats.
Various image processing techniques have been applied to investigate the detection of small-scale looting attempts (i.e., 1–3 per year) in the wider area of Politiko village. Both archive and satellite images have been used to detect these systematic and organized events. The image analysis included archival data from the Department of Land and Surveys of Cyprus, Google Earth© images, and a very high-resolution WorldView-2 image. It should be stressed that no scheduled satellite overpass was programmed, and hence the analysis was based upon existing and available data.
The overall results demonstrated that Earth observation datasets and aerial imagery can be sufficiently used to detect looting marks in wider areas, and track the illegal excavations with high precision. The RGB-compressed images of Google Earth© are considered as a very good starting point for the interpretation of the area. These images have undergone some image histogram enhancement, namely changes in brightness and contrast, and other linear histogram enhancements. Image processing such as the vegetation indices indicated in Table 2, and spectral transformations such as PCA, orthogonal equations, HSV, etc., in multi-spectral images can further improve the final results. Automatic extraction based on object-oriented classification was also attempted in this case study, providing some interesting results. The overall interpretation of the results from the image analyses is that it is highly important to be verified with in situ inspections and ground truthing. Quantitative assessment of the overall results was not carried out due to the temporal changes of the phenomenon, as well as to the different datasets (with different spectral and spatial characteristics) used in this case study.
Areas with archaeological interest which are endangered by looting, such as the case study of Ayios Mnason-Politiko village, can be systematically controlled by space and aerial sensors. The establishment of such a reliable monitoring tool for local stakeholders could further act as an inhibiting factor for preventing looters.

Acknowledgments

The present communication is under the “ATHENA” project H2020-TWINN2015 of the European Commission. This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No. 691936.

Author Contributions

All authors have contributed equally to the results.

Conflicts of Interest

The authors declare no conflict of interest. The founding sponsors had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, and in the decision to publish the results.

References

  1. Convention on the Means of Prohibiting and Preventing the Illicit Import, Export and Transfer of Ownership of Cultural Property; UNESCO: Paris, France, 1970.
  2. UNIDROIT. Convention on Stolen or Illegally Exported Cultural Objects; UNIDROIT: Rome, Italy, 1995. [Google Scholar]
  3. Tapete, D.; Cigna, F.; Donoghue, N.M.D. ‘Looting marks’ in space-borne SAR imagery: Measuring rates of archaeological looting in Apamea (Syria) with TerraSAR-X Staring Spotlight. Remote Sens. Environ. 2016, 178, 42–58. [Google Scholar] [CrossRef]
  4. Chase, F.A.; Chase, Z.D.; Weishampel, F.J.; Drake, B.J.; Shrestha, L.R.; Slatton, L.C.; Awe, J.J.; Carter, E.W. Airborne LiDAR, archaeology, and the ancient Maya landscape at Caracol, Belize. J. Archaeol. Sci. 2011, 38, 387–398. [Google Scholar] [CrossRef]
  5. Lasaponara, R.; Leucci, G.; Masini, N.; Persico, R. Investigating archaeological looting using satellite images and GEORADAR: The experience in Lambayeque in North Peru. J. Archaeol. Sci. 2014, 42, 216–230. [Google Scholar] [CrossRef]
  6. Contreras, A.D.; Brodie, N. The utility of publicly-available satellite imagery for investigating looting of archaeological sites in Jordan. J. Field Archaeol. 2010, 35, 101–114. [Google Scholar] [CrossRef]
  7. Cerra, D.; Plank, S.; Lysandrou, V.; Tian, J. Cultural heritage sites in danger—Towards automatic damage detection from space. Remote Sens. 2016, 8, 781. [Google Scholar] [CrossRef]
  8. Tapete, D.; Cigna, F.; Donoghue, D.N.M.; Philip, G. Mapping changes and damages in areas of conflict: From archive C-band SAR data to new HR X-band imagery, towards the Sentinels. In Proceedings of the FRINGE Workshop 2015, European Space Agency Special Publication ESA SP-731, Frascati, Italy, 23–27 March 2015; European Space Agency: Rome, Italy, 2015; pp. 1–4. [Google Scholar]
  9. Stone, E. Patterns of looting in southern Iraq. Antiquity 2008, 82, 125–138. [Google Scholar] [CrossRef]
  10. Parcak, S. Archaeological looting in Egypt: A geospatial view (Case Studies from Saqqara, Lisht, andel Hibeh). Near East. Archaeol. 2015, 78, 196–203. [Google Scholar] [CrossRef]
  11. Agapiou, A.; Lysandrou, V. Remote sensing archaeology: Tracking and mapping evolution in European scientific literature from 1999 to 2015. J. Archaeol. Sci. Rep. 2015, 4, 192–200. [Google Scholar] [CrossRef]
  12. Agapiou, A.; Lysandrou, V.; Alexakis, D.D.; Themistocleous, K.; Cuca, B.; Argyriou, A.; Sarris, A.; Hadjimitsis, D.G. Cultural heritage management and monitoring using remote sensing data and GIS: The case study of Paphos area, Cyprus. Comput. Environ. Urban Syst. 2015, 54, 230–239. [Google Scholar] [CrossRef]
  13. Deroin, J.-P.; Kheir, B.R.; Abdallah, C. Geoarchaeological remote sensing survey for cultural heritage management. Case study from Byblos (Jbail, Lebanon). J. Cult. Herit. 2017, 23, 37–43. [Google Scholar] [CrossRef]
  14. Negula, D.I.; Sofronie, R.; Virsta, A.; Badea, A. Earth observation for the world cultural and natural heritage. Agric. Agric. Sci. Procedia 2015, 6, 438–445. [Google Scholar] [CrossRef]
  15. Xiong, J.; Thenkabail, S.P.; Gumma, K.M.; Teluguntla, P.; Poehnelt, J.; Congalton, G.R.; Yadav, K.; Thau, D. Automated cropland mapping of continental Africa using Google Earth Engine cloud computing. ISPRS J. Photogramm. Remote Sens. 2017, 126, 225–244. [Google Scholar] [CrossRef]
  16. Boardman, J. The value of Google Earth™ for erosion mapping. Catena 2016, 143, 123–127. [Google Scholar] [CrossRef]
  17. Gorelick, N.; Hancher, M.; Dixon, M.; Ilyushchenko, S.; Thau, D.; Moore, R. Google earth engine: Planetary-scale geospatial analysis for everyone. Remote Sens. Environ. 2017. [Google Scholar] [CrossRef]
  18. Agapiou, A.; Papadopoulos, N.; Sarris, A. Detection of olive oil mill waste (OOMW) disposal areas in the island of Crete using freely distributed high resolution GeoEye’s OrbView-3 and Google Earth images. Open Geosci. 2016, 8, 700–710. [Google Scholar] [CrossRef]
  19. Contreras, D. Using Google Earth to Identify Site Looting in Peru: Images, Trafficking Culture. Available online: http://traffickingculture.org/data/data-google-earth/using-google-earth-to-identify-site-looting-in-peru-images-dan-contreras/ (accessed on 27 July 2017).
  20. Contreras, D.; Brodie, N. Looting at Apamea Recorded via Google Earth, Trafficking Culture. Available online: http://traffickingculture.org/data/data-google-earth/looting-at-apamea-recorded-via-google-earth/ (accessed on 27 July 2017).
  21. Agapiou, A. Orthogonal equations for the detection of archaeological traces de-mystified. J. Archaeol. Sci. Rep. 2016. [Google Scholar] [CrossRef]
  22. Agapiou, A.; Alexakis, D.D.; Sarris, A.; Hadjimitsis, D.G. Linear 3-D transformations of Landsat 5 TM satellite images for the enhancement of archaeological signatures during the phenological of crops. Int. J. Remote Sens. 2015, 36, 20–35. [Google Scholar] [CrossRef]
  23. RDAC 2010, Annual Report of the Department of Antiquities for the Year 2008, “Excavations at Politiko-Troullia”; Department of Antiquities: Nicosia, Cyprus, 2010; p. 50.
  24. RDAC 2013, Annual Report of the Department of Antiquities for the Year 2009, “Excavations at Politiko-Troullia”; Department of Antiquities: Nicosia, Cyprus, 2013; pp. 57–58.
  25. Yu, L.; Porwal, A.; Holden, E.-J.; Dentith, C.M. Suppression of vegetation in multispectral remote sensing images. Int. J. Remote Sens. 2011, 32, 7343–7357. [Google Scholar] [CrossRef]
  26. Crippen, R.E.; Blom, R.G. Unveiling the lithology of vegetated terrains in remotely sensed imagery. Photogramm. Eng. Remote Sens. 2001, 67, 935–943. [Google Scholar]
  27. Gitelson, A.A.; Merzlyak, M.N.; Chivkunova, O.B. Optical properties and nondestructive estimation of anthocyanin content in plant leaves. Photochem. Photobiol. 2001, 74, 38–45. [Google Scholar] [CrossRef]
  28. Kaufman, Y.J.; Tanré, D. Atmospherically resistant vegetation index (ARVI) for EOS-MODIS. IEEE Trans. Geosci. Remote Sens. 1992, 30, 261–270. [Google Scholar] [CrossRef]
  29. Chuvieco, E.; Martin, P.M.; Palacios, A. Assessment of different spectral indices in the red-near-infrared spectral domain for burned land discrimination. Remote Sens. Environ. 2002, 112, 2381–2396. [Google Scholar] [CrossRef]
  30. Tucker, C.J. Red and photographic infrared linear combinations for monitoring vegetation. Remote Sens. Environ. 1979, 8, 127–150. [Google Scholar] [CrossRef]
  31. Huete, A.R.; Liu, H.Q.; Batchily, K.; van Leeuwen, W. A comparison of vegetation indices over a global set of TM images for EOS-MODIS. Remote Sens. Environ. 1997, 59, 440–451. [Google Scholar] [CrossRef]
  32. Pinty, B.; Verstraete, M.M. GEMI: A non-linear index to monitor global vegetation from satellites. Plant Ecol. 1992, 101, 15–20. [Google Scholar] [CrossRef]
  33. Gitelson, A.; Kaufman, Y.; Merzylak, M. Use of a green channel in remote sensing of global vegetation from EOS-MODIS. Remote Sens. Environ. 1996, 58, 289–298. [Google Scholar] [CrossRef]
  34. Sripada, R.P.; Heiniger, R.W.; White, J.G.; Meijer, A.D. Aerial color infrared photography for determining early in-season nitrogen requirements in corn. Agron. J. 2006, 98, 968–977. [Google Scholar] [CrossRef]
  35. Gitelson, A.A.; Merzlyak, M.N. Remote Sensing of Chlorophyll Concentration in Higher Plant Leaves. Adv. Space Res. 1998, 22, 689–692. [Google Scholar] [CrossRef]
  36. Crippen, R. Calculating the vegetation index faster. Remote Sens. Environ. 1990, 34, 71–73. [Google Scholar] [CrossRef]
  37. Segal, D. Theoretical basis for differentiation of ferric-iron bearing minerals, using Landsat MSS Data. In Proceedings of the 2nd Thematic Conference on Remote Sensing for Exploratory Geology, Symposium for Remote Sensing of Environment, Fort Worth, TX, USA, 6–10 December 1982; pp. 949–951. [Google Scholar]
  38. Boegh, E.; Soegaard, H.; Broge, N.; Hasager, C.; Jensen, N.; Schelde, K.; Thomsen, A. Airborne multi-spectral data for quantifying leaf area index, nitrogen concentration and photosynthetic efficiency in agriculture. Remote Sens. Environ. 2002, 81, 179–193. [Google Scholar] [CrossRef]
  39. Daughtry, C.S.T.; Walthall, C.L.; Kim, M.S.; de Colstoun, E.B.; McMurtrey, J.E. Estimating corn leaf chlorophyll concentration from leaf and canopy reflectance. Remote Sens. Environ. 2000, 74, 229–239. [Google Scholar] [CrossRef]
  40. Haboudane, D.; Miller, J.R.; Pattey, E.; Zarco-Tejada, P.J.; Strachan, I. Hyperspectral vegetation indices and novel algorithms for predicting green LAI of crop canopies: Modeling and validation in the context of precision agriculture. Remote Sens. Environ. 2004, 90, 337–352. [Google Scholar] [CrossRef]
  41. Yang, Z.; Willis, P.; Mueller, R. Impact of band-ratio enhanced AWIFS image to crop classification accuracy. In Proceedings of the Pecora 17, Remote Sensing Symposium, Denver, CO, USA, 18–20 November 2008. [Google Scholar]
  42. Sims, D.A.; Gamon, J.A. Relationships between leaf pigment content and spectral reflectance across a wide range of species, leaf structures and developmental stages. Remote Sens. Environ. 2002, 81, 337–354. [Google Scholar] [CrossRef]
  43. Goel, N.; Qin, W. Influences of canopy architecture on relationships between various vegetation indices and LAI and Fpar: A computer simulation. Remote Sens. Rev. 1994, 10, 309–347. [Google Scholar] [CrossRef]
  44. Bernstein, L.S.; Jin, X.; Gregor, B.; Adler-Golden, S. Quick atmospheric correction code: Algorithm description and recent upgrades. Opt. Eng. 2012, 51, 111719-1–111719-11. [Google Scholar] [CrossRef]
  45. Hall, D.; Riggs, G.; Salomonson, V. Development of methods for mapping global snow cover using moderate resolution imaging spectroradiometer data. Remote Sens. Environ. 1995, 54, 127–140. [Google Scholar] [CrossRef]
  46. Rouse, J.W.; Haas, R.H.; Schell, J.A.; Deering, D.W.; Harlan, J.C. Monitoring the Vernal Advancements and Retrogradation (Greenwave Effect) of Nature Vegetation; NASA/GSFC Final Report; NASA: Greenbelt, MD, USA, 1974.
  47. Rondeaux, G.; Steven, M.; Baret, F. Optimization of soil-adjusted vegetation indices. Remote Sens. Environ. 1996, 55, 95–107. [Google Scholar] [CrossRef]
  48. Curran, P.; Windham, W.; Gholz, H. Exploring the relationship between reflectance red edge and chlorophyll concentration in slash pine leaves. Tree Physiol. 1995, 15, 203–206. [Google Scholar] [CrossRef] [PubMed]
  49. Roujean, J.L.; Breon, F.M. Estimating PAR absorbed by vegetation from bidirectional reflectance measurements. Remote Sens. Environ. 1995, 51, 375–384. [Google Scholar] [CrossRef]
  50. Jordan, C.F. Derivation of leaf area index from quality of light on the forest floor. Ecology 1969, 50, 663–666. [Google Scholar] [CrossRef]
  51. Huete, A. A soil-adjusted vegetation index (SAVI). Remote Sens. Environ. 1988, 25, 295–309. [Google Scholar] [CrossRef]
  52. Gamon, J.A.; Surfus, J.S. Assessing leaf pigment content and activity with a reflectometer. New Phytol. 1999, 143, 105–117. [Google Scholar] [CrossRef]
  53. Haboudane, D.; Miller, J.R.; Tremblay, N.; Zarco-Tejada, P.J.; Dextraze, L. Integrated narrow-band vegetation indices for prediction of crop chlorophyll content for application to precision agriculture. Remote Sens. Environ. 2002, 81, 416–426. [Google Scholar] [CrossRef]
  54. Bannari, A.; Asalhi, H.; Teillet, P. Transformed difference vegetation index (TDVI) for vegetation cover mapping. In Proceedings of the IEEE International Geoscience and Remote Sensing Symposium (IGARSS ’02), Toronto, ON, Canada, 24–28 June 2002; Volume 5. [Google Scholar]
  55. Gitelson, A.A.; Stark, R.; Grits, U.; Rundquist, D.; Kaufman, Y.; Derry, D. Vegetation and soil lines in visible spectral space: A concept and technique for remote estimation of vegetation fraction. Int. J. Remote Sens. 2002, 23, 2537–2562. [Google Scholar] [CrossRef]
  56. Wolf, A. Using WorldView 2 Vis-NIR MSI Imagery to Support Land Mapping and Feature Extraction Using Normalized Difference Index Ratios; DigitalGlobe: Longmont, CO, USA, 2010. [Google Scholar]
  57. Agapiou, A.; Hadjimitsis, D.G.; Alexakis, D.D. Evaluation of broadband and narrowband vegetation indices for the identification of archaeological crop marks. Remote Sens. 2012, 4, 3892–3919. [Google Scholar] [CrossRef]
  58. Roerdink, J.B.T.M.; Meijster, A. The watershed transform: Definitions, algorithms, and parallelization strategies. Fundam. Inf. 2001, 41, 187–228. [Google Scholar]
  59. Robinson, D.J.; Redding, N.J.; Crisp, D.J. Implementation of a Fast Algorithm for Segmenting SAR Imagery; Scientific and Technical Report; Defense Science and Technology Organization: Victoria, Australia, 2002.
Figure 1. Overall methodology and resources used for the current study.
Figure 1. Overall methodology and resources used for the current study.
Geosciences 07 00098 g001
Figure 2. Map indicating the case study area in the southwestern part of the modern village of Politiko, Nicosia District. Red dots indicate looted tombs which have been detected during the in situ investigation and mapped with GNSS (February 2016).
Figure 2. Map indicating the case study area in the southwestern part of the modern village of Politiko, Nicosia District. Red dots indicate looted tombs which have been detected during the in situ investigation and mapped with GNSS (February 2016).
Geosciences 07 00098 g002
Figure 3. Looted tombs (February 2016).
Figure 3. Looted tombs (February 2016).
Geosciences 07 00098 g003
Figure 4. RGB Google Earth© images over the area of interest between the years 2008 and 2016 as follows (a) 9 July 2008, (b) 13 July 2010, (c) 20 June 2011, (d) 29 July 2012, (e) 10 November 2013, (f) 13 July 2014, (g) 16 February 2015, (h) 5 April 2015, and (i) 27 of April 2016. Looted tombs are indicated by the yellow squares.
Figure 4. RGB Google Earth© images over the area of interest between the years 2008 and 2016 as follows (a) 9 July 2008, (b) 13 July 2010, (c) 20 June 2011, (d) 29 July 2012, (e) 10 November 2013, (f) 13 July 2014, (g) 16 February 2015, (h) 5 April 2015, and (i) 27 of April 2016. Looted tombs are indicated by the yellow squares.
Geosciences 07 00098 g004
Figure 5. Aerial RGB orthophotos taken in 2008 (bottom) and 2014 (top). Looted imprints (ad) are traceable only in the latest aerial image.
Figure 5. Aerial RGB orthophotos taken in 2008 (bottom) and 2014 (top). Looted imprints (ad) are traceable only in the latest aerial image.
Geosciences 07 00098 g005
Figure 6. Principal component analysis results of the seven-band multi-temporal aerial datasets of 1993, 2008, and 2014: (a) Principal Component 1 (PC1); (b) Principal Component 2 (PC2); and (c) the pseudo-color composite of the first three principal components (PC1–PC3). Looted marks are indicated with arrows.
Figure 6. Principal component analysis results of the seven-band multi-temporal aerial datasets of 1993, 2008, and 2014: (a) Principal Component 1 (PC1); (b) Principal Component 2 (PC2); and (c) the pseudo-color composite of the first three principal components (PC1–PC3). Looted marks are indicated with arrows.
Geosciences 07 00098 g006
Figure 7. WorldView-2 image 5-3-2 pseudo-color composite (a); vegetation suppression result applied at the WorldView-2 image (b); and pseudo-color composite of the first three principal components (PC1–PC3) of the WorldView-2 image (c). The looted tomb is identified by the yellow square.
Figure 7. WorldView-2 image 5-3-2 pseudo-color composite (a); vegetation suppression result applied at the WorldView-2 image (b); and pseudo-color composite of the first three principal components (PC1–PC3) of the WorldView-2 image (c). The looted tomb is identified by the yellow square.
Geosciences 07 00098 g007
Figure 8. HSV (hue, saturation, and value) and HSL (hue, saturation, and lightness) color transformations of the WorldView-2 image after applying the Gram–Schmidt and NNDiffuse pan-sharpening algorithms. Yellow squares show the looting imprint.
Figure 8. HSV (hue, saturation, and value) and HSL (hue, saturation, and lightness) color transformations of the WorldView-2 image after applying the Gram–Schmidt and NNDiffuse pan-sharpening algorithms. Yellow squares show the looting imprint.
Geosciences 07 00098 g008
Figure 9. RGB pseudo-color composite 5-3-2 and NIR-R-G pseudo-color composite 6-5-3 are demonstrated in a-I and b-I. The rest sub-figures (c-I; d-I; …. d-VIII and e-VIII) correspond to the greyscale indices (see Table 2; total 38 indices) results applied in the WorldView-2 image. The looted area is shown in the yellow square. Images with a red outline show promising vegetation indices (i.e., b-VII; c-VII; a-VIII; and c-VIII).
Figure 9. RGB pseudo-color composite 5-3-2 and NIR-R-G pseudo-color composite 6-5-3 are demonstrated in a-I and b-I. The rest sub-figures (c-I; d-I; …. d-VIII and e-VIII) correspond to the greyscale indices (see Table 2; total 38 indices) results applied in the WorldView-2 image. The looted area is shown in the yellow square. Images with a red outline show promising vegetation indices (i.e., b-VII; c-VII; a-VIII; and c-VIII).
Geosciences 07 00098 g009
Figure 10. WorldView-2 spectral reprojection in a new 3D space: (a) soil component, (b) vegetation component, and (c) the crop mark component are shown in greyscale. The RGB pseudo-composite (d) of these three components is also shown in the lower right (soil component, vegetation component, and crop mark component reflect the red, green, and blue bands, respectively).
Figure 10. WorldView-2 spectral reprojection in a new 3D space: (a) soil component, (b) vegetation component, and (c) the crop mark component are shown in greyscale. The RGB pseudo-composite (d) of these three components is also shown in the lower right (soil component, vegetation component, and crop mark component reflect the red, green, and blue bands, respectively).
Geosciences 07 00098 g010
Figure 11. Object-oriented segmentation and classification of the WorldView-2 satellite image. The white rectangle shows the area of the necropolis under examination, while the yellow square shows the looting mark.
Figure 11. Object-oriented segmentation and classification of the WorldView-2 satellite image. The white rectangle shows the area of the necropolis under examination, while the yellow square shows the looting mark.
Geosciences 07 00098 g011
Table 1. Datasets used for the current study. GSD: ground sampling distance; RGB: red-green-blue.
Table 1. Datasets used for the current study. GSD: ground sampling distance; RGB: red-green-blue.
NoImageDate of AcquisitionsType
1Aerial image1993Greyscale (1 m pixel resolution)
2Aerial image2008RGB orthophoto (50 cm pixel resolution)
3Aerial image2014RGB orthophoto (20 cm pixel resolution)
4WorldView-220 June 2011Multi-spectral (1.84 m GSD for multispectral and 0.46 m at nadir view for the panchromatic image
5Google Earth9 June 2008RGB
6Google Earth13 July 2010RGB
7Google Earth20 June 2011RGB
8Google Earth29 July 2012RGB
9Google Earth10 November 2013RGB
10Google Earth13 July 2014RGB
11Google Earth16 February 2015RGB
12Google Earth5 April 2015RGB
13Google Earth27 April 2016RGB
Table 2. Indices applied for the detection of looted marks in the WorldView-2 image. Promising indices are highlighted.
Table 2. Indices applied for the detection of looted marks in the WorldView-2 image. Promising indices are highlighted.
No.IndexEquationResult in Figure 9Reference
1Anthocyanin Reflectance Index 1 A R I 1 = 1 p 550 1 p 700 c-I[27]
2Anthocyanin Reflectance Index 2 A R I 2 = p 800 [ 1 p 550 1 p 700 ] d-I[27]
3Atmospherically Resistant Vegetation Index A R V I = N I R [ R e d γ ( B l u e R e d ) ] N I R + [ R e d γ ( B l u e R e d ) ] e-I[28]
4Burn Area Index B A I = 1 ( 0.1 R e d ) 2 + ( 0.06 N I R ) 2 a-II[29]
5Difference Vegetation Index D V I = N I R R e d b-II[30]
6Enhanced Vegetation Index E V I = 2.5 × ( N I R R e d ) ( N I R + 6 × R e d 7.5 × B l u e + 1 ) c-II[31]
7Global Environmental Monitoring Index G E M I = e t a ( 1 0.25 × e t a ) R e d 0.125 1 R e d e t a = 2 ( N I R 2 R e d 2 ) + 1.5 × N I R + 0.5 × R e d N I R + R e d + 0.5 d-II[32]
8Green Atmospherically-Resistant Index G A R I = N I R [ G r e e n γ ( B l u e 0 R e d ) ] N I R + [ G r e e n γ ( B l u e 0 R e d ) ] e-II[33]
9Green Difference Vegetation Index G D V I = N I R G r e e n a-III[34]
10Green Normalized Difference Vegetation Index G N D V I = ( N I R G r e e n ) ( N I R + G r e e n ) b-III[35]
11Green Ratio
Vegetation Index
G R V I = N I R G r e e n c-III[34]
12Infrared Percentage Vegetation Index I P V I = N I R N I R + R e d d-III[36]
13Iron Oxide I r o n   O x i d e   R a t i o = R e d B l u e e-III[37]
14Leaf Area Index L A I = ( 3.618 × E V I 0.118 ) a-IV[38]
15Modified Chlorophyll Absorption Ratio Index M C A R I = [ ( p 700 p 670 ) 0.2 ( p 700 p 550 ) ] × ( p 700 p 670 ) b-IV[39]
16Modified Chlorophyll Absorption Ratio
Index-Improved
M C A R I 2 = 1.5   [ 2.5 ( p 800 p 670 ) 1.3 ( p 800 p 550 ) ] ( 2 × p 800 + 1 ) 2 ( 6 × p 800 5 × p 670 ) 0.5 c-IV[40]
17Modified Non-Linear Index M N L I = ( N I R 2 R e d ) × ( 1 + L ) N I R 2 + R e d + L d-IV[41]
18Modified Simple Ratio M S R = ( N I R R e d ) 1 ( N I R R e d ) + 1 e-IV[42]
19Modified Triangular Vegetation Index M T V I = 1.2   [ 1.2 ( p 800 p 550 ) 2.5 ( p 670 p 550 ) ] a-V[38]
20Modified Triangular Vegetation Index-Improved M T V I 2 = 1.5   [ 1.2 ( p 800 p 550 ) 2.5 ( p 670 p 550 ) ] ( 2 × p 800 + 1 ) 2 ( 6 × p 800 5 × p 670 ) 0.5 b-V[40]
21Non-Linear Index N L I = N I R 2 R e d N I R 2 + R e d c-V[43]
22Normalized Difference Mud Index N D M I = ( p 795 p 990 ) ( p 795 + p 990 ) d-V[44]
23Normalized Difference Snow Index N D S I = ( G r e e n S W I R 1 ) ( G r e e n + S W I R 1 ) e-V[45]
24Normalized Difference Vegetation Index N D V I = ( N I R R e d ) ( N I R + R e d ) a-VI[46]
25Optimized Soil Adjusted Vegetation Index O S A V I = 1.5 × ( N I R R e d ) ( N I R + R e d + 0.16 ) b-VI[47]
26Red Edge Position IndexMaximum derivative of reflectance in the vegetation red edge region of the spectrum in microns from 690 nm to 740 nmc-VI[48]
27Renormalized Difference Vegetation Index R D V I = ( N I R R e d ) ( N I R + R e d ) d-VI[49]
28Simple Ratio S R = N I R R e d e-VI[50]
29Soil Adjusted
Vegetation Index
S A V I = 1.5 × ( N I R R e d ) ( N I R + R e d + 0.5 ) a-VII[51]
30Sum Green IndexMean of reflectance across the 500 nm to 600 nm portion of the spectrumb-VII[52]
31Transformed Chlorophyll Absorption
Reflectance Index
T C A R I = 3 [ ( p 700 p 670 ) 0.2 ( p 700 p 550 ) ( p 700 p 670 ) ] c-VII[53]
32Transformed Difference Vegetation Index T D V I = 0.5 + ( N I R R e d ) ( N I R + R e d ) d-VII[54]
33Visible Atmospherically Resistant Index V A R I = G r e e n R e d G r e e n + R e d B l u e e-VII[55]
34WorldView Built-Up Index W V B I = ( C o a s t a l R e d   E d g e ) ( C o a s t a l + R e d   E d g e ) a-VIII[56]
35WorldView Improved Vegetative Index W V V I = ( N I R 2 R e d ) ( N I R 2 + R e d ) b-VIII[56]
36WorldView New
Iron Index
W V I I = ( G r e e n × Y e l l o w ) ( B l u e × 1000 ) c-VIII[56]
37WorldView Non-Homogeneous
Feature Difference
W V N H F D = ( R e d   E d g e C o a s t a l ) ( R e d   E d g e + C o a s t a l ) d-VIII[56]
38WorldView Soil Index W V S I = ( G r e e n Y e l l o w ) ( G r e e n + Y e l l o w ) e-VIII[56]

Share and Cite

MDPI and ACS Style

Agapiou, A.; Lysandrou, V.; Hadjimitsis, D.G. Optical Remote Sensing Potentials for Looting Detection. Geosciences 2017, 7, 98. https://doi.org/10.3390/geosciences7040098

AMA Style

Agapiou A, Lysandrou V, Hadjimitsis DG. Optical Remote Sensing Potentials for Looting Detection. Geosciences. 2017; 7(4):98. https://doi.org/10.3390/geosciences7040098

Chicago/Turabian Style

Agapiou, Athos, Vasiliki Lysandrou, and Diofantos G. Hadjimitsis. 2017. "Optical Remote Sensing Potentials for Looting Detection" Geosciences 7, no. 4: 98. https://doi.org/10.3390/geosciences7040098

APA Style

Agapiou, A., Lysandrou, V., & Hadjimitsis, D. G. (2017). Optical Remote Sensing Potentials for Looting Detection. Geosciences, 7(4), 98. https://doi.org/10.3390/geosciences7040098

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop