Next Article in Journal
Thermal Shock Response of Yeast Cells Characterised by Dielectrophoresis Force Measurement
Next Article in Special Issue
Complex Field Network Coding for Multi-Source Multi-Relay Single-Destination UAV Cooperative Surveillance Networks
Previous Article in Journal
High Precision Dimensional Measurement with Convolutional Neural Network and Bi-Directional Long Short-Term Memory (LSTM)
Previous Article in Special Issue
Trajectory Planning for Data Collection of Energy-Constrained Heterogeneous UAVs
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Comparison of the Influence of Vegetation Cover on the Precision of an UAV 3D Model and Ground Measurement Data for Archaeological Investigations: A Case Study of the Lepelionys Mound, Middle Lithuania

by
Algimantas Česnulevičius
*,
Artūras Bautrėnas
,
Linas Bevainis
and
Donatas Ovodas
Department of Cartography and Geoinformatics, Institute of Geosciences, Faculty of Chemistry and Geosciences, Vilnius University, LT-03101 Vilnius, Lithuania
*
Author to whom correspondence should be addressed.
Sensors 2019, 19(23), 5303; https://doi.org/10.3390/s19235303
Submission received: 10 October 2019 / Revised: 27 November 2019 / Accepted: 28 November 2019 / Published: 2 December 2019
(This article belongs to the Special Issue Optimization and Communication in UAV Networks)

Abstract

:
The aim of this research was to conduct a comparative analysis of the precision of ground geodetic data versus the three-dimensional (3D) measurements from unmanned aerial vehicles (UAV), while establishing the impact of herbaceous vegetation on the UAV 3D model. Low (up to 0.5 m high) herbaceous vegetation can impede the establishment of the anthropogenic roughness of the surface. The identification of minor surface alterations, which enables the determination of their anthropogenic origin, is of utmost importance in archaeological investigations. Vegetation cover is regarded as one of the factors influencing the identification of such minor forms of relief. The research was conducted on the Lepelionys Mound (Prienai District Municipality, Lithuania). Ground measurements were obtained using Trimble GPS, and UAV “Inspire 1” was used for taking aerial photographs. Following the data from the ground measurements and aerial photographs, large scale surface maps were drawn and the errors in the measurement of the position of the isolines were compared. The results showed that the largest errors in the positional measurements of fixed objects were conditioned by the height of grass. Grass with a height of up to 0.1 m resulted in discrepancies of up to 0.5 m, whereas grass that was up to 0.5 m high led to discrepancies up to 1.3 m high.

1. Introduction

During the initial stage of an archaeological investigation, one of the most important principles is to identify a potential object, to determine its boundaries and area. Traditionally, large-scale topographic maps and geodetic measurements are widely used during the initial stage of reconstruction.
Aerial photographs were mainly used where archaeological sites coincided with the areas covered by aerial topography and only on fragmentary basis due to their high cost. High-resolution space images have only become possible within the last decade, but they do not cover continuous areas. Moreover, high resolution photos are not always available for academic research or studies. At the beginning of the 21st century unmanned aerial vehicles, better known as drones, were employed to identify and map potential archaeological objects. They have a number of advantages which include the following characteristics: low price, high resolution, large scale, and multispectral. A very important advantage of unmanned aerial vehicles is the creation of 3D models using photogrammetric techniques. These 3D models reveal the small roughness of the surface. Such alterations in the surface serve as identifiers, when searching for potential archaeological sites.
The issues of reliability and accuracy of aerial photographs obtained using unmanned aerial vehicles have already been addressed in studies by many academic researchers [1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21]. The use of unmanned aerial vehicles provide a fast and inexpensive way to explore ground surface and to identify objects of interest [22], however, research on assessing the precision of aerial images from unmanned aerial vehicles is scarce [23,24,25,26]. The accuracy of aerial images produced with the help of unmanned aerial vehicles (UAVs) can be affected by a number of factors, for example, altitude of flight, the image quality of the photo camera, the design of the UAV route, the methods of georeferencing, and others. An appropriate design of the UAV route ensures cruise altitude and constant aerial image coverage of the whole territory. An appropriate project for the flight and a high-quality photo camera effect the efficiency of photogrammetric processing of the images obtained. Further investigations are simplified by using the well-tested and broadly applied mathematical and photogrammetric algorithms for image processing. The problems occur while designing a 3D model of the territory captured in aerial images. The initial 3D model is created in the conditional coordinate system, which is later linked to the officially used coordinate system. The coordinates can be connected in one of the following two ways: by direct graphical connection of the position of the object in the aerial image to the coordinate system (less precise) or by linking the GPS measurements of fixed objects to the coordinates of the aerial images. The accuracy of the vertical positioning of given points is a highly important factor in designing the 3D relief models that are used for identification, analysis, and mapping of archaeological objects.
Recent research [18,19,20,21,22,23,24,25,26,27,28,29,30,31,32] has shown that while aiming for high accuracy of the vertical positioning of the objects, it is not enough to use a global navigation satellite system (GNSS); ground control points (GCPs) have to be applied as well. Such a combined technique allows for the design of a more accurate digital relief model (DRM), where the precision of vertical positioning of points equals 0.7 cm.
The aim of this research is to conduct a comparative analysis of the precision of ground geodetic measurements and aerial photographs from an unmanned aerial vehicle, while establishing the positional accuracy of the identified objects. The archaeological objects of the Middle Ages in the eastern coast of the Baltic sea are often related to natural relief forms, which were modified by people while building fortifications and settlements around them [33,34,35,36,37]. These archaeological objects are now in forests, agricultural lands, and urbanized territories. The surface of archaeological objects in such urban territories has been exposed to significant changes or has been fully destroyed. The use of aerial images from unmanned aerial vehicles for the positional identification of archaeological objects is highly limited. Due to dense vegetation and the foliage of tall trees, the application of aerial imaging in wooded territories is restricted. The surface of archaeological objects in agricultural territories is partially extant. Therefore, aerial images can be rather efficient in seeking to identify positions of archaeological objects in meadows and woodless territories.
The narrow spectral and surface thermal analysis methods are applied for the investigation of the structural diversity of vegetation cover on the basis of UAV aerial images [38,39,40,41,42]. Studies have mainly focused on the influence of big ligneous plants on the mapping of surface elements, whereas the impact of low herbaceous vegetation on low forms of archaeological relief has, so far, not been exhaustively researched [43]. Our research aims to assess the quality of aerial images, ultimately seeking to design accurate digital 3D relief models for the identification of archaeological objects [44,45,46,47,48,49,50].
For the identification of small surface irregularities (small archaeological objects) we applied the computer program “Circle_3p”, developed by the Department of Cartography and Geoinformatics, Vilnius University, applying the classical Delaunay method (author Artūras Bautrėnas). The results of the study showed that this method is effective in grassy mounds.

2. Research Object, Materials, and Methods

The object of the research is the Lepelionys Mound, which is located in the Prienai Administrative Region of Kaunas County (Figure 1). It dates back to the second half of the first millennium. At the beginning of the second millennium a settlement was established there, covering an area of 9 hectares around the mound. The Lepelionys Mound is on the left side of the road from Vilnius to Prienai (60 km to the west of Vilnius). The territory of the ancient settlement is on both sides of the road, but its bigger part is located on the left side. The Vilnius-Prienai road was built in the second half of the 20th century. While designing the road, the relief of the former ancient settlement was affected but some small and low relief forms of anthropogenic origin still remain, dating back to between the 9th and 12th centuries [51,52]. The main archaeological object, the Lepelionys Mound, was investigated by archaeologists in the second half of the 20th century. During these archaeological investigations the territory boundaries and the protection zone of the ancient settlement were distinguished (Figure 1).
Ground geodetic measurements and photos taken by the camera on the unmanned aerial vehicles were applied while designing the three-dimensional relief models. Comparisons of accuracy between the UAV 3D model and the ground measurements of the Lepelionys mound were carried out twice, in August 2018 and June 2019.
Ground geodetic measurements were carried out with a Trimble R4 GPS device (measurable accuracy in favorable conditions: X, Y is set to ±8 mm and Z to ±15 mm) on 9 August 2018. Since the mound is in a fully open area and not covered by buildings or greenery (Figure 1), the measurements were collected with maximum accuracy. During the collection of these measurements, the coordinates of 212 characteristic ground-surface points were recorded. After analyzing the accuracy of the measured point coordinates, 179 points were mapped to the LKS-94 coordinate system (Figure 2).
Since the topographic photograph can be used to estimate the accuracy of the aerial photographs, 10 ground control points (GCPs) were measured in parallel to the ground points (Figure 3).
Figure 4 shows two objects, the coordinates of which were used for creating the aerial photograph model.
The vegetation is one of the most important indicators of archaeological objects. Information on human activities is reflected in the variation of the lushness of vegetation. Homogeneous vegetation is characteristic of the investigated territory, since for several decades most of the surroundings of the mound have been used as pasture. Local differences in herbaceous vegetation in the mound surroundings over a long period of time have been predetermined by changes in the surface relief layer caused by the following human activities:
(i)
Organic waste was thrown at the foot of the mound;
(ii)
In the territory of the ancient settlement the ground was excavated for substructures of buildings and the soil (sediment) was poured beside the walls of the building;
(iii)
The ancient settlement was surrounded by palisades, the stakes of which were driven into the ground and the excavated soil fortified the foundation of the fence;
(iv)
Organic and mineral waste (ceramic fragments, bones of the animals used for food, worn out shoes, and clothes) was thrown over the palisade of the settlement.
All the aforesaid factors resulted in physical differences in the present vegetation, i.e., lusher or sparser vegetation. It is important to point out that currently there is a pasture in the former territory of the settlement, where grazing starts at the end of April and lasts until October. The whole area is grazed in this time and the anthropogenic impact on the surface was equal during photofixation in August.
The picture in Figure 5 provides a visual representation of the camera sensor and the field of view. Using the width of the camera sensor, the focal length, and the drone altitude the ground sample distance (GSD) can be calculated (Figure 5).
The equation we use to calculate the GSD is:
G S D   =   ( s e n s o r s   w i d t h   ×   a l t i t u d e   ×   100 ) ( f o c a l   l e n g t h   ×   i m a g e   w i d t h )
Photofixation of aerial images was conducted using the unmanned aerial vehicle (UAV) INSPIRE 1. Its technical parameters are presented in Table 1.
The front overlap of the pictures taken is 80% and the side overlap is 70%. The “double grid” mission flight plan was used for a more detailed and accurate 3D model. The flight was made at the height of 50 m, therefore, respectively, the GSD equals 1.7 cm.
There were 199 photos that were processed with special photogrammetric “Pixoprocessing” software. The point cloud, the digital surface model (DSM), and the orthomosaic were obtained during this process.
The study included an assessment of the mismatches between the elevation isoline positions acquired from the ground geodetic measurements and from the aerial images from the UAVs. An associate professor of the Department of Cartography and Geoinformatics, Artūras Bautrėnas, designed the computer program “Circle_3p”, which employs the classical method of Delaunay and ensures a consistent systemic selection of points. Using the Delaunay triangulation method, altitude interpolation of the ground measurement points was performed and an isoline view was generated. An analogous method was used for the interpolation of the elevation of surface points and the generation of isolines using the images taken by the camera on the UAV (Figure 6). The following indicators were calculated: ±Ni which is the sequence number of the analyzed point in the positive or the negative deviation from the base (ground geodetic measurement) isoline, ±ΔSi which is the length of the perpendicular to the positive or the negative side of the analyzed point, ±ΔZi which is the calculated correction of the overdose to the positive or the negative side, and ±D which is the distance between the base (ground geodetic measurement) isoline point and the UAV isoline point.
For the calculation of the deviation of the target position the following formula was used:
± Δ S i   =   ( y Ni     y Ti ) cos α     ( x Ni     x Ti ) sin α .
where α is the directional angle of the segment Ni–Ni + 1, Ti is the number of the interpolated UAV measurement point, and i is the number of the point for each fragment of the ground geodetic and UAV isolines.
As we know what the isoline step is (0.5 m), the distance between the horizontal (±D) at each Ni point can be calculated by geometric interpolation. The difference in height ± ΔZi is calculated using the formula:
± Δ Z i   =   ± h ± D i   ×   ± Δ S i
where h is the isoline step, D is the distance between the horizontal at the calculated point, and the ± sign depends on the direction of the horizontal deviation.

3. Results

3.1. Creating a Two-Dimensional (2D) Relief Model

In order to confidently state that the 2D relief model is sufficiently precise and can be used as a benchmark for estimating the models, which were made by using aerial photometric methods, the horizontals were drawn automatically in accordance with strict interpolation rules.
In order to perform the automated relief modelling, it was necessary to select pairs of measured points, among which it would be possible to calculate the exact horizontal surfaces of the relief, i.e., interpolate heights. Therefore, the Delaunay triangulation method was chosen to interpolate the heights [53].

3.2. Drawing of a Topographic Plan

First, the Delaunay triangulation (Figure 7) was completed among 179 selected topographic points using the program “Circle_3p”. This allowed 321 triangles to be selected, among which the interpolation of the triangle vertices was performed.
Among these triangle vertices, horizontal interpolation was performed in the LAS07 height system using a selected step of 0.5 m (Figure 8). The coordinates of 1676 extra points, plotted as horizontals, were calculated during the interpolation.
The interpolation points were uploaded to TopoPlan (AutoCAD 2016). The horizontals were plotted using the “Spline” function (Figure 9 and Figure 10).
The cross-sections of the Lepelionys Mound were created with the help of aerial images taken by the camera on UAV, which highlighted the minor anthropogenic forms of relief on the slope of the mound, i.e., the remains of the former tree trunk wall (Figure 11).

4. Discussion

4.1. Evaluation of the Precision of theAaerial Images

One of the most time-consuming tasks in aerial photography is to set out the GCPs and to coordinate them. Therefore, it is necessary to find the optimal number of GCPs in order to minimize the preparatory work. It should also be possible to estimate the feasible use of coordinated stable land objects (Figure 4) instead of bearing marks, which would further simplify the preparatory work. Therefore, ten ground control points (marks) in the study area are used to estimate the accuracy of the coordinates of 10 objects in the study area (Figure 3).
In order to evaluate the accuracy of the 3D model, it was created incorporating all ten marks and the coordinates of all the objects were measured in this model. The differences between the coordinates of objects in the 3D model and the coordinates measured from the topographic image do not exceed the double (Trimble GPS) accuracy (Table 2 and Table 3) for those objects that are clearly seen in the 3D model (np-508, -515, -582, and -587). The accuracy of the other objects is poorer due to the vegetation (grass), which complicates their identification.
The random error distribution depends on the accuracy of the object identification, and therefore the graph consists of taking the errors in absolute size in mm.
The error analysis shows that they increase significantly when the orientation marks are fewer than five (Table 5, Figure 5), even for those objects that are visible in the 3D model (np-508, -515, -582, and -587). Therefore, it can be argued that in order to maintain the accuracy of measurements, there should be at least five orientation marks. It has been noticed that the error rate is influenced not only by the vegetation but also by the experience and thoroughness of the operator measuring the 3D model. As the precision of the well-known objects is practically unchanged (from ten to five marks), it can be argued that the 3D model should operate with maximum accuracy with five GCPs and the use of easily visible coordinated objects (Figure 12, Table 2, Table 3, Table 4 and Table 5).
As seen in Table 3, the random error distribution depends on the accuracy of the object identification. Similarly, the absolute errors of objects have been calculated for the 3D models with different number of ground control points (Table 4 and Table 5).

4.2. Evaluation of the Influence of Vegetation Covers

In 2019, a comparison of the Lepelionys mound surface isolines obtained using the UAV 3D model or the ground measurements showed that there are significant deviations in the plane and height positions between the two. The comparison was carried out in different vegetation height zones (Figure 13). At the top of the mound, where the grass was mown and its height was only 1 to 2 cm, the maximum discrepancies between the UAV 3D model and the ground measurement isolines were 0.75 m for the plane position and 0.42 m for the height. On the slopes of the mound, where the height of the grass was between 5 and 10 cm, the maximum discrepancies between the plane position of the isolines were up to 0.41 m, and up to 0.42 m for the height. At the foot of the mound, where the height of the unheated grass was 60 to 100 cm, the maximum discrepancies between the plane position of the isolines reached 6.63 m, and up to 0.77 m for the height. The results of the discrepancies between the plane position of the isolines and their height are presented in Table 6.
The sharpness and contrast in aerial images are both becoming important issues for the use of UAV aerial imagery. Aerial image contrast problems occur in areas that fall under the shadow of trees or rough terrain on a sunny day. In this study, the image contrast of the aerial photographs was adjusted and, where necessary, increased. During the 2018 photofixation, the western and southwestern parts of the mound slope were in shadow. To highlight the terrain microforms in parts of the image on the southwest slope we used the brightness/contrast, shadows/highlights, color balance, hue/saturation, and photo files tools in Adobe Photoshop software.
The comparison of the large-scale maps of the Lepelionys mound surface created using the UAV 3D model or by using the ground topographic measurements, shows that the plane position of the isolines in the 3D model is highly micro-sinuous. This is due to the methods of isoline interpolation applied in the UAV 3D model, i.e., the calculation of the interfaces between multiple point pairs (about seven million pixel pairs) creates the non-continuous isolines.
Three-dimensional terrain modelling using UAV aerial imagery is currently expanding. The wider application of collaborative mapping initiatives in archaeology [54,55,56] will lead to an increasing use of nonprofessional UAV aerial imagery to identify undefined and unexplored archaeological sites from the 19th to the early 20th century.

5. Conclusions

  • The “Circle_3p” computer program designed by Artūras Bautrėnas, an associate professor of the Department of Cartography and Geoinformatics, employs the classical method of Delaunay and ensures a consistent systemic selection of points.
  • The use of “Circle 3p” for the analysis of aerial photographs of the Lepelionys Mound has shown that the program needs to be improved by adding elements for the correction of the isolines.
  • A comparison of the results of the geodetic measurements and the UAV images, showed that the best overlaps of surface microform isolines are on steep slopes. On the flat top of the mound surface, microform variance makes up 0.7 to 1.0 m. This is due to the rare density of the interpolation points calculated by the program “Circle_3p”. The variance of the isolines at the foot of the mound reaches 0.45 to 0.7 m (medium-height grass) and 0.95 to 1.35 m (high grass). The study has shown that external factors have a significant influence on the identification of the mound relief microforms.
  • The research related to the GCPs position and shows that five GCPs arranged at the edges and the center of the object give the best accuracy as compared with other variations (three on the top, four on the edge, and ten GCPs).
  • The unnatural curvature of isolines in the UAV 3D model, resulting in the abundance of unnatural surface microforms, is due to the interpolation techniques used to determine the isoline position, specifically the calculation of the interfaces between numerous point pairs.

Author Contributions

A.Č.: conceptualization, methodology, visualization, writing—original draft, writing—review & editing. A.B.: data curation, formal analysis, investigation, validation. L.B.: investigation, software, visualization. D.O.: investigation, software, validation.

Funding

This research received no external funding.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Jones, R.J.A.; Evans, R. Soil and crop marks in the recognition of archaeological sites by air photography. In Aerial Reconnaissance for Archaeology; Wilson, D.R., Ed.; The Council for British Archaeology: London, UK, 1975; pp. 1–11. [Google Scholar]
  2. Remondino, F.; El-Hakim, S. Image-based 3D modelling, a review. Photogramm. Rec. 2006, 21, 269–291. [Google Scholar] [CrossRef]
  3. Parcaks, S.H. Satellite Remote Sensing for Archaeology; Routledge: London, UK; New York, NY, USA, 2009; p. 312. [Google Scholar]
  4. Bäumker, M.; Przybilla, H.-J. Investigations on the accuracy of the navigation data of unmanned aerial vehicles using the example of the system microcopter. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2011, 38, 113–118. [Google Scholar]
  5. Deseilligny, M.; De Luca, L.; Remondino, F. Automated image-based procedures for accurate artifacts 3D modeling and orthoimages. In Proceedings of the XXIIIth International CIPA Simposium, Prague, Czech Republic, 12–16 September 2011; pp. 291–299. [Google Scholar]
  6. Chiabrando, F.; Nex, F.; Piatti, D.; Rinaudo, F. UAV and RPV systems for photogrammetric surveys in archaeological areas, two tests in the Piedmont Region (Italy). J. Archaeol. Sci. 2011, 38, 697–710. [Google Scholar] [CrossRef]
  7. Lasaponara, R.; Masini, N. Satellite remote sensing in archaeology, past, present and future perspectives. J. Archaeol Sci. 2011, 38, 1995–2002. [Google Scholar] [CrossRef]
  8. Lasaponara, R.; Masini, N. (Eds.) Satellite Remote Sensing: A New Tool for Archaeology; Springer: Heidelberg, Germany, 2012; p. 364. [Google Scholar]
  9. Lasaponara, R.; Masini, N. Satellite synthetic aperture radar in archaeology and cultural landscape: An overview. Archaeol. Prospect. 2013, 20, 71–78. [Google Scholar] [CrossRef]
  10. Nex, F.; Remondino, F. UAV for 3D mapping applications, a review. Appl. Geomat. 2013, 6, 1–15. [Google Scholar] [CrossRef]
  11. Hugenholtz, C.H.; Whitehead, K.; Brown, O.W.; Barchyn, T.E.; Moorman, B.J.; LeClair, A.J.; Riddell, K.; Hamilton, T. Geomorphological mapping with a small unmanned aircraft system (SUAS), feature detection and accuracy assessment of a photogrammetrically-derived digital terrain model. Geomorphology 2013, 194, 16–24. [Google Scholar] [CrossRef]
  12. Whitehead, K.; Hugenholtz, C.H. Remote sensing of the environment with small unmanned aircraft systems (UAS): A review of progress and challenges. J. Unmanned Veh. Syst. 2014, 2, 69–85. [Google Scholar] [CrossRef]
  13. Whitehead, K.; Hugenholtz, C.H.; Myshak, S.; Brown, O.W.; LeClair, A.J.; Tamminga, A.; Barchyn, T.E.; Moorman, B.J.; Eaton, B. Remote sensing of the environment with small unmanned aircraft systems (UAS): Scientific and commercial applications. J. Unmanned Veh. Syst. 2014, 2, 86–102. [Google Scholar] [CrossRef]
  14. Themistocleous, K.; Agapiou, A.; Cuca, B.; Hadjimitsis, D.G. Unmanned aerial systems and spectroscopy for remote sensing applications in archaeology. Int. Arch. Photogramm. Remote Sens. Spati. Inf. Sci. 2015, XL-7/W3, 1419–1423. [Google Scholar] [CrossRef]
  15. Hadjimitsis, D.G.; Themistocleous, K.; Agapiou, A. Monitoring Archaeological Site Landscapes in Cyprus using multi-temporal atmospheric corrected image data. Int. J. Archit. Comput. 2009, 7, 121–138. [Google Scholar] [CrossRef]
  16. Agapiou, A.; Hadjimitsis, D.G.; Themistocleous, K.; Papadavid, G.; Toulios, L. Detection of archaeological crop marks in Cyprus using vegetation indices from Landsat TM/ETM+ satellite images and field spectroscopy measurements. Proc. SPIE 2010, 7831, 78310V. [Google Scholar]
  17. Ruzgienė, B.; Berteška, T.; Gečytė, S.; Jakubauskienė, E.; Aksamitauskas, V.Č. The surface modelling based on UAV photogrammetry and qualitative estimation. Measurement 2015, 73, 619–627. [Google Scholar] [CrossRef]
  18. Sužiedelytė-Visockienė, J.; Bručas, D.; Bagdžiūnaitė, R.; Puzienė, R.; Stanionis, A.; Ragauskas, U. Remotely-piloted aerial system for photogrammetry: Orthoimage generation for mapping applications. Geografie 2016, 121, 349–367. [Google Scholar]
  19. Campana, S. Drones in archaeology. State-of-the-art and future perspectives. Archaeol. Prospect. 2017, 24, 275–296. [Google Scholar] [CrossRef]
  20. Traviglia, A.; Torsello, A. Landscape pattern detection in archaeological remote sensing. Geosciences 2017, 7, 128. [Google Scholar] [CrossRef]
  21. Masini, N.; Marzo, C.; Manzari, P.; Belmonte, A.; Sabia, C.; Lasaponara, R. On the characterization of temporal and spatial patterns of archaeological crop-marks. J. Cult. Herit. 2018, 32, 124–132. [Google Scholar] [CrossRef]
  22. Cowley, D.C.; Moriarty, C.H.; Geddes, G.; Brown, G.L.; Wade, T.; Nichol, C.J. UAVs in context: Archaeological airborne recording in a national body of survey and record. Drones 2018, 56, 2. [Google Scholar] [CrossRef]
  23. Tapete, D. Remote sensing and geosciences for archaeology. Geosciences 2018, 7, 41. [Google Scholar] [CrossRef]
  24. Konstantinos, K.G.; Soura, K.; Koukouvelas, I.K.; Argyropoulos, N.G. UAV vs classical aerial photogrammetry for archaeological studies. J. Archaeol. Sci. Rep. 2017, 14, 758–773. [Google Scholar]
  25. Sanz-Ablanedo, E.; Chandler, J.H.; Rodríguez-Pérez, J.R.; Ordóñez, C. Accuracy of unmanned aerial vehicle (UAV) and SfM photogrammetry survey as a function of the number and location of ground control points used. Remote Sens. 2018, 10, 1606. [Google Scholar] [CrossRef]
  26. Shahbazi, M.; Sohn, G.; Théau, J.; Menard, P. Development and evaluation of a UAV-photogrammetry system for precise 3D environmental modeling. Sensors 2015, 15, 27493–27524. [Google Scholar] [CrossRef] [PubMed]
  27. Turner, D.; Lucieer, A.; Watson, C. An automated technique for generating georectified mosaics from ultra-high resolution unmanned aerial vehicle (UAV) imagery, based on structure from motion (SFM) point clouds. Remote Sens. 2012, 4, 1392–1410. [Google Scholar] [CrossRef]
  28. Shahbazi, M.; Sohn, G.; Théau, J.; Menard, P. Robust structure-from-motion computation: Application to open-pit mine surveying from unmanned aerial images. J. Unmanned Veh. Syst. 2017, 5, 126–145. [Google Scholar] [CrossRef]
  29. Mian, O.; Lutes, J.; Lipa, G.; Hutton, J.J.; Gavelle, E.; Borghini, S. Accuracy assessment of direct georeferencing for photogrammetric applications on small unmanned aerial platforms. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, 40, 77–83. [Google Scholar] [CrossRef]
  30. Hugenholtz, C.; Brown, O.; Walker, J.; Barchyn, T.; Nesbit, P.; Kucharczyk, M.; Myshak, S. Spatial accuracy of UAV-derived orthoimagery and topography: Comparing photogrammetric models processed with direct geo-referencing and ground control points. Geomatica 2016, 70, 21–30. [Google Scholar] [CrossRef]
  31. Benassi, F.; Dall’Asta, E.; Diotri, F.; Forlani, G.; Cella, U.M.; Roncella, R.; Santise, M. Testing the accuracy and repeatability of UAV blocks oriented with GNSS-supported aerial triangulation. Remote Sens. 2017, 9, 172. [Google Scholar] [CrossRef]
  32. Forlani, G.; Dall’Asta, E.; Diotri, F.; Cella, U.M.; di Roncella, R.; Santise, M. Quality assessment of DSMs produced from UAV flights georeferenced with on-board RTK positioning. Remote Sens. 2018, 10, 311. [Google Scholar] [CrossRef]
  33. Buivydas, U. The Settlement at the foot of Lepelionys Hill Fort. In Archaeological Research in Lithuania; Lithuanian Archaeological Society: Vilnius, Lithuania, 2006; pp. 65–66. [Google Scholar]
  34. Banytė-Rowell, R.; Baronas, D.; Kazakevičius, V.; Vaškevičiūtė, I.; Zabiela, G. History of Lithuania; Zabiela, G., Ed.; The Lithuanian Institute of History: Vilnius, Lithuania, 2007; Volume 2, p. 518. [Google Scholar]
  35. Jovaiša, E. The Aestii: Genesis; Lithuanian University of Educational Sciences: Vilnius, Lithuania, 2013; Volume 1, p. 379. [Google Scholar]
  36. Jovaiša, E. The Aestii: Evolution; Lithuanian University of Educational Sciences: Vilnius, Lithuania, 2014; Volume 2, p. 351. [Google Scholar]
  37. Viršilienė, J.; Zabiela, G. Revived Mounds; The Lithuanian Institute of History: Vilnius, Lithuania, 2018; p. 219. [Google Scholar]
  38. Berni, J.; Zarco-Tejada, P.J.; Suarez, L.; Fereres, E. Thermal and narrowband multispectral remote sensing for vegetation monitoring from an unmanned aerial vehicle. IEEE Trans. Geosci. Remote Sens. 2009, 47, 722–738. [Google Scholar] [CrossRef]
  39. Baluja, J.; Diago, M.P.; Balda, P.; Zorer, R.; Meggio, F.; Morales, F.; Tardaguila, J. Assessment of vineyard water status variability by thermal and multispectral imagery using an unmanned aerial vehicle (UAV). Irrig. Sci. 2012, 30, 511–522. [Google Scholar] [CrossRef]
  40. Bellvert, J.; Zarco-Tejada, P.J.; Girona, J.; Fereres, E. Mapping crop water stress index in a Pinot-Noir vineyard: Comparing ground measurements with thermal remote sensing imagery from an unmanned aerial vehicle. Precis. Agric. 2013, 15, 361–376. [Google Scholar] [CrossRef]
  41. Chisholm, R.A.; Cui, J.; Lum, S.K.Y.; Chen, B.M. UAV LiDAR for below-canopy forest surveys. J. Unmanned Veh. Syst. 2013, 1, 61–68. [Google Scholar] [CrossRef]
  42. Huesca, M.; Merino-de-Miguel, S.; González-Alonso, F.; Martinez, S.; Miguel Cuevas, J.; Calle, A. Using AHS hyper-spectral images to study forest vegetation recovery after a fire. Int. J. Remote Sens. 2013, 34, 4025–4048. [Google Scholar] [CrossRef]
  43. Knoth, C.; Klein, B.; Prinz, T.; Kleinebecker, T. Unmanned aerial vehicles as innovative remote sensing platforms for high-resolution infrared imagery to support restoration monitoring in cut-over bogs. Appl. Veg. Sci. 2013, 16, 509–517. [Google Scholar] [CrossRef]
  44. Mozas-Calvache, A.T.; Pérez-García, J.L.; Cardernal-Escarcena, F.J.; Delgado, J.; Mata-de-Castro, E. Comparison of low altitude photogrammetric methods for obtaining DEMs and orthoimages of archaeological sites. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, 39, 577–581. [Google Scholar] [CrossRef] [Green Version]
  45. Erny, G.; Gutierrez, G.; Friedman, A.; Godsey, M.; Gradoz, M. Archaeological topography: Comparing digital photogrammetry taken with unmanned aerial vehicles (UAVs) versus standard surveys with total stations. In Proceedings of the 80th Annual Meeting of the Society for American Archaeology, San Francisco, CA, USA, 15–19 April 2015; p. 409. [Google Scholar]
  46. Mesas-Carracosa, F.-J.; Notario-Garcia, M.D.; Meroño de Larriva, J.E.; García-Ferre, A. An analysis of the influence of flight parameters in the generation of unmanned aerial vehicle (UAV) orthomosaics to survey archaeological areas. Sensors 2016, 16, 1838. [Google Scholar] [CrossRef] [Green Version]
  47. Mian, O.; Lutes, J.; Lipa, G.; Hutton, J.J.; Gavelle, E.; Borghini, S. Direct georeferencing on small unmanned aerial platforms for improved reliability and accuracy of mapping without the need for ground control points. In Proceedings of the International Conference on Unmanned Aerial Vehicles in Geomatics, Toronto, ON, Canada, 30 August–2 September 2015; Volume XL-1/W4, pp. 397–402. [Google Scholar]
  48. Agüera–Vega, F.; Carvajal–Ramirez, F.; Martínez–Carricondo, P. Assessment of photogrammetric mapping accuracy based on variation ground control points number using unmanned aerial vehicle. Measurement 2017, 98, 221–227. [Google Scholar] [CrossRef]
  49. Fryskowska, A.; Kedzierski, M.; Walczykowski, P.; Wierzbicki, D.; Delis, P.; Lada, A. Effective detection of sub-surface archaeological features from laser scanning point clouds and imagery data. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, 42, 245–251. [Google Scholar] [CrossRef] [Green Version]
  50. Carvajal-Ramírez, F.; Navarro-Ortega, A.D.; Agüera-Vega, F.; Martínez-Carricondo, P.; Mancini, F. Virtual reconstruction of damaged archaeological sites based on unmanned aerial vehicle photogrammetry and 3D modelling: Study case of a southeastern Iberia production area in the Bronze Age. J. Int. Meas. Conf. 2019, 136, 225–236. [Google Scholar] [CrossRef]
  51. Juškaitis, V. Report of Archaeological Research in 2007 of the Ancient Settlement of the Lepelionys Mound (22593, Prienai District); Trakai Historical Museum: Trakai, Lithuania, 2009. [Google Scholar]
  52. Juškaitis, V. The settlement at the foot of the lepelionys hill fort site. In Archaeological Research in Lithuania; The Lithuanian Institute of History: Vilnius, Lithuania, 2007; pp. 73–74. [Google Scholar]
  53. Fang, T.; Piegl, L. Algorithm for Delaunay triangulation and convex hull computation using a sparse matrix. Comput. Aided Des. 1992, 24, 425–436. [Google Scholar] [CrossRef]
  54. Hacigüzeller, P. Collaborative mapping in the age of ubiquitous internet: An archaeological perspective. Digit. Class. Online 2017, 3, 5–16. [Google Scholar]
  55. Perkins, C. Community Mapping. Cartogr. J. 2007, 44, 127–137. [Google Scholar] [CrossRef]
  56. Lee, D. Map Orkney Month: Imagining Archaeological Mappings. Livingmaps Rev. 2016, 1, 1–25. [Google Scholar]
Figure 1. The location of Lepelionys Mound and the ancient settlement: (1) mound boundary and (2) ancient settlement territory boundary (according to V. Juškaitis [51]).
Figure 1. The location of Lepelionys Mound and the ancient settlement: (1) mound boundary and (2) ancient settlement territory boundary (according to V. Juškaitis [51]).
Sensors 19 05303 g001
Figure 2. The selected points that were mapped to the LKS-94 coordinate system.
Figure 2. The selected points that were mapped to the LKS-94 coordinate system.
Sensors 19 05303 g002
Figure 3. The ground control point marks (A) and the diagram of the ground control point (GCP) arrangement (B). The red circle defines the location of the ground mark.
Figure 3. The ground control point marks (A) and the diagram of the ground control point (GCP) arrangement (B). The red circle defines the location of the ground mark.
Sensors 19 05303 g003
Figure 4. Examples of identified objects. The red circles define the small objects location, whose measured coordinates are used to adjust the 3D model.
Figure 4. Examples of identified objects. The red circles define the small objects location, whose measured coordinates are used to adjust the 3D model.
Sensors 19 05303 g004
Figure 5. Visual representation of the nadir facing camera on the drone.
Figure 5. Visual representation of the nadir facing camera on the drone.
Sensors 19 05303 g005
Figure 6. The scheme of elevation isoline position assessment using ground geodetic measurements and the mismatch between it and the aerial images from UAVs.
Figure 6. The scheme of elevation isoline position assessment using ground geodetic measurements and the mismatch between it and the aerial images from UAVs.
Sensors 19 05303 g006
Figure 7. The Delaunay triangulation (A) among the 179 selected topographic points (B) using the program “Circle_3p”.
Figure 7. The Delaunay triangulation (A) among the 179 selected topographic points (B) using the program “Circle_3p”.
Sensors 19 05303 g007
Figure 8. The interpolation among the vertices of selected triangles.
Figure 8. The interpolation among the vertices of selected triangles.
Sensors 19 05303 g008
Figure 9. The points of interpolation (A), an interpolation sequence (B) and the relief isolines (C).
Figure 9. The points of interpolation (A), an interpolation sequence (B) and the relief isolines (C).
Sensors 19 05303 g009
Figure 10. A cross-section of the Lepelionys Mound. The roughness in the red line indicates the remains of the former tree trunk fencing.
Figure 10. A cross-section of the Lepelionys Mound. The roughness in the red line indicates the remains of the former tree trunk fencing.
Sensors 19 05303 g010
Figure 11. The variance of the equal height isolines in plane position.
Figure 11. The variance of the equal height isolines in plane position.
Sensors 19 05303 g011
Figure 12. A diagram illustrating the parameters of variance of equal height isolines in plane position.
Figure 12. A diagram illustrating the parameters of variance of equal height isolines in plane position.
Sensors 19 05303 g012
Figure 13. The zones of different herbaceous height on the Lepelionys mound.
Figure 13. The zones of different herbaceous height on the Lepelionys mound.
Sensors 19 05303 g013
Table 1. The specifications of unmanned aerial vehicle (UAV) DJI Inspire1 and the RGB Zenmus X3 camera.
Table 1. The specifications of unmanned aerial vehicle (UAV) DJI Inspire1 and the RGB Zenmus X3 camera.
Hovering Accuracy (GPS Mode)Vertical: 0.5 m; Horizontal: 2.5 m
Max angular velocityPitch: 300°/s; Yaw: 150°/s
Max tilt angle35°
Max ascent and descent speed5 m/s; 4 m/s
Max speed22 m/s
Max wind speed resistance10 m/s
Max service ceiling above take-off point120 m
Type and model X3; FC350
Total and effective pixels12.76 M; 12.4 M
Max capacity 64 GB
Maximal image size4000 × 3000
ISO range Photo- 100–1600; Video- 100–3200.
The electronic shutter speed 8 s–1/8000 s
A field of view (FOV)94°
Supported file formatsPhoto: JPEG, DNG; Video: MP4/MOV (MPEG-4 AVC/H.264)
Types of electronic mediaMicro SD. Maximal capacity 64 GB. Class 10 or UHS-1
Sensor width (mm)6.17
Focal length (mm)4.55
Altitude (m)50
Image width 4000
Image height 3000
GSD (cm/pixel)1.695054945
Width (m) 67.8021978
Height (m)50.85164835
Table 2. The parameters of variance of the equal height isolines in plane position.
Table 2. The parameters of variance of the equal height isolines in plane position.
Line No.Variance of Equal Height Isolines in Plane Position, mSlope Inclination, Degree in Brackets (Mean Slope Inclination Values)Remarks
1 0.91 0The top of the mound, mown grass
2 0.71 0The top of the mound, mown grass
3 0.86 0The top of the mound, mown grass
4 0.94 0The top of the mound, mown grass
5 0.89 0The top of the mound, mown grass
6 0.60 30–40The slope of the mound, short grass (0.1 m)
7 0.36 30–40The slope of the mound, short grass (0.1 m)
8 0.54 30–40The slope of the mound, short grass (0.1 m)
9 0.36 30–40The slope of the mound, short grass (0.1 m)
10 0.34 30–40The slope of the mound, short grass (0.1 m)
11 0.36 30–40The slope of the mound, short grass (0.1 m)
12 0.96 21–26The foot of the mound, high grass (0.5 m)
13 1.26 21–26The foot of the mound, high grass (0.5 m)
14 1.09 21–26The foot of the mound, high grass (0.5 m)
15 1.32 21–26The foot of the mound, high grass (0.5 m)
16 0.91 21–26The foot of the mound, high grass (0.5 m)
17 0.45 14–19The foot of the mound, medium-high grass (0.2 m)
18 0.67 14–19The foot of the mound, medium-high grass (0.2 m)
19 0.71 14–19The foot of the mound, medium-high grass (0.2 m)
20 0.54 14–19The foot of the mound, medium-high grass (0.2 m)
21 0.63 14–19The foot of the mound, medium-high grass (0.2 m)
Table 3. The errors of the measurements of the objects when using ten marks.
Table 3. The errors of the measurements of the objects when using ten marks.
No. GPSDEM Error Size (m)Absolute Error Size (mm)
np-507
X6048772.1626048772.216−0.05454
Y524809.995524810.072−0.07777
Z112.138112.212−0.07474
np-508
X6048771.9636048771.971−0.0088
Y524810.078524810.0660.01212
Z112.757112.785−0.02828
np-515
X6048774.3786048774.393−0.01515
Y524804.426524804.440−0.01414
Z111.620111.644−0.02424
np-522
X6048769.1326048769.230−0.09898
Y524803.655524803.700−0.04545
Z111.896111.940−0.04444
np-530
X6048769.7086048769.740−0.03232
Y524793.519524793.562−0.04343
Z110.071110.140−0.06969
np-560
X6048716.4106048716.452−0.04242
Y524751.812524751.892−0.08080
Z105.409105.3680.04141
np-582
X6048712.9646048712.978−0.01414
Y524768.387524768.400−0.01313
Z106.038106.0060.03232
np-586
X6048692.8696048692.8350.03434
Y524769.123524769.1000.02323
Z103.427103.3940.03333
np-587
X6048692.7296048692.734−0.0055
Y524769.143524769.154−0.01111
Z104.131104.1070.02424
np-596
X6048715.7256048715.6850.04040
Y524781.260524781.290−0.03030
Z107.443107.3990.04444
Table 4. The absolute errors of objects when a 3D model is made on the basis of 10 marks with‚ ground control points.
Table 4. The absolute errors of objects when a 3D model is made on the basis of 10 marks with‚ ground control points.
Absolute Error (mm)
Point No.XYZ
np-507547774
np-50881228
np-515151424
np-522984544
np-530324369
np-560428041
np-582141332
np-586342333
np-58751124
np-596403044
Table 5. A comparison of the absolute errors when different numbers of ground control points are used for a precise calculation. Values are in mm.
Table 5. A comparison of the absolute errors when different numbers of ground control points are used for a precise calculation. Values are in mm.
Point No.3 Marks4 Marks5 Marks7 Marks
xyzxyzxyzxyz
np-507402608514309194263568051487073
np-508861055636424211122610919
np-5151751936478141015825131626
np-522951455744833254955657883949
np-5309546571419692170164484652373855
np-560400492194387308406458448437847
np-58210226688563239161128111529
np-586729371803524498527352531302827
np-58710914518864758417142691622
np-596701398239471480341423246453343
Table 6. The deviation between the ground measurement and the UAV isoline plane and height positions (data from June 2019).
Table 6. The deviation between the ground measurement and the UAV isoline plane and height positions (data from June 2019).
Isoline HeightNumber of Deviation aMaximum Plane Deviation, m bAverage of Plane Deviation, m cMaximum of Height Deviation, m dAverage of Height Deviation, m eProportional Deviation f
+++++
The foot of the mound. Height of the grass was 60–100 cm
102.001382320.6191.5690.2690.556−0.109+0.117−0.051+0.038+0.005
102.507413996.6532.5271.5390.834−0.338+0.214−0.091+0.071−0.034
103.004496325.7912.1160.9610.864−0.778+0.216−0.110+0.085+0.004
103.503774313.9501.6630.5690.698−0.480+0.186−0.064+0.079+0.012
104.007326351.3552.1620.4970.765−0.167+0.213−0.064+0.081+0.003
104.508828632.3011.8220.7870.569−0.277+0.293−0.086+0.080−0.004
105.008776081.7201.7260.5770.566−0.190+0.245−0.063+0.0910.000
105.507446311.6931.4630.5390.446−0.199+0.271−0.074+0.075−0.005
106.005363561.6290.9020.4490.276−0.233+0.139−0.066+0.048−0.021
106.505322221.5340.5130.4200.197−0.282+0.120−0.077+0.043−0.042
107.003182510.7710.9590.3100.293−0.197+0.179−0.074+0.059−0.016
107.502582540.5180.8180.2140.225−0.126+0.176−0.047+0.048+0.000
108.002033030.6810.7990.2650.253−0.182+0.183−0.066+0.060+0.009
108.50703820.3550.9690.1270.330−0.109+0.174−0.037+0.062+0.047
109.004162421.0210.5080.4010.135−0.174+0.147−0.075+0.040−0.033
109.503743191.7360.7000.4690.254−0.406+0.163−0.115+0.059−0.035
110.002123050.7760.4150.2500.166−0.167+0.090−0.064+0.036−0.005
110.503441891.2510.4490.3220.132−0.238+0.103−0.066+0.030−0.032
111.004493361.4010.5750.4120.211−0.298+0.199−0.093+0.064−0.026
111.504513611.0551.0800.4380.332−0.251+0.289−0.103+0.081−0.022
112.005372061.8430.7790.5850.270−0.341+0.141−0.113+0.048−0.069
The slopes of the mound. Height of the grass was 5 to 10 cm
110.001381370.4220.6950.2050.3250.197+0.424−0.096+0.148+0.027
110.501121580.3490.4770.1590.1950.197+0.272−0.086+0.107+0.027
111.001621240.3260.30.1140.1090.184+0.184−0.064+0.064−0.006
111.501691180.3550.3580.1540.133−0.21+0.234−0.089+0.088−0.014
112.001441440.3840.4060.1590.139−0.247+0.255−0.01+0.091−0.001
112.501461370.3470.3810.1380.137−0.256+0.246−0.098+0.096−0.004
113.001251220.2970.2910.120.12−0.227+0.196−0.09+0.082−0.002
113.501151130.2720.2650.1240.116−0.182+0.189−0.083+0.081+0.001
114.001101000.2510.2130.120.105−0.161+0.161−0.08+0.073−0.007
114.501031080.2770.2880.1230.132−0.202+0.207−0.008+0.01+0.005
115.001201250.3390.3620.1410.171−0.199+0.248−0.092+0.116+0.013
115.501151390.3780.3290.1810.137−0.249+0.219−0.124+0.099−0.001
116.001201110.340.2450.1510.109−0.257+0.17−0.108+0.078−0.018
116.50951240.2060.2370.0950.097−0.159+0.1550.072+0.066+0.007
117.00119880.2670.2220.1190.108−0.185+0.16−0.082+0.078−0.02
117.50951070.2680.2850.1370.134−0.168+0.207−0.085+0.096+0.009
118.00911000.2710.2740.1320.131−0.185+0.176−0.096+0.082−0.002
118.5090990.2590.2780.1150.117−0.193+0.183−0.009+0.075−0.002
119.00110680.270.2420.1220.112−0.184+0.158−0.087+0.075−0.026
119.50612730.2360.2220.1030.088−0.159+0.123−0.079+0.049−0.009
120.0087920.2540.4050.0940.123−0.217+0.177−0.072+0.063−0.003
120.50701320.2470.2530.050.081−0.245+0.114−0.041+0.044+0.014
121.0057500.1160.1590.0630.069−0.127+0.127−0.041+0.071+0.043
The top of the mound. Height of the grass was 1 to 2 cm
121.00711510.4680.4430.2600.240−0.297+0.055−0.107+0.031−0.013
120.501531830.3960.7540.1110.310−0.070+0.101−0.021+0.048+0.017
120.00862510.3730.4660.1480.269−0.042+0.091−0.018+0.051+0.033
119.501512250.2740.6940.1140.238−0.048+0.125−0.018+0.044+0.019
119.002053950.7360.5850.2561.228−0.197+0.419−0.033+0.114+0.064
118.501074900.6472.3730.3020.804−0.235+0.196−0.106+0.074+0.042
121.00711510.4680.4430.2600.240−0.297+0.055−0.107+0.031−0.013
120.501531830.3960.7540.1110.310−0.070+0.101−0.021+0.048+0.017
120.00862510.3730.4660.1480.269−0.042+0.091−0.018+0.051+0.033
119.501512250.2740.6940.1140.238−0.048+0.125−0.018+0.044+0.019
119.002053950.7360.5850.2561.228−0.197+0.419−0.033+0.114+0.064
118.501074900.6472.3730.3020.804−0.235+0.196−0.106+0.074+0.042
121.00711510.4680.4430.2600.240−0.297+0.055−0.107+0.031−0.013
120.501531830.3960.7540.1110.310−0.070+0.101−0.021+0.048+0.017
120.00862510.3730.4660.1480.269−0.042+0.091−0.018+0.051+0.033
119.501512250.2740.6940.1140.238−0.048+0.125−0.018+0.044+0.019
119.002053950.7360.5850.2561.228−0.197+0.419−0.033+0.114+0.064
118.501074900.6472.3730.3020.804−0.235+0.196−0.106+0.074+0.042
* Explanation of the superscripts in the table: a number of negative (−N) and positive (+N) derivation, b maximum of negative (−Δ) and positive (+Δ) plane derivation, c average of negative (∑−Δ/−N) and positive (∑+Δ/+N) plane derivation, d maximum of negative (−Δz max) and positive (+Δz max) height derivation, e average of negative (∑−Δz/Nz) and positive (∑+Δz/Nz) height derivation, and f ratio of average positive/negative derivation and number of positive/negative derivation ( + Δ Δ ) / ( N + N ) .

Share and Cite

MDPI and ACS Style

Česnulevičius, A.; Bautrėnas, A.; Bevainis, L.; Ovodas, D. A Comparison of the Influence of Vegetation Cover on the Precision of an UAV 3D Model and Ground Measurement Data for Archaeological Investigations: A Case Study of the Lepelionys Mound, Middle Lithuania. Sensors 2019, 19, 5303. https://doi.org/10.3390/s19235303

AMA Style

Česnulevičius A, Bautrėnas A, Bevainis L, Ovodas D. A Comparison of the Influence of Vegetation Cover on the Precision of an UAV 3D Model and Ground Measurement Data for Archaeological Investigations: A Case Study of the Lepelionys Mound, Middle Lithuania. Sensors. 2019; 19(23):5303. https://doi.org/10.3390/s19235303

Chicago/Turabian Style

Česnulevičius, Algimantas, Artūras Bautrėnas, Linas Bevainis, and Donatas Ovodas. 2019. "A Comparison of the Influence of Vegetation Cover on the Precision of an UAV 3D Model and Ground Measurement Data for Archaeological Investigations: A Case Study of the Lepelionys Mound, Middle Lithuania" Sensors 19, no. 23: 5303. https://doi.org/10.3390/s19235303

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop