Author Contributions
Conceptualization, T.B., L.S. and K.P.; methodology, T.B., L.S. and K.P.; software, T.B., L.S. and K.P.J.; validation, T.B. and L.S.; formal analysis, T.B. and L.S.; investigation, K.P.; resources, K.P. and K.P.J.; data curation, T.B., L.S. and K.P.J.; writing—original draft preparation, T.B., L.S. and K.P.; writing—review and editing, T.B. and L.S.; visualization, T.B., L.S. and K.P.J.; supervision, K.P.; project administration, K.P.; funding acquisition, K.P. All authors have read and agreed to the published version of the manuscript.
Figure 1.
Former US air bases in Greenland during WWII, the air base of our interest is circled (adapted from Google maps).
Figure 1.
Former US air bases in Greenland during WWII, the air base of our interest is circled (adapted from Google maps).
Figure 2.
Left—Bluie East Two location (Google maps), right—aerial view of the base (GeoEye1). This and the following figures with the coordinates shown use the following coordinate system: WGS 84/UTM zone 24N (EPSG: 32624).
Figure 2.
Left—Bluie East Two location (Google maps), right—aerial view of the base (GeoEye1). This and the following figures with the coordinates shown use the following coordinate system: WGS 84/UTM zone 24N (EPSG: 32624).
Figure 3.
Several vehicles and construction remains were documented in 3D based on close-range photogrammetry, for example, 80-year-old trucks with inflated tires and fitted snow chains seem very mysterious (photos K. Pavelka).
Figure 3.
Several vehicles and construction remains were documented in 3D based on close-range photogrammetry, for example, 80-year-old trucks with inflated tires and fitted snow chains seem very mysterious (photos K. Pavelka).
Figure 4.
Example of created 3D model based on close-range photogrammetry ((left)—photography of the truck for which the 3D model was created, (right)—sample of the created 3D model).
Figure 4.
Example of created 3D model based on close-range photogrammetry ((left)—photography of the truck for which the 3D model was created, (right)—sample of the created 3D model).
Figure 5.
Image overlapping statistics (adapted from Pix4D software project report).
Figure 5.
Image overlapping statistics (adapted from Pix4D software project report).
Figure 6.
Canon PowerShot ELPH 110 HS NIR spectral response (SenseFly material).
Figure 6.
Canon PowerShot ELPH 110 HS NIR spectral response (SenseFly material).
Figure 7.
Orthophoto plan, GSD 5 cm (left), DSM, GSD 10 cm (right).
Figure 7.
Orthophoto plan, GSD 5 cm (left), DSM, GSD 10 cm (right).
Figure 8.
Selected areas of interest.
Figure 8.
Selected areas of interest.
Figure 9.
A more suitable way to distinguish barrels from vegetation is to use a combination of NIR-R-G (right) rather than R-G-B (left). White circle marks the site of a pile of barrels.
Figure 9.
A more suitable way to distinguish barrels from vegetation is to use a combination of NIR-R-G (right) rather than R-G-B (left). White circle marks the site of a pile of barrels.
Figure 10.
Satellite data preprocessing flowchart.
Figure 10.
Satellite data preprocessing flowchart.
Figure 11.
Post-processing methodology flowchart.
Figure 11.
Post-processing methodology flowchart.
Figure 12.
Result of pixel-based classification of UAV data ((left)—false color composite of UAV data, (right)—classification result).
Figure 12.
Result of pixel-based classification of UAV data ((left)—false color composite of UAV data, (right)—classification result).
Figure 13.
Examples of barrel locations, year 2019 ((upper row)—false color satellite composite, (lower row)—classification results, (left column)—main pile of barrels in close proximity to the base, (middle column)—piles of barrels between the landing area and the sea, (right column)—a pile of barrels next to the road).
Figure 13.
Examples of barrel locations, year 2019 ((upper row)—false color satellite composite, (lower row)—classification results, (left column)—main pile of barrels in close proximity to the base, (middle column)—piles of barrels between the landing area and the sea, (right column)—a pile of barrels next to the road).
Figure 14.
Feature importance of individual bands used for classification for each year ((top)—with NDVI, (bottom)—without NDVI).
Figure 14.
Feature importance of individual bands used for classification for each year ((top)—with NDVI, (bottom)—without NDVI).
Figure 15.
Comparison of satellite data classification results for 2019 using the same training areas ((left)—with R band and with NDVI, (right)—without R band and without NDVI).
Figure 15.
Comparison of satellite data classification results for 2019 using the same training areas ((left)—with R band and with NDVI, (right)—without R band and without NDVI).
Figure 16.
Comparison of spectral curves of barrels (red line) and sites misclassified as barrels (blue line).
Figure 16.
Comparison of spectral curves of barrels (red line) and sites misclassified as barrels (blue line).
Figure 17.
Total area occupied by barrels by year and AOI.
Figure 17.
Total area occupied by barrels by year and AOI.
Figure 18.
Removal of the biggest pile of barrels. 2019 (left), 2021 (middle), change (right).
Figure 18.
Removal of the biggest pile of barrels. 2019 (left), 2021 (middle), change (right).
Figure 19.
Change detection in areas occupied by the barrels (interannual comparison).
Figure 19.
Change detection in areas occupied by the barrels (interannual comparison).
Figure 20.
Removed barrels along the roads in AOI 1.
Figure 20.
Removed barrels along the roads in AOI 1.
Figure 21.
Transfer of barrels in AOI 2.
Figure 21.
Transfer of barrels in AOI 2.
Figure 22.
Status of barrel sanitation in 2022 for part of AO I1 ((left)—false color composite, (right)—classification).
Figure 22.
Status of barrel sanitation in 2022 for part of AO I1 ((left)—false color composite, (right)—classification).
Figure 23.
Barrels lying in layers on top of each other (photo K. Pavelka, 2019).
Figure 23.
Barrels lying in layers on top of each other (photo K. Pavelka, 2019).
Table 1.
Absolute camera position and orientation uncertainties (adapted from Pix4D software project report).
Table 1.
Absolute camera position and orientation uncertainties (adapted from Pix4D software project report).
| X (m) | Y (m) | Z (m) | Omega (Degree) | Phi (Degree) | Kappa (Degree) |
---|
Mean | 0.043 | 0.042 | 0.053 | 0.012 | 0.011 | 0.005 |
Sigma | 0.012 | 0.012 | 0.014 | 0.003 | 0.003 | 0.002 |
Table 2.
Absolute Geolocation variance. The geolocation error is the difference between the initial and computed image positions. Note, the image geolocation errors do not correspond to the accuracy of the observed 3D points (adapted from Pix4D software project report).
Table 2.
Absolute Geolocation variance. The geolocation error is the difference between the initial and computed image positions. Note, the image geolocation errors do not correspond to the accuracy of the observed 3D points (adapted from Pix4D software project report).
| Geolocation Error X (%) | Geolocation Error Y (%) | Geolocation Error Z (%) |
---|
Mean (m) | 0.03 | 0.04 | −0.03 |
Sigma (m) | 0.60 | 0.48 | 0.75 |
RMS error (m) | 0.60 | 0.48 | 0.75 |
Table 3.
Relative geolocation variance. The percentage of images with a relative geolocation error in X, Y, Z (adapted from Pix4D software project report).
Table 3.
Relative geolocation variance. The percentage of images with a relative geolocation error in X, Y, Z (adapted from Pix4D software project report).
Relative Geolocation Error | Images X (%) | Images Y (%) | Images Z (%) |
---|
Point (−1.00, 1.00) | 98.77 | 98.03 | 95.09 |
Point (−2.00, 2.00) | 99.75 | 99.75 | 100.00 |
Point (−3.00, 3.00) | 100.00 | 100.00 | 100.00 |
Mean of geolocation accuracy (m) | 1.11 | 1.11 | 1.49 |
Sigma of geolocation accuracy (m) | 0.15 | 0.15 | 0.12 |
Table 4.
Comparison of individual scenes.
Table 4.
Comparison of individual scenes.
Sensing Date | Satellite | Spatial Resolution (Panchromatic Band) |
---|
31 May 2019 | WorldView3 | 0.3 m |
1 August 2020 | GeoEye1 | 0.4 m |
30 August 2021 | GeoEye1 | 0.5 m |
18 July 2022 | SPOT-6 | 1.6 m |
Table 5.
Classification accuracy of UAV data (%).
Table 5.
Classification accuracy of UAV data (%).
| | Reference | | |
---|
| | Barrels | Water | Bare Land | Vegetation | Precision | Recall |
---|
Classification | Barrels | 88.21 | 0.00 | 1.14 | 7.98 | 85.08 | 88.21 |
Water | 0.00 | 100.00 | 0.00 | 0.00 | 100.00 | 100.00 |
Bare land | 2.28 | 0.00 | 98.17 | 1.48 | 98.20 | 98.17 |
Vegetation | 9.51 | 0.00 | 0.69 | 90.54 | 92.66 | 90.54 |
Table 6.
Mean classification accuracy, year 2019 (%).
Table 6.
Mean classification accuracy, year 2019 (%).
| | Reference | | |
---|
| | Barrels | Water | Bare Land | Vegetation | Precision | Recall |
---|
Classification | Barrels | 96.13 | 0.01 | 0.33 | 0.00 | 95.06 | 96.13 |
Water | 0.05 | 99.70 | 0.07 | 0.00 | 99.79 | 99.70 |
Bare land | 3.82 | 0.29 | 99.59 | 0.09 | 99.62 | 99.59 |
Vegetation | 0.00 | 0.00 | 0.01 | 99.91 | 99.98 | 99.91 |
Table 7.
Mean classification accuracy, year 2020 (%).
Table 7.
Mean classification accuracy, year 2020 (%).
| | Reference | | |
---|
| | Barrels | Water | Bare Land | Vegetation | Precision | Recall |
---|
Classification | Barrels | 95.99 | 0.00 | 0.25 | 0.00 | 95.29 | 95.99 |
Water | 0.11 | 99.72 | 0.05 | 0.00 | 99.84 | 99.72 |
Bare land | 3.89 | 0.27 | 99.69 | 0.06 | 99.68 | 99.69 |
Vegetation | 0.00 | 0.00 | 0.00 | 99.94 | 99.98 | 99.94 |
Table 8.
Mean classification accuracy, year 2021(%).
Table 8.
Mean classification accuracy, year 2021(%).
| | Reference | | |
---|
| | Barrels | Water | Bare Land | Vegetation | Precision | Recall |
---|
Classification | Barrels | 90.08 | 0.01 | 0.15 | 0.00 | 94.76 | 90.08 |
Water | 0.69 | 99.44 | 0.09 | 0.00 | 99.70 | 99.44 |
Bare land | 9.16 | 0.49 | 99.63 | 0.73 | 99.30 | 99.63 |
Vegetation | 0.08 | 0.06 | 0.13 | 99.27 | 99.54 | 99.27 |
Table 9.
Number of barrels in each year.
Table 9.
Number of barrels in each year.
Classification | 2019 | 2020 | 2021 |
---|
Manual (UAV data) | 30,162 | — | — |
Automatic (satellite data) | 33,786 | 20,278 | 15,225 |