Next Article in Journal
The Use of Drones to Determine Rodent Location and Damage in Agricultural Crops
Previous Article in Journal
A Case Study of Vignetting Nonuniformity in UAV-Based Uncooled Thermal Cameras
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Communication

Offline Imagery Checks for Remote Drone Usage

by
Roxane J. Francis
1,*,
Kate J. Brandis
1 and
Justin A. McCann
2
1
Centre for Ecosystem Science, School of Biological Earth and Environmental Sciences, University of New South Wales, Sydney, NSW 2052, Australia
2
Bush Heritage Australia, P.O. Box 329, Flinders Lane, Melbourne, VIC 8009, Australia
*
Author to whom correspondence should be addressed.
Drones 2022, 6(12), 395; https://doi.org/10.3390/drones6120395
Submission received: 19 October 2022 / Revised: 17 November 2022 / Accepted: 26 November 2022 / Published: 3 December 2022
(This article belongs to the Section Drone Communications)

Abstract

:
Drones are increasingly used for a wide range of applications including mapping, monitoring, detection, tracking and videography. Drone software and flight mission programs are, however, still largely marketed for “urban” use such as property photography, roof inspections or 3D mapping. As a result, much of the flight mission software is reliant upon an internet connection and has built-in cloud-based services to allow for the mosaicking of imagery as a direct part of the image collection process. Another growing use for drones is in conservation, where drones are monitoring species and habitat change. Naturally, much of this work is undertaken in areas without internet connection. Working remotely increases field costs, and time in the field is often aligned with specific ecological seasons. As a result, pilots in these scenarios often have only one chance to collect appropriate data and an opportunity missed can mean failure to meet research aims and contract deliverables. We provide a simple but highly practical piece of code allowing drone pilots to quickly plot the geographical position of captured photographs and assess the likelihood of the successful production of an orthomosaic. Most importantly, this process can be performed in the field with no reliance on an internet connection, and as a result can highlight any missing sections of imagery that may need recollecting, before the opportunity is missed. Code is written in R, a familiar software to many ecologists, and provided on a GitHub repository for download. We recommend this data quality check be integrated into a pilot’s standard image capture process for the dependable production of mosaics and general quality assurance of drone collected imagery.

1. Introduction

The utilisation of drones in a vast range of industries is rapidly increasing, alongside constant developments in drone technology. Many of these uses remain marketed for the “urban” world, where drones are used for industries such as urban design, insurance, forensics, and real estate, together with a huge market for hobbyist drone pilots [1,2,3,4,5]. These predominant uses of drones mean much of the accompanying flight mission software is reliant upon an internet connection to facilitate data collection and processing [6,7]. However, digital disadvantage in rural and remote areas means internet connectivity is limited in many continents, including Australia, Africa and South America [8,9,10].
Another increasing use of drones is their application in ecological monitoring. These applications include the monitoring of vegetation, locating of animals, or counting of large species aggregations [11,12,13]. The use of drones in natural environments lends itself to unique difficulties and requirements for the drone user. In undertaking an ecological monitoring survey, there are often no clear pre-existing boundaries that a pilot may need to cover in a drone flight. Further, boundaries can change or move quickly when surveying animals, and the development of survey areas is therefore often estimated on the fly. The estimation of aerial coverage based on on-ground distance calculations is difficult and as a result, complete coverage of a survey area in drone imagery may not be obtained. Currently, there is no simple, free and/or open source way of evaluating flight quality and determining the geographical position of collected photographs while in the field.
Missing or wayward images can be a big problem when monitoring requires the creation of an orthomosaic (a georeferenced image derived from the mosaicking of many individual images) [14]. Orthomosaics are often crucial inputs into ecological monitoring as they allow for high-resolution mapping of study sites [15]. The creation of a successful orthomosaic (a complete and clear orthomosaic without gaps) requires the collection of consistently spaced images, captured along transect lines at a very high overlap (upwards of 75%). The collection of this imagery can take many hours in the field and can be prone to error due to software failures, incorrect camera settings, poor lighting, battery changes, GPS failures and weather conditions such as wind [16,17]. In the case where sections of flight paths are missed, the orthomosaic will not correctly align neighbouring images, resulting in a highly blurred or patchy final mosaic. Even if images are obtained, if they are not of sufficient quality (e.g., highly blurred), processing software is unable to locate tie points in neighbouring images, also resulting in blurred or missing portions of mosaics. Some drone flight mission programs such as Drone Deploy [18] and Pix4D Capture [19] have automated the process of developing orthomosaics by directly uploading imagery to the cloud for instant processing of the imagery. This allows for reasonably quick viewing of the placement of images in geographic space, but requires an internet connection and processing of the final mosaic can take several hours or days, making it difficult to evaluate collection quality. Other open source software such as WEB Open Drone Map is available at a cost and works offline [20], but as it is mosaicking software it requires large amounts of random access memory (RAM), a limitation when working on field-based computers or laptops. Further, many processing software licences are locked to a single computer and so the user must wait to return to the office, or attempt remote connection to then realise some images were not captured. As such, when working in the field without internet, quick viewing of the geographic placement of collected drone imagery is currently not possible, making it difficult to predict the success of later orthomosaic production in a disconnected environment.
Importantly, for many ecological drone surveys there is a very limited window in which surveys can be conducted as they are focused around a breeding, climatic, migration event, or season. The temporal aspect to the data collection is therefore highly important and repeat surveys are not possible once the opportunity has passed [21,22,23,24]. Where repeat surveys are possible, failure to collect enough suitable images for a complete orthomosaic can result in more days spent in the field or the rescheduling of field trips, increasing field costs. Failure to deliver an orthomosaic can result in failure to provide accurate, or any, monitoring data, making in-field checks highly useful.
We aimed to develop a totally offline, field friendly system of plotting drone image location in geographical space and quantifying potential image blur. This system will work for most brands or model of drone, regardless of the flight mission software used. It allows the user to identify missing or blurry images in a short time, providing the opportunity for users to re-fly a mission while still on site, ensuring total coverage of an area of interest and the successful future production of an orthomosaic. This quality testing can prevent failed data collection, save the user much time and money in avoiding the need to re-visit field sites, and ensure delivery on ecological research contracts.

2. Materials and Methods

We collected drone imagery to build high-resolution maps to track changes in vegetation and flooding in the Chobe Region, northern Botswana. Images were collected with the use of Pix4D capture [19], flying at a height of ~100 m, speed of ~5 m/s and along transects with a front and side overlap of 75% with a DJI Phantom Advanced quadcopter. Images were transferred from the drone secure digital (SD) card and onto a laptop.
The complete image set was used to represent a successful drone flight for the purpose of creating an orthomosaic, labelled “complete”. We duplicated this set of imagery and removed 25% of images by deleting files, and used this image set to represent an unsuccessful drone flight, assigning them to a separate folder hereto referred to as “incomplete”.
In the software package R [25], we used the exifr package [26] to extract the exchangeable image file format (exif) data of each image allowing for the geographic coordinates hard coded into each image to be extracted. We then plotted the image coordinates, allowing the user to ensure images were collected at even spaces across the area. Further, image coordinates were exported as a .csv file, allowing for the plotting of image location coordinates in a geographic information system over satellite imagery or other layers. This is an additional step that allows the user to easily visualise coverage of the area of interest.
As a further check of dataset quality, we called on the package magick [27] to rank the images (or a subset of images) based on intensity change among pixels of the greyscale image. Images with less edges, such as blurry images or smooth surfaces, have low image variance, while sharp images with many detailed edges return higher image variance [27], a useful technique for many advanced image processing purposes. Rather than define an arbitrary cut-off, the code outputs the 10 images with the lowest variance for the pilot to review and assess their suitability for generating a mosaic. As the example dataset had minimal blurring, we used the magick package [27] to introduce blur in 7% of the images, mimicking potential blur in drone datasets. We then re-ran the algorithm to test the code’s ability to detect blurred images in a dataset.
We provide the code on GitHub https://github.com/RoxFrancis/Offline_drone_imagery_assessment (accessed on 3 November 2022) and encourage users to make this quality check a standard procedure for the collection of drone imagery in remote locations.
Orthomosaics presented for visual purposes in the results were produced using Pix4DMapper software [28].

3. Results

When plotted, the full set of imagery collected (432 images) showed largely consistent balanced spacing between images captured, and covered the whole area of interest (Figure 1).
There were a few points where the image was missed or captured late, shown by the grouped points, but the image spacing is largely uniform. As a result, a high-resolution orthomosaic could be produced covering the intended area (noting large expanses of water will normally drop from the orthomosaic process due to few tie points) (Figure 2).
In contrast, the mosaic created from the “incomplete” imagery set (322 images) did not cover the entire area of interest, with inconsistent spacing between images (Figure 3).
This missing data could occur due to software or hardware malfunction, or human error. As a result of the missing images, it did not produce an acceptable mosaic (Figure 4) and would not be useful for further analysis.
The image dataset had minimal blurring, so instead of blurred images, the output of the variance analysis consisted of large smooth surfaces, such as water (Figure 5). All artificially blurred images which we introduced to test the effectiveness of the algorithm were detected in the top 10 blurred images (Figure 5).

4. Discussion and Conclusions

Flying drones in remote locations poses a unique set of challenges for successful image collection, particularly locations without internet connectivity. For many remote locations, the uploading of imagery to a cloud or server for orthomosaic production is not possible. Pilots therefore risk returning from the field only to find their imagery was not successfully or entirely collected. Our methods provide a completely offline, open source and free solution for the checking of drone imagery while in the field, that works for most drone brands, models and flight mission software, preventing missed data collection opportunities.
The collection of suitable imagery for the creation of a mosaic is dependant on many factors, such as height, speed and most importantly, transect overlap. This work does not intend to describe ideal drone survey techniques which have been described elsewhere [29,30,31], but rather to find any mishaps in the collection of the imagery that may have occurred outside of the users control, and provide the opportunity for repeated data collection when necessary. Our method provides a peace of mind check up on the imagery, and can guide a user as to whether they will have a complete and useable dataset.
It is important to note that a complete dataset, that plots with even spaces between images, and covers the entire area of interest, will still not produce a successful orthomosaic if the overlap between transects is not sufficient. Importantly, “sufficient” overlap is project dependant, and will depend greatly on camera and flight statistics, research goals and image background [32]. Further, a complete dataset with sufficient overlap can be useless if each image collected is highly blurred, too light or dark, or the camera gimbal is not correctly calibrated and orientated [17]. Visual inspection of individual images will quickly identify such problems [33], and our code can help to prioritise images that might need attention due to blurring. Investigating images by rank of lowest variance is more time efficient than randomly checking images, which may be unlikely to chance upon potential blur.
Similar industry-specific advances in the processing of drone imagery are arising, such as the repeated temporal alignment of agricultural plots to explore plant development [34] and the development of end to end software packages with multi-sensor functions [35]. Such works highlight the constant improvements in not only drone technology but pre-and post-processing software, which will continue to facilitate the use of drones in more and more industries.
Within this work, we provide two example datasets to highlight to a user the visual differences between a successful and unsuccessful flight mission, and resultant orthmosaics (https://figshare.com/articles/media/Complete_and_Incomplete_Image_Collections/20479215, accessed on 3 November 2022). The “complete” dataset produced the best orthomosaic (Figure 3), and any errors in the orthomosaic produced from the “incomplete” dataset (Figure 4) are due to the missing images, as all other processing options remained the same. The image blur quantification and ranking returned crisp images of smooth surfaces (Figure 5), so is not a concern for production of an orthomosaic from this dataset. The code was effective at finding the artificially blurred images, proving its usefulness on other users’ datasets.
While our code does require the movement of images from the drone to a laptop computer—the benefit of this work is that it is applicable to most brands and models of drones, controllers, flight mission software, photogrammetry software, storage medium or computer. It requires only a few minutes to extract the exif data (depending on the number of images) using exifr [26], making it suitable for checking in the field. The few cases where this piece of code will not work however, is when exif data are not saved within the images collected, but are rather saved in an external image geolocation text file.
We have found that the time checking data quality immediately after the mission is time well spent. A particularly useful application of this code is when an area of interest is to be estimated in the field. Many ecological applications require the in-field determination of a boundary, such as covering the extent of bird colonies which have regularly changing boundaries [11,36].
Future improvements on this work could include automated detection for image brightness boundaries. Such work would be largely project specific and dependant on the area and object of interest. For example, when working with a dark coloured species on a dark background such as water it might be beneficial to have lower limits on image brightness. The inverse of this scenario could be counting white birds on sand or ice [37], where over-exposure is more likely to be an issue, and upper limits on brightness could be specified. Further, automated detection of missing images is a potential addition to the code. Such work would require distance calculations between neighbouring image locations, but these could be elevated by edge photographs which have no neighbouring image. Such edge images could be removed from the calculations when working with square or rectangular flight missions, but irregular areas (such as those often covered in ecological surveys) would complicate the identification of “edge” images.
The successful creation of orthomosaics is important for the monitoring and analysis of ecological trends [38]. This simple in-field quality assurance check of collected drone imagery can help to ensure successful mosaic creation, contributing to the success of research projects, and delivery on ecological contracts.

Author Contributions

Conceptualization, R.J.F., J.A.M.; methodology, R.J.F., J.A.M.; software, R.J.F., J.A.M.; validation, K.J.B.; resources, K.J.B.; writing—original draft preparation, R.J.F., J.A.M.; writing—review and editing, R.J.F., J.A.M., K.J.B.; visualization, R.J.F.; supervision, K.J.B.; project administration, R.J.F.; funding acquisition, K.J.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research received financial support from Taronga Conservation Society, the Australian Commonwealth Government Research Training Program and the Centre for Ecosystem Science, University of New South Wales Sydney.

Data Availability Statement

Code to run imagery checks can be found on GitHub https://github.com/RoxFrancis/Offline_drone_imagery_assessment (accessed on 3 November 2022) and we provide a practice imagery data set on Figshare https://figshare.com/articles/media/Complete_and_Incomplete_Image_Collections/20479215 (accessed on 3 November 2022).

Acknowledgments

This research received financial support from Taronga Conservation Society, the Australian Commonwealth Government Research Training Program and the Centre for Ecosystem Science, University of New South Wales Sydney. This study was conducted under the guidelines of the UNSW Animal Care and Ethics, permit 13/3B. We also thank the Government of Botswana for access to research permits EWT 8/36/4 XXIV (179), and drone permit RPA (H) 211.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Rocke, B.; Ruffell, A.; Donnelly, L. Drone aerial imagery for the simulation of a neonate burial based on the geoforensic search strategy (GSS). J. Forensic Sci. 2021, 66, 1506–1519. [Google Scholar] [CrossRef] [PubMed]
  2. Kugler, L. Real-world applications for drones. Commun. ACM 2019, 62, 19–21. [Google Scholar] [CrossRef] [Green Version]
  3. Ullah, F.; Sepasgozar, S.M.; Wang, C. A systematic review of smart real estate technology: Drivers of, and barriers to, the use of digital disruptive technologies and online platforms. Sustainability 2018, 10, 3142. [Google Scholar] [CrossRef] [Green Version]
  4. Appelbaum, D.; Nehmer, R.A. Using drones in internal and external audits: An exploratory framework. J. Emerg. Technol. Account. 2017, 14, 99–113. [Google Scholar] [CrossRef]
  5. la Cour-Harbo, A. Mass threshold for ‘harmless’ drones. Int. J. Micro Air Veh. 2017, 9, 77–92. [Google Scholar] [CrossRef] [Green Version]
  6. Ihsan, M.; Somantri, L.; Sugito, N.; Himayah, S.; Affriani, A. The Comparison of Stage and Result Processing of Photogrammetric Data Based on Online Cloud Processing. In IOP Conference Series: Earth and Environmental Science; IOP Publishing: Bristol, UK, 2019. [Google Scholar]
  7. Hinge, L.; Gundorph, J.; Ujang, U.; Azri, S.; Anton, F.; Rahman, A.A. Comparative analysis of 3D photogrammetry modeling software packages for drones survey. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, 42, 95–100. [Google Scholar] [CrossRef] [Green Version]
  8. Park, S. Digital inequalities in rural Australia: A double jeopardy of remoteness and social exclusion. J. Rural Stud. 2017, 54, 399–407. [Google Scholar] [CrossRef]
  9. Adeleye, N.; Eboagu, C. Evaluation of ICT development and economic growth in Africa. Netnomics Econ. Res. Electron. Netw. 2019, 20, 31–53. [Google Scholar] [CrossRef]
  10. Bánhidi, Z. The impact of broadband networks on growth and development in South America. Period. Polytech. Soc. Manag. Sci. 2021, 29, 33–39. [Google Scholar] [CrossRef]
  11. Lyons, M.B.; Brandis, K.J.; Murray, N.J.; Wilshire, J.H.; McCann, J.A.; Kingsford, R.T.; Callaghan, C.T. Monitoring large and complex wildlife aggregations with drones. Methods Ecol. Evol. 2019, 10, 1024–1035. [Google Scholar] [CrossRef]
  12. Vujičić, M.D.; Kennell, J.; Stankov, U.; Gretzel, U.; Vasiljević, Đ.A.; Morrison, A.M. Keeping up with the drones! Techno-social dimensions of tourist drone videography. Technol. Soc. 2022, 68, 101838. [Google Scholar] [CrossRef]
  13. Jiménez López, J.; Mulero-Pázmány, M. Drones for conservation in protected areas: Present and future. Drones 2019, 3, 10. [Google Scholar] [CrossRef] [Green Version]
  14. Hemerly, E.M. Automatic georeferencing of images acquired by UAV’s. Int. J. Autom. Comput. 2014, 11, 347–352. [Google Scholar] [CrossRef] [Green Version]
  15. Ancin-Murguzur, F.J.; Munoz, L.; Monz, C.; Hausner, V.H. Drones as a tool to monitor human impacts and vegetation changes in parks and protected areas. Remote Sens. Ecol. Conserv. 2020, 6, 105–113. [Google Scholar] [CrossRef]
  16. Hung, I.-K.; Unger, D.; Kulhavy, D.; Zhang, Y. Positional precision analysis of orthomosaics derived from drone captured aerial imagery. Drones 2019, 3, 46. [Google Scholar] [CrossRef] [Green Version]
  17. PIX4D. Troubleshooting—PIX4Dcapture. Troubleshooting 2021. Available online: https://support.pix4d.com/hc/en-us/articles/115004119343-Troubleshooting-PIX4Dcapture (accessed on 8 September 2022).
  18. Drone Deploy, Drone Deploy. California, United States. 2022.
  19. PIX4D SA PIX4Dcapture. 2019. Available online: https://www.pix4d.com/product/pix4dcapture (accessed on 3 November 2022).
  20. Vacca, G. WEB Open Drone Map (WebODM) a Software Open Source to Photogrammetry Process. In Proceedings of the Fig Working Week 2020—Smart Surveyors for Land and Water Management, Amsterdam, The Netherlands, 10–14 May 2020. [Google Scholar]
  21. Schofield, G.; Katselidis, K.A.; Lilley, M.K.; Reina, R.D.; Hays, G.C. Detecting elusive aspects of wildlife ecology using drones: New insights on the mating dynamics and operational sex ratios of sea turtles. Funct. Ecol. 2017, 31, 2310–2319. [Google Scholar] [CrossRef] [Green Version]
  22. Rush, G.P.; Clarke, L.E.; Stone, M.; Wood, M.J. Can drones count gulls? Minimal disturbance and semiautomated image processing with an unmanned aerial vehicle for colony-nesting seabirds. Ecol. Evol. 2018, 8, 12322–12334. [Google Scholar] [CrossRef]
  23. Francis, R.; Kingsford, R.; Brandis, K. Using drones and citizen science counts to track colonial waterbird breeding, an indicator for ecosystem health on the Chobe River, Botswana. Glob. Ecol. Conserv. 2022, 38, e02231. [Google Scholar] [CrossRef]
  24. Evans, L.J.; Jones, T.H.; Pang, K.; Saimin, S.; Goossens, B. Spatial ecology of estuarine crocodile (Crocodylus porosus) nesting in a fragmented landscape. Sensors 2016, 16, 1527. [Google Scholar] [CrossRef] [Green Version]
  25. R Core Team. Foundation for Statistical Computing. R: A Language and Environment for Statistical Computing; 4.1.0; R Core Team: Vienna, Austria, 2022. [Google Scholar]
  26. Dunnington, D.H.; Harvey, P. exifr: EXIF Image Data in R; R Package Version 0.3.2; 2021; Available online: https://cran.r-project.org/web/packages/exifr/exifr.pdf (accessed on 3 November 2022).
  27. Ooms, J. magick: Advanced Graphics and Image-Processing in R; R Package Version 2.7.3; 2021; Available online: https://ropensci.org/blog/2017/08/15/magick-10/ (accessed on 3 November 2022).
  28. PIX4D SA PIX4Dmapper. 2022. Available online: https://www.pix4d.com/product/pix4dmapper-photogrammetry-software (accessed on 3 November 2022).
  29. Junda, J.; Greene, E.; Bird, D.M. Proper flight technique for using a small rotary-winged drone aircraft to safely, quickly, and accurately survey raptor nests. J. Unmanned Veh. Syst. 2015, 3, 222–236. [Google Scholar] [CrossRef]
  30. Kannan, R.J.; Yadav, K. Drone Routing Techniques for Surveying in Urban Areas. Rev. Int. Geogr. Educ. Online 2021, 11, 4157–4167. [Google Scholar]
  31. Lyons, M.; Brandis, K.; Wilshire, J.; Murray, N.; McCann, J.; Kingsford, R.; Callaghan, C. A protocol for using drones to assist monitoring of large breeding bird colonies. EcovoRxiv 2019. [Google Scholar] [CrossRef] [Green Version]
  32. Jiménez-Jiménez, S.I.; Ojeda-Bustamante, W.; Marcial-Pablo, M.d.J.; Enciso, J. Digital terrain models generated with low-cost UAV photogrammetry: Methodology and accuracy. ISPRS Int. J. Geo-Inf. 2021, 10, 285. [Google Scholar] [CrossRef]
  33. Sieberth, T.; Wackrow, R.; Chandler, J. Motion blur disturbs—The influence of motion-blurred images in photogrammetry. Photogramm. Rec. 2014, 29, 434–453. [Google Scholar] [CrossRef]
  34. Khan, Z.; Miklavcic, S.J. An automatic field plot extraction method from aerial orthomosaic images. Front. Plant Sci. 2019, 10, 683. [Google Scholar] [CrossRef] [PubMed]
  35. Hinzmann, T.; Schönberger, J.L.; Pollefeys, M.; Siegwart, R. Mapping on the fly: Real-time 3D dense reconstruction, digital surface map and incremental orthomosaic generation for unmanned aerial vehicles. In Field and Service Robotics; Springer: Berlin/Heidelberg, Germany, 2018. [Google Scholar]
  36. Francis, R.J.; Lyons, M.B.; Kingsford, R.T.; Brandis, K.J. Counting mixed breeding aggregations of animal species using drones: Lessons from waterbirds on semi-automation. Remote Sens. 2020, 12, 1185. [Google Scholar] [CrossRef] [Green Version]
  37. Chabot, D.; Francis, C.M. Computer-automated bird detection and counts in high-resolution aerial images: A review. J. Field Ornithol. 2016, 87, 343–359. [Google Scholar] [CrossRef]
  38. Ferrari, R.; Lachs, L.; Pygas, D.R.; Humanes, A.; Sommer, B.; Figueira, W.F.; Edwards, A.J.; Bythell, J.C.; Guest, J.R. Photogrammetry as a tool to improve ecosystem restoration. Trends Ecol. Evol. 2021, 36, 1093–1101. [Google Scholar] [CrossRef]
Figure 1. GPS locations of drone imagery collected adjacent to the Chobe River, Botswana were extracted and plotted in R for easy inspection of approximate spacing between images, before being exported as a .csv file and plotted over satellite imagery, ensuring total coverage of the area of interest in the complete dataset.
Figure 1. GPS locations of drone imagery collected adjacent to the Chobe River, Botswana were extracted and plotted in R for easy inspection of approximate spacing between images, before being exported as a .csv file and plotted over satellite imagery, ensuring total coverage of the area of interest in the complete dataset.
Drones 06 00395 g001
Figure 2. The orthomosaic developed from the complete set of images collected alongside the Chobe River, Botswana, showing the high clarity of the orthomosaic when zoomed in (inset).
Figure 2. The orthomosaic developed from the complete set of images collected alongside the Chobe River, Botswana, showing the high clarity of the orthomosaic when zoomed in (inset).
Drones 06 00395 g002
Figure 3. GPS locations of a subset of drone imagery collected adjacent to the Chobe River, Botswana, were extracted and plotted in R for easy inspection of approximate spacing between images (left), before being exported as a .csv file and plotted over satellite imagery (right), highlighting gaps in the coverage of the area of interest in the “incomplete” dataset.
Figure 3. GPS locations of a subset of drone imagery collected adjacent to the Chobe River, Botswana, were extracted and plotted in R for easy inspection of approximate spacing between images (left), before being exported as a .csv file and plotted over satellite imagery (right), highlighting gaps in the coverage of the area of interest in the “incomplete” dataset.
Drones 06 00395 g003
Figure 4. The orthomosaic developed from the incomplete set of images collected alongside the Chobe River, Botswana, showing the distortion and gaps in the orthomosaic.
Figure 4. The orthomosaic developed from the incomplete set of images collected alongside the Chobe River, Botswana, showing the distortion and gaps in the orthomosaic.
Drones 06 00395 g004
Figure 5. Images ranked with the lowest variance from a sample of 10% of the incomplete dataset. Image (a) represents the image with the lowest variance of the sample (variance = 14) showing open water, next to its artificially blurred counterpart (b), (c) represents the second ranked (variance = 26.5) showing texture on the water surface, next to its artificially blurred counterpart (d), (e) is the ninth ranked (variance = 154) showing water and land, next to its artificially blurred counterpart (f) and finally (g) is the 10th ranked (variance = 161) showing the smooth ground surface but more texture in the trees, next to its artificially blurred counterpart (h).
Figure 5. Images ranked with the lowest variance from a sample of 10% of the incomplete dataset. Image (a) represents the image with the lowest variance of the sample (variance = 14) showing open water, next to its artificially blurred counterpart (b), (c) represents the second ranked (variance = 26.5) showing texture on the water surface, next to its artificially blurred counterpart (d), (e) is the ninth ranked (variance = 154) showing water and land, next to its artificially blurred counterpart (f) and finally (g) is the 10th ranked (variance = 161) showing the smooth ground surface but more texture in the trees, next to its artificially blurred counterpart (h).
Drones 06 00395 g005
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Francis, R.J.; Brandis, K.J.; McCann, J.A. Offline Imagery Checks for Remote Drone Usage. Drones 2022, 6, 395. https://doi.org/10.3390/drones6120395

AMA Style

Francis RJ, Brandis KJ, McCann JA. Offline Imagery Checks for Remote Drone Usage. Drones. 2022; 6(12):395. https://doi.org/10.3390/drones6120395

Chicago/Turabian Style

Francis, Roxane J., Kate J. Brandis, and Justin A. McCann. 2022. "Offline Imagery Checks for Remote Drone Usage" Drones 6, no. 12: 395. https://doi.org/10.3390/drones6120395

APA Style

Francis, R. J., Brandis, K. J., & McCann, J. A. (2022). Offline Imagery Checks for Remote Drone Usage. Drones, 6(12), 395. https://doi.org/10.3390/drones6120395

Article Metrics

Back to TopTop