Next Article in Journal
Individual Tree-Crown Detection in RGB Imagery Using Semi-Supervised Deep Learning Neural Networks
Next Article in Special Issue
IR Thermography from UAVs to Monitor Thermal Anomalies in the Envelopes of Traditional Wine Cellars: Field Test
Previous Article in Journal
Double-Branch Multi-Attention Mechanism Network for Hyperspectral Image Classification
Previous Article in Special Issue
Effect of Leaf Occlusion on Leaf Area Index Inversion of Maize Using UAV–LiDAR Data
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Surveying Wild Animals from Satellites, Manned Aircraft and Unmanned Aerial Systems (UASs): A Review

1
Key Laboratory of Land Surface Pattern and Simulation, Institute of Geographic Sciences and Natural Resources Research, Chinese Academy of Sciences, Beijing 100101, China
2
National Hulunber Grassland Ecosystem Observation and Research Station, Institute of Agricultural Resources and Regional Planning, Chinese Academy of Agricultural Sciences, Beijing 100081, China
3
State Key Laboratory of Resources and Environmental Information System, Institute of Geographic Sciences and Natural Resources Research, Chinese Academy of Sciences, Beijing 100101, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2019, 11(11), 1308; https://doi.org/10.3390/rs11111308
Submission received: 22 April 2019 / Revised: 20 May 2019 / Accepted: 22 May 2019 / Published: 1 June 2019
(This article belongs to the Special Issue Trends in UAV Remote Sensing Applications)

Abstract

:
This article reviews studies regarding wild animal surveys based on multiple platforms, including satellites, manned aircraft, and unmanned aircraft systems (UASs), and focuses on the data used, animal detection methods, and their accuracies. We also discuss the advantages and limitations of each type of remote sensing data and highlight some new research opportunities and challenges. Submeter very-high-resolution (VHR) spaceborne imagery has potential in modeling the population dynamics of large (>0.6 m) wild animals at large spatial and temporal scales, but has difficulty discerning small (<0.6 m) animals at the species level, although high-resolution commercial satellites, such as WorldView-3 and -4, have been able to collect images with a ground resolution of up to 0.31 m in panchromatic mode. This situation will not change unless the satellite image resolution is greatly improved in the future. Manned aerial surveys have long been employed to capture the centimeter-scale images required for animal censuses over large areas. However, such aerial surveys are costly to implement in small areas and can cause significant disturbances to wild animals because of their noise. In contrast, UAS surveys are seen as a safe, convenient and less expensive alternative to ground-based and conventional manned aerial surveys, but most UASs can cover only small areas. The proposed use of UAS imagery in combination with VHR satellite imagery would produce critical population data for large wild animal species and colonies over large areas. The development of software systems for automatically producing image mosaics and recognizing wild animals will further improve survey efficiency.

Graphical Abstract

1. Introduction

Poaching activities, climate change, rapid habitat loss and environmental degradation have led to massive population decline and even extinction for many types of wild animals in recent decades [1]. The regular monitoring of wild animals is thus essential for estimating population changes and modeling the possible consequences of changes in human activities, the climate, and many other possible stressors on wild animal populations [2]. However, monitoring animal populations over large geographical areas is extremely challenging because wild animals are often located far from human settlements and are commonly concealed [3].
The traditional methods for monitoring aspects of wild animals, such as their population size and structure, mainly rely on ground-based surveys [4,5,6,7]. Although traditional field-based data collected from thousands of sampling quadrats or transects are accurate, the collection of such data is laborious, time-intensive, and costly [7,8,9], and survey regions are often difficult to access on the ground. For example, Sibanda and Murwira [4] randomly generated one hundred transects to collect elephant data, but only 36% of the transects were accessed. The limited sample number of field surveys also restricts the monitoring accuracy of large-scale animal surveys. To ensure a sufficient sample number and geographic coverage of field surveys, some field-based surveys, such as the North American Breeding Bird Survey (1966–2011) [9], have been conducted with the help of thousands of volunteers along roadsides. However, such data are not representative of either the amount of habitat or the rate of change at a large spatial scale [10]. Using roadside methods also results in significant bias because only disturbed landscapes are examined. In addition, collecting data for animals sensitive to humans, such as waterbirds, is difficult [11]. For this purpose, researchers have developed various methods to survey wild animals from satellites, manned aircraft, and unmanned aircraft systems (UASs) [12,13,14].
Although multiple reviews have been published on the use of satellite imagery [12,13,15], manned aerial data [16,17,18], and UAS data [1,19,20,21,22] individually for wild animal surveys, there needs to be a single review that compares and contrasts these methods to help readers make an informed decision about which monitoring approach to use. The spatial resolution of satellite imagery is low relative to that of aerial imagery and thus requires that the surveyed animals or clusters of animals be large enough and have high contrast with the landscape [12]. Manned aerial systems have a relatively longer endurance time than do UASs, but manned aerial surveys typically require a much higher cost to implement and can be risky for survey personnel. The shortcomings of each type of data will not be overcome in the foreseeable future. Consequently, it is sometimes necessary to fuse these data types to regularly monitor wild animals over vast areas with high accuracy. This information is essential for users to establish appropriate uses of different remote sensing data or combinations thereof in the planning phases of wild animal surveys [23].
In this article, we review recent studies that used remote sensing data collected from satellites, manned aircraft, and UASs to estimate the abundance of wild animals by directly detecting and counting individuals. We describe the sensors, animal detection methods, and capabilities and limitations of each type of data and highlight some new research opportunities and technological developments for wild animal surveys using remote sensing data.

2. Methods

A comprehensive literature search was performed using the databases Web of Science and Google Scholar to collect studies related to remote sensing surveys of wild animal populations. The literature published before 26 December 2018, was reviewed. The keywords of the remote sensing platforms (including ‘remote sensing’ or ‘spaceborne’ or ‘space’, ‘satellite’ or ‘aircraft’ or ‘airborne’ or ‘unmanned aerial system’ or ‘unmanned aerial vehicle’ or ‘UAS’, ‘UAV’ or ‘drone’), surveyed species (including ‘wildlife’ or ‘animal’ or ‘wild animal’ or ‘mammal’, ‘bird’ or ‘herbivorous’), and investigation content (including ‘survey’ or ‘population’ or ‘census’ or ‘monitoring’) were used in combination in the search. Studies associated with the search results that were found in citations and recommended by anonymous reviewers were also added to the literature. According to the data acquisition platform, the surveys were classified as spaceborne surveys, manned aerial surveys, and UAS surveys. The platforms, sensors, image resolution, availability, data cost, coverage, surveyed species, animal detection methods, pixel number of the surveyed species on the imagery, and the accuracy of the wild animal surveys in each paper were determined. Some studies did not report the data cost, coverage, and pixel number of the surveyed species on the imagery, and these figures were estimated according to the provided information.

3. Results

3.1. Platforms

3.1.1. Satellites

Medium to low (1–60 m) spatial resolution spaceborne remote sensing data have been used for indirect animal surveys by identifying some form of ‘sign’ that indicates that animals have been in the area, such as fecal counts [24,25,26], food removal, and burrow counts [27,28], since the early 1980s. Schwaller et al. [26] delineated the physical extent of the Adélie penguin rookeries on Ross and Beaufort Islands, Antarctica, by analyzing the penguin nesting sites based on guano and other debris in 30-m-resolution Landsat TM imagery. Fretwell and Trathan [24] revealed 38 emperor penguin colonies scattered along the shore of Antarctica according to the characteristics of fecal stains using Landsat data. The breeding population of the emperor penguins at each colony was further estimated from QuickBird satellite images with a resolution of 0.61 m in the panchromatic band and a resolution of 2.44 m in the four multispectral bands (blue, green, red, and infrared) using a robust regression algorithm between these classified areas and the number of penguins [25]. In recent years, submeter very-high-resolution (VHR, ≤1 m panchromatic resolution) imagery from commercial satellites, such as the GeoEye-1, WorldView-2, WorldView-3, Quickbird-2, and IKONOS satellites, with resolutions ranging from 0.3–1 m in the panchromatic band to 1.2-4 m in the multispectral band, have been used to directly identify large-sized (≥0.6 m) individual animals, such as wildebeests (Connochaetes gnou), zebras (Equus quagga) [29], polar bears (Ursus maritimus) [30,31], albatrosses [32], southern right whales (Eubalaena australis) [33,34], and Weddell seals (Leptonychotes weddellii) [35]. Among the fifteen satellite remote sensing studies, three studies used GeoEye-1 imagery, seven studies used WorldView-1/2/3 imagery, three studies used Quickbird-2 imagery, and one study used IKONOS imagery (Table 1). At present, optical satellites have the ability to capture imagery with resolutions as fine as 31 cm (WorldView-3/4) with global coverage, and radar satellites offer resolutions as fine as 1 m (TerraSAR-X). Table 2 summarizes the submeter commercial satellites. The submeter radar satellites include TerraSAR-X and COSMO-SkyMed, but their use has not been reported for wild animal population surveys, which may be due to the low sensitivity of their radar signals to animals [15].

3.1.2. Light Manned Aircraft

Using light manned helicopters or fixed-wing aircraft for wild animal surveys has obvious advantages over using satellite-derived imagery [16]. The flight altitudes, time and sensors can be customized for each mission. Furthermore, aerial imagery can be collected with significantly higher spatial resolutions, up to 2.5 cm [23]. Given the flight safety and the endurance time of aircraft, studies of terrestrial mammals distributed over smaller areas prefer to use helicopters [18,36,37], while those of plains, remote locations and marine environments prefer to use fixed-wing aircraft [2,17,38,39,40,41]. One long-period study used a combination of helicopters and fixed-wing airplanes [42]. Among the thirteen manned aerial surveys of wild animals, three studies used manned helicopters, and eleven studies used fixed-wing aircraft (one study used both helicopters and fixed-wing aircraft, Table 1).

3.1.3. UASs

In recent years, UASs have also been employed to count and track wild animals [43,44,45] and to detect their body sizes [46] and behaviors [11,47,48]. UASs typically include an unmanned aerial vehicle (UAV, commonly known as a drone), a ground-based controller, and a system of communication between the two. The flight of UAVs may operate with various degrees of autonomy and can operate under remote control by a human operator or autonomously by onboard computers. The UASs employed for wild animal surveys in the literature are mainly small fixed-wing UAVs and multicopters that can be remotely controlled autonomously by onboard computers [19] (Table 1). Most of these UAS surveys were conducted with small UASs within the line of site because UASs are not permitted to fly out of sight of the operator or close to densely populated areas in many countries unless permission has been given [49]. In addition, many authors expressed understandable concerns over legislation regulating the use of UASs. Some surveys used large UASs, such as the ScanEagle, to obtain relatively large-range (up to 100 km) and long-endurance (24+ h) capabilities [50,51]. Among the nineteen studies, approximately half of the studies used multicopters (9/18) and fixed-wing UAVs (11/18). Because of their poor flexibility, airships and aerostats have not been used in wild animal surveys. Figure 1 shows photos of a fixed-wing UAS, a quadcopter UAS, and a hexacopter UAS. Given the safe flight height and the long-endurance, fixed-wing UASs are typically used to survey large animals, such as roe deer, red deer, wild boar [52], elephants [43], dugongs [51], and humpback whales [50], over a large area by operating at a high flight altitude (typically > 100 m). In the marine environment, fixed-wing UASs are much less restricted in terms of the safe flight height, and the capacity of the sensor dictates what species can be detected at a set altitude [50]. However, one disadvantage of large fixed-wing UASs is that they typically need a launch ramp or runway (Figure 1) to launch. Although some small and light fixed-wing UASs, such as the eBee [53], are able to take off by being thrown into the air, open and flat areas are required for a safe landing, and thus, their use has been largely restricted over uneven terrain and in high-vegetation areas. The predominance of multicopters can be explained by their low cost and superior vertical takeoff and landing performance on small flat areas surrounded by uneven terrain. Additionally, multicopters are readily available and can easily maintain a stationary position in flight to take pictures at any orientation. Among multicopters, quadcopters and hexacopters are the main types employed, and only one study used an octocopter [54]. Typically, quadcopters are less expensive and are easier to carry than other multicopters. Quadcopters fitted with light and small cameras can fly at low altitudes to detect small animals, such as blacktip reef sharks, pink whiprays [55], and butterflies [56], which are not overly sensitive to UAS disturbances. Hexacopters and octocopters are able to carry telephoto or high imaging quality cameras, which are heavier, but can capture higher-resolution imagery than the cameras carried by quadcopters at the same altitude. Thus, they are widely used to survey birds, such as canvasbacks, western/Clark’s grebes, double-crested cormorants [11], Gentoo penguins, chinstrap penguins [57], frigatebirds, crested terns, and royal penguins [54], which are sensitive to UAS and human disturbances and must be monitored from a greater height.
Regarding the power sources of UASs, most UASs are driven by electricity. Although fuel-based power systems generally allow for longer endurance, combustion engines are more difficult to handle than electric engines for people without expertise, and these engines produce more vibrations and noise than electric engines and cannot easily provide flexible power for multicopters [1]. In addition, combustion engines present risks associated with engine ignition. Multiple studies considered electric UASs to be an excellent tool for approaching and surveying birds because of their low noise [57,58,59,60,61].
Table 1 shows a comparison of spaceborne, manned aerial, and UAS surveys of animals, including the platforms used, sensors, image resolution, data cost, coverage, availability, surveyed species, animal detection methods, pixel number of surveyed species in the imagery, and detection accuracy.

3.2. Sensors

Depending upon the wavelengths collected by the sensors, remote sensing instruments can be classified into optical (0.4–14 μm) or microwave (1 mm–1 m) sensors [23]. Optical (0.4–14 μm) sensors can capture panchromatic (black and white), multispectral (collecting several bands) and hyperspectral (collecting hundreds of bands) imagery. In addition to the visible light range of red, green and blue (RGB), multispectral and hyperspectral imagery also includes the far blue, near-infrared, thermal and infrared bands. Some satellites are able to collect superspectral (between 10-100 bands) imagery; for example, WorldView-3 can capture up to 17 bands. The data used for satellite, manned aerial, and UAS surveys include panchromatic [58], RGB [11,43], multispectral [29,30,31,32,33,35], and thermal infrared imagery [53,62,63], as well as radio-tracking data [23,48,64,65].

3.2.1. Spaceborne Surveys

For spaceborne wild animal surveys, panchromatic imagery [31,66] and multispectral imagery [29,67,68] are the most widely used types of data. Among the fifteen satellite remote sensing studies, two studies used panchromatic imagery, and the other thirteen studies used multispectral imagery (Table 1). Multispectral imagery provides more spectral information for distinguishing target objects from the background than doe spanchromatic imagery [32]. However, the low resolution of multispectral imagery makes it effective only for recognition of huge animals (>2.5 m, twice the ground resolution of Worldview 3 in multispectral bands), such as southern right whales [33]. Researchers also used pansharpening techniques (merging high-resolution panchromatic and lower-resolution multispectral imagery to create a single high-resolution color image) to increase the differentiation between the target objects and the background [29,67,69]. Although the resolution has been greatly improved (up to 1 m, Table 2), microwave spaceborne imagery has not been reportedly used for wild animal recognition because of its low sensitivity to animals [15].

3.2.2. Manned Aerial Surveys

Aerial surveys of wild animals fall into two main categories: (a) Real-time surveys, in which the wild animals are counted in situ by trained observers, i.e., no imaging sensors are used during the surveys [2,17,37,38,39,41], and (b) photographic surveys, in which wild animals are counted from still RGB images or video [36,42,70,71,72,73]. Infrared thermography has also been tested for surveying wild animals with a significant temperature difference from the background environment, such as the Pacific walrus (Odobenus rosmarus) [63,74,75], and red deer (Cervus elaphus), fallow deer (Dama dama), roe deer (Capreolus capreolus), wild boar (Sus scrofa), foxes, wolves and badgers [76].

3.2.3. UAS Surveys

Similar to manned aerial surveys, UAS surveys have also widely used RGB cameras (15/18) and thermal infrared cameras (5/18) to capture data [1,20,21]. The use of multispectral, hyperspectral, radar and other sensors sensitive to vegetation information has not been reported, most likely for cost and efficiency reasons. Medium-format RGB cameras are smaller and cheaper, but have a much higher resolution than thermal infrared cameras (8 to 24 megapixels vs. <0.21 megapixels). Furthermore, RGB images are closer to human vision and are suitable for detecting wild animals, such as elephants [43], dugongs [51], and turtles [42], which live in open lands or marine environments. Thermal infrared cameras are primarily used for detecting wild animals that live in forests and other high-vegetation areas, such as roe deer, rabbits, foxes [77], koalas [62] and gray seals [53]. Animals can be detected by the temperature differences between their bodies and the environment, which is a helpful feature for detecting nocturnal animals in low-light conditions [52,63]. However, thermal infrared cameras are less commonly used because of their high price and the coarse resolution of the sensors [1]. These issues may become less important with continued technological developments and the increased availability of thermal infrared cameras. In addition, radio-tracking devices have been used on UASs in recent years to study the behaviors of small animals, such as Bicknell’s and Swainson’s thrushes (C. ustulatus) [64], noisy miners (Manorina melanocephala) [48], and iguanas [65], which used to be monitored from a ground-based vehicle or on foot by following radio signals [5,78]. However, to use radio tags, researchers had to first install the radio tags on the animals before tracking them. Consequently, radio signal data could not be used to properly assess the population of animals without radio tags. For the UAS data type, still images are the main outputs of the sensors. Videos typically produce images of lower quality than still images and are primarily used for the real-time monitoring and tracking of wild animals and the surrounding environment (see the list in Table 5). Overall, the types of UASs and sensors used are likely to be largely governed by their availability, price and ease of use. As more and cheaper UASs are developed and become available, the types and percentages of the UASs and sensors used are likely to change.

3.3. Data Availability, Resolution and Cost

3.3.1. Satellite Data

Satellite data can provide up to 0.31 m-resolution imagery (WorldView-3 and -4) for global coverage and are available from commercial satellites. The price of 0.5 m spatial resolution satellite imagery ranges from USD $14–27.5 per km2, depending on the spectral resolution, order area and data age [79]. A guarantee of a low degree of cloud cover increases costs by 25–50%. These data are relatively expensive, although the many US and European data providers have offered test data with several scenes or pieces of images. Researchers with limited budgets will likely not be able to obtain data over large areas. As stated by Kuenzer et al. [15], free and open data will be critical to encouraging studies and applications of VHR satellite imagery for wild animal surveys. Although this submeter satellite imagery is largely in the commercial domain, companies are being encouraged to allow researchers to access the archived data for free or at an affordable cost.

3.3.2. Aerial Data

Manned helicopter or fixed-wing vehicles have a relatively long endurance time and have been used for regular and geographically comprehensive animal monitoring by directly observing animals from the air [17,23,37,39,41] or reviewing centimeter-scale resolution imagery [42,70,71,72,73]. Stoner et al. [41] assessed the effectiveness of protection strategies in Tanzania based on a decade (1988–1998) of aerial survey data collected for 23 types of large herbivores, including buffalos (Syncerus caffer), elands (Taurotragus oryx), elephants (Loxodonta africana), and giraffes (Giraffa camelopardalis). Ottichilo et al. [17] analyzed the population trends of large nonmigratory wild herbivores and livestock in the Masai Mara ecosystem, Kenya, based on 20 years (1977–1997) of aerial survey data. The surveyed herbivores included elephants, impalas, ostriches, gazelles, warthogs, and cattle. Stapleton et al. [18] used a helicopter (Bell 206 LongRanger) to conduct a comprehensive aerial survey of Foxe Basin polar bears (Ursus maritimus) during the 2009 and 2010 ice-free seasons. The sampling transect was over 12,800 km, with an area of approximately 6000 km2. Martin et al. [42] used five decades (1963-2012) of aerial survey data from Guam (Marianas Archipelago in Micronesia) to systematically estimate the changes in the abundances, trends, and geographic distributions of sea turtles, elasmobranchs, and cetaceans. It is difficult to quantify the costs of aerial surveys because the costs are extremely variable depending upon the location, the access to aircraft and a number of other factors. Manned aerial surveys are expensive to implement for small study areas because of the cost of the aircraft, operator, and fuel, especially in developing countries. Furthermore, strict flight restrictions and crash risks must be considered [1,21]. Although it decreases as the area increases, the cost of aerial imagery is still much higher than that of satellite imagery.

3.3.3. UAS Data

UASs can fly at an altitude of several meters to hundreds of meters (Table 5) to capture up to 2 mm-resolution imagery [56] and have been seen as a safer and low-budget alternative to manned aircraft for wild animal surveys [1,20,21]. UASs have indeed become accessible to common consumers primarily because of the miniaturization of electronic components; the increase in computational power for onboard central processing units (CPUs); the development of intelligent flight control subsystems for takeoff, landing, and flight control; the improvement of payloads; and the reduction in costs [44]. The UASs used for wild animal surveys encompass a broad range of sizes from quadcopters (e.g., DJI Phantom 2, an electric quadcopter that costs approximately $1000 [55]) to full-sized planes (e.g., ScanEagle, providing a range of up to 100 km and 24+ h endurance capabilities [50,51]) and, therefore, vary in cost [23]. UASs have cost and image resolution advantages over manned aircraft. Most surveys were applied to small areas, as shown in Table 5. Indeed, only two studies surveyed geographic areas larger than 10.0 km2: Vermeulen et al. [43] surveyed African elephants over an area of 13.79 km2, and Hodgson et al. [50] surveyed humpback whales over an area of 35.2 km2. The remaining studies surveyed geographic areas smaller than 2.0 km2, which were much smaller than those surveyed with manned aerial surveys [17,18,42], and the minimum survey area was only 4 × 4 m [56]. This difference is mainly because the endurance times and operating distances of UASs are typically very short. Most multirotor UASs on the market only have an endurance of 10–30 min, and even the fixed-wing UASs mostly have endurance times shorter than 2 h. The operating distance of UASs may range from only several to tens of kilometers [1,23,50,51]. UASs tend to be affected by weather, particularly rain and wind [1]. Most UASs cannot fly in rain or moderately high winds [23]. Hostility from local people and poachers must also be considered, and the UASs may be shot down [80]. Although all species reported could be successfully surveyed using proper UAS models at proper flight heights, wild animals typically have stronger reactions to larger UAS sizes, combustion (noisier) engines, and low flight altitudes [81]. Animals during the nonbreeding period and in large groups were more likely to show behavioral reactions to a UAS [82], and birds were more prone to react than were other taxa [81]. Thus, most studies focused on surveying bird reactions to UASs and considered electric UASs an excellent tool for approaching them [57,58,59,60,61]. In addition to behavioral responses, the possible social aspects of using drones for wild animal surveys, including safety, privacy, psychological responses, data security and the wider understanding of conservation problems, should be of concern [47,80,83]. UAS users should also follow the local UAS operation legislation and ethics to prevent undesirable consequences for wild animals and human beings [84], as previous researchers did by receiving permits from the relevant local authorities for entering the study areas and operating UASs [11,43,50,59,60,85]. The comprehensive work of Cracknell [86] presents a review of UAS operation legislation in a number of countries.

3.4. Surveyed Species and Methodology

3.4.1. Spaceborne Surveys

The early spaceborne surveys of wild animals focused on using medium/low (1–60 m)-spatial resolution spaceborne data to find wild animals by identifying some form of sign indicating that the animals have been in the area, such as fecal counts [24,25,26], food removal, and burrow counts [27,28], rather than performing direct observations of the animals themselves. Examples of the indirect surveying of species include monitoring the population increase of the king penguin (Aptenodytes patagonicus) using 10 m-resolution SPOT images in the southern Indian Ocean [87], developing a supervised classification algorithm for locating seabird nesting habitats from Landsat TM images in the Russian High Arctic archipelago of Franz Josef Land [88], surveying the distribution of the Adélie penguin (Pygoscelis adeliae) [25,26] and emperor penguin (Aptenodytes forsteri) [24] in Antarctica by analyzing guano and other debris in the Landsat TM imagery, and detecting hairy-nosed wombats (Lasiorhinus latifrons) by analyzing the degraded vegetation and bare ground caused by the animal’s burrowing and mound building behaviors in the Nullarbor Plain of southern Australia based on 60 m-resolution Landsat imagery. VHR imagery, such as 1.8 m-resolution Worldview-2 imagery, has also been applied to detect gerbil burrows using a similar NDVI-based technique [28]. Although this indirect approach is valuable for detecting mammals and other animals with clear habitat modification signals, other species might not directly generate a detectable signal of habitat modification [15]. Recent studies have focused on using VHR (0.31–1 m panchromatic resolution) imagery from satellites, such as the GeoEye-1, WorldView-2, WorldView-3, and Quickbird-2 satellites, for directly identifying large-sized (≥0.6 m) individual animals, such as wildebeests (Connochaetes gnou), zebras (Equus quagga) [29,66], polar bears (Ursus maritimus) [31], albatrosses [32], southern right whales (Eubalaena australis) [33], and Weddell seals (Leptonychotes weddellii) [35]. Multitemporal image differencing and change detection methods have also been used to detect animal movements, such as those of polar bears [30]. Other efforts include assessing the possibility of detecting marine mammals (polar bears, walruses, and bowhead whales) in GeoEye-1 images [31], monitoring marine mammals (e.g., humpback whales) using IKONOS imagery [68], manually detecting, describing, and counting four different mysticete species (fin whales in the Ligurian Sea, humpback whales off Hawaii, southern right whales off Península Valdés, and gray whales in Laguna San Ignacio) in WorldView-3 imagery [34], manually recognizing elephant seals (Mirounga leonina) in GeoEye-1 images of Macquarie Island in the southern Pacific Ocean [69], and monitoring the population changes of Weddell seals along the Victoria Land coast based on an analysis of high-resolution satellite images captured by the DigitalGlobe and GeoEye platforms [89]. Leblanc et al. [90] presented a study that analyzed the potential to detect and differentiate large arctic mammals using spaceborne optical satellites (e.g., Pleiades and WorldView-2 and 3) by using a ground-based portable ASD FieldSpec® 3 spectroradiometer to measure and analyze the spectral reflectance (within the 350–2500 nm range) of snow and several arctic mammal pelts (polar bear, caribou, muskox, and ringed, harp and bearded seals) under winter conditions. Data fusion of multiresolution spaceborne imagery was also used to produce large-scale and accurate wild animal population data. Fretwell et al. reported a global and synoptic survey of emperor penguins by integrating 30 m-resolution Landsat TM and 0.6 m-resolution QuickBird imagery [67]. LaRue et al. [12] summarized several criteria, including the minimum animal size, open habitat landscapes, and high color contrasts between the target organisms and the landscape, that must be met in order to use VHR imagery to detect wild animals. Figure 2 shows two VHR satellite images of wild animals. Table 3 summarizes the animal species detected using satellite imagery.

3.4.2. Manned Aerial Surveys

Manned aerial technology has been used for monitoring wild animals since the mid-1930s, when the US Bureau of Biological Survey used airplanes and dirigibles to survey waterfowl [70]. The early surveys were mainly real-time surveys that employed trained observers to survey terrestrial and marine animals with potentially low abundance in remote or large areas [38]. The terrestrial animals included polar bears in the Arctic [37]; red kangaroos (Megaleia rufa) and sheep [91]; buffalos (Syncerus caffer), elands (Taurotragus oryx), elephants (Loxodonta africana), and giraffes (Giraffa camelopardalis) in the African grasslands [41]; and pronghorns (Antilocapra americana) in two pronghorn pastures, Wyoming, USA, with vegetation communities dominated by mountain big sagebrush communities [2]. In marine environments, aerial surveys have typically focused on marine megafauna, such as cetaceans and elasmobranch [38,39], because these taxa are regularly visible at the surface and their large sizes make them highly visible. Aerial surveys have been widely used for decades to estimate the local densities and population sizes of walruses and other cetaceans [92]. The National Oceanic and Atmospheric Administration (NOAA) have conducted many aerial surveys of marine mammals using manned aircraft with observers and photos since 1976 (https://www.afsc.noaa.gov/nmml/software/bwasp-comida.php). However, real-time aerial survey technology often underestimates animal numbers [91]. Marsh and Sinclair [40] reported that in an aerial survey of dugongs (Dugong dugon) and sea turtles (Chelonia mydas) in Moreton Bay, Australia, in 1985, observers missed over 40% of the dugong groups and over 80% of the turtles visible within the transect, including groups of more than 10 dugongs. The systematic reconnaissance fight method [93] is a widely used method in Africa, Australia and North America for the assessment of plains and woodland wildlife [38,41]. The method involves systematic or random transects over the target area at a constant height above ground, with at least one observer recording wild animals in a calibrated strip on at least one side of the aircraft. The double-observer survey configuration has proven to be necessary to quantify and correct for the bias caused by the failure of observers within the innermost distance band [2]. The method has often been criticized for its low accuracy and precision, even if the advanced version adopting the dual-experienced observer technique can partly eliminate the potential of visual obstruction. Despite this criticism, this method is considered to be the best option for obtaining relatively inexpensive coverage of large areas [38,40]. With the development of digital imaging and computer technologies, still images or videos captured from manned aerial aircraft have been used to count animals [36,71]. The separation of animal counting and aerial flight missions provides the photography-based method with several benefits over the real-time survey method, including allowing aircraft to fly at a higher altitude (high-resolution or telephoto cameras are required), revisiting the imagery or video after flights, and developing automatic counting algorithms [71,72,73]. Furthermore, aerial imagery can be collected with RGB cameras and infrared thermography at significantly higher spatial resolutions of up to 2.5 cm [23]. Therefore, aerial imagery allows one to directly discern more small (<0.6 m) animals, such as birds [70,71,72,73], sea turtles, and fish [42]; large animals that are difficult to distinguish from the background at the species level, such as roe deer (Capreolus pygargus) and red deer (Cervus elaphus) [36]; and some animals in varying habitats and at night, such as Pacific walruses (Odobenus rosmarus) [74], and and red deer (Cervus elaphus), fallow deer (Dama dama), roe deer (Capreolus capreolus), wild boar (Sus scrofa), foxes, wolves and badgers [76]. Table 4 summarizes the animal species surveyed using manned aircraft.

3.4.3. UAS Surveys

Since Anderson and Gaston [19] published a review on the ecological applications of drones, several broad-ranging literature review papers on the wild animal applications of drones have been published [1,20,21]. More recently, published review papers on the wild animal applications of drones have focused on specific niches, such as marine mammals [22]. UASs have been seen as a safer and low-budget alternative to manned aircraft for surveying terrestrial mammals. Most UAS studies have focused on assessing the possibilities of species detection, detection probabilities, and analyzing influencing factors [50] rather than monitoring large areas [1], which is likely because the UAS application field is still in its infancy. In many countries, aviation regulations inhibit many high-altitude, long-range, and beyond-line-of-sight (BLOS) UAS operations [49], which has stunted progress in larger fixed-wing applications. Additionally, de-icing technology has stunted UAS operations in polar regions or at high altitudes where icing is a big issue. Other technical limitations include the short endurance time of batteries and the low flight reliability of UASs.
The surveyed terrestrial mammals include roe deer, foxes [77], orangutans [45], African elephants [43], wild rabbits [94], wildebeests [95], elephants, poachers [44], koalas, deer, kangaroos [62], red deer, roe deer, and wild boar [52]. UASs have also been widely used to survey aquatic and amphibious animals because of their hard-to-access habitats, which are hazardous to manned aircraft and pilots [20]. Early detections of American alligators (Alligator mississippiensis), white ibises (Eudocimus albus), other white wading birds, and Florida manatees (Trichechus manatus) were reported by Jones et al. [58]. In addition to live animals, 4 life-size foam alligator decoys were used to test the performance of a small UAS for recognizing large to midsize vertebrates. Because these species typically live in or near water, it is almost impossible to detect all individuals at one time. Recently, published studies have also focused on assessing the sighting rate of using UASs to detect species, such as dugongs [51], turtles [42], and humpback whales [50]. Due to their ability to noninvasively collect very high-resolution aerial data, UASs have also been used to investigate the quantity, colony area, and density of species (including gentoo penguins, chinstrap penguins [57], sharks and rays [55]), assess the body size and condition of humpback whales [46], and study the movement of iguanas [65]. Birds and insects have also been subjects of UAS surveys. Because flying organisms tend to be highly sensitive to investigators, the use of UASs is considered a convenient method to access their habitats (e.g., dense forests, wetlands, and islands). The early assessment of the possibility of surveying various bird species, including white ibises and other white wading birds, using UAS imagery was reported by Jones et al. [58]. Light and small UASs also have advantages over ground surveys or conventional aerial surveys in monitoring bird colonies and locating nests. Sarda-palomera et al. [82] reported the use of a Multiplex Twin Star II, a small electric fixed-wing UAS, to monitor temporal changes in the breeding population size of a black-headed gull colony in Catalonia, northeastern Spain, and obtained disturbance-free georeferenced data on nest locations. Hodgson et al. [54] used a 3D robotics X8 electric octocopter to monitor the breeding of lesser frigatebirds and crested terns and the molting of royal penguins in tropical and polar environments. Another study focused on the use of UASs to detect insects, i.e., the butterfly species Libythea celtis [56].
Figure 3 shows UAS photos of four species: Elephants, lesser frigatebirds, fur seals, and royal penguins. Table 5 summarizes the animal species surveyed using UASs.

3.5. Pixel Number of Target Species in Imagery

The determination of the optimal pixel number of each animal in remote sensing imagery for wild animal surveys is difficult and becomes more complex when several species of different sizes are of interest. The optimal pixel number for species detection is mainly dependent on the body size and the contrast between the target organisms and the landscapes. The other influencing factors include image quality, possible confounding features, safe flight altitude, and cloud height (for aerial imagery). According to the Nyquist-Shannon sampling theorem [97], an object must occupy two or more pixels to be detected in the imagery. Nevertheless, more pixels raise the detection certainty, especially when several species need to be distinguished and the objects have a low color contrast with the landscape.

3.5.1. Spaceborne Surveys

Large-sized (≥0.6 m) individual animals typically cover one or more pixels in VHR satellite imagery (0.31–1 m resolution), depending on the animal size and image resolution. For example, adult wildebeests and zebras have head-and-body lengths of 1.5 to 2.5 m and 2.2 to 2.5 m, respectively, which results in image objects 3 to 4 pixels long and 1 to 2 pixels wide in a pansharpened GeoEye-1 image [29]. The average length and width of an adult female seal is 2.4 m and 1.4 m, respectively, so seals occupy 4–6 pixels in a pansharpened GeoEye-1 image [69,89]. An adult polar bear with a body length of 2–2.5 m occupies 4–5 pixels in a pansharpened GeoEye-1 image [31]. Submeter VHR satellite imagery has also been used for estimating the populations of animals that are colonial or congregate in herds, such as penguins [67]. Although emperor penguins appear as one or more pixels (when they group into close clusters) in the 0.6 m-resolution QuickBird imagery, the population of each colony can be estimated by constructing a regression between the area of these animals in the VHR images and the field-collected population numbers for each site [67].

3.5.2. UAS Surveys

We also investigated the pixel numbers of wild animals in UAS imagery and found that most animals cover 22–79 pixels (Figure 4). For example, Israel [77] found theoretically that a maximum flying altitude of 166 m can detect a fawn (roe deer) in a meadow using the FLIR Tau640 thermal camera (two pixels for each fawn), but that a flight altitude of 30 m (at which a fawn is approximately 22 pixels in length) should be used to avoid missing fawns, to ensure satisfactory illumination, and to minimize the impact of weather conditions. Some very large animals, such as elephants and leopard seals, cover over 100 pixels. This variation is partly because UASs must strike a balance between image resolution and other factors, such as a safe flight altitude, cloud height, and cover efficiency. Generally, flying at higher altitudes increases the transect strip width, but also increases the risk of encountering clouds and reduces the image resolution and detection certainty [50,51]. Many studies have used UASs to detect animals, but only a few studies have discriminated between several species, which requires more pixels [54,57]. Dulava et al. [11] reported that a minimum pixel resolution of approximately 5 mm was necessary to identify most waterbird species, such as the canvasback, meaning that each bird covers approximately 96 pixels in length in UAS images.
Manned aerial imagery and UAS imagery are similar in terms of their image resolution and flight height. The pixel number of wild animals in manned aerial imagery was not investigated.

3.6. Automatic and Semiautomatic Algorithms for Wild Animal Surveys

Manually counting animals from remote sensing imagery collected from satellites [30,34,68,69,89], manned aircraft [17,36,37,39,92], and UASs [43,58] has been performed with a high degree of accuracy and reliability for over half a century [35,69]. However, manual reviews of remote sensing imagery by inspectors or the direct observation of animals from aircraft are time consuming and subjective [1,2]. With the resolution improvement of remote sensing imagery and advancements in computer technologies, researchers have developed various automatic and semiautomated counting algorithms for detecting and counting wild animals in remote sensing imagery [3,71,72,73]. Although automatic and semiautomated counting algorithms have remarkable advantages in terms of dataset processing speed, most such studies have focused on only a few remote sensing images (usually covering only a few square kilometers) in relatively homogenous environments [23,32]. In general, pixel-based image analysis and object-based image analysis are the two main image classification techniques used in remote sensing. Supervised and unsupervised classification is pixel-based and creates a map in which each pixel is assigned to a class based on spectral information. Supervised classification uses known objects to train the algorithms and may return many errors of commission. Unsupervised classification uses statistical algorithms to group pixels based on spectral information, identifying objects with limited user inputs, and thus may result in no meaningful results [32]. Object-based image classification has been used more recently for high-resolution data in which each target covers multiple pixels [72]. It uses multiresolution segmentation or segment mean shifts to produce homogenous image objects with different scales by grouping pixels. These objects can then be analyzed in elaborate ways based on a large variety of spatial, spectral, and textural attributes.

3.6.1. Pixel-Based Methods

Pixel-based methods, such as supervised classification, unsupervised classification, and thresholding, are the simplest and most common methods for detecting animals from spaceborne imagery because even large animals cover only single to several pixels in VHR spaceborne imagery, and object-based image classification is less useful for producing homogenous image objects in spaceborne imagery [32,53,71]. Fretwell et al. [32] reported that a simple thresholding technique for the panchromatic and coastal (400 nm–450 nm) bands of the WorldView-2 satellite generates better results in counting southern right whales than three other supervised and unsupervised classification methods and found 84.6% of all manually digitized whales and 89% of the objects manually classed as probable whales, with 23.7% false positives. Descamps et al. [71] developed an unsupervised model for detecting greater flamingos in aerial images by fitting birds into bright ellipses surrounded by a darker background and delivered counts with good precision compared to manual counts by an expert (5% difference). Some studies also found that simple threshold-based methods remain adequate for detecting animals in low-resolution UAS thermal and RGB imagery. Seymour et al. [53] constructed a threshold-based seal detection model in the ArcGIS model builder programming environment to automatically detect and count gray seals in UAS thermal imagery, and the automated counts were 95–98% similar to the human estimates. Gonzalez et al. [62] developed an algorithm that used threshold segmentation and template matching technologies to count and track koalas and deer in UAS RGB and thermal videos. Thresholding and unsupervised classification algorithms were also developed to detect and count chickens in agricultural fields from thermal imagery to reduce the animal mortality caused by agricultural machinery [94] and to identify black-faced spoonbills (Platalea minor) in UAS RGB imagery to support an upcoming annual international black-faced spoonbill census [98]. Although the thresholding method performs adequately when the targets have similar gray values and are significantly different from their background, it is less accurate in more complex areas. Thresholding also requires users to identify thresholds. Yang et al. [29] used a hybrid image classification method that combines pixel-based and object-based image classification approaches to detect and count wildebeests and zebras in a GeoEye-1 satellite image of open savanna and generated good results with an average count error of 8.2%, omission error of 6.6% and commission error of 13.7%. The hybrid method first applied a pixel-based image classification method, i.e., an artificial neural network (ANN), to classify potential targets with similar spectral reflectance values at the pixel level. An object-based image classification method was then used to further differentiate animal targets from the surrounding landscapes. However, the method requires the input of a number of parameters and is therefore subjective and labor-intensive. Xue et al. [66] developed a semisupervised object-based method that combined a wavelet algorithm and an adaptive-network-based fuzzy neural network (ANFIS) to detect and count wildebeests and zebras in a single VHR GeoEye-1 panchromatic image of open savanna. The ANFIS, combining machine learning and a fuzzy system, can not only use existing expert knowledge, but also learn from data and is thus efficient and stable. The accuracy of the method is significantly higher than that of the traditional threshold-based method (0.79 vs. 0.58), which uses only gray value thresholding as a simple image segmentation method to divide an image into objects and background.

3.6.2. Object-Based Methods and Machine Learning

Aerial imagery collected from manned aircraft and UASs typically have a higher resolution than satellites, and each target animal covers more pixels (most animals cover 22-79 pixels on UAS imagery). Therefore, it is possible to use more features, including texture and shape features, to develop more complex and robust automatic counting algorithms, such as object-based algorithms [72] and machine learning [44,95,96], to detect animals in more complex environments. Chabot et al. [72] developed a systematic, repeatable approach using a commercial remote sensing image analysis program, ENVI 5.3 (Exelis Visual Information Solutions, Boulder, CO, USA), bundled with the OBIA-capable Feature Extraction module and the Interactive Data Language (IDL) programming application, to detect and count lesser snow geese (Chen caerulescens) in the Canadian Arctic in large numbers of manned aerial images at 4- and 5-cm resolutions (totaling 290 files and 234 GB) covering a variety of landscapes, numerous confounding features, and varying illumination conditions and exposure levels. Compared to manual counts, the object-based approach produced overall accurate estimates of goose numbers (R2 = 0.998, regression coefficient = 0.974) in 41 test images drawn from several breeding colonies. Torney et al. [95] developed an automatic counting method that uses rotation-invariant object descriptors combined with machine learning algorithms to detect and count wildebeests in aerial images collected in the Serengeti National Park. The results showed that the per image error rates were greater than, but comparable to, two separate human counts. For the total count, the algorithm was more accurate than both manual counts, suggesting that human counters have a tendency to systematically over- or undercount images, and the recognition accuracy was 74.15%. Olivares-Mendez et al. [44] developed a UAS that uses vision sensors to detect and track elephants and poachers automatically. In the research, a low-dimensional subspace representation scheme was applied to model 3D animals. An online incremental learning approach was utilized as an effective technique for learning/updating the appearance of a 3D animal. A particle filter and hierarchical tracking strategy were employed to estimate the motion model of a 3D animal. Christiansen et al. [94] developed a thermal feature extraction algorithm that used the discrete cosine transform to parameterize the thermal signature and a k-nearest-neighbor (kNN) classifier to automatically discriminate wild rabbits and chickens from nonanimals. The classification accuracy was 93.3% at an altitude range of 3–10 m. Longmore et al. [96] developed a UAS that could detect cows in thermal images based on astronomical detection software with machine learning algorithms. The average detection accuracy at low altitudes (<80 m) was 70%, with a scatter of approximately 10%, depending on variations in the background levels. Rey et al. [3] proposed a semiautomated data-driven active learning system jointly based on an object proposal strategy with an ensemble of exemplar support vector machine (EESVM) models to detect large mammals, including common elands (Taurotragus oryx), greater kudus (Tragelaphus strepsiceros), and gemsboks (Oryx gazella), in the semiarid African savanna from 6500 RGB UAS images, achieving a recall rate of 75% for a precision of 10%. The data-driven system was trained with crowd-sourced annotations provided by volunteers.

3.6.3. Deep Learning

Over the last several years, the machine learning community has realized that deep learning can learn functions that traditional shallower machine learning models, such as ANN [29], are often unable to. Major breakthroughs in deep learning have been made, and this technique has become an extremely powerful tool for use in processing large remote sensing data sets [99]. The significant difference between deep learning and classic visual recognition methods is that deep learning methods automatically learn hierarchical features from a huge amount of data rather than requiring the engineering of features by hand. Convolutional neural networks (CNNs), a recent family of deep learning algorithms, normally have more than one hidden layer; thus, they are able to extract more useful feature representations from a large number of input images for object detection and have been used in detecting large mammals in large datasets with a higher accuracy than those of traditional machine learning algorithms, such as EESVM (80% correct detections for a precision of 30% [100] vs. 75% correct detections for a precision of 10% [3]). However, extremely large training datasets are needed when applying deep learning, and it is a black-box solution; consequently, the trained models are unexplainable [99]. These issues are controversial within the remote sensing community. Furthermore, it is difficult to obtain a practical balance between recall rates and false positives when using deep learning [100].
Overall, the current automatic detection algorithms are rudimentary and cannot easily outperform human counts in most cases. Automated counting can be used as a first-pass count [100] or a method of assessing the performance of amateurs [95]. The use of semiautomated counting methods, i.e., automated counting methods in combination with manual counting results, would be an ideal application of the method.

3.7. Accuracy of Remote Sensing-Based Counts

As mentioned above, the automated and semiautomatic counts of animals in remote sensing imagery are usually highly correlated with manual counts when these algorithms are applied to small areas in relatively homogenous environments [25,34], achieving an overall accuracy of >70% [32,44,72,95,96]. However, the recognition accuracy dramatically decreases as the survey area, species and environmental complexity increase [95,96]. This trend is true even when using state-of-the-art models, such as CNNs. Recent works on the detection of large mammals in a large mammal reserve show that, at a recall rate of 80%, at least 3–20 false positives should be expected for each true positive [3,100].
Manual counts of animals derived from different remote sensing imagery and ground-based counts collected within a short time interval are also reported to be highly correlated. The numbers of seals manually recognized from pansharpened GeoEye-1 images and ground counts show a significant relationship (R2 = 0.9062) [69]. The estimate of polar bears counted from WorldView-1/2 and Quickbird imagery (N: 94; 95% confidence interval: 92–105) was remarkably similar to an abundance estimate derived from a line transect aerial survey conducted a few days earlier (N: 102; 95% confidence interval: 69–152) [30]. LaRue et al. [35] reported a strong, positive correlation (r = 0.98, df = 3, P < 0.003) between the ground counts of seals and counts derived from the Quickbird-2 and WorldView-1 images at overlapping locations within Erebus Bay at the same time. However, intrinsic (e.g., group size and animal activity) and extrinsic factors (e.g., lighting conditions, occlusion, flight speeds, image quality, vegetation composition and structure) are likely to contribute to the failure to detect all animals within the surveyed areas [2]. Therefore, remote sensing-based counts, including those derived from real-time aerial survey technology [40,91], often underestimate animal populations, especially for animals living in high-vegetation areas [36,62] and aquatic environments [42,50,51], but employing high-resolution imagery and additional observers [2] increases the detection probability [11].

4. Discussion

This paper has provided a brief overview of wild animal surveys based on spaceborne, manned aerial, and UAS data to date, and it lists the platforms, sensors, data, surveyed species, animal detection methods, and detection accuracies. The capabilities and limitations of each type of data are also discussed.

4.1. Spaceborne Surveys

Low-spatial resolution spaceborne imagery has primarily been used for characterizing and assessing changes in wild animal habitats, and VHR satellite imagery has been used only for directly monitoring large-sized (>0.6 m) individual animals, such as wildebeests, zebras [29], and southern right whales [33], or estimating the populations of animals that are colonial or congregate in groups, such as penguins [67]. Notably, most previous VHR surveys have used 0.4-1-m-resolution imagery (e.g., GeoEye-1, WorldView-2, Quickbird-2, and IKONOS). To date, only a limited number of studies have used the higher-resolution WorldView-3 data (e.g., Fretwell, et al. [32] and Cubaynes, et al. [34]). Satellite imagery offers several potential advantages over aerial imagery collected using manned aircraft or UASs, including larger geographic coverage (e.g., global) and regular data collection. Satellite imagery has the potential for modeling past, present, and future populations of large-sized wild animals. Satellite surveys require little regulation or logistical effort, are safe and do not disturb the target animals. With the continuous improvement of the satellite imagery resolution in the future, it will be possible to monitor more species from space. Although revisit times may improve with additional satellite launches, satellite imagery will not entirely replace conventional aerial surveys in the near future because of the significantly lower resolution [68]. Even for the highest resolution satellite imagery, such as WorldView-3 and -4 (0.3 m), the resolution is still not sufficient to discern small-sized (<0.6 m) animals [29] at the species level. According to the Nyquist-Shannon sampling theorem [97], an animal must occupy two or more pixels in the imagery to avoid information loss. Target animals that occupy only 1–2 pixels in the VHR imagery cannot be discerned as being different species, especially when the target species have similar colors and body sizes. In addition, open habitat landscapes and high color contrasts between the target organisms and the landscape are necessary for using VHR imagery in the estimation of animal abundance [12].

4.2. Manned Aerial Surveys

Manned aerial surveys of wild animals including real-time and photography-based methods have been routinely used to study wild animals for decades [23,37,39,70,73]. Aerial imagery can be collected with significantly higher spatial resolutions (up to 2.5 cm) than those of satellite imagery [23], which allows one to directly survey smaller (<0.6 m) animals, such as birds [70], sea turtles, and fish [42]. Manned aerial vehicles typically have a longer endurance time than do UASs, thus allowing large-scale wild animal surveys. Although manned aerial survey protocols are more mature and have been refined over decades by continual use, they are expensive and risky to implement. Moreover, large manned aerial vehicles sometimes cause significant disturbances to wild animal behavior because of the associated noise. Therefore, they have increasingly been replaced by UAS surveys [1,43].

4.3. UAS Surveys

As a new remote sensing method, UASs can easily access areas that are either impossible or very dangerous for humans or manned aircraft to enter; therefore, the line transect setting is less restricted by natural conditions, such as rivers and steep mountains. The use of UASs not only has ensured the safety of researchers and improved the reliability of results, but also has the potential to reduce disturbances of the target species [56]. As the use of UASs increases, the cost will further decrease; advances in UASs and sensors may lead to richer wild animal data and results that are more accurate than those of human surveys. Therefore, the potential for collecting unprecedented amounts of data on wild animal population distributions, abundances, behavior and habitat use will increase [50]. However, the UAS data acquisition efficiency is severely hampered by the short endurance and limited sensor resolution (both of which impact the coverage area). Among the studies, only two conducted surveys of geographic areas larger than 2.0 km2, with a maximum area of 35.2 km2 (surveying humpback whales). In addition, many authors expressed understandable concern over legislation regulating the use of UASs. In most countries, UASs are not permitted to fly out of sight of the operator or close to densely populated areas unless specific permission has been given [49]. The existing UAS operation regulations, national privacy and data protection rules are rigidly applied to the data captured by both civil and military UASs. These rules have hindered the application of civilian UASs for wild animal surveys. Fortunately, some countries have recently made positive efforts in terms of legislation. The South African Civil Aviation Authority (SACAA) has drafted a regulation indicating that flights can be performed as long as they are conducted over wild animal areas with low manned aircraft activity and are not close to registered active airfields [101]. The regulation will mean that wild animal surveys using UASs can be carried out more quickly with less bureaucracy. The USA Federal Aviation Administration (FAA) passed the small unmanned aircraft rule (part 107) in 2016, which raised the “blanket” altitude authorization for Section 333 exemption for aircraft weighing less than 55 pounds and for government aircraft operators to 400 feet from 200 feet [49]. The move has significantly reduced the workload for airspace applications and made UAS surveys of wild animals easier as well. The USA has also created six test sites to begin integrating UASs in civil airspace, and civil safety agencies are allowed to use these sites [1]. China has also opened civil airspace at thirteen sites located all over the country where researchers can use civilian UASs without permits [102]. A safe operation manual for lightweight UASs has been provided to support scientific research in the UK [103]. UAS users in many countries may not know where or how to apply for permits, and applying for permits may also be costly and time consuming. Many UAS experiments have thus been delayed or have had to be carried out without permission. Recently, government apps (e.g., “Can I fly there?” in Australia) have been developed to provide readily accessible airspace information to everyone. Although such flexibility significantly facilitates the activities of researchers and helps legislation take a positive path, airspace applications should be simplified, or exemptions should be developed, especially for applications that are obviously not considered to be a risk to national and human security, such as wild animal surveys in remote or sparsely inhabited regions. The authors believe that these problems could be overcome in the near future. UAS remote sensing will serve as a more effective alternative to manned aircraft for aerial and ground-based wild animal surveys and will provide technical support for the dynamic monitoring of small-scale wild animal populations.

5. Perspectives

The use of multiresolution spaceborne imagery has shown potential in global, synoptic surveys of species, as Fretwell et al. demonstrated with Landsat TM and QuickBird satellite images for surveying emperor penguins [67]. Similarly, the data fusion of the fine spatial resolution UAS data, the broad spatial coverage of VHR spaceborne imagery and high temporal resolution of animal movement data collected via GPS tracking technologies [104] will provide critical data in monitoring wild animals over large areas. Data fusion is particularly attractive for understanding the dynamics of some large wild animal species, such as elephants and wildebeests, especially when multiple target species are surveyed. Spaceborne remote sensing has the unique advantage of assessing the dynamics of wild animals and their habitats. The numbers of large animals or colonies can roughly be estimated from VHR satellite imagery, such as QuickBird and WorldView. Although higher-spectral, -temporal and -spatial resolution imagery collected by satellites (e.g., those from the Digital Globe Legion) will improve our ability to detect small animals and should continue to receive attention, it is almost impossible to detect juvenile individuals and capture the details of small-sized species in satellite imagery, even if the resolution greatly improves in the future. UASs can offer scientists fine-resolution data for colony-scale wild animal surveys at user-controlled revisit periods. The proposed use of large-sized animals or their colony numbers roughly estimated from spaceborne images and the wild animal population structure precisely extracted from UAS images at each colony in combination with animal movement data not only will allow us to assess animal populations with higher accuracy, but also can be used in modeling past, present, and future resource suitability. As an attempt, Toor et al. [104] built a data-driven framework to map resource suitability related to the foraging of white storks (Ciconia ciconia) by using multitemporal Landsat data and white stork movement data collected using GPS tracking devices. This fusion technique will also be helpful for estimating the populations of free-grazing livestock and other large wild herbivores, which is essential for many applications, such as the estimation of the forage-livestock balance.
The manual reviewing of remote sensing imagery collected from satellites, manned aircraft, and UASs by inspectors is costly, time consuming and subjective [1,2]. Previous studies have attempted to develop reliable algorithms to automatically detect, count and track animals to replace human observers in the field [3,44,62]. However, most of the algorithms are pixel-based or object-based and have been applied to only small areas or a few images. Deep learning algorithms are able to automatically learn hierarchical features from a huge amount of data and have shown higher reliability and superior performance in detecting animals than do traditional machine learning algorithms [100]. The models used are mainly common CNNs [7,100], and the accuracy of animal recognition and classification is still low when these models are applied to large datasets [100,105]. Active learning and transfer learning technologies can be utilized to enhance a CNN’s performance incrementally, as proposed by Norouzzadeh et al. [7]. The detection speed can be improved via the use of the latest-developed deep learning architectures, such as Faster RCNN [105,106]. The body sizes of target animals can be extracted from the mask polygons generated by Mask RCNN [107]. An alternative approach for quickly counting animals in remote sensing imagery is to recruit volunteers and use crowdsourcing platforms to tag animals in various remote sensing imagery, and these human classifications can be used to help train deep learning models to perform better in the future [3,23]. Existing open crowdsourcing platforms include Geo-Wiki (which allows citizens to provide feedback on existing information overlaid on satellite imagery or by contributing entirely new data, http://www.geo-wiki.org/), Penguin Watch (which enables citizens to tag penguins in aerial or camera-trap imagery, https://www.zooniverse.org), MicroMappers (https://micromappers.wordpress.com/) [3], and Amazon Mechanical Turk (which enables individuals and companies to outsource processes and jobs to a distributed workforce that can perform these tasks virtually, https://www.mturk.com/). The use of automated methods in combination with manual methods would improve the efficiency of wild animal surveys.
Accessing various remote sensing data is becoming cheaper and easier, but limited data collection and analytical expertise have limited the wider use of the data in animal surveys [108]. Future ecologists are advised to develop expertise in data collection and processing to avoid costly delays, inaccurate data collection and processing. Data collection and processing manuals, including the determination of platforms, sensors, flight altitudes and transects, should also be specifically developed for different species to survey target species with the desired accuracy while maximizing the sampling area. The existing ground-based technical manuals for terrestrial wild animal surveys [109] provide a valuable reference for UAS and VHR data collection, including transect design and population estimation theory. However, detecting different sized species requires different imaging systems and image resolutions.

Author Contributions

D.W. conceived the original study, processed the data, and wrote the manuscript. Q.S. and H.Y. contributed to the editing of the manuscript.

Funding

This work was performed with financial support from the National Key R&D Program of China (2017YFB0503005; 2017YFC0506505), the National Natural Science Foundation of China (41771388; 41501416) and Tianjin Intelligent Manufacturing Project (Autonomous control UAS network remote sensing observation technology and application).

Acknowledgments

The authors thank the DigitalGlobe Foundation, USA, for granting us permission to use the QuickBird and GeoEye-1 images, shown in Figure 2.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Linchant, J.; Lisein, J.; Semeki, J.; Lejeune, P.; Vermeulen, C. Are unmanned aircraft systems (uass) the future of wildlife monitoring? A review of accomplishments and challenges. Mamm. Rev. 2015, 45, 239–252. [Google Scholar] [CrossRef]
  2. Smyser, T.J.; Guenzel, R.J.; Jacques, C.N.; Garton, E.O. Double-observer evaluation of pronghorn aerial line-transect surveys. Wildl. Res. 2016, 43, 474–481. [Google Scholar] [CrossRef]
  3. Rey, N.; Volpi, M.; Joost, S.; Tuia, D. Detecting animals in african savanna with uavs and the crowds. Remote Sens. Environ. 2017, 200, 341–351. [Google Scholar] [CrossRef]
  4. Sibanda, M.; Murwira, A. Cotton fields drive elephant habitat fragmentation in the mid zambezi valley, zimbabwe. Int. J. Appl. Earth Obs. Geoinf. 2012, 19, 286–297. [Google Scholar] [CrossRef]
  5. Crewe, T.; Taylor, P.; Mackenzie, S.; Lepage, D.; Aubry, Y.; Crysler, Z.; Finney, G.; Francis, C.; Guglielmo, C.; Hamilton, D. The motus wildlife tracking system: A collaborative research network to enhance the understanding of wildlife movement. Avian Conserv. Ecol. 2017, 12, 8. [Google Scholar]
  6. Borchers, D.L.; Buckland, S.T.; Zucchini, W. Estimating Animal Abundance; Springer: London, UK, 2002. [Google Scholar]
  7. Norouzzadeh, M.S.; Nguyen, A.; Kosmala, M.; Swanson, A.; Palmer, M.S.; Packer, C.; Clune, J. Automatically identifying, counting, and describing wild animals in camera-trap images with deep learning. Proc. Natl. Acad. Sci. USA 2018, 115, 5716–5725. [Google Scholar] [CrossRef]
  8. Witmer, G.W. Wildlife population monitoring: Some practical considerations. Wildl. Res. 2005, 32, 259–263. [Google Scholar] [CrossRef]
  9. Sauer, J.R.; Link, W.A.; Fallon, J.E.; Pardieck, K.L.; Ziolkowski Jr, D.J. The north american breeding bird survey 1966–2011: Summary analysis and species accounts. N. Am. Fauna 2013, 79, 1–32. [Google Scholar] [CrossRef]
  10. Betts, M.G.; Mitchell, D.; Dlamond, A.W.; Bety, J. Uneven rates of landscape change as a source of bias in roadside wildlife surveys. J. Wildl. Manag. 2007, 71, 2266–2273. [Google Scholar] [CrossRef]
  11. Dulava, S.; Bean, W.T.; Richmond, O.M.W. Applications of unmanned aircraft systems (uas) for waterbird surveys. Environ. Pract. 2015, 17, 201–210. [Google Scholar] [CrossRef]
  12. LaRue, M.A.; Stapleton, S.; Anderson, M. Feasibility of using high-resolution satellite imagery to assess vertebrate wildlife populations. Conserv. Biol. 2017, 31, 213–220. [Google Scholar] [CrossRef] [PubMed]
  13. Pettorelli, N.; Laurance, W.F.; O’Brien, T.G.; Wegmann, M.; Nagendra, H.; Turner, W. Satellite remote sensing for applied ecologists: Opportunities and challenges. J. Appl. Ecol. 2014, 51, 839–848. [Google Scholar] [CrossRef]
  14. Turner, W. Sensing biodiversity. Science 2014, 346, 301–302. [Google Scholar] [CrossRef]
  15. Kuenzer, C.; Ottinger, M.; Wegmann, M.; Guo, H.; Wang, C.; Zhang, J.; Dech, S.; Wikelski, M. Earth observation satellite sensors for biodiversity monitoring: Potentials and bottlenecks. Int. J. Remote Sens. 2014, 35, 6599–6647. [Google Scholar] [CrossRef]
  16. Michaud, J.S.; Coops, N.C.; Andrew, M.E.; Wulder, M.A.; Brown, G.S.; Rickbeil, G.J.M. Estimating moose (alces alces) occurrence and abundance from remotely derived environmental indicators. Remote Sens. Environ. 2014, 152, 190–201. [Google Scholar] [CrossRef]
  17. Ottichilo, W.K.; De Leeuw, J.; Skidmore, A.K.; Prins, H.H.T.; Said, M.Y. Population trends of large non-migratory wild herbivores and livestock in the masai mara ecosystem, kenya, between 1977 and 1997. Afr. J. Ecol. 2000, 38, 202–216. [Google Scholar] [CrossRef]
  18. Stapleton, S.; Peacock, E.; Garshelis, D. Aerial surveys suggest long-term stability in the seasonally ice-free foxe basin (nunavut) polar bear population. Mar. Mammal Sci. 2016, 32, 181–201. [Google Scholar] [CrossRef]
  19. Anderson, K.; Gaston, K.J. Lightweight unmanned aerial vehicles will revolutionize spatial ecology. Front. Ecol. Environ. 2013, 11, 138–146. [Google Scholar] [CrossRef] [Green Version]
  20. Chabot, D.; Bird, D.M. Wildlife research and management methods in the 21st century: Where do unmanned aircraft fit in? J. Unmanned Veh. Syst. 2015, 3, 137–155. [Google Scholar] [CrossRef] [Green Version]
  21. Christie, K.S.; Gilbert, S.L.; Brown, C.L.; Hatfield, M.; Hanson, L. Unmanned aircraft systems in wildlife research: Current and future applications of a transformative technology. Front. Ecol. Environ. 2016, 14, 241–251. [Google Scholar] [CrossRef]
  22. Fiori, L.; Doshi, A.; Martinez, E.; Orams, M.B.; Bollardbreen, B. The use of unmanned aerial systems in marine mammal research. Remote Sens. 2017, 9, 543. [Google Scholar] [CrossRef]
  23. Hollings, T.; Burgman, M.; van Andel, M.; Gilbert, M.; Robinson, T.; Robinson, A. How do you find the green sheep? A critical review of the use of remotely sensed imagery to detect and count animals. Methods Ecol. Evol. 2018, 9, 881–892. [Google Scholar] [CrossRef]
  24. Fretwell, P.T.; Trathan, P.N. Penguins from space: Faecal stains reveal the location of emperor penguin colonies. Glob. Ecol. Biogeogr. 2009, 18, 543–552. [Google Scholar] [CrossRef]
  25. Schwaller, M.R.; Southwell, C.J.; Emmerson, L.M. Continental-scale mapping of adélie penguin colonies from landsat imagery. Remote Sens. Environ. 2013, 139, 353–364. [Google Scholar] [CrossRef]
  26. Schwaller, M.R.; Olson, C.E.; Ma, Z.Q.; Zhu, Z.L.; Dahmer, P. A remote-sensing analysis of adelie penguin rookeries. Remote Sens. Environ. 1989, 28, 199–206. [Google Scholar] [CrossRef]
  27. Loffler, E.; Margules, C. Wombats detected from space. Remote Sens. Environ. 1980, 9, 47–56. [Google Scholar] [CrossRef]
  28. Wilschut, L.I.; Heesterbeek, J.A.P.; Begon, M.; de Jong, S.M.; Ageyev, V.; Laudisoit, A.; Addink, E.A. Detecting plague-host abundance from space: Using a spectral vegetation index to identify occupancy of great gerbil burrows. Int. J. Appl. Earth Obs. Geoinf. 2018, 64, 249–255. [Google Scholar] [CrossRef]
  29. Yang, Z.; Wang, T.; Skidmore, A.K.; De, L.J.; Said, M.Y.; Freer, J. Spotting east african mammals in open savannah from space. PLoS ONE 2014, 9, e115989. [Google Scholar] [CrossRef]
  30. Stapleton, S.; Larue, M.; Lecomte, N.; Atkinson, S.; Garshelis, D.; Porter, C.; Atwood, T. Polar bears from space: Assessing satellite imagery as a tool to track arctic wildlife. PLoS ONE 2014, 9, e101513. [Google Scholar] [CrossRef]
  31. Platonov, N.G.; Mordvintsev, I.N.; Rozhnov, V.V. The possibility of using high resolution satellite images for detection of marine mammals. Biol. Bull. 2013, 40, 197–205. [Google Scholar] [CrossRef]
  32. Fretwell, P.T.; Scofield, P.; Phillips, R.A. Using super-high resolution satellite imagery to census threatened albatrosses. Ibis 2017, 159, 481–490. [Google Scholar] [CrossRef]
  33. Fretwell, P.T.; Staniland, I.J.; Forcada, J. Whales from space: Counting southern right whales by satellite. PLoS ONE 2014, 9, e88655. [Google Scholar] [CrossRef]
  34. Cubaynes, H.C.; Fretwell, P.T.; Bamford, C.; Gerrish, L.; Jackson, J.A. Whales from space: Four mysticete species described using new vhr satellite imagery. Mar. Mamm. Sci. 2018, 1, 1–26. [Google Scholar] [CrossRef]
  35. LaRue, M.A.; Rotella, J.J.; Garrott, R.A.; Siniff, D.B.; Ainley, D.G.; Stauffer, G.E.; Porter, C.C.; Morin, P.J. Satellite imagery can be used to detect variation in abundance of weddell seals (leptonychotes weddellii) in erebus bay, antarctica. Polar Biol. 2011, 34, 1727–1737. [Google Scholar] [CrossRef]
  36. Li, S. Aerial surveys of wildlife resources. Jilin For. Sci. Tech. 1985, 1, 50–51. [Google Scholar]
  37. Wiig, Ø.; Bakken, V. Aerial strip surveys of polar bears in the barents sea. Polar Res. 1990, 8, 309–311. [Google Scholar] [CrossRef]
  38. Kessel, S.T.; Gruber, S.H.; Gledhill, K.S.; Bond, M.E.; Perkins, R.G. Aerial survey as a tool to estimate abundance and describe distribution of a carcharhinid species, the lemon shark, negaprion brevirostris. J. Mar. Biol. 2013, 2013, 597383. [Google Scholar] [CrossRef]
  39. Andriolo, A.; Martins, C.; Engel, M.H.; Pizzorno, J.L.; Más-Rosa, S.; Freitas, A.C.; Morete, M.E.; Kinas, P.G. The first aerial survey to estimate abundance of humpback whales (megaptera movaeangliae) in the breeding ground off brazil (breeding stock a). J. Cetacean Res. Manag. 2006, 8, 307–311. [Google Scholar]
  40. Marsh, H.; Sinclair, D.F. An experimental evaluation of dugong and sea turtle aerial survey techniques. Aust. Wildl. Res. 1989, 16, 639–650. [Google Scholar] [CrossRef]
  41. Stoner, C.; Caro, T.; Mduma, S.; Mlingwa, C.; Sabuni, G.; Borner, M. Assessment of effectiveness of protection strategies in tanzania based on a decade of survey data for large herbivores. Conserv. Biol. 2007, 21, 635–646. [Google Scholar] [CrossRef]
  42. Martin, S.L.; Van Houtan, K.S.; Jones, T.T.; Aguon, C.F.; Gutierrez, J.T.; Tibbatts, R.B.; Wusstig, S.B.; Bass, J.D. Five decades of marine megafauna surveys from micronesia. Front. Mar. Sci. 2016, 2, 1–13. [Google Scholar] [CrossRef]
  43. Vermeulen, C.; Lejeune, P.; Lisein, J.; Sawadogo, P.; Bouche, P. Unmanned aerial survey of elephants. PLoS ONE 2013, 8, e54700. [Google Scholar] [CrossRef] [PubMed]
  44. Olivares-Mendez, M.A.; Fu, C.H.; Ludivig, P.; Bissyande, T.F.; Kannan, S.; Zurad, M.; Annaiyan, A.; Voos, H.; Campoy, P. Towards an autonomous vision-based unmanned aerial system against wildlife poachers. Sensors 2015, 15, 31362–31391. [Google Scholar] [CrossRef]
  45. Koh, L.; Wich, S. Dawn of drone ecology: Low-cost autonomous aerial vehicles for conservation. Trop. Conserv. Sci. 2012, 5, 121–132. [Google Scholar] [CrossRef]
  46. Christiansen, F.; Dujon, A.M.; Sprogis, K.R.; Arnould, J.P.Y.; Bejder, L. Noninvasive unmanned aerial vehicle provides estimates of the energetic cost of reproduction in humpback whales. Ecosphere 2016, 7, 18. [Google Scholar] [CrossRef]
  47. Ditmer, M.A.; Vincent, J.B.; Werden, L.K.; Tanner, J.C.; Laske, T.G.; Iaizzo, P.A.; Garshelis, D.L.; Fieberg, J.R. Bears show a physiological but limited behavioral response to unmanned aerial vehicles. Curr. Biol. 2015, 25, 2278–2283. [Google Scholar] [CrossRef]
  48. Cliff, O.M.; Fitch, R.; Sukkarieh, S.; Saunders, D.L.; Heinsohn, R. Online Localization of Radio-Tagged Wildlife with An Autonomous Aerial Robot System. In Proceedings of the Robotics: Science and Systems, Rome, Italy, 13–17 July 2015. [Google Scholar]
  49. U.S. Federal Aviation Administration. Faa Doubles ‘blanket’ Altitude for Many Uas Flights. Available online: https://www.faa.gov/uas/media/Part_107_Summary.pdf (accessed on 15 December 2017).
  50. Hodgson, A.; Peel, D.; Kelly, N. Unmanned aerial vehicles for surveying marine fauna: Assessing detection probability. Ecol. Appl. 2017, 27, 1253–1267. [Google Scholar] [CrossRef]
  51. Hodgson, A.; Kelly, N.; Peel, D. Unmanned aerial vehicles (uavs) for surveying marine fauna: A dugong case study. PLoS ONE 2013, 8, e79556. [Google Scholar] [CrossRef]
  52. Witczuk, J.; Pagacz, S.; Zmarz, A.; Cypel, M. Exploring the feasibility of unmanned aerial vehicles and thermal imaging for ungulate surveys in forests - preliminary results. Int. J. Remote Sens. 2017, 1–18. [Google Scholar] [CrossRef]
  53. Seymour, A.C.; Dale, J.; Hammill, M.; Halpin, P.N.; Johnston, D.W. Automated detection and enumeration of marine wildlife using unmanned aircraft systems (uas) and thermal imagery. Sci. Rep. 2017, 7, 10. [Google Scholar] [CrossRef]
  54. Hodgson, J.C.; Baylis, S.M.; Mott, R.; Herrod, A.; Clarke, R.H. Precision wildlife monitoring using unmanned aerial vehicles. Sci. Rep. 2016, 6, 22574. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  55. Kiszka, J.J.; Mourier, J.; Gastrich, K.; Heithaus, M.R. Using unmanned aerial vehicles (uavs) to investigate shark and ray densities in a shallow coral lagoon. Mar. Ecol. Prog. Ser. 2016, 560, 237–242. [Google Scholar] [CrossRef]
  56. Ivosevic, B. Monitoring butterflies with an unmanned aerial vehicle: Current possibilities and future potentials. J. Ecol. Environ. 2017, 41, 72–77. [Google Scholar] [CrossRef]
  57. Goebel, M.E.; Perryman, W.L.; Hinke, J.T.; Krause, D.J.; Hann, N.A.; Gardner, S.; Leroi, D.J. A small unmanned aerial system for estimating abundance and size of antarctic predators. Polar Biol. 2015, 38, 619–630. [Google Scholar] [CrossRef]
  58. Jones, G.P.I.; Pearlstine, L.G.; Percival, H.F. An assessment of small unmanned aerial vehicles for wildlife research. Wildl. Soc. Bull. 2006, 34, 750–758. [Google Scholar] [CrossRef]
  59. Vas, E.; Lescroël, A.; Duriez, O.; Boguszewski, G.; Grémillet, D. Approaching birds with drones: First experiments and ethical guidelines. Biol. Lett. 2015, 11, 20140754. [Google Scholar] [CrossRef]
  60. McEvoy, J.F.; Hall, G.P.; McDonald, P.G. Evaluation of unmanned aerial vehicle shape, flight path and camera type for waterfowl surveys: Disturbance effects and species recognition. PeerJ 2016, 4, e1831. [Google Scholar] [CrossRef] [PubMed]
  61. Wilson, A.M.; Barr, J.; Zagorski, M. The feasibility of counting songbirds using unmanned aerial vehicles. Auk 2017, 134, 350–362. [Google Scholar] [CrossRef]
  62. Gonzalez, L.F.; Montes, G.A.; Puig, E.; Johnson, S.; Mengersen, K.; Gaston, K.J. Unmanned aerial vehicles (uavs) and artificial intelligence revolutionizing wildlife monitoring and conservation. Sensors 2016, 16, 97. [Google Scholar] [CrossRef]
  63. Goodenough, A.E.; C63arpenter, W.S.; MacTavish, L.; MacTavish, D.; Theron, C.; Hart, A.G. Empirically testing the effectiveness of thermal imaging as a tool for identification of large mammals in the african bushveldt. Afr. J. Ecol. 2018, 56, 51–62. [Google Scholar] [CrossRef]
  64. Tremblay, J.A.; Desrochers, A.; Aubry, M.Y.; Pace, P.; Bird, D.M. A low-cost technique for radio-tracking wildlife using a small standar unmanned aerial vehicle. J. Unmanned Veh. Syst. 2017, 5, 102–108. [Google Scholar]
  65. Webber, D.; Hui, N.; Kastner, R.; Schurgers, C. Radio receiver design for unmanned aerial wildlife tracking. In Proceedings of the International Conference on Computing, Networking and Communications (ICNC), Santa Clara, CA, USA, 26–29 January 2017; pp. 942–946. [Google Scholar]
  66. Xue, Y.; Wang, T.; Skidmore, A.K. Automatic counting of large mammals from very high resolution panchromatic satellite imagery. Remote Sens. 2017, 9, 878. [Google Scholar] [CrossRef]
  67. Fretwell, P.T.; LaRue, M.A.; Morin, P.; Kooyman, G.L.; Wienecke, B.; Ratcliffe, N.; Fox, A.J.; Fleming, A.H.; Porter, C.; Trathan, P.N. An emperor penguin population estimate: The first global, synoptic survey of a species from space. PLoS ONE 2012, 7, e33751. [Google Scholar] [CrossRef]
  68. Abileah, R.; Marine Technology, S. Use of high resolution space imagery to monitor the abundance, distribution, and migration patterns of marine mammal populations. In Proceedings of the Annual Conference of the Marine-Technology-Society, Honolulu, HI, USA, 5–8 November 2001. [Google Scholar]
  69. McMahon, C.R.; Howe, H.; van den Hoff, J.; Alderman, R.; Brolsma, H.; Hindell, M.A. Satellites, the all-seeing eyes in the sky: Counting elephant seals from space. PLoS ONE 2014, 9, e92613. [Google Scholar] [CrossRef]
  70. Hawkins, A.S. Flyways: Pioneering waterfowl management in north america. Indian J. Med. Res. 1984, 49, 832. [Google Scholar]
  71. Descamps, S.; Béchet, A.; Descombes, X.; Arnaud, A.; Zerubia, J. An automatic counter for aerial images of aggregations of large birds. Bird Study 2011, 58, 302–308. [Google Scholar] [CrossRef]
  72. Chabot, D.; Dillon, C.; Francis, C.M. An approach for using off-the-shelf object-based image analysis software to detect and count birds in large volumes of aerial imagery. Avian Conserv. Ecol. 2018, 13, 15. [Google Scholar] [CrossRef]
  73. Groom, G.; Stjernholm, M.; Nielsen, R.D.; Fleetwood, A.; Petersen, I.K. Remote sensing image data and automated analysis to describe marine bird distributions and abundances. Ecol. Inform. 2013, 14, 2–8. [Google Scholar] [CrossRef]
  74. Burn, D.M.; Webber, M.A.; Udevitz, M.S. Application of airborne thermal imagery to surveys of pacific walrus. Wildl. Soc. Bull. 2006, 34, 51–58. [Google Scholar] [CrossRef]
  75. Garner, D.L.; Underwood, H.B.; Porter, W.F. Use of modern infrared thermography for wildlife population surveys. Environ. Manag. 1995, 19, 233–238. [Google Scholar] [CrossRef]
  76. Franke, U.; Goll, B.; Hohmann, U.; Heurich, M. Aerial ungulate surveys with a combination of infrared and high–resolution natural colour images. Anim. Biodivers. Conserv. 2012, 35, 285–293. [Google Scholar]
  77. Israel, M. A uav-based roe deer fawn detection system. In Proceedings of the International Conference on Unmanned Aerial Vehicle in Geomatics, Gottingen, Germany, 14–16 September 2011; pp. 51–55. [Google Scholar]
  78. Hayford, H.A.; O’Donnell, M.J.; Carrington, E. Radio tracking detects behavioral thermoregulation at a snail’s pace. J. Exp. Mar. Biol. Ecol. 2018, 499, 17–25. [Google Scholar] [CrossRef]
  79. Landinfo Worldwide Mapping LLC. Buying Satellite Imagery: Pricing Information for High Resolution Satellite Imagery. Available online: http://www.landinfo.com/satellite-imagery-pricing.html (accessed on 6 January 2019).
  80. Sandbrook, C. The social implications of using drones for biodiversity conservation. Ambio 2015, 44, 636–647. [Google Scholar] [CrossRef] [Green Version]
  81. Mulero-Pazmany, M.; Jenni-Eiermann, S.; Strebel, N.; Sattler, T.; Negro, J.J.; Tablado, Z. Unmanned aircraft systems as a new source of disturbance for wildlife: A systematic review. PLoS ONE 2017, 12, e0178448. [Google Scholar] [CrossRef] [PubMed]
  82. Sarda-palomera, F.; Bota, G.; Vinolo, C.; Pallares, O.; Sazatornil, V.; Brotons, L.; Gomariz, S.; Sarda, F. Fine-scale bird monitoring from light unmanned aircraft systems. IBIS 2012, 154, 177–183. [Google Scholar] [CrossRef]
  83. Hodgson, J.C.; Koh, L.P. Best practice for minimising unmanned aerial vehicle disturbance to wildlife in biological field research. Curr. Biol. 2016, 26, 404–405. [Google Scholar] [CrossRef] [PubMed]
  84. Resnik, D.B.; Elliott, K.C. Using drones to study human beings: Ethical and regulatory issues. Sci. Eng. Ethics 2018, 1–12. [Google Scholar] [CrossRef]
  85. Rümmler, M.C.; Mustafa, O.; Maercker, J.; Peter, H.U.; Esefeld, J. Measuring the influence of unmanned aerial vehicles on adélie penguins. Polar Biol. 2016, 39, 1329–1334. [Google Scholar] [CrossRef]
  86. Cracknell, A.P. Uavs: Regulations and law enforcement. Int. J. Remote Sens. 2017, 38, 3054–3067. [Google Scholar] [CrossRef]
  87. Guinet, C.; Jouventin, P.; Malacamp, J. Satellite remote-sensing in monitoring change of seabirds - use of spot image in king penguin population increase at ile aux cochons, crozet archipelago. Polar Biol. 1995, 15, 511–515. [Google Scholar] [CrossRef]
  88. Williams, M.; Dowdeswell, J.A. Mapping seabird nesting habitats in franz josef land, russian high arctic, using digital landsat thematic mapper imagery. Polar Res. 1998, 17, 15–30. [Google Scholar] [CrossRef]
  89. Ainley, D.G.; Larue, M.A.; Stirling, I.; Stammerjohn, S.; Siniff, D.B. An apparent population decrease, or change in distribution, of weddell seals along the victoria land coast. Mar. Mammal Sci. 2015, 31, 1338–1361. [Google Scholar] [CrossRef]
  90. Leblanc, G.; Francis, C.; Soffer, R.; Kalacska, M.; De Gea, J. Spectral reflectance of polar bear and other large arctic mammal pelts: Potential applications to remote sensing surveys. Remote Sens. 2016, 8, 273. [Google Scholar] [CrossRef]
  91. Caughley, G. Experiments in aerial survey. J. Wildl. Manag. 1976, 40, 290–300. [Google Scholar] [CrossRef]
  92. International Whaling Commission. Report of the Scientific Committee. Available online: https://iwc.int/scientifc-committee-reports (accessed on 21 May 2019).
  93. Norton-Griffiths, M. Counting Animals. Handbook No.1; African Wildlife Leadership Foundation: Nairobi, Kenya, 1978. [Google Scholar]
  94. Christiansen, P.; Steen, K.; Jørgensen, R.; Karstoft, H. Automated detection and recognition of wildlife using thermal cameras. Sensors 2014, 14, 13778. [Google Scholar] [CrossRef]
  95. Torney, C.J.; Dobson, A.P.; Borner, F.; Lloydjones, D.J.; Moyer, D.; Maliti, H.T.; Mwita, M.; Fredrick, H.; Borner, M.; Hopcraft, J.G.C. Assessing rotation-invariant feature classification for automated wildebeest population counts. PLoS ONE 2016, 11, e0156342. [Google Scholar] [CrossRef] [PubMed]
  96. Longmore, S.N.; Collins, R.P.; Pfeifer, S.; Fox, S.E.; Mulero-Pazmany, M.; Bezombes, F.; Goodwin, A.; Ovelar, M.D.; Knapen, J.H.; Wich, S.A. Adapting astronomical source detection software to help detect animals in thermal images obtained by unmanned aerial systems. Int. J. Remote Sens. 2017, 38, 2623–2638. [Google Scholar] [CrossRef]
  97. Shannon, C.E. Communication in the presence of noise. Proc. Inst. Radio Eng. 1949, 37, 10–21. [Google Scholar] [CrossRef]
  98. Liu, C.-C.; Chen, Y.-H.; Wen, H.-L. Supporting the annual international black-faced spoonbill census with a low-cost unmanned aerial vehicle. Ecol. Inform. 2015, 30, 170–178. [Google Scholar] [CrossRef]
  99. Zhu, X.X.; Tuia, D.; Mou, L.; Xia, G.S.; Fraundorfer, F. Deep learning in remote sensing: A comprehensive review and list of resources. IEEE Geosci. Remote Sens. Mag. 2018, 5, 8–36. [Google Scholar] [CrossRef]
  100. Kellenberger, B.; Marcos, D.; Tuia, D. Detecting mammals in uav images: Best practices to address a substantially imbalanced dataset with deep learning. Remote Sens. Environ. 2018, 216, 139–153. [Google Scholar] [CrossRef]
  101. Mulero-Pazmany, M.; Stolper, R.; van Essen, L.D.; Negro, J.J.; Sassen, T. Remotely piloted aircraft systems as a rhinoceros anti-poaching tool in africa. PLoS ONE 2014, 9, e83873. [Google Scholar] [CrossRef]
  102. Liao, X. The Second Chief Meeting of the Research Center of Uav Application and Regulation, Cas. Available online: http://www.igsnrr.ac.cn/xwzx/zhxw/201709/t20170901_4853837.html (accessed on 19 January 2018).
  103. Cunliffe, A.M.; Anderson, K.; DeBell, L.; Duffy, J.P. A uk civil aviation authority (caa)-approved operations manual for safe deployment of lightweight drones in research. Int. J. Remote Sens. 2017, 38, 2737–2744. [Google Scholar] [CrossRef]
  104. Toor, M.L.v.; Kranstauber, B.; Newman, S.H.; Prosser, D.J.; Takekawa, J.Y.; Technitis, G.; Weibel, R.; Wikelski, M.; Safi, K. Integrating animal movement with habitat suitability for estimating dynamic migratory connectivity. Landsc. Ecol. 2018, 33, 879–893. [Google Scholar] [CrossRef] [Green Version]
  105. Madec, S.; Jin, X.; Lu, H.; De Solan, B.; Liu, S.; Duyme, F.; Heritier, E.; Baret, F. Ear density estimation from high resolution rgb imagery using deep learning technique. Agric. For. Meteorol. 2019, 264, 225–234. [Google Scholar] [CrossRef]
  106. Ren, S.; He, K.; Girshick, R.; Sun, J. Faster r-cnn: Towards real-time object detection with region proposal networks. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 39, 1137–1149. [Google Scholar] [CrossRef]
  107. He, K.; Gkioxari, G.; Dollár, P.; Girshick, R. Mask R-CNN. IEEE Trans. Pattern Anal. Mach. Intell. 2018, 2961–2969. [Google Scholar] [CrossRef]
  108. Clark, B.L.; Bevanda, M.; Aspillaga, E.; Jørgensen, N.H. Bridging disciplines with training in remote sensing for animal movement: An attendee perspective. Remote Sens. Ecol. Conserv. 2016, 3, 30–37. [Google Scholar] [CrossRef]
  109. China State Administration of Forestry. Notifications of the Relevant Work of the State Forestry Administration on the Full Start of the Second Investigation of Terrestrial Wildlife Resources. Available online: http://www.forestry.gov.cn/portal/main/govfile/13/govfile_1817.htm (accessed on 7 December 2017).
Figure 1. Photographs of representative UASs: (a) Gatewing×100TM, a fuel-powered fixed-wing UAS [43]; (b) DJI Phantom 2, an electric quadcopter [55]; and (c) APH-22, an electric hexacopter [57].
Figure 1. Photographs of representative UASs: (a) Gatewing×100TM, a fuel-powered fixed-wing UAS [43]; (b) DJI Phantom 2, an electric quadcopter [55]; and (c) APH-22, an electric hexacopter [57].
Remotesensing 11 01308 g001
Figure 2. Two VHR satellite images of wild animals. (a) Emperor penguins (Aptenodytes forsteri) are shown as black/gray pixels, and guano is shown as brown pixels in a pansharpened QuickBird image from Fretwell et al. [67]. (b) Migration season of wildebeests (Connochaetes gnou) and zebras (Equus quagga) in a GeoEye-1 satellite image from Yang et al. [29]. Note: The QuickBird and GeoEye-1 images have resolutions of 0.61 m and 0.5 m (at nadir), respectively, in the corresponding panchromatic bands. The use of these two images was authorized by the DigitalGlobe Foundation.
Figure 2. Two VHR satellite images of wild animals. (a) Emperor penguins (Aptenodytes forsteri) are shown as black/gray pixels, and guano is shown as brown pixels in a pansharpened QuickBird image from Fretwell et al. [67]. (b) Migration season of wildebeests (Connochaetes gnou) and zebras (Equus quagga) in a GeoEye-1 satellite image from Yang et al. [29]. Note: The QuickBird and GeoEye-1 images have resolutions of 0.61 m and 0.5 m (at nadir), respectively, in the corresponding panchromatic bands. The use of these two images was authorized by the DigitalGlobe Foundation.
Remotesensing 11 01308 g002
Figure 3. UAS imagery of wild animals: (a) Elephants in the Nazinga Game Ranch in the south of Burkina Faso [43], (b) breeding lesser frigatebirds (Fregata ariel) in the tropical Ashmore Reef Commonwealth Marine Reserve and on the nearby Adele Island, Western Australia [54], (c) fur seals [57], and (d) molting royal penguins (Eudyptes schlegeli) on the subantarctic Macquarie Island, Australia [54]. These studies demonstrate that using UAS technology can significantly improve the efficiency and precision of surveying hard-to-reach wild animal populations and places compared with using traditional ground-based methods.
Figure 3. UAS imagery of wild animals: (a) Elephants in the Nazinga Game Ranch in the south of Burkina Faso [43], (b) breeding lesser frigatebirds (Fregata ariel) in the tropical Ashmore Reef Commonwealth Marine Reserve and on the nearby Adele Island, Western Australia [54], (c) fur seals [57], and (d) molting royal penguins (Eudyptes schlegeli) on the subantarctic Macquarie Island, Australia [54]. These studies demonstrate that using UAS technology can significantly improve the efficiency and precision of surveying hard-to-reach wild animal populations and places compared with using traditional ground-based methods.
Remotesensing 11 01308 g003
Figure 4. Pixel number of target species in UAS images. The pixel number, N, is estimated by N = Length/GSD, where Length represents the body length of the target adult animal and GSD represents the ground sample distance of the UAS images. The body lengths of the target adult animals were collected or estimated from the corresponding references.
Figure 4. Pixel number of target species in UAS images. The pixel number, N, is estimated by N = Length/GSD, where Length represents the body length of the target adult animal and GSD represents the ground sample distance of the UAS images. The body lengths of the target adult animals were collected or estimated from the corresponding references.
Remotesensing 11 01308 g004
Table 1. Comparison of spaceborne, manned aerial, and unmanned aircraft system (UAS) surveys of wild animals.
Table 1. Comparison of spaceborne, manned aerial, and unmanned aircraft system (UAS) surveys of wild animals.
Spaceborne SurveysManned aerial SurveysUAS Survey
PlatformsSatellite images are from GeoEye-1 (3/15), WorldView-1/2/3/4 (7/15), Quickbird-2 (3/15), and IKONOS satellites (1/15).Aircraft used were mainly light manned helicopters (3/12) and fixed-wing aircraft (11/12).
Surveyors of terrestrial mammals prefer using helicopters.
Surveyors of animals in plains or marine environments prefer using fixed-wing aircraft.
A long-period study used a combination of helicopters and fixed-wing airplanes.
UASs used include small fixed-wing UASs (11/18) and multicopters (9/18).
Fixed-wing UASs are typically used to survey large or marine animals.
Multicopter UASs are typically used to survey animals in uneven terrain and high-vegetation areas, as well as birds because of their superior vertical takeoff and landing capabilities and low noise.
SensorsPanchromatic and multispectral images are the most widely used data (Two satellite remote sensing studies used panchromatic imagery, and the other thirteen studies used multispectral imagery).
Pansharpening techniques were used to merge high-resolution panchromatic and lower-resolution multispectral imagery to create a single high-resolution color image to increase the differentiation between target objects and background.
Real-time surveys do not need imaging sensors.
Photographic surveys used still RGB images, video, and infrared thermography to detect wild animals.
RGB images are suitable for detecting wild animals living in open lands or marine environments.
Thermal infrared cameras are primarily used for detecting wild animals living in forests and other high-vegetation areas.
Radio-tracking devices have been used on UASs in recent years to study the behavior of small animals.
ResolutionUp to 0.31 m resolution in the panchromatic band (WorldView-3 and -4)
Up to 1.2 m in the multispectral band (WorldView-3 and -4)
Up to 2.5 cm (RGB imagery)Up to 2 mm resolution (RGB imagery)
CoverageRegional to global scalesHas been used for regular and geographically comprehensive animal monitoring on regional scales.
Sampling distances were up to 12,800 km, with an area of approximately 6000 km2.
No more than 50 km2.
Most survey areas were <2 km2.
The minimum survey area was only 4 × 4 m.
CostRelatively low (the price of 0.5 m spatial resolution satellite imagery ranges from USD $14–27.5 per km2 depending on the spectral resolution, order area and data age).Expensive to implement for small study areas because of the cost of the aircraft, operator, and fuel.Medium
Has been seen as a safer and low-budget alternative to manned aircraft.
Surveyed speciesIt is possible only to directly identify large-sized (≥0.6 m) individual animals from existing VHR commercial satellite imagery, such as wildebeests, zebras, polar bears, albatrosses, southern right whales, and Weddell seals.The real-time survey method has long been used to survey terrestrial and marine animals with potentially low abundances in remote or large areas.
Manned aerial imagery allows directly discern smaller (<0.6 m) animals, such as birds, sea turtles, and fish; large animals that are difficult to distinguish from the background at the species level, such as roe deer and red deer; and some animals with a significant temperature difference from the background environment, such as Pacific walruses.
UASs allow surveying of smaller animals, as well as their behaviors, such as butterfly species, Bicknell’s and Swainson’s thrushes, noisy miners, and iguanas.
Most applications of UASs focus on assessing the possibilities of species detection in a small geographic area.
MethodologyDirect visual recognition and
Automatic and semiautomatic detection using pixel-based and object-based methods
Direct visual recognition
Automatic and semiautomatic detection using pixel-based and object-based methods and traditional machine learning.
Direct visual recognition
Automatic and semiautomatic detection using pixel-based and object-based methods, traditional machine learning, and deep learning.
Pixel number of target species in imagery2–6 pixelsDid not investigate, but similar to those for UAS imagery.Most animals cover 22–79 pixels.
AccuracyAutomated and semiautomatic counts of animals from remote sensing imagery are reported to usually be highly correlated with manual counts when these algorithms were applied to small areas in relatively homogenous environments.
The manual counts of animals derived from different remote sensing imagery and ground-based counts collected within a short time interval are also reported to be highly correlated.
Remote sensing-based counts often underestimate populations because some animals are invisible to remote sensing imagery, especially those living in high-vegetation areas and aquatic environments, but high-resolution imagery increases the detection possibility.
Note: The accuracy was determined through comparison with ground-based counts or manual counts.
Table 2. Submeter commercial satellites (1 m or higher resolution).
Table 2. Submeter commercial satellites (1 m or higher resolution).
No.Sensor/InstrumentSensor TypeSpatial Resolution (Nadir)AgencyLaunch Year
1IKONOSOpticalPanchromatic 1 m, multispectral 4 mDigital Globe, USA1999
QuickBird-2OpticalPanchromatic 0.61 m, multispectral 2.62 m2001
GeoEye-1OpticalPanchromatic 0.41 m, multispectral 1.65 m2008
WorldView-1OpticalPanchromatic 0.46 m2007
WorldView-2OpticalPanchromatic 0.46 m, multispectral 1.85 m2009
WorldView-3/4OpticalPanchromatic 0.31 m, multispectral 1.24 m2014/2016
2COSMO-SkyMed 1/2/3/4/5/6SARX-Band up to 1 mItalian Space Agency2007-2018
3Pleiades-1/2OpticalPanchromatic 0.5 m, multispectral 2 mFrench space agency and EADS Astrium2011/2012
4TerraSAR-X
TanDEM-X
SARX-Band up to 1 mGerman Aerospace Center and EADS Astrium2007
5Resurs-DK1OpticalPanchromatic 1 m, multispectral 2-3 mRussian Space Agency2006
6Kompsat-2OpticalPanchromatic 1 m, multispectral 4 mKorean Academy of Aeronautics and Astronautics2006
Kompsat-3OpticalPanchromatic 0.7 m, multispectral 2.8 m2012
7CartoSat-2/2A/2BOpticalPanchromatic 1 mIndian Space Research Organization2007
8EROS-BOpticalPanchromatic 0.7 mIsraeli Aircraft Industries Ltd. (built) and ImageSat International N.V. (own)2006
9GF-2OpticalPanchromatic 0.8, multispectral 3.2 mState Administration of Science, Technology and Industry for National Defense, China2014
10Beijing 2OpticalPanchromatic 0.8 m, multispectral 3.2 mTwenty First Century Aerospace Technology Co., Ltd, China2015
11SuperView-1OpticalPanchromatic 0.5 m, multispectral 2 mChina Aerospace Science and Technology Corporation2016
Table 3. Animal species detected using spaceborne imagery.
Table 3. Animal species detected using spaceborne imagery.
GroupSpecies or Items DetectedSatellitesResolution (in the Panchromatic Band)Data TypeStudy
Terrestrial mammalswildebeests (Connochaetes gnou), zebras (Equus quagga)GeoEye-10.5 mMultispectral imagery[29]
wildebeests (Connochaetes gnou), zebras (Equus quagga)GeoEye-10.5 mPanchromatic imagery[66]
polar bears (Ursus maritimus)WorldView-2 and Quickbird0.5 and 0.6 m, respectivelyMultispectral imagery[30]
polar bears (Ursus maritimus)GeoEye-10.5 mPanchromatic imagery[31]
muskoxen (Ovibus moschatus)WorldView-1 and WorldView-20.5 mMultispectral imagery[12]
Aquatic and amphibious animalswalruses and bowhead whalesGeoEye-10.5 mpanchromatic imagery[31]
southern right whales (Eubalaena australis)WorldView-20.5 mMultispectral imagery[33]
fin whales, southern right whales, and gray whalesWorldView-30.31 mMultispectral imagery (Pansharpened)[34]
Weddell seals (Leptonychotes weddellii)Quickbird-2 and WorldView-10.6 mMultispectral imagery (Pansharpened)[35]
emperor penguins (Aptenodytes fosteri)QuickBird0.6 mMultispectral imagery (Pansharpened)[67]
humpback whales (up to 10 m in length)IKONOS1 mMultispectral imagery (Pansharpened)[68]
elephant seals (Mirounga leonina)GeoEye-10.5 mMultispectral imagery (Pansharpened)[69]
Weddell seals (Leptonychotes weddellii)DigitalGlobe and GeoEye (specified satellites were not given)0.6 mMultispectral imagery (Pansharpened)[89]
Flying organisms and insectswandering albatross (Diomedea
exulans) and northern royal albatross (Diomedea sanfordi)
WorldView-30.3 mMultispectral imagery[32]
Table 4. Manned aerial surveys of wild animals.
Table 4. Manned aerial surveys of wild animals.
GroupSpecies or Items DetectedPlatformsSensorsData TypeSurveyed Area (km2)Flight Height (m)Study
Terrestrial mammalspolar bears (Ursus maritimus)HelicopterReal-time surveys, no sensorsNo imagery263100[37]
polar bears (Ursus maritimus)Bell 206 LongRanger (helicopter)Real-time surveys, no sensorsNo imagery~6000~120[18]
buffalos (Syncerus caffer), elands (Taurotragus oryx), elephants (Loxodonta africana), and giraffes (Giraffa camelopardalis)Cessna 182 or
185 aircraft (fixed-wing aircraft)
Real-time surveys, no sensorsNo imagery<10,000Not mentioned[41]
pronghorns (Antilocapra americana)Maule 5 (fixed-wing aircraft)Real-time surveys, no sensorsNo imagery~6091.4[2]
red kangaroos (Megaleia rufa), grey kangaroos (Macropus giganteus) and sheepCessna 182 (fixed-wing aircraft)Real-time surveys, no sensorsNo imagery13646–183[91]
buffalos, giraffes, elands and
waterbucks, elephants, impalas, ostriches, cattle, goats and
sheep
Cessna 185 or Partinevia (fixed-wing, high-wing aircraft)Real-time surveys, no sensorsNo imagery600090–120[17]
red deer (Cervus elaphus), fallow deer
(Dama dama), roe deer (Capreolus capreolus), wild boar (Sus scrofa), foxes, wolves and badgers
Microlight S–Stol (fixed-wing, electric)A JENOPTIC@
infrared camera and a Canon 5D Mark 2
Infrared videos and RGB images4450[76]
Aquatic and amphibious animalslemon
sharks (Negaprion brevirostris)
A Cessna 172, a Beechcraft 35 Bonanza, a Piper Pa-28 Archer, and a Piper PA-31-350 Navajo
Chieftain (fixed-wing, low winged aircraft)
Real-time surveys, no sensorsNo imagery~100100 m[38]
humpback whales (Megaptera novaeangliae)Mitsubishi Marquese (fixed-wing, flat window aircraft)Real-time surveys, no sensorsNo imagery1180152.4 m[39]
dugongs (Dugong dugon), dolphins, and sea turtles (Chelonia mydas)Partenavia 68B (fixed-wing, high-wing aircraft)Real-time surveys, no sensorsNo imagery~120137–274[40]
sea turtles, sharks, manta rays, small delphinids, and large delphinidsEarly surveys (1963–1965) used helicopters (e.g., Sikorsky SH-
3 SeaKing), and later surveys (1975–2012) used 4-seat single engine fixed-wing airplanes (e.g., Cessna 172 Skyhawk)
Real-time surveys, no sensorsNo imagery70.1692–200[42]
Pacific walrus (Odobenus rosmarus
divergens)
Aero Commander 690B (fixed-wing, high-wing aircraft)Daedelus Airborne Multispectral Scanner (AMS) and Nikon D1X digital cameraThermal infrared images and RGB images~11,398.5457–3200[74]
Flying organisms and insectsgreater flamingo (Phoenicopterus roseus)Not mentioned35 mm film
or digital (5 M pixels) reflex cameras
RGB imagesNot mentioned300[71]
lesser snow geese (Chen caerulescens)Not mentionedDSS 439 39-megapixel aerial cameraRGB imagesNot mentionedNot mentioned[72]
common scoter (Melanitta nigra), great cormorant (Phalacrocorax carbo), diver species group (Gavia sp.), Sandwich tern (Sterna sandvicensis), Manx shearwater (Puffinus puffinus)Twin-engine Cessna 402B and Cessna 404 (fixed-wing aircraft)Vexcel’s UltraCAM-D and UltraCAM-XPRGB images670475[73]
Table 5. Detected animal species and employed unmanned aerial systems (UASs) determined via a literature review.
Table 5. Detected animal species and employed unmanned aerial systems (UASs) determined via a literature review.
GroupSpecies or Items DetectedUAS Model (Type of UAS)SensorData TypeSurveyed Area (km2)Flight Height (m)Study
Terrestrial mammalsroe deer (Capreolus pygargus)Falcon-8 (fixed-wing, electric)FLIR Tau640 thermal imaging cameraThermal image0.7130–50[77]
elephants (Loxodonta africana)Gatewing 100 (fixed-wing, electric)Ricoh GR3 still cameraRGB image13.79100–600[43]
cows (Bos taurus)Custom-made 750 mm carbon-folding Y6-multirotor (hexacopter, electric)FLIR Tau 2 LWIR thermal imaging cameraThermal image<1.0 *80–120[96]
koalas (Phascolarctos cinerus)S800 EVO (hexacopter, electric)Mobius RGB Camera +FLIR Tau 2-640 thermal imaging cameraRGB video + thermal video0.01 *20–60[62]
red deer (Cervus elaphus), roe deer (Capreolus capreolus), and wild boar
(Sus scrofa)
Skywalker X8 (fixed-wing, electric)IRMOD v640 thermal imaging cameraVideo~1.0 *149~150[52]
Aquatic and amphibious animalsdugongs (Dugong dugon)ScanEagle (fixed-wing, fuel)Nikon D90 SLR camera + fixed
video camera
RGB image+ RGB video1.3152–304[51]
American alligators (Alligator mississippiensis) and Florida manatees (Trichechus manatus)1.5-m wingspan MLB FoldBat (fixed-wing, fuel)Canon Elura 2RGB video1.3100–150[58]
leopard seals (Hydrurga leptonyx)APH-22 (hexacopter, electric)Olympus E-P1RGB image<1.0 *45[57]
humpback whales (Megaptera novaeangliae)ScanEagle (fixed-wing, fuel)Nikon D90 12 megapixel digital SLR cameraRGB image35.2 *732[50]
blacktip reef sharks (Carcharhinus melanopterus) and pink whiprays (Himantura fai)DJI Phantom 2 (quadcopter, electric)GoPro Hero 3RGB video0.028812[55]
gray seals (Halichoerus grypus)senseFly eBee (fixed-wing, electric)Canon S110+ FLIR Tau 2-640 thermal imaging cameraRGB image + thermal image0.16 *250[53]
Flying organisms and insectswhite ibises (Eudocimus albus)1.5-m wingspan MLB FoldBat (fixed-wing, fuel)Canon Elura 2RGB video1.3100–150[58]
black-headed gulls (Chroicocephalus ridibundus)Multiplex Twin Star II model (fixed-wing, electric)Panasonic Lumix FT-1RGB image0.055830–40[82]
frigatebirds (Fregata ariel), crested terns (Thalasseus bergii), and royal penguins (Eudyptes schlegeli)3D Robotics (octocopter, electric)Canon EOS MRGB image<1.0 *75[54]
gentoo penguins (Pygoscelis papua) and chinstrap penguins (Pygoscelis antarctica)APH-22 (hexacopter, electric)Olympus E-P1RGB image<1.0 *45[57]
canvasbacks (Aythya valisineria), western/Clark’s grebes (Aechmophorus occidentalis/clarkii), and double-crested cormorants (Phalacrocorax auritus)Honeywell RQ-16 T-Hawk (hexacopter, fuel) and AeroVironment RQ-11A (fixed-wing, electric)Canon PowerShot SX230, SX260, GoPro Hero3, and Canon PowerShot S100RGB image<1.0 *45–76[11]
butterflies (Libythea celtis)Phantom 2 Vision+ (quadcopter, electric)GoPro Hero3RGB image0.0000164[56]
Bicknell’s and Swainson’s thrushes (C. ustulatus)Sky Hero Spyder X8 (octocopter, electric)Radio transmitter
(Avian NanoTag model NTQB-4-2, Lotek Wireless Inc., Newmarket, Ont., Canada)
Radio-tracking data<1.0 *50[64]
noisy miners (Manorina Melanocephala)Unmentioned
(hexacopter, electric)
Radio transmitter
(Avian NanoTag model NTQB-4-2, Lotek Wireless Inc., Newmarket, Ont., Canada)
Radio-tracking data<1.0 *50[48]
* indicates values estimated from the studies.

Share and Cite

MDPI and ACS Style

Wang, D.; Shao, Q.; Yue, H. Surveying Wild Animals from Satellites, Manned Aircraft and Unmanned Aerial Systems (UASs): A Review. Remote Sens. 2019, 11, 1308. https://doi.org/10.3390/rs11111308

AMA Style

Wang D, Shao Q, Yue H. Surveying Wild Animals from Satellites, Manned Aircraft and Unmanned Aerial Systems (UASs): A Review. Remote Sensing. 2019; 11(11):1308. https://doi.org/10.3390/rs11111308

Chicago/Turabian Style

Wang, Dongliang, Quanqin Shao, and Huanyin Yue. 2019. "Surveying Wild Animals from Satellites, Manned Aircraft and Unmanned Aerial Systems (UASs): A Review" Remote Sensing 11, no. 11: 1308. https://doi.org/10.3390/rs11111308

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop