Next Article in Journal
Joint Caching and Computation in UAV-Assisted Vehicle Networks via Multi-Agent Deep Reinforcement Learning
Previous Article in Journal
Parametric Modeling and Evaluation of Departure and Arrival Air Routes for Urban Logistics UAVs
Previous Article in Special Issue
The Use of Open Vegetation by Red Deer (Cervus elaphus) and Fallow Deer (Dama dama) Determined by Object Detection Models
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Drones and AI-Driven Solutions for Wildlife Monitoring

Department of Engineering, Universidad Europea de Madrid, Villaviciosa de Odón, 28670 Madrid, Spain
Drones 2025, 9(7), 455; https://doi.org/10.3390/drones9070455
Submission received: 10 May 2025 / Revised: 13 June 2025 / Accepted: 23 June 2025 / Published: 24 June 2025

Abstract

Wildlife monitoring has entered a transformative era with the convergence of drone technology and artificial intelligence (AI). Drones provide access to remote and dangerous habitats, while AI unlocks the potential to process vast amounts of wildlife data. This synergy is reshaping wildlife monitoring, offering novel solutions to tackle challenges in species identification, animal tracking, anti-poaching, population estimation, and habitat analysis. This paper conducts a comprehensive literature review to examine the recent advancements in drone and AI systems for wildlife monitoring, focusing on two critical dimensions: (1) Methodologies, algorithms, and applications, analyzing the AI techniques employed in wildlife monitoring, including their operational frameworks and real-world implementations. (2) Challenges and opportunities, identifying current limitations, including technical hurdles and regulatory constraints, as well as exploring the untapped potential in drone and AI integration to enhance wildlife monitoring and conservation efforts. By synthesizing these insights, this paper will provide researchers with a structured framework for leveraging drone and AI systems in wildlife monitoring, identifying best practices and outlining actionable pathways for future innovation in the field.

1. Introduction

Climate change, habitat loss, and poaching are critical threats to wildlife conservation [1]. Traditional wildlife monitoring methods, which rely on manual tracking and camera traps, are often labor-intensive and limited in scope [2]. In remote or dangerous environments, targeted monitoring struggles to provide timely and accurate data. Thus, innovative tools and approaches are needed to expand conservation efforts while minimizing human interference in fragile habitats [3].
The advancements in drone and AI technologies have created a paradigm shift in wildlife monitoring, enabling unprecedented precision, efficiency, and scalability in wildlife research [4]. Drones provide high-resolution aerial imagery, thermal sensing, and real-time data collection across vast and inaccessible terrains. However, the sheer volume of data poses a new challenge: extracting actionable insights efficiently. Effectively, wildlife research has entered the age of big data and is increasingly reliant on computational resources [5]. This is where AI becomes indispensable, transforming raw data into a deeper understanding of wildlife [6]. This powerful synergy allows for automated animal detection, behavioral analysis, population estimation, and habitat assessment with minimal human intervention. By integrating these technologies, researchers can monitor endangered species, track poaching activities, and assess ecosystem health with greater accuracy and reduced costs.
This paper reviews the recent advancements in drone and AI applications for wildlife monitoring, focusing on exploring operational frameworks, AI techniques, and practical implementations across diverse wildlife research domains as well as highlighting current limitations and potential improvements in drone and AI synergy.

1.1. Comparison with Existing Reviews

The current literature highlights significant advancements in integrating Unmanned Aerial Vehicles (UAVs) and AI-driven solutions for wildlife monitoring, with numerous surveys and reviews addressing specific research areas. For instance, a study [7] reviewed automated wildlife detection via drones and machine learning (ML) methods for the 2015–2020 period, focusing on sensor types, drone platforms, and environmental conditions. Additionally, Ref. [8] explored AI’s transformative potential for species tracking, health assessment, and anti-poaching operations, while [9] focused on deep learning (DL) techniques applied to behavioral research capabilities, showcasing innovative approaches like markerless pose tracking and multi-animal classification. Furthermore, the study [10] presented a systematic review of DL applications in livestock farming, focusing on cattle detection, classification, and localization. Similarly, Ref. [11] reviewed the DL challenges in wildlife detection, emphasizing the use of standardized datasets and metrics across species/environments.
Recent reviews have also examined both the potential and limitations of UAV applications in ecological studies. The review [12] provided comprehensive analyses of drone-induced wildlife disturbance across diverse ecosystems, while [13] underscored the role of AI-enhanced drones in forest ecology and real-time threat detection. In animal movement analysis, Ref. [14] highlights how drones facilitate the observation of rare behaviors in inaccessible environments, whereas [15] reveals that most drone-based behavioral studies concentrate on aquatic mammals and birds, with the geographical focus limited primarily to North America and Australia, highlighting significant gaps in understanding taxon-specific and location-dependent responses to drone surveillance.

1.2. Motivation and Contribution

Despite the extensive survey literature on wildlife, existing reviews reveal gaps in comprehensively assessing the applicability of drones and AI technologies. Key limitations include the methodological approaches for monitoring small, cryptic, or nocturnal species; the robustness of AI-driven solutions in overcoming real-world challenges like occlusion in dense habitats and real-time processing in resource-constrained environments; and the insufficient discussion on integrating these technologies with complementary tools.
Therefore, the objective of this paper is to review the recent advancements in drones and AI applications for wildlife monitoring, focusing on two critical dimensions:
  • Methodologies and applications, exploring operational frameworks, AI techniques, and practical implementations across diverse wildlife research domains, including species identification, animal tracking, movement analysis, anti-poaching, population estimation, and habitat assessment.
  • Challenges and opportunities, highlighting current limitations and the unexploited potential in the synergy between drones and AI in wildlife monitoring as well as outlining promising research directions.
By addressing these aspects, this paper aims to provide researchers with a comprehensive overview of the best practices of drone–AI in wildlife monitoring, highlighting existing limitations, and revealing future opportunities for research in wildlife monitoring.

1.3. Review Organization

This review is structured as follows: Section 2 details the methodologies and materials used in this review. Section 3 provides a brief overview covering drone platforms, their instrumentation, computing systems, and relevant machine and deep learning models. Section 4 presents a comprehensive analysis, through a scientific literature review, of drone and AI applications across key wildlife research domains. Section 5 synthesizes the principal findings from the examined literature. Section 6 identifies critical gaps and current limitations in drone and –AI implementations while highlighting promising future research directions. Finally, Section 7 concludes the paper by integrating key insights.

2. Materials and Methods

This review investigates the current applications, challenges, and future potential of drones and AI-driven solutions in wildlife monitoring.

2.1. Literature Search Process

The literature search was conducted across publicly available multidisciplinary databases, including IEEE Xplore, Springer Link (Springer & Nature Portfolio), Wiley Online Library, MDPI, and ScienceDirect, using their respective search tools. To ensure relevance and focus, only peer-reviewed articles, reviews, and surveys published in journals between 2018 and 2025 were considered, targeting papers specifically addressing the application of drones and AI in wildlife monitoring. The search strategy utilized predefined inclusion criteria, combining keywords and Boolean operators to search in articles’ abstracts. The query structure was as follows:
(Wildlife OR Animal) AND (UAV OR Drone OR “Unmanned Aerial Vehicle”) AND
(“Artificial Intelligence” OR AI OR Dataset OR “Data set”)
From an initial pool of 167 publications, 42% were excluded during the screening process. Most exclusions involved studies on AI applications for drone control (e.g., UAV navigation, bio-inspired techniques, or swarm algorithms), as they were not directly relevant to wildlife monitoring. The final analysis incorporates a total of 97 papers, and the search process is summarized in Figure 1.

2.2. Topic Categorization

Given the broad scope of wildlife monitoring, the collected literature was categorized into thematic domains to enable adequate analysis. In this regard, the current applications of drones and AI-driven solutions span diverse areas, each requiring specialized approaches. These domains were inspired by papers such as [16,17,18], and the classification process was carried out after reviewing the paper abstracts. The final analysis incorporates 97 papers, which are categorized into five key wildlife research domains:
  • W1: Automatic species identification.
  • W2: Tracking and movement analysis.
  • W3: Anti-poaching and surveillance.
  • W4: Population estimation.
  • W5. Habitat analysis.
By organizing the literature into focused topics, this study enables targeted discussions of drones and AI-driven solutions, challenges, and innovations within each subfield. Table 1 summarizes the survey results, while Figure 2 displays the distribution of identified publications across academic databases and wildlife research domains.
The subsequent literature analysis, identifying the impact of drones and AI in the wildlife research domains, is addressed in Section 4.

3. Background

This section outlines the fundamental components enabling drones and AI-driven solutions for wildlife monitoring, namely the UAV platform for operational capabilities, onboard instrumentation for data and image acquisition, and companion computing units to facilitate real-time ML/DL algorithm inference.

3.1. UAV Platforms

Modern wildlife monitoring employs diverse drone types, each tailored to specific study requirements such as range, payload capacity, and operational environment [19]. Multi-copters (quadcopters/hexacopters) excel in precision tasks with their Vertical Take Off and Landing (VTOL) capabilities, stable hovering (±0.5 m accuracy), and payload flexibility (0.5–6 kg), making them ideal for low-altitude surveys in dense habitats using high-resolution RGB or thermal cameras (±2 °C accuracy). Fixed-wing drones, with extended endurance (2–5 h) and large coverage (50–150 km per flight), are optimized for tracking migratory species or mapping vast savannas/coastlines. Hybrid VTOL models merge these multi-copter and fixed-wings drones’ advantages, transitioning seamlessly between vertical lift and efficient fixed-wing flight to access remote areas like rainforests or mountains. Autonomous drones are advanced UAVs equipped with Real-Time Kinematic Global Navigation Satellite System (RTK-GNSS) technology, delivering exceptional precision (1–3 cm). Featuring advanced navigation systems, real-time data processing, and intelligent path planning capabilities, these drones automate anti-poaching patrols, species identification, and habitat assessment. Remotely operated vehicles (ROVs) are used for marine ecosystem studies, such as the underwater BlueROV2 (300 m depth rating) which documents coral reef health and fish behavior with minimal disturbance.
Operating drones for wildlife monitoring presents significant challenges when flying over diverse landscapes such as mountains, dense forests, and deserts, each imposing unique constraints [20,21]. In mountainous terrain, high winds and rapid elevation changes reduce flight stability, limiting operational windows and forcing shorter flight times to maintain control. Dense forests obstruct GPS signals and visual line-of-sight, requiring lower altitudes and slower speeds, which increases survey effort and reduces coverage per battery cycle. In arid deserts, extreme heat can overheat batteries, cutting flight durations, while vast, featureless landscapes complicate navigation, demanding higher sampling intensity to ensure data accuracy. Environmental variables like wind, temperature, and visibility directly dictate UAV flight time, survey efficiency, and the necessary sampling density, often requiring adaptive mission planning to balance data quality with operational feasibility.

3.2. Onboard Instrumentation

Drones used in wildlife missions are typically equipped with specialized instruments tailored to ecological research and conservation efforts. Key sensors include visible cameras to capture visual aerial imagery for counting animals, observing nesting birds, detecting poaching activities, tracking movement patterns, and assessing habitat [22]. Thermal cameras are used to detect heat signatures, making them ideal for locating nocturnal, camouflaged, or hidden wildlife, identifying injured animals, and monitoring underground or burrowing species that are difficult to observe visually [23]. Multispectral and hyperspectral cameras are used to capture data across various light wavelengths, including infrared and ultraviolet, to analyze vegetation health and habitat quality [24]. LiDAR is used to analyze canopy density, measure tree height, and identify nesting sites for birds and arboreal mammals like primates. It is also used to penetrate dense foliage, and to capture details on the ground, allowing the understanding of hidden ecosystems [25]. Acoustic sensors, mounted on drones, help to record animal vocalizations, such as bird songs, frog calls, and bat echolocation. They are also used to detect illegal activities like gunshots, chainsaw noises in protected forests, etc. [26]. Radio telemetry receivers are used to track animals fitted with GPS or VHF collars, which is especially useful for studying migratory species and monitoring endangered species in remote areas [27]. Gas sensors on drones measure atmospheric conditions such as CO2, methane, temperature, and humidity, and monitor air quality near sensitive ecosystems, offering insights into wildlife habitats [28].

3.3. Drones’ Companion Computing Support

Modern drones leverage companion computers (CCs) to enhance their flight controllers, enabling advanced computational tasks such as AI-driven object recognition, real-time image processing, and autonomous navigation. These CCs are deployed in three primary configurations (onboard, offboard, and swappable), each offering unique advantages for ecological research and conservation.
Onboard CCs (e.g., NVIDIA Jetson, Raspberry Pi, or Intel NUC) are embedded within the drone’s airframe and interface directly with the autopilot via UART, USB, or Ethernet. They enable real-time processing for critical functions like navigation and environmental monitoring with minimal latency. This setup is ideal for remote fieldwork where external communication is unreliable. Offboard CCs offload computation to ground stations, servers, or cloud platforms. Data (e.g., high-resolution imagery and LiDAR scans) is transmitted via Wi-Fi, 4G/5G, or RF links for post processing. While this reduces onboard power consumption—beneficial for energy-intensive tasks (e.g., hyperspectral analysis) or long-endurance missions—it relies on stable connectivity and introduces latency, making it less suitable for time-sensitive applications. Swappable CCs provide modular flexibility, allowing drones to interchange specialized processors or sensors (e.g., swapping thermal cameras for multispectral imagers) via plug-and-play interfaces. This adaptability is valuable for wildlife studies requiring diverse data modalities in a single mission.
Selecting the optimal CC configuration involves balancing processing speed, power efficiency, and operational constraints to meet specific research needs.

3.4. Machine Learning and Deep Learning Algorithms

ML is a broad field encompassing algorithms that learn from data, which can be divided into supervised learning (e.g., classification and regression), unsupervised learning (e.g., clustering), and reinforcement learning (e.g., autonomous decision making). DL algorithms represent a specialized subset that uses multi-layered neural networks to model complex data. Key DL architectures include Convolutional Neural Networks (CNNs) [29] and Generative Adversarial Network (GAN) [30], which are both suitable for image processing.
CNNs serve as the backbone for most modern object detection systems, which fall into two main categories: two-stage and single-stage detectors. Two-stage detectors, such as the region-based Convolutional Neural Networks (R-CNN) family, prioritize accuracy over speed by first proposing regions of interest and then classifying them. The original R-CNN extracts features from each proposed region independently, making it computationally heavy. Fast R-CNN improves efficiency by sharing convolutional features across regions. Mask R-CNN extends this framework by adding pixel-level segmentation capabilities. Cascade R-CNN refines detection quality through progressive bounding box refinement, making it suitable for high-precision applications. Specials DL include ResNet [31], a network with improved performance on complex computer vision tasks, or DenseNet [32], a densely connected convolutional networks architecture that links several layers.
Special DL networks known as single-stage detectors emphasize speed, making them ideal for real-time applications such as drone-based wildlife monitoring. YOLO (You Only Look Once) is a prominent example, processing images in a single pass by dividing them into grids and predicting bounding boxes and class probabilities simultaneously. There are several specialized versions, such as YOLO-NAS, a cutting-edge object detection model utilizing Neural Architecture Search (NAS) ensuring a balance between speed and precision, and YOLO-PEST an object detection model specifically tailored for the detection of pests in agricultural settings, particularly small or densely packed ones. Further customized YOLO architectures used small-target detection include SE-YOLO [33], a Squeeze-and-Excitation channel attention mechanism, and WILD-YOLO [33,34], an optimized YOLOv7 with a reduction in a significant number of parameters compared with the baseline model, and ALSS-YOLO [35], an efficient and lightweight detector optimized for TIR aerial images.
Additional fast objects detectors known as Single Shot MultiBox Detector (SSD) utilize a DL network leveraging multi-scale feature maps for detecting objects at various resolutions such as RetinaNet [36] or EfficientNet [37] which utilizes a compound scaling method to uniformly scale depth, width, and resolution.
Studies have also used GAN networks -like BATScan [38], an ML framework that uses two neural networks, a generator, and a discriminator, in an adversarial training process to generate new data that resembles the training data. The generator produces synthetic data, while the discriminator tries to distinguish between real and generated data, pushing the generator to create more realistic outputs.
Understanding these distinctions helps in selecting the right model for tasks like those discussed in the reviewed papers, where UAVs and AI-driven detection play a critical role in wildlife monitoring.

4. Drones and AI-Driven Solutions in Wildlife Monitoring

Advances in drone technology and AI have revolutionized wildlife monitoring, enabling a paradigm shift in ecological research and conservation. This section explores the latest developments in drone–AI solutions across the following wildlife domains:
  • Automatic species identification.
  • Tracking and movement analysis.
  • Anti-poaching and surveillance.
  • Population estimation.
  • Habitat analysis.
For each domain, a brief context and its significance as well as the role of drones and AI application is provided, followed by a discussion of AI-driven methodologies from the recent literature that addresses specific wildlife challenges

4.1. Automated Species Identification

Species identification relies on morphological and anatomical characteristics such as size, shape, color, and structural features, typically using dichotomous keys. Behavioral traits like bird calls and mating rituals have also served as important differentiators. However, the traditional approaches demand expert knowledge, have been proven to be time consuming, and often struggle with cryptic species [39]. AI solutions and thermal–visual imagery has revolutionized this field by enabling automated species detection and classification [7].
Recent studies have demonstrated significant advancements in multi-species identification. For large mammal detection, Ref. [40] proposed a Faster R-CNN model for identifying caribou in drone imagery, achieving 80% accuracy. Similarly, Ref. [41] combined YOLO and Faster R-CNN for Bengal tiger recognition in complex mangrove habitats. Further improving detection performance, Ref. [42] applied YOLOv4 to deer identification in forested drone footage, reaching 86% accuracy, while [43] combined YOLOv3 with Environment for Visualizing Images (ENVI), a software for geospatial image processing and analysis, for cattle detection achieving 95% detection and 88% posture classification accuracy.
Beyond traditional deep learning approaches, Ref. [44] introduced a point pattern analysis method for herd species identification (e.g., elephants and zebras), attaining a 96% F1-score, outperforming classical feature-based models. Meanwhile, Ref. [45] proposed an active learning strategy with a reusable CNN-based detector, significantly reducing the labeling effort for training. Their model was validated on UAV datasets containing diverse species, including kudus, giraffes, zebras, and black rhinos.
The fusion of thermal–visible images from drones together with several machine learning algorithms (Random Forest, SVM, and Boosted Regression Trees) has been used in [46] to survey waterbird populations in (Halimun-Salak, Indonesia), while [47] evaluated multiple YOLO variants (v5, v7, and v8) for animal detection (e.g., cows, deer, and horses), finding YOLOv5 superior to YOLOv7 in overall metrics. Similarly, Ref. [48] integrated thermal and visible imagery with a Sobel edge-based method for the real-time detection of large animals, achieving 80.4% accuracy.
The detection of small and cryptic species has also seen significant innovation through specialized methodologies. For instance, Refs. [49,50] presented the use of a pipeline based on Faster R-CNN and YOLO pipeline for koala detection on drone thermal imagery, achieving 68–100% accuracy, while [51] applied YOLOv5 to nocturnal species (hares and roe deer) with thermal data. In marine environments, Ref. [52] evaluated CNN performance across species such as seals, turtles, and gannets, finding that detection accuracy varies with habitat complexity. Additionally, Ref. [53] demonstrated avian identification using YOLOv4. Customized YOLO architectures have further improved classification performance. For example, for small target detection, Refs. [33,34] introduced SE-YOLO and WILD-YOLO and optimized YOLOv7 compared with the baseline model, while [35] proposed ALSS-YOLO, an optimized detector for thermal aerial images. Additionally, Ref. [31] compared CNN and ResNet models across the classification of four species (deer, geese, cattle, and horses), with ResNet achieving 99.18% accuracy, significantly outperforming the CNN baseline (84.55%). Similarly, Ref. [38] developed BATScan, a GAN-based classifier capable of distinguishing over 30 bat species while differentiating them from birds and insects. Further innovations include RetinaNet presented in [36], which improved mean average precision (mAP) by 11.3% on infrared datasets through channel enhancement. Finally, Ref. [54] combined YOLOv8 with heuristic methods for real-time animal detection in thermal surveys, achieving a recall of 95.23%.
All these contributions illustrate the evolution of species identification through drones and AI.

4.2. Tracking and Behavioral Analysis

Wildlife tracking refers to the methods used to monitor animal movements and locations over time, often employing technologies like GPS collars, camera traps, or radio telemetry [27]. Beyond tracking, wildlife behavioral analysis examines animal movement patterns to understand foraging strategies, mating behaviors, migration dynamics, predator avoidance, and responses to environmental changes [55]. In this aspect, drones enhance these studies by providing high-resolution, real-time, and flexible data collection. When integrated with Geographic Information Systems (GIS), drone-derived data can map movement paths, visualize trajectories, and enable advanced spatial analysis, offering deeper insights into animal behavior and habitat use [56].
Several studies have utilized drones and AI to implement systems for animal tracking. For example, a study [57] described the implementation of a real-time tracking system for Procapra goats, using a YOLOv7 model with enhanced object detection and tracking, while another study [32] presented DeepPoseKit, that uses an efficient multi-scale DL model, referred to as Stacked DenseNet, and a fast GPU-based model to achieve real-time pose estimation. Similarly, Ref. [58] presented Trex, a fast solution for the real-time tracking of up to approximately 256 individuals simultaneously. Trex is based on the IDTRACKER.AI software, which is able to track individuals in large collectives of animals from video recordings and can identify up to 100 individuals, even if they are unmarked. For radio-tagged wildlife, UAV-based telemetry systems [59,60] have combined signal processing and a Particle Filter (PF) for tracking and localizing animals with high accuracy. Ref. [61] extended this by optimizing UAV waypoints using the Pareto algorithm to minimize the uncertainty in VHF-tagged animal localization, and [62] applied Optimal Transport (OT), a probabilistic framework to prioritize monitoring zones.
Several studies have also utilized drones and AI to investigate animal behaviors. For example, a study [14] reviewed drone applications in behavior studies, highlighting AI’s role in biohybrid systems for behavioral experiments, and another study [9] examined the principles of DL methods for markerless pose tracking, emphasizing practical implementation, and the integration of drones and computer vision for the study of multi-animal behavior classification in natural environments. Furthermore, another study [63] described an autonomous UAV combined with the YOLOV3 model for object detection, to study the group movements and behavior of a herd of Exmoor ponies. Similarly, Ref. [64] presented an automated drone equipped with computer vision and CNNs models for video processing to track animal poses and movements for monitoring a group of Gelada monkeys.
Collaborative drones have also been exploited to improve tracking efficiency in complex environments. For example, Ref. [65] developed an Ant Colony Optimization (ACO) algorithm to enhance drone-based deer tracking, demonstrating that adaptive strategies are crucial for handling dynamic animal movements. Reinforcement Learning (RL) further enhances multi-drone coordination, as demonstrated in [66] in robotic shepherding systems and [67] in tracking zebras via wireless sensor networks. These innovations demonstrate the benefits of multi-agent coordination in wildlife monitoring.
These contributions illustrate the rapid evolution of wildlife tracking analysis through drones, AI, and collaborative systems.

4.3. Surveilance and Anti-Poaching

Surveillance and anti-poaching encompass a range of measures and technologies designed to prevent illegal wildlife hunting (poaching) and monitor protected areas, playing a vital role in conserving endangered species and biodiversity. Traditional methods are based on ranger patrols and camera traps for deterring poachers. Recent advancements have introduced paradigm-shifting solutions for wildlife protection, leveraging sophisticated detection systems and response mechanisms to improve surveillance and anti-poaching combat [68].
On the detection front, Ref. [69] presented an automated person detection system under occlusion conditions using Airborne Optical Sectioning (AOS), a synthetic aperture imaging technique that employs camera drones to capture unstructured thermal light fields. The detection is performed via a YOLO model running on an onboard drone microcontroller, achieving 89% accuracy under canopy cover. A similar approach is explored in [70]. To address occluded person detection in surveillance and anti-poaching applications, Ref. [71] introduced a UAV-based thermal imaging dataset comprising 8768 labeled images from diverse environments to facilitate model training. Meanwhile, Ref. [72] demonstrated a drone-based AI pipeline using a YOLOv5l6 model on a Jetson Xavier to detect free-ranging black rhinos, giraffe, ostrich, and springbok, showcasing the potential for automated megafauna monitoring. For the enhanced depth perception of occluded objects (e.g., humans and animals), Ref. [73] combined stereoscopic synthetic aperture imaging with AI, reporting a 40% improvement in detection. Nocturnal surveillance is tackled in [74], which integrates a customized GAN for nighttime imaging with a YOLO classifier trained and tested using the iNaturalist datasets [75]. Marine applications are also explored in [76], where an autonomous drone system supplemented with IOT wireless sensors, using aerial imagery and region-based CNN, is proposed to prevent shark attacks, achieving 92% effectiveness in simulations.
Several implementations incorporate real-time response strategies. For instance, Ref. [68] proposed a UAV-based anti-poaching system using acoustic and pyroelectric infrared sensors to detect intruders. A custom-trained YOLO model identified poachers and animals, triggering alerts to authorities. The proof-of-concept employs a hexacopter with a Raspberry Pi 4B for autonomous operation. Similarly, Ref. [77] introduced a real-time rhino and vehicle detection system, linking drone footage via GSM to a conservation data center running a Faster R-CNN ResNet model. Furthermore, Ref. [78] described an alert system (SMS-based) for wild animal activity detection, utilizing the YOLOR (You Only Learn One Representation) model and achieving >98% mean accuracy across 25 classes in testing.
For broader applications, Ref. [79] presented a drone swarm strategy for search-and-rescue, wildlife observation, and wildfire detection, combining synthetic aperture sensing with adaptive particle swarm optimization to locate occluded targets in dense forests. Additionally, Ref. [80] evaluated UAV effectiveness in wildlife surveillance through agent-based modeling simulations, optimizing system parameters for protection strategies. At a systemic level, Ref. [81] outlined the WatchEDGE project, a distributed edge computing network optimized for AI-based image processing, enabling the real-time analysis of multi-source data (e.g., drone fleets) for remote wildlife surveillance.
These innovations demonstrate how integrated drone and AI methods overcome the traditional limitations in anti-poaching operations through enhanced detection accuracy, expanded coverage capabilities, and rapid response times.

4.4. Population Estimation

Population estimation is a critical task in wildlife conservation that involves estimating populations to evaluate conservation status and the effectiveness of protection efforts. Traditional methods, such as manual counting or camera traps, are often limited in scope and accuracy [82]. Drones overcome these challenges by conducting aerial surveys across vast, rugged, and dangerous terrains, capturing high-resolution images that AI algorithms analyze to count and classify animals [83].
In this regard, several recent studies have leveraged drone-based remote sensing and AI for wildlife population estimation. For example, a study [84] demonstrated that counting from drone footage with semi-automated counting is more accurate than ground surveys. The counting is based on the use of multicount, a Java-based image processing program, where objects to be counted are manually marked. For large-scale surveys, Ref. [85] presented a semi-automated approach using the random forest ML algorithm to map targets of interest in drone imagery to estimate wildlife aggregation populations, which was tested on four complex breeding waterbird colonies on floodplains, ranging from 20,000 to 250,000 birds, providing estimates of bird nests. Similarly, Ref. [86] presented a semi-automated counting method based on the random forest ML classifier for identifying and counting waterbird species. The work also showed the transferability of the classifier’s application, using trained models with drone imagery from the Okavango Delta (Botswana), and testing it with Australian breeding colonies. The method is judged to be 5x quicker than manual counting. Furthermore, Ref. [87] investigated the drone-based counting of waterbirds, achieving a 96% success rate across 343 cases, and compared manual counting with ImageJ/Fiji (v 1.53) software reaching 100 birds in 64 s, whereas the use of DenoiSeg tool, a neural network algorithm for instance segmentation, reduced this to 7 s per 100 birds. Ref. [88] presented a semi-automated manatee population estimation method, using a fixed-wing drone and Kinovea, an analytical software used for identification by annotating the videos as each manatee became visible on the screen. The abundance and distribution were estimated using a Bayesian closed capture-mark-recapture model. Additionally, Ref. [89] used Picterra, an online CNN model to identify and count pups and older seals, for monitoring their population dynamics.
Thermal imaging is also exploited for surveying wildlife populations, as shown in [90], which implemented the sampling strata analysis with “sgsR” (a statistical R-toolbox) for surveying Formosan sika deer, demonstrating the potential of thermal imaging drones for accurately estimating wildlife populations. Similarly, a study [91] combined drones with simultaneous visible and thermal imaging for European elk monitoring and surveying in Siberian winter forests, leveraging tailored software tools for image processing. Another study [92] used Otsu thresholding in thermal image segmentation to quantify canopy cover (bias correction for animal counts), outperforming traditional Gaussian kernel, Chan–Vese active contour, logistic regression, and ML methods in speed/accuracy.
Drones and AI are also applied to livestock management as discussed in [93], which reviewed the use of several AI techniques as well as the IoT for wildlife detection and counting. Similarly, Ref. [94] used UAVs and CNN models for livestock monitoring, achieving 95% accuracy in counting cattle from 100 m altitude. Similarly, Ref. [95] applied YOLOv2 and [96,97] applied Mask R-CNN to both cattle detection and counting, while [98] applied a CNN to sheep counting. Furthermore, Ref. [99] discussed AnimalDrone, a large-scale video-based animal counting dataset, and developed a new graph regularized flow attention network to perform density map estimation in dense crowds of video clips with arbitrary crowd density. Another study [100] presented an electronic sheepdog, which was UAV-assisted, for electronic fencing for protecting grazing areas.
These references underscore the combination of drones and AI solutions for revolutionizing wildlife population estimation, offering efficiency and scalability.

4.5. Habitat Analysis

Habitat analysis plays a critical role in safeguarding wildlife habitats by assessing ecological health through key applications such as forest monitoring, fire detection, vegetation mapping, tree species classification, and so on [101]. Drones enhance this process by capturing high-resolution aerial imagery, conducting 3D mapping, and surveying inaccessible areas, while AI leverages this data for training predictive models to forecast environmental threats, enabling the proactive safeguarding of natural habitats [46]. In this respect, several recent studies have leveraged drone-based remote sensing and AI for wildlife habitat monitoring.
Regarding forest and fire detection, a study [13] demonstrated AI-driven drones equipped with thermal and multispectral sensors for real-time species identification, wildfire detection, and Explainable Artificial Intelligence (XAI), a set of methods that can explain the reasoning behind decisions, enabling users to trust and understand the results generated by AI algorithms. Another study [102] enhanced early fire warnings using a hybrid Transformer-CNN model on thermal imagery, while [103] optimized drone swarms with clustering algorithms (reaching 97.2% success rate) for rapid fire and encroachment response. Another study [12] described several methods for monitoring animals during fires using drones equipped with visible and thermal camaras and onboard image processing capabilities.
For vegetation mapping, a study [104] achieved 99% accuracy in crop damage assessment using visible images from drones and a Matlab-based technique for morphological filters and color/texture image processing, whereas [105] employed UAVs with multi-spectral cameras and CNNs to attain a 97% dice score in vegetation segmentation. Furthermore, Ref. [106] applied YOLOv5 to UAV imagery for invasive Siam weed detection (with F1-score of 0.88), and another study [107] used hyperspectral data for efficient alfalfa quality prediction. In tree and species classification, Ref. [108] proposed a CNN-based approach (obtaining more than 90% accuracy), enabling cost-effective forest inventories.
For general wildlife monitoring, Ref. [109] introduced an energy-efficient UAV combined with Wireless Sensor Network (WSN) fusion to minimize data loss, while [110] reviewed drone ecophysiology studies (2010–2023), emphasizing visible/thermal sensors for non-invasive health assessments. Furthermore, Ref. [16] analyzed drone-induced wildlife disturbance, recommending optimized flight protocols to reduce animal stress. Finally, in aquatic and marine monitoring, Ref. [111] documented a 12x increase in UAV deployment since 2013, highlighting AI-driven hyperspectral sensing for algal bloom detection and habitat assessment, and [112] reviewed UAV applications for fish stock assessment and habitat monitoring, emphasizing AI’s role in data analysis for sustainable practices.
These studies underscore AI-enhanced drones as transformative tools for sustainable habitat management.

5. Discussion

The data analysis reveals a clear dominance of AI-powered techniques in species identification (W1), with 34% of studies, reflecting the maturity of this research domain. Meanwhile, tracking and movement analysis (W2), anti-poaching and surveillance (W3), population estimation (W4), and habitat analysis (W5) show balanced distributions, each representing 16–21% of the literature, indicating parallel advancements in these interconnected domains.
A critical transversal issue across all previously discussed domains concerns wildlife dataset biases. These datasets frequently exhibit limitations such as geographic bias favoring certain regions (e.g., North America or African savannas) over tropical forests and polar areas; species bias focusing on fascinating megafauna (e.g., lions and elephants) with inadequate representation of small species (e.g., insects, and amphibians); temporal bias in data collection, predominantly capturing daytime or seasonal behaviors while neglecting nocturnal behaviors; annotation errors from mislabeling, particularly for visually similar species; and insufficient samples of rare or endangered species for effective AI training. These biases can significantly compromise AI model performance in real-world conservation applications.
For automated species identification, the Faster R-CNN and YOLO models are used to detect large mammals such as caribou, Bengal tigers, deer, and cattle. Beyond traditional DL models, point pattern analysis and active learning strategies enhance identification accuracy while reducing labeling effort, particularly for species like elephants, zebras, and rhinos. In thermal–visible image fusion, YOLOv5 outperforms conventional ML and DL methods. For small and cryptic species (e.g., koalas), the Faster R-CNN and YOLO models are applied, while YOLOv5 aids in detecting nocturnal animals like hares and roe deer. CNNs are also utilized for marine and avian species recognition. Further advancements include custom YOLO architectures like SE-YOLO, WILD-YOLO, and ALSS-YOLO, which are optimized for small target detection, and GAN-based models (e.g., BATScan) for improved species differentiation. ResNet achieves high classification accuracy, while CE-RetinaNet enhances infrared detection performance.
Animal tracking applications use tools such as YOLOv7 with Deep SORT and Stacked DenseNet for pose estimation, and IDTRACKER.AI for multi-individual tracking. ML-enhanced UAV telemetry improves the localization of radio-tagged wildlife. Drones and AI also facilitate behavior analysis, leveraging YOLOv3 and CNNs for group movement studies, markerless pose tracking, and autonomous monitoring. Finally, collaborative drone systems are also used to optimize wildlife tracking through Ant Colony Optimization and Reinforcement Learning, enhancing efficiency in dynamic environments.
Anti-poaching efforts increasingly rely on dronewith YOLO models and thermal imaging to identify occluded humans/animals and with GAN-enhanced night vision further improving surveillance. Real-time response systems integrate Faster R-CNN and YOLO to autonomously detect poachers and trigger alerts. Large-scale solutions are also considered deploying drone swarms with particle swarm optimization (PSO) and edge computing networks (e.g., WatchEDGE) for multi-source anti-poaching operations in challenging environments.
On the population estimation front, many papers indicate the use of drone imagery with semi-automated ML counting methods, proving faster and more accurate than manual surveys. Thermal imaging is also combined with AI for wildlife population surveys, employing statistical tools like “sgsR” and Otsu thresholding for accurate counts of species like Formosan sika deer and European elk in challenging environments. Drone imagery and AI solutions are also exploited in livestock, employing CNNs for animal counting, with innovations like the use of UAV-assisted electronic fencing for grazing management.
Finally, habitat analysis is tackled from several perspectives. For instance, forest and fire monitoring utilizes thermal/multispectral drones with hybrid transformer-CNN models and swarm optimization for early wildfire detection. Vegetation mapping employs UAVs with visible/thermal imagery to feed CNNs and YOLO models for analyzing crop damage and invasive species detection, highlighting ML’s role in data analysis. Finally, in aquatic and marine monitoring, the increase in UAV deployment since 2013 and AI-driven hyperspectral sensing for algal bloom detection, fish stock assessment and habitat monitoring are highlighted.
Table 2 summarizes the application of AI algorithms across wildlife domains.

6. Limitations and Future Directions

Despite the significant advances in drone and AI-driven wildlife monitoring, key challenges remain. This section examines the technical hurdles identified in the reviewed literature, highlights some legal restrictions, and explores potential solutions to maximize the impact of these technologies in wildlife monitoring efforts.

6.1. Technical Limitations

In automated species identification, Ref. [11] highlights critical gaps in data diversity and model generalizability, emphasizing the need for standardized open benchmarks to ensure robust performance across ecosystems. Another study [50] highlighted the lack of datasets for most endangered species, which poses a challenge for training ML and DL models. Another study [6] stresses the importance of interdisciplinary collaboration between ecologists and data scientists to refine AI applications, ensuring that models align with real-world conservation needs. Technical hurdles remain, as noted by [113], including occlusion, background interference, and limitations in sensor resolution, which complicate detection in complex habitats. Additionally, Ref. [39] underscores the need for more advanced automated biometrics, such as footprint and facial recognition, to improve individual animal identification. Addressing these challenges will require continued innovation in multi-sensor fusion, adaptive learning techniques, and the construction of comprehensive, ecologically representative datasets.
AI applications in animal tracking and behavior analysis face multiple constraints. Ref. [14] identify significant processing challenges for large behavioral datasets, particularly in heterogeneous landscapes. Geographic biases are evident, with most studies focusing on aquatic mammals and birds while neglecting terrestrial species [15]. DL applications encounter scalability issues with markerless pose estimation [9], and observer effects remain problematic for sensitive species. Refs. [14,114] advocate for species-specific protocols to mitigate drone-induced stress. These limitations highlight the need for standardized protocols and improved sensor technologies.
Regarding population estimation, it remains limited in aquatic environments [115], and advanced classifiers achieve a high accuracy rate only under optimal conditions [89]. Behavioral responses to drones introduce potential biases, for example, in waterbirds exhibiting flushing behavior during surveys [87]. Weather-dependent flight optimizations and habitat-specific protocols further complicate reliable population assessments [90].
Critical operational challenges persist in anti-poaching and surveillance applications. In this respect, Ref. [116] demonstrates that thermal drones achieve good accuracy in nighttime detection; however, performance degrades significantly at operational altitudes (>120 m). Power limitations create surveillance gaps when batteries drain, requiring complex solutions [117]. Nighttime surveillance depends on expensive thermal equipment [74]. Lastly, most aerial wildlife imagery is associated with specific studies and are not publicly available, leading to small sample sizes for algorithm training [83].
Habitat monitoring faces technical constraints in extreme conditions. Effective fire monitoring requires advanced thermal sensors and robust image recognition software [12], while taxonomic biases persist in data collection [110]. Dense canopies challenge species identification and health assessment algorithms [13], and operational variables like altitude and noise trigger stress responses in wildlife [16].

6.2. Legal Restrictions, Certification, and Costs

Finally, flying drones for wildlife research in restricted areas, such as national parks, protected zones, or near airports, requires strict compliance with aviation and environmental regulations [118]. Many countries mandate special permits for drone operations in these regions, often requiring researchers to demonstrate minimal disturbance to wildlife and adherence to no fly zones (e.g., the US Federal Aviation Administration prohibits UAV flights in designated wilderness areas without authorization; the European Union’s EASA regulations impose similar restrictions under the specific category of drone operations; and wildlife agencies may enforce seasonal bans to protect sensitive species during breeding or migration). Additionally, certification requirements for flying drones vary by region, but typically include a remote pilot license. Costs also remain a critical factor, expenses include not only drone hardware (USD 1000–USD 10,000+) but also permits, insurance, training, and battery replacements, which can limit accessibility for underfunded organizations [7]. Balancing regulatory compliance, ecological sensitivity, and budget constraints is essential for effective and legal UAV deployment in wildlife studies.

6.3. Future Directions

The future of wildlife monitoring lies in developing smarter, more integrated technological solutions that can overcome the current limitations while minimizing ecological impact. Key advances must come in three areas: collaborative and standardized datasets, technical advancements, and ethical deployment.
To advance AI-driven wildlife monitoring, researchers should leverage platforms with datasets including drone imagery and other data streams, such as Conservation AI [119] for real-time species detection via edge/cloud computing, LILA BC [120] for curated training datasets, and Wildlife Insights [121] for automated image analysis using EfficientNet/ResNet architectures. Platforms like Movebank [122,123] enable migration modeling through GPS/telemetry data, while large-scale repositories like GBIF [124,125] with 2.3B+ species records and hybrid platforms like EarthRanger [126] integrate drones and sensor data for anti-poaching alerts, and iNaturalist [75,127] combines crowdsourced labeling with AI pre-sorting to scale biodiversity monitoring. Future development must prioritize standardized data formats, federated learning for cross-ecosystem generalization, and few-shot learning to overcome data scarcity in conservation applications.
On the technical front, next generation monitoring systems have to leverage multi-sensor fusion, enabling richer environmental insights. Edge computing and quantum-inspired algorithms will facilitate real-time analysis in remote areas, while autonomous drone swarms and hybrid air–ground networks (connected via 5G or LoRaWAN) will expand coverage and operational flexibility. These systems will allow dynamic flight path adjustments for tracking animal movements or detecting ecological threats. High-speed 5G networks will support real-time HD video streaming for rapid response (e.g., poaching detection), while LoRaWAN’s low-power, long-range connectivity will sustain long-term monitoring in remote regions. Additionally, pretrained foundation models tailored for wildlife applications could revolutionize species identification and behavior prediction.
Finally, while advances in drone technology enable unprecedented capabilities, such as predictive population modeling and individual animal tracking across entire habitats, their success hinges on addressing ethical challenges, including wildlife disturbance, stress, and habitat disruption. To minimize harm, operators must adopt standardized protocols that prioritize conservation, such as maintaining safe distances, avoiding sensitive biological periods (e.g., breeding or nesting seasons), and using low-noise drone models. Pre-flight assessments of species-specific tolerances and habitat conditions are critical, as is interdisciplinary collaboration between technologists and conservationists to enforce best practices like gradual approach techniques and altitude management. By integrating these measures, drones can achieve wildlife monitoring tasks while ensuring ethical deployment and minimizing ecological impact.

7. Conclusions

The integration of drones and AI-driven solutions has significantly advanced wildlife monitoring, offering numerous benefits across key applications. AI-powered techniques, particularly DL models like YOLO, Faster R-CNN, and GANs, have revolutionized species identification, achieving a high accuracy in detecting diverse animals, from large mammals to cryptic and nocturnal species. Drones equipped with thermal and visual cameras enhance tracking and anti-poaching, enabling real-time surveillance and efficient population estimation, while multispectral sensors are used for large-scale environmental monitoring. AI-driven tools such as Deep SORT and IDTRACKER.AI improve animal movement analysis, while drone swarms and edge computing optimize anti-poaching operations in challenging terrains.
However, limitations persist. Data diversity and model generalizability remain critical challenges, with AI performance often hindered by occlusions, background interference, and sensor resolution constraints. Geographic and taxonomic biases in studies limit the applicability of these technologies across all ecosystems. Operational hurdles, such as battery life, weather dependencies, and wildlife stress responses, further restrict drone effectiveness. Additionally, the lack of standardized protocols and publicly available datasets impedes the development of robust AI solutions.
Despite these challenges, the combined potential of drones and AI in wildlife conservation is enormous. Addressing the current limitations, future wildlife monitoring requires smarter, integrated solutions focusing on three key areas: advanced sensors and edge computing for richer real-time data, adaptable AI models for improved analysis, and autonomous networks for expanded coverage. These innovations will enable predictive tracking and population modeling while minimizing ecological disruption. Success depends on interdisciplinary collaboration and standardized protocols to ensure responsible, high-accuracy deployment.

Funding

This research received no external funding.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Conflicts of Interest

The author declares no conflicts of interest.

Abbreviations

ACOAnt Colony Optimization
AOSAirborne Optical Sectioning
CCCompanion Computer
CNNConvolutional Neural Network
DLDeep Learning
DNNDeep Neural Network
ENVIEnvironment for Visualizing Images
GANGenerative Adversarial Network
GISGeographic Information System
GPSGlobal Positioning System
GSMGlobal System for Mobile Network Communication
LiDARLight Detection and Ranging
LoRaWANLong-Range Wide-Area Network
MLMachine Learning
PSOParticle Swarm Optimization
R-CNNRegion-based Convolutional Neural Networks
RGBRed, Green, and Blue
RLReinforcement Learning
RNNRecurrent Neural Network
ROVRemotely Operated Vehicle
RTK-GNSSReal-Time Kinematic Global Navigation Satellite System
UAVUnmanned Aerial Vehicle
VHFVery High Frequency
VTOLVertical Take off and Landing
WSNWireless Sensor Network
XAIExplainable artificial Intelligence
YOLOYou Only Look Once

References

  1. Abrahms, B.; Carter, N.H.; Clark-Wolf, T.J.; Gaynor, K.M.; Johansson, E.; McInturff, A.; Nisi, A.C.; Rafiq, K.; West, L. Climate Change as a Global Amplifier of Human–Wildlife Conflict. Nat. Clim. Chang. 2023, 13, 224–234. [Google Scholar] [CrossRef]
  2. Lahoz-Monfort, J.J.; Magrath, M.J.L. A Comprehensive Overview of Technologies for Species and Habitat Monitoring and Conservation. BioScience 2021, 71, 1038–1062. [Google Scholar] [CrossRef]
  3. Sparrow, B.D.; Edwards, W.; Munroe, S.E.M.; Wardle, G.M.; Guerin, G.R.; Bastin, J.-F.; Morris, B.; Christensen, R.; Phinn, S.; Lowe, A.J. Effective Ecosystem Monitoring Requires a Multi-Scaled Approach. Biol. Rev. 2020, 95, 1706–1719. [Google Scholar] [CrossRef]
  4. López, J.J.; Mulero-Pázmány, M. Drones for Conservation in Protected Areas: Present and Future. Drones 2019, 3, 10. [Google Scholar] [CrossRef]
  5. Farley, S.S.; Dawson, A.; Goring, S.J.; Williams, J.W. Situating Ecology as a Big-Data Science: Current Advances, Challenges, and Solutions. Bioscience 2018, 68, 563–576. [Google Scholar] [CrossRef]
  6. Tuia, D.; Kellenberger, B.; Beery, S.; Costelloe, B.R.; Zuffi, S.; Risse, B.; Mathis, A.; Mathis, M.W.; van Langevelde, F.; Burghardt, T.; et al. Perspectives in Machine Learning for Wildlife Conservation. Nat. Commun. 2022, 13, 792. [Google Scholar] [CrossRef]
  7. Corcoran, E.; Winsen, M.; Sudholz, A.; Hamilton, G. Automated Detection of Wildlife Using Drones: Synthesis, Opportunities and Constraints. Methods Ecol. Evol. 2021, 12, 1103–1114. [Google Scholar] [CrossRef]
  8. Kumar, D.; Jakhar, S.D. Artificial Intelligence in Animal Surveillance and Conservation. In Impact of Artificial Intelligence on Organizational Transformation; Scrivener Publishing: Austin, TX, USA, 2022; pp. 73–85. [Google Scholar] [CrossRef]
  9. Saoud, L.S.; Sultan, A.; Elmezain, M.; Heshmat, M.; Seneviratne, L.; Hussain, I. Beyond Observation: Deep Learning for Animal Behavior and Ecological Conservation. Ecol. Inform. 2024, 84, 102893. [Google Scholar] [CrossRef]
  10. Yousefi, D.B.M.; Rafie, A.S.M.; Al-Haddad, S.A.R.; Azrad, S. A Systematic Literature Review on the Use of Deep Learning in Precision Livestock Detection and Localization Using Unmanned Aerial Vehicles. IEEE Access 2022, 10, 80071–80091. [Google Scholar] [CrossRef]
  11. Axford, D.; Sohel, F.; Vanderklift, M.A.; Hodgson, A.J. Collectively Advancing Deep Learning for Animal Detection in Drone Imagery: Successes, Challenges, and Research Gaps. Ecol. Inform. 2024, 83, 102842. [Google Scholar] [CrossRef]
  12. Ivanova, S.; Prosekov, A.; Kaledin, A. A Survey on Monitoring of Wild Animals during Fires Using Drones. Fire 2022, 5, 60. [Google Scholar] [CrossRef]
  13. Buchelt, A.; Adrowitzer, A.; Kieseberg, P.; Gollob, C.; Nothdurft, A.; Eresheim, S.; Tschiatschek, S.; Stampfer, K.; Holzinger, A. Exploring Artificial Intelligence for Applications of Drones in Forest Ecology and Management. For. Ecol. Manag. 2024, 551, 121530. [Google Scholar] [CrossRef]
  14. Pedrazzi, L.; Naik, H.; Sandbrook, C.; Lurgi, M.; Fürtbauer, I.; King, A.J. Advancing Animal Behaviour Research Using Drone Technology. Anim. Behav. 2025, 222, 123147. [Google Scholar] [CrossRef]
  15. Mo, M.; Bonatakis, K. An Examination of Trends in the Growing Scientific Literature on Approaching Wildlife with Drones. Drone Syst. Appl. 2022, 10, 111–139. [Google Scholar] [CrossRef]
  16. Afridi, S.; Laporte-Devylder, L.; Maalouf, G.; Kline, J.M.; Penny, S.G.; Hlebowicz, K.; Cawthorne, D.; Lundquist, U.P.S. Impact of Drone Disturbances on Wildlife: A Review. Drones 2025, 9, 311. [Google Scholar] [CrossRef]
  17. Fergus, P.; Chalmers, C.; Longmore, S.; Wich, S. Harnessing Artificial Intelligence for Wildlife Conservation. Conservation 2024, 4, 685–702. [Google Scholar] [CrossRef]
  18. Reynolds, S.A.; Beery, S.; Burgess, N.; Burgman, M.; Butchart, S.H.M.; Cooke, S.J.; Coomes, D.; Danielsen, F.; Di Minin, E.; Durán, A.P.; et al. The Potential for AI to Revolutionize Conservation: A Horizon Scan. Trends Ecol. Evol. 2025, 40, 191–207. [Google Scholar] [CrossRef]
  19. Kumarasan, T.; Vinoth, M.; Durgaram, M.; Benedict, M. Drone-enable wildlife monitoring system: Revolutionizing conservation efforst. NeuroQuantology 2023, 18, 330–339. [Google Scholar] [CrossRef]
  20. Duffy, J.P.; Cunliffe, A.M.; DeBell, L.; Sandbrook, C.; Wich, S.A.; Shutler, J.D.; Myers-Smith, I.H.; Varela, M.R.; Anderson, K.; James Duffy, C.P. Location, Location, Location: Considerations When Using Lightweight Drones in Challenging Environments. Interdiscip. Perspect. Remote Sens. Ecol. Conserv. 2018, 4, 7–19. [Google Scholar] [CrossRef]
  21. Mo, M.; Bonatakis, K. Approaching Wildlife with Drones: Using Scientific Literature to Identify Factors to Consider for Minimising Disturbance. Aust. Zool. 2022, 42, 1–29. [Google Scholar] [CrossRef]
  22. Perz, R.; Wronowski, K.; Domanski, R.; Dąbrowski, I. Case Study of Detection and Monitoring of Wildlife by UAVs Equipped with RGB Camera and TIR Camera. Aircr. Eng. Aerosp. Technol. 2023, 95, 1461–1469. [Google Scholar] [CrossRef]
  23. Rietz, J.; van Beeck Calkoen, S.T.S.; Ferry, N.; Schlüter, J.; Wehner, H.; Schindlatz, K.-H.; Lackner, T.; von Hoermann, C.; Conraths, F.J.; Müller, J.; et al. Drone-Based Thermal Imaging in the Detection of Wildlife Carcasses and Disease Management. Transbound. Emerg. Dis. 2023, 2023, 5517000. [Google Scholar] [CrossRef] [PubMed]
  24. McCraine, D.; Samiappan, S.; Kohler, L.; Sullivan, T.; Will, D.J. Automated Hyperspectral Feature Selection and Classification of Wildlife Using Uncrewed Aerial Vehicles. Remote Sens. 2024, 16, 406. [Google Scholar] [CrossRef]
  25. Luan, K.; Zhao, X.; Kong, W.; Chen, T.; Xie, H.; Liu, X.; Wang, F. A Novel Ray Tracing Approach for Bathymetry Using UAV-Based Dual-Polarization Photon-Counting LiDAR. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2024, 17, 20284–20303. [Google Scholar] [CrossRef]
  26. Michez, A.; Broset, S.; Lejeune, P. Ears in the Sky: Potential of Drones for the Bioacoustic Monitoring of Birds and Bats. Drones 2021, 5, 9. [Google Scholar] [CrossRef]
  27. Hui, N.T.; Lo, E.K.; Moss, J.B.; Gerber, G.P.; Welch, M.E.; Kastner, R.; Schurgers, C. A More Precise Way to Localize Animals Using Drones. J. Field Robot. 2021, 38, 917–928. [Google Scholar] [CrossRef]
  28. Burgués, J.; Marco, S. Environmental Chemical Sensing Using Small Drones: A Review. Sci. Total Environ. 2020, 748, 141172. [Google Scholar] [CrossRef] [PubMed]
  29. Janiesch, C.; Zschech, P.; Heinrich, K. Machine Learning and Deep Learning. Electron Mark. 2021, 31, 685–695. [Google Scholar] [CrossRef]
  30. Bau, D.; Zhu, J.Y.; Strobelt, H.; Zhou, B.; Tenenbaum, J.B.; Freeman, W.T.; Torralba, A. GAN Dissection: Visualizing and Understanding Generative Adversarial Networks. In Proceedings of the 7th International Conference on Learning Representations, ICLR 2019, New Orleans, LA, USA, 6–9 May 2019. [Google Scholar]
  31. Zhou, M.; Elmore, J.A.; Samiappan, S.; Evans, K.O.; Pfeiffer, M.B.; Blackwell, B.F.; Iglay, R.B. Improving Animal Monitoring Using Small Unmanned Aircraft Systems (SUAS) and Deep Learning Networks. Sensors 2021, 21, 5697. [Google Scholar] [CrossRef]
  32. Graving, J.M.; Chae, D.; Naik, H.; Li, L.; Koger, B.; Costelloe, B.R.; Couzin, I.D. DeepPoseKit, a Software Toolkit for Fast and Robust Animal Pose Estimation Using Deep Learning. eLife 2019, 8, e47994. [Google Scholar] [CrossRef]
  33. Mou, C.; Liu, T.; Zhu, C.; Cui, X. WAID: A Large-Scale Dataset for Wildlife Detection with Drones. Appl. Sci. 2023, 13, 10397. [Google Scholar] [CrossRef]
  34. Mou, C.; Zhu, C.; Liu, T.; Cui, X. A Novel Efficient Wildlife Detecting Method with Lightweight Deployment on UAVs Based on YOLOv7. IET Image Process 2024, 18, 1296–1314. [Google Scholar] [CrossRef]
  35. He, A.; Li, X.; Wu, X.; Su, C.; Chen, J.; Xu, S.; Guo, X. ALSS-YOLO: An Adaptive Lightweight Channel Split and Shuffling Network for TIR Wildlife Detection in UAV Imagery. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2024, 17, 17308–17326. [Google Scholar] [CrossRef]
  36. Zhang, Y.; Cai, Z. CE-RetinaNet: A Channel Enhancement Method for Infrared Wildlife Detection in UAV Images. IEEE Trans. Geosci. Remote Sens. 2023, 61, 1–12. [Google Scholar] [CrossRef]
  37. Du, J. Understanding of Object Detection Based on CNN Family and YOLO. J. Phys. Conf. Ser. 2018, 1004, 012029. [Google Scholar] [CrossRef]
  38. Werber, Y.; Sextin, H.; Yovel, Y.; Sapir, N. BATScan: A Radar Classification Tool Reveals Large-Scale Bat Migration Patterns. Methods Ecol. Evol. 2023, 14, 1764–1779. [Google Scholar] [CrossRef]
  39. Petso, T.; Jamisola, R.S.; Mpoeleng, D. Review on Methods Used for Wildlife Species and Individual Identification. Eur. J. Wildl. Res. 2022, 68, 3. [Google Scholar] [CrossRef]
  40. Lenzi, J.; Barnas, A.F.; ElSaid, A.A.; Desell, T.; Rockwell, R.F.; Ellis-Felege, S.N. Artificial Intelligence for Automated Detection of Large Mammals Creates Path to Upscale Drone Surveys. Sci. Rep. 2023, 13, 947. [Google Scholar] [CrossRef]
  41. Bhattacharya, S.; Sultana, M.; Das, B.; Roy, B. A Deep Neural Network Framework for Detection and Identification of Bengal Tigers. Innov. Syst. Softw. Eng. 2024, 20, 151–159. [Google Scholar] [CrossRef]
  42. Rančić, K.; Blagojević, B.; Bezdan, A.; Ivošević, B.; Tubić, B.; Vranešević, M.; Pejak, B.; Crnojević, V.; Marko, O. Animal Detection and Counting from UAV Images Using Convolutional Neural Networks. Drones 2023, 7, 179. [Google Scholar] [CrossRef]
  43. Mücher, C.A.; Los, S.; Franke, G.J.; Kamphuis, C. Detection, Identification and Posture Recognition of Cattle with Satellites, Aerial Photography and UAVs Using Deep Learning Techniques. Int. J. Remote Sens. 2022, 43, 2377–2392. [Google Scholar] [CrossRef]
  44. Petso, T.; Jamisola, R.S.; Mpoeleng, D.; Bennitt, E.; Mmereki, W. Automatic Animal Identification from Drone Camera Based on Point Pattern Analysis of Herd Behaviour. Ecol. Inform. 2021, 66, 101485. [Google Scholar] [CrossRef]
  45. Kellenberger, B.; Marcos, D.; Lobry, S.; Tuia, D. Half a Percent of Labels Is Enough: Efficient Animal Detection in UAV Imagery Using Deep CNNs and Active Learning. IEEE Trans. Geosci. Remote Sens. 2019, 57, 9524–9533. [Google Scholar] [CrossRef]
  46. Rahman, D.A.; Sitorus, A.B.Y.; Condro, A.A. From Coastal to Montane Forest Ecosystems, Using Drones for Multi-Species Research in the Tropics. Drones 2021, 6, 6. [Google Scholar] [CrossRef]
  47. Krishnan, B.S.; Jones, L.R.; Elmore, J.A.; Samiappan, S.; Evans, K.O.; Pfeiffer, M.B.; Blackwell, B.F.; Iglay, R.B. Fusion of Visible and Thermal Images Improves Automated Detection and Classification of Animals for Drone Surveys. Sci. Rep. 2023, 13, 10385. [Google Scholar] [CrossRef]
  48. Lee, S.; Song, Y.; Kil, S.-H. Feasibility Analyses of Real-Time Detection of Wildlife Using UAV-Derived Thermal and RGB Images. Remote Sens. 2021, 13, 2169. [Google Scholar] [CrossRef]
  49. Corcoran, E.; Denman, S.; Hanger, J.; Wilson, B.; Hamilton, G. Automated Detection of Koalas Using Low-Level Aerial Surveillance and Machine Learning. Sci. Rep. 2019, 9, 3208. [Google Scholar] [CrossRef]
  50. Hamilton, G.; Corcoran, E.; Denman, S.; Hennekam, M.E.; Koh, L.P. When You Can’t See the Koalas for the Trees: Using Drones and Machine Learning in Complex Environments. Biol. Conserv. 2020, 247, 108598. [Google Scholar] [CrossRef]
  51. Povlsen, P.; Bruhn, D.; Durdevic, P.; Arroyo, D.; Pertoldi, C. Using YOLO Object Detection to Identify Hare and Roe Deer in Thermal Aerial Video Footage—Possible Future Applications in Real-Time Automatic Drone Surveillance and Wildlife Monitoring. Drones 2023, 8, 2. [Google Scholar] [CrossRef]
  52. Dujon, A.M.; Ierodiaconou, D.; Geeson, J.J.; Arnould, J.P.Y.; Allan, B.M.; Katselidis, K.A.; Schofield, G. Machine Learning to Detect Marine Animals in UAV Imagery: Effect of Morphology, Spacing, Behaviour and Habitat. Remote Sens. Ecol. Conserv. 2021, 7, 341–354. [Google Scholar] [CrossRef]
  53. Mpouziotas, D.; Karvelis, P.; Tsoulos, I.; Stylios, C. Automated Wildlife Bird Detection from Drone Footage Using Computer Vision Techniques. Appl. Sci. 2023, 13, 7787. [Google Scholar] [CrossRef]
  54. Backman, K.; Wood, J.; Brandimarti, M.; Beranek, C.T.; Roff, A. Human Inspired Deep Learning to Locate and Classify Terrestrial and Arboreal Animals in Thermal Drone Surveys. Methods Ecol. Evol. 2025, 16, 1239–1254. [Google Scholar] [CrossRef]
  55. Jin, T.; Si, X.; Liu, J.; Ding, P. An Integrated Animal Tracking Technology Combining a GPS Tracking System with a UAV. Methods Ecol. Evol. 2023, 14, 505–511. [Google Scholar] [CrossRef]
  56. Quamar, M.M.; Al-Ramadan, B.; Khan, K.; Shafiullah, M.; El Ferik, S. Advancements and Applications of Drone-Integrated Geographic Information System Technology—A Review. Remote Sens. 2023, 15, 5039. [Google Scholar] [CrossRef]
  57. Zhang, G.; Zhao, Y.; Fu, P.; Luo, W.; Shao, Q.; Zhang, T.; Yu, Z. A Reliable Unmanned Aerial Vehicle Multi-Target Tracking System with Global Motion Compensation for Monitoring Procapra Przewalskii. Ecol. Inform. 2024, 81, 102556. [Google Scholar] [CrossRef]
  58. Lentink, D.; Walter, T.; Couzin, I.D. TRex, a Fast Multi-Animal Tracking System with Markerless Identification, and 2D Estimation of Posture and Visual Fields. eLife 2021, 10, e64000. [Google Scholar] [CrossRef]
  59. Van Nguyen, H.; Chesser, M.; Koh, L.P.; Rezatofighi, S.H.; Ranasinghe, D.C. TrackerBots: Autonomous Unmanned Aerial Vehicle for Real-Time Localization and Tracking of Multiple Radio-Tagged Animals. J. Field Robot. 2019, 36, 617–635. [Google Scholar] [CrossRef]
  60. Shafer, M.W.; Vega, G.; Rothfus, K.; Flikkema, P. UAV Wildlife Radiotelemetry: System and Methods of Localization. Methods Ecol. Evol. 2019, 10, 1783–1795. [Google Scholar] [CrossRef]
  61. Mohammadi, M.; Shafer, M.W. UAV Path Planning for Precision Multi-Target Localization. IEEE Access 2025, 13, 63715–63728. [Google Scholar] [CrossRef]
  62. Kabir, R.H.; Lee, K. Wildlife Monitoring Using a Multi-UAV System with Optimal Transport Theory. Appl. Sci. 2021, 11, 4070. [Google Scholar] [CrossRef]
  63. Kavwele, C.M.; Hopcraft, J.G.C.; Davy, D.; Torney, C.J. Automated and Repeated Aerial Observations of GPS-Collared Animals Using UAVs and Open-Source Electronics. Ecosphere 2024, 15, e4841. [Google Scholar] [CrossRef]
  64. Koger, B.; Deshpande, A.; Kerby, J.T.; Graving, J.M.; Costelloe, B.R.; Couzin, I.D. Quantifying the Movement, Behaviour and Environmental Context of Group-Living Animals Using Drones and Computer Vision. J. Anim. Ecol. 2023, 92, 1357–1371. [Google Scholar] [CrossRef] [PubMed]
  65. Chowdhury, S.; Marufuzzaman, M.; Tunc, H.; Bian, L.; Bullington, W. A Modified Ant Colony Optimization Algorithm to Solve a Dynamic Traveling Salesman Problem: A Case Study with Drones for Wildlife Surveillance. J. Comput. Des. Eng. 2019, 6, 368–386. [Google Scholar] [CrossRef]
  66. Han, W.; Wang, J.; Wang, Y.; Xu, B. Multi-UAV Flocking Control With a Hierarchical Collective Behavior Pattern Inspired by Sheep. IEEE Trans. Aerosp. Electron. Syst. 2024, 60, 2267–2276. [Google Scholar] [CrossRef]
  67. Ergunsah, S.; Tümen, V.; Kosunalp, S.; Demir, K. Energy-Efficient Animal Tracking with Multi-Unmanned Aerial Vehicle Path Planning Using Reinforcement Learning and Wireless Sensor Networks. Concurr. Comput. 2023, 35, e7527. [Google Scholar] [CrossRef]
  68. Sangeetha, R.G.; Srivastava, Y.; Hemanth, C.; Naicker, H.S.; Kumar, A.P.; Vidhyadharan, S. Unmanned Aerial Surveillance and Tracking System in Forest Areas for Poachers and Wildlife. IEEE Access 2024, 12, 187572–187586. [Google Scholar] [CrossRef]
  69. Schedl, D.C.; Kurmi, I.; Bimber, O. Search and Rescue with Airborne Optical Sectioning. Nat. Mach. Intell. 2020, 2, 783–790. [Google Scholar] [CrossRef]
  70. Kurmi, I.; Schedl, D.C.; Bimber, O. Combined Person Classification with Airborne Optical Sectioning. Sci. Rep. 2022, 12, 3804. [Google Scholar] [CrossRef]
  71. Song, Z.; Yan, Y.; Cao, Y.; Jin, S.; Qi, F.; Li, Z.; Lei, T.; Chen, L.; Jing, Y.; Xia, J.; et al. An Infrared Dataset for Partially Occluded Person Detection in Complex Environment for Search and Rescue. Sci. Data 2025, 12, 300. [Google Scholar] [CrossRef]
  72. Hua, A.; Martin, K.; Shen, Y.; Chen, N.; Mou, C.; Sterk, M.; Reinhard, B.; Reinhard, F.F.; Lee, S.; Alibhai, S.; et al. Protecting Endangered Megafauna through AI Analysis of Drone Images in a Low-Connectivity Setting: A Case Study from Namibia. PeerJ 2022, 10, e13779. [Google Scholar] [CrossRef]
  73. Kerschner, R.; Nathan, R.J.A.A.; Mantiuk, R.K.; Bimber, O. Stereoscopic Depth Perception through Foliage. Sci. Rep. 2024, 14, 23056. [Google Scholar] [CrossRef] [PubMed]
  74. Madhasu, N.; Pande, S.D. Revolutionizing Wildlife Protection: A Novel Approach Combining Deep Learning and Night-Time Surveillance. Multimed. Tools Appl. 2024. [Google Scholar] [CrossRef]
  75. How Are INaturalist Data Used for Research? INaturalist Help. Available online: https://help.inaturalist.org/en/support/solutions/articles/151000170341-how-are-inaturalist-data-used-for-research- (accessed on 14 April 2025).
  76. Li, X.; Huang, H.; Savkin, A.V. A Novel Method for Protecting Swimmers and Surfers from Shark Attacks Using Communicating Autonomous Drones. IEEE Internet Things J. 2020, 7, 9884–9894. [Google Scholar] [CrossRef]
  77. Chalmers, C.; Fergus, P.; Aday Curbelo Montanez, C.; Longmore, S.N.; Wich, S.A. Video Analysis for the Detection of Animals Using Convolutional Neural Networks and Consumer-Grade Drones. J. Unmanned Veh. Syst. 2021, 9, 112–127. [Google Scholar] [CrossRef]
  78. Natarajan, B.; Elakkiya, R.; Bhuvaneswari, R.; Saleem, K.; Chaudhary, D.; Samsudeen, S.H. Creating Alert Messages Based on Wild Animal Activity Detection Using Hybrid Deep Neural Networks. IEEE Access 2023, 11, 67308–67321. [Google Scholar] [CrossRef]
  79. Amala Arokia Nathan, R.J.; Kurmi, I.; Bimber, O. Drone Swarm Strategy for the Detection and Tracking of Occluded Targets in Complex Environments. Commun. Eng. 2023, 2, 55. [Google Scholar] [CrossRef]
  80. Monzambe, G.; Valentine, L.; Skosana, X. Performance Evaluation of Unmanned Aerial Vehicles Usage in Wildlife Surveillance Operations Using Agent-Based Simulation Modeling. Procedia CIRP 2024, 128, 804–809. [Google Scholar] [CrossRef]
  81. Maier, G.; Albanese, A.; Ciavotta, M.; Ciulli, N.; Giordano, S.; Giusti, E.; Salvatore, A.; Schembra, G. WatchEDGE: Smart Networking for Distributed AI-Based Environmental Control. Comput. Netw. 2024, 243, 110248. [Google Scholar] [CrossRef]
  82. He, G.; Yan, X.; Zhang, X.; Guo, M.; Wang, J.; Wei, Q.; Shen, Y.; Wang, C.; Lei, Y.; Jin, X.; et al. Undertaking Wildlife Surveys with Unmanned Aerial Vehicles in Rugged Mountains with Dense Vegetation: A Tentative Model Using Sichuan Snub-Nosed Monkeys in China. Glob. Ecol. Conserv. 2023, 48, e02685. [Google Scholar] [CrossRef]
  83. Iglay, R.B.; Jones, L.R.; Elmore, J.A.; Evans, K.O.; Samiappan, S.; Pfeiffer, M.B.; Blackwell, B.F. Wildlife Monitoring with Drones: A Survey of End Users. Wildl. Soc. Bull. 2024, 48, e1533. [Google Scholar] [CrossRef]
  84. Hodgson, J.C.; Mott, R.; Baylis, S.M.; Pham, T.T.; Wotherspoon, S.; Kilpatrick, A.D.; Raja Segaran, R.; Reid, I.; Terauds, A.; Koh, L.P. Drones Count Wildlife More Accurately and Precisely than Humans. Methods Ecol. Evol. 2018, 9, 1160–1167. [Google Scholar] [CrossRef]
  85. Lyons, M.B.; Brandis, K.J.; Murray, N.J.; Wilshire, J.H.; McCann, J.A.; Kingsford, R.T.; Callaghan, C.T. Monitoring Large and Complex Wildlife Aggregations with Drones. Methods Ecol. Evol. 2019, 10, 1024–1035. [Google Scholar] [CrossRef]
  86. Francis, R.J.; Lyons, M.B.; Kingsford, R.T.; Brandis, K.J. Counting Mixed Breeding Aggregations of Animal Species Using Drones: Lessons from Waterbirds on Semi-Automation. Remote Sens. 2020, 12, 1185. [Google Scholar] [CrossRef]
  87. Marchowski, D. Drones, Automatic Counting Tools, and Artificial Neural Networks in Wildlife Population Censusing. Ecol. Evol. 2021, 11, 16214–16227. [Google Scholar] [CrossRef] [PubMed]
  88. Edwards, H.H.; Hostetler, J.A.; Stith, B.M.; Martin, J. Monitoring Abundance of Aggregated Animals (Florida Manatees) Using an Unmanned Aerial System (UAS). Sci. Rep. 2021, 11, 12920. [Google Scholar] [CrossRef]
  89. Infantes, E.; Carroll, D.; Silva, W.T.A.F.; Härkönen, T.; Edwards, S.V.; Harding, K.C. An Automated Work-Flow for Pinniped Surveys: A New Tool for Monitoring Population Dynamics. Front. Ecol. Evol. 2022, 10, 905309. [Google Scholar] [CrossRef]
  90. Chang, B.; Hwang, B.; Lim, W.; Kim, H.; Kang, W.; Park, Y.-S.; Ko, D.W. Enhancing Wildlife Detection Using Thermal Imaging Drones: Designing the Flight Path. Drones 2025, 9, 52. [Google Scholar] [CrossRef]
  91. Prosekov, A.; Vesnina, A.; Atuchin, V.; Kuznetsov, A. Robust Algorithms for Drone-Assisted Monitoring of Big Animals in Harsh Conditions of Siberian Winter Forests: Recovery of European Elk (Alces Alces) in Salair Mountains. Animals 2022, 12, 1483. [Google Scholar] [CrossRef]
  92. Pagacz, S.; Witczuk, J. Estimating Ground Surface Visibility on Thermal Images from Drone Wildlife Surveys in Forests. Ecol. Inform. 2023, 78, 102379. [Google Scholar] [CrossRef]
  93. Alanezi, M.A.; Shahriar, M.S.; Hasan, M.B.; Ahmed, S.; Sha’aban, Y.A.; Bouchekara, H.R.E.H. Livestock Management With Unmanned Aerial Vehicles: A Review. IEEE Access 2022, 10, 45001–45028. [Google Scholar] [CrossRef]
  94. De Vasconcellos, B.C.; Trindade, J.P.P.; Volk, L.B.d.S.; de Pinho, L.B. Method Applied To Animal MonitoringThrough VANT Images. IEEE Lat. Am. Trans. 2020, 18, 1280–1287. [Google Scholar] [CrossRef]
  95. Shao, W.; Kawakami, R.; Yoshihashi, R.; You, S.; Kawase, H.; Naemura, T. Cattle Detection and Counting in UAV Images Based on Convolutional Neural Networks. Int. J. Remote Sens. 2020, 41, 31–52. [Google Scholar] [CrossRef]
  96. Xu, B.; Wang, W.; Falzon, G.; Kwan, P.; Guo, L.; Chen, G.; Tait, A.; Schneider, D. Automated Cattle Counting Using Mask R-CNN in Quadcopter Vision System. Comput. Electron. Agric. 2020, 171, 105300. [Google Scholar] [CrossRef]
  97. Xu, B.; Wang, W.; Falzon, G.; Kwan, P.; Guo, L.; Sun, Z.; Li, C. Livestock Classification and Counting in Quadcopter Aerial Images Using Mask R-CNN. Int. J. Remote Sens. 2020, 41, 8121–8142. [Google Scholar] [CrossRef]
  98. Sarwar, F.; Griffin, A.; Rehman, S.U.; Pasang, T. Detecting Sheep in UAV Images. Comput. Electron. Agric. 2021, 187, 106219. [Google Scholar] [CrossRef]
  99. Zhu, P.; Peng, T.; Du, D.; Yu, H.; Zhang, L.; Hu, Q. Graph Regularized Flow Attention Network for Video Animal Counting from Drones. IEEE Trans. Image Process. 2021, 30, 5339–5351. [Google Scholar] [CrossRef]
  100. Wang, H.; Zhang, X.; Meng, X.; Song, W.; Chen, Z. Electronic Sheepdog: A Novel Method in with UAV-Assisted Wearable Grazing Monitoring. IEEE Internet Things J. 2023, 10, 16036–16047. [Google Scholar] [CrossRef]
  101. Mangewa, L.J.; Ndakidemi, P.A.; Munishi, L.K. Integrating UAV Technology in an Ecological Monitoring System for Community Wildlife Management Areas in Tanzania. Sustainability 2019, 11, 6116. [Google Scholar] [CrossRef]
  102. Akyol, K. An Innovative Hybrid Method Utilizing Fused Transformer-Based Deep Features and Deep Neural Networks for Detecting Forest Fires. Adv. Space Res. 2025, 75, 8583–8598. [Google Scholar] [CrossRef]
  103. Melhim, L.K.B.; Jemmali, M.; Boulila, W.; Alazab, M.; Rani, S.; Campbell, H.; Amdouni, H. Leveraging Drone-Assisted Surveillance for Effective Forest Conservation: A Case Study in Australia’s Daintree Rainforest. IEEE Internet Things J. 2024, 11, 31167–31179. [Google Scholar] [CrossRef]
  104. Kuželka, K.; Surový, P. Automatic Detection and Quantification of Wild Game Crop Damage Using an Unmanned Aerial Vehicle (UAV) Equipped with an Optical Sensor Payload: A Case Study in Wheat. Eur. J. Remote Sens. 2018, 51, 241–250. [Google Scholar] [CrossRef]
  105. Ghazal, M.A.; Mahmoud, A.; Aslantas, A.; Soliman, A.; Shalaby, A.; Benediktsson, J.A.; El-Baz, A. Vegetation Cover Estimation Using Convolutional Neural Networks. IEEE Access 2019, 7, 132563–132576. [Google Scholar] [CrossRef]
  106. Mawardi, Z.; Gautam, D.; Whiteside, T.G. Utilization of Remote Sensing Dataset and a Deep Learning Object Detection Model to Map Siam Weed Infestations. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2024, 17, 18939–18948. [Google Scholar] [CrossRef]
  107. Feng, L.; Zhang, Z.; Ma, Y.; Sun, Y.; Du, Q.; Williams, P.; Drewry, J.; Luck, B. Multitask Learning of Alfalfa Nutritive Value From UAV-Based Hyperspectral Images. IEEE Geosci. Remote Sens. Lett. 2022, 19, 1–5. [Google Scholar] [CrossRef]
  108. Onishi, M.; Ise, T. Explainable Identification and Mapping of Trees Using UAV RGB Image and Deep Learning. Sci. Rep. 2021, 11, 903. [Google Scholar] [CrossRef]
  109. Lee, J.J. Minimizing Sensor Fusion Disruptions in UAV-Based Collaborative Remote Sensing for Wildlife Preservation. IEEE Sens. Lett. 2024, 8, 1–4. [Google Scholar] [CrossRef]
  110. Yaney-Keller, A.; McIntosh, R.R.; Clarke, R.H.; Reina, R.D. Closing the Air Gap: The Use of Drones for Studying Wildlife Ecophysiology. Biol. Rev. 2025, 100, 1206–1228. [Google Scholar] [CrossRef]
  111. Liu, X.; Ho, L.; Bruneel, S.; Goethals, P. Applications of Unmanned Vehicle Systems for Multi-Spatial Scale Monitoring and Management of Aquatic Ecosystems: A Review. Ecol. Inform. 2025, 85, 102926. [Google Scholar] [CrossRef]
  112. Ganie, P.A.; Khatei, A.; Posti, R.; Sidiq, M.J.; Pandey, P.K. Unmanned Aerial Vehicles in Fisheries and Aquaculture: A Comprehensive Overview. Environ. Monit. Assess. 2025, 197, 503. [Google Scholar] [CrossRef]
  113. Sundaram, N.; Meena, S.D. Integrated Animal Monitoring System with Animal Detection and Classification Capabilities: A Review on Image Modality, Techniques, Applications, and Challenges. Artif. Intell. Rev. 2023, 56, 1–51. [Google Scholar] [CrossRef]
  114. Schad, L.; Fischer, J. Opportunities and Risks in the Use of Drones for Studying Animal Behaviour. Methods Ecol. Evol. 2023, 14, 1864–1872. [Google Scholar] [CrossRef]
  115. Hays, G.C.; Taxonera, A.; Renom, B.; Fairweather, K.; Lopes, A.; Cozens, J.; Laloë, J.O. Changes in Mean Body Size in an Expanding Population of a Threatened Species. Proc. R. Soc. B 2022, 289, 20220696. [Google Scholar] [CrossRef]
  116. Puri, A. A Novel Wildlife Poaching Detection Solution Using Spatio-Temporal Data with Dynamic Time Warping. In Proceedings of the 2021 IEEE MIT Undergraduate Research Technology Conference (URTC), Virtual, 8–10 October 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 1–5. [Google Scholar]
  117. Siyanwal, R.; Agarwal, A.; Srirama, S.N. An Energy Efficient Fog-Based Internet of Things Framework to Combat Wildlife Poaching. Sustain. Comput. Inform. Syst. 2025, 45, 101070. [Google Scholar] [CrossRef]
  118. Stöcker, C.; Bennett, R.; Nex, F.; Gerke, M.; Zevenbergen, J. Review of the Current State of UAV Regulations. Remote Sens. 2017, 9, 459. [Google Scholar] [CrossRef]
  119. Getting Started–Conservation AI. Available online: https://www.conservationai.co.uk/getting-started/ (accessed on 15 April 2025).
  120. Data Sets-LILA BC. Available online: https://lila.science/datasets (accessed on 14 April 2025).
  121. About Wildlife Insights AI|Wildlife Insights. Available online: https://www.wildlifeinsights.org/about-wildlife-insights-ai# (accessed on 9 June 2025).
  122. Movebank. Available online: https://www.movebank.org/cms/movebank-content/get-started (accessed on 14 April 2025).
  123. Kays, R.; Davidson, S.C.; Berger, M.; Bohrer, G.; Fiedler, W.; Flack, A.; Hirt, J.; Hahn, C.; Gauggel, D.; Russell, B.; et al. The Movebank System for Studying Global Animal Movement and Demography. Methods Ecol. Evol. 2022, 13, 419–431. [Google Scholar] [CrossRef]
  124. Ivanova, N.V.; Shashkov, M.P. The Possibilities of GBIF Data Use in Ecological Research. Russ. J. Ecol. 2021, 52, 1–8. [Google Scholar] [CrossRef]
  125. GBIF Dataset Search. Available online: https://www.gbif.org/dataset/search (accessed on 14 April 2025).
  126. Integrations-EarthRanger. Available online: https://www.earthranger.com/integrations (accessed on 9 June 2025).
  127. Di Cecco, G.J.; Barve, V.; Belitz, M.W.; Stucky, B.J.; Guralnick, R.P.; Hurlbert, A.H. Observing the Observers: How Participants Contribute Data to INaturalist and Implications for Biodiversity Science. Bioscience 2021, 71, 1179–1188. [Google Scholar] [CrossRef]
Figure 1. Search process diagram.
Figure 1. Search process diagram.
Drones 09 00455 g001
Figure 2. Distribution of publications across academic databases and wildlife domains. W1: Automatic Species Identification; W2: Tracking and Movement Analysis; W3: Surveillance and Anti-Poaching; W4: Population Estimation; W5: Habitat Analysis. Others: encompasses papers from: Oxford Academic (1), Canadian Science of Publishing (2), and PeerJ (1).
Figure 2. Distribution of publications across academic databases and wildlife domains. W1: Automatic Species Identification; W2: Tracking and Movement Analysis; W3: Surveillance and Anti-Poaching; W4: Population Estimation; W5: Habitat Analysis. Others: encompasses papers from: Oxford Academic (1), Canadian Science of Publishing (2), and PeerJ (1).
Drones 09 00455 g002
Table 1. Identified publications across academic databases and wildlife research domains.
Table 1. Identified publications across academic databases and wildlife research domains.
DatabaseW1W2W3W4W5Total
IEEE Xplore6433622
Springer10252120
Wiley 5606118
MDPI6213214
ScienceDirect4421314
Francis & Taylor200215
Others112004
Total341913171497
W1: Automatic Species Identification; W2: Tracking and Movement Analysis; W3: Surveillance and Anti-Poaching; W4: Population Estimation; W5: Habitat Analysis. Others: encompasses papers from: Oxford Academic (1), Canadian Science of Publishing (2), and PeerJ (1).
Table 2. Referenced AI algorithms across wildlife applications.
Table 2. Referenced AI algorithms across wildlife applications.
AlgorithmsWildlife Applications
Automated Species Identification
Faster R-CNNIdentification of caribou [37]; Bengal Tiger [38]; koala [46,47].
CNNIdentification (kudus, giraffes, zebras, rhinos) [42]; marine species (seals, turtles, gannets) [49].
YOLOv3/4/5/7/8Deer in forests [39]; cattle [40]; (cows, deer, horses) [44]; nocturnal species (hares, roe deer) [48]; avian [50]; arboreal species (koalas, gliders) [51].
RandomForest/SVM/Reg. TreesWaterbird population [43].
{SE,WILD,ALSS}-YOLOSheep, cattle, seal, camelus, zebra, kiang [30,31]; thermal images [32].
YOLO + TensorFlowHerd species (elephants, zebras) [41].
ResNetSpecies classification (deer, geese, cattle, horses) [28].
GAN (BATScan)Bat species classification (>30 species) [35].
RetinaNetThermal imagery [33].
Sobel edge-based methodDetection of large mammals [45].
Tracking and Behavioral Analysis
CNNGelada monkey pose/movement tracking [61].
YOLOv3/7Procapra goat tracking [54,60].
Reinforcement LearningRobotics shepherding system [63]; zebra tracking [64].
DenseNet (DeepPoseKit)Animal pose estimation [29].
IDTRACKER.AI (Trex)Multiple animal tracking and 2D pose estimation [55].
Particle Filter (PF)Localization and tracking of multiple radio-tagged animals [56,57].
Pareto algorithmUAV waypoint optimization for VHF-tagged animals tracking [58].
Optimal Transport (OT)Multi-UAV for animal herds movement modeling with OT theory [59].
Ant Colony Opt. (ACO)Deer tracking strategies with ACO theory [62].
Surveillance and Anti-Poaching
Region-Based CNNShark detection from aerial imagery and IOT sensors [73].
ResNetRhino/vehicle detection with GSM alerts [74].
YOLOv3/v516/8Occluded person detection under canopy [66,67]; poacher detection and alert issuing [65]; occluded person in complex environment [68]; surveillance of (black rhino, giraffe, ostrich, springbok) [69].
YOLORSMS-based wild animal activity alerts (25 classes) [75].
GAN and YOLONocturnal surveillance trained with iNaturalist datasets [71,72].
Particle Swarm Opt. Detection of standing or walking people in occluded forest [76].
Agent-based modelingEvaluate a surveillance operation using [77].
Population Estimation
Random ForestMapping and counting nests for waterbirds breeding colonies [82,83].
CNNCounting animals from images at 100 m height [91]; sheep counting [95].
CNN and PicterraPinniped (seal) surveys [86].
Mask-RCNNCattle detection and counting [93,94].
YOLOv2Cattle detection and counting [92].
DenoiSegWaterbird counting during breeding and non-breeding periods [84].
TIR Object Finder SoftwareMonitoring and surveying European elk [88].
Graph Reg. Flow Attention Net Density map estimation for video animal counting [96].
Otsu ThresholdingThermal image segmentation for animal counts [89].
Habitat Analysis
CNNVegetation segmentation with multispectral cameras [102]; tree species classification for forest inventories [105].
Transformer–CNN hybridEarly wildfire detection in thermal imagery [99].
YOLOv5Invasive Siam weed detection in UAV imagery [103].
LSTM-RNNAlfalfa quality prediction using hyperspectral drone imagery [104].
Two Group Clustering Drone swarm optimization for detecting fires encroachment [100].
MATLAB Morphological FiltersDelineation of damaged areas via drone imagery [101].
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Aliane, N. Drones and AI-Driven Solutions for Wildlife Monitoring. Drones 2025, 9, 455. https://doi.org/10.3390/drones9070455

AMA Style

Aliane N. Drones and AI-Driven Solutions for Wildlife Monitoring. Drones. 2025; 9(7):455. https://doi.org/10.3390/drones9070455

Chicago/Turabian Style

Aliane, Nourdine. 2025. "Drones and AI-Driven Solutions for Wildlife Monitoring" Drones 9, no. 7: 455. https://doi.org/10.3390/drones9070455

APA Style

Aliane, N. (2025). Drones and AI-Driven Solutions for Wildlife Monitoring. Drones, 9(7), 455. https://doi.org/10.3390/drones9070455

Article Metrics

Back to TopTop