Next Article in Journal
UAV Airborne Network Intrusion Detection Method Based on Improved Stratified Sampling and Ensemble Learning
Previous Article in Journal
A New Drone Methodology for Accelerating Fire Inspection Tasks
Previous Article in Special Issue
A Comprehensive Survey of Drones for Turfgrass Monitoring
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Integration of Drones in Landscape Research: Technological Approaches and Applications

1
Department of Landscape Architecture, Architecture and Design Faculty, Atatürk University, 25240 Erzurum, Türkiye
2
Department of Architecture, Architecture and Design Faculty, Atatürk University, 25240 Erzurum, Türkiye
3
Doctoral Programme in Landscape Architecture, Institute of Science, Atatürk University, 25240 Erzurum, Türkiye
4
Horticulture Programme, Baskil Vocational College, Fırat University, 23119 Elazığ, Türkiye
*
Author to whom correspondence should be addressed.
Drones 2025, 9(9), 603; https://doi.org/10.3390/drones9090603
Submission received: 31 May 2025 / Revised: 18 August 2025 / Accepted: 19 August 2025 / Published: 26 August 2025
(This article belongs to the Special Issue Drones for Green Areas, Green Infrastructure and Landscape Monitoring)

Abstract

Drones have rapidly emerged as transformative tools in landscape research, enabling high-resolution spatial data acquisition, real-time environmental monitoring, and advanced modelling that surpass the limitations of traditional methodologies. This scoping review systematically explores and synthesises the technological applications of drones within the context of landscape studies, addressing a significant gap in the integration of Uncrewed Aerial Systems (UASs) into environmental and spatial planning disciplines. The study investigates the typologies of drone platforms—including fixed-wing, rotary-wing, and hybrid systems—alongside a detailed examination of sensor technologies such as RGB, LiDAR, multispectral, and hyperspectral imaging. Following the PRISMA-ScR (Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews) guidelines, a comprehensive literature search was conducted across Scopus, Web of Science, and Google Scholar, utilising predefined inclusion and exclusion criteria. The findings reveal that drone technologies are predominantly applied in mapping and modelling, vegetation and biodiversity analysis, water resource management, urban planning, cultural heritage documentation, and sustainable tourism development. Notably, vegetation analysis and water management have shown a remarkable surge in application over the past five years, highlighting global shifts towards sustainability-focused landscape interventions. These applications are critically evaluated in terms of spatial efficiency, operational flexibility, and interdisciplinary relevance. This review concludes that integrating drones with Geographic Information Systems (GISs), artificial intelligence (AI), and remote sensing frameworks substantially enhances analytical capacity, supports climate-resilient landscape planning, and offers novel pathways for multi-scalar environmental research and practice.

1. Introduction

Landscape research plays a pivotal role in environmental planning, sustainable resource management, and urban design by providing essential spatial insights into natural and built environments. Traditional landscape analysis methods—including field surveys, satellite imagery, and Geographic Information Systems (GISs)—have long served as core tools in this field; however, they often fall short in terms of temporal resolution, operational flexibility, and cost-effectiveness [1]. The growing demand for real-time high-precision spatial data has underscored the limitations of these conventional techniques, creating a need for more agile and scalable alternatives.
Despite the growing volume of studies involving Uncrewed Aerial Vehicles (UASs), few reviews offer a comprehensive synthesis of their evolving roles within landscape research, particularly through a technological and application-oriented lens. As global landscape challenges become more urgent—ranging from urbanisation pressures and biodiversity loss to climate adaptation and sustainable development—the integration of drone technologies requires a more structured evaluation. This review responds to this need by mapping out typologies, application areas, and prospective innovations related to UAS use in landscape research. It also aims to provide researchers and practitioners with a consolidated framework to inform data-driven, adaptive, and sustainable planning strategies.
This scoping review is particularly essential due to the fragmented nature of drone-related landscape research, where existing studies often focus on specific technologies or isolated case areas without providing a comprehensive synthesis of technological typologies, sensor integrations, or application domains. Previous reviews have typically emphasised either agricultural uses or remote sensing capabilities in isolation, lacking an interdisciplinary framework that bridges landscape architecture, planning, environmental science, and geospatial engineering. In this context, this review fills a critical gap by systematically mapping current trends in UAS technologies, sensor capabilities, and landscape-related applications. Adhering to the PRISMA-ScR guidelines, this study contributes to the literature by not only cataloguing use cases, but also by identifying technological patterns and underexplored areas, which can inform future research agendas and policy developments.
Over the past decade, Uncrewed Aerial Vehicles (UASs), commonly referred to as drones, have emerged as transformative assets in landscape research, offering unprecedented capabilities in high-resolution imaging, precise spatial data collection, and three-dimensional (3D) modelling [2,3]. These platforms—ranging from fixed-wing and rotary-wing to hybrid systems—have become indispensable tools for landscape scientists, environmental engineers, and urban planners [4]. Drones not only provide a faster and more cost-effective alternative to traditional methods, but also enable comprehensive coverage of vast and inaccessible terrains, facilitating the precise monitoring of temporal and spatial landscape dynamics [5].
The integration of drones into landscape research has catalysed significant advancements across multiple domains, including land use mapping, vegetation and biodiversity assessment, water resource management, and the monitoring of ecological transformations [6]. Notably, the convergence of drone technologies with remote sensing techniques—such as multispectral, hyperspectral, and LiDAR sensors—has substantially enhanced data accuracy and resolution, thereby supporting robust scientific analyses and informed spatial decision making [7]. These advancements reflect a paradigm shift towards data-driven and technology-enhanced landscape management practices.
In response to the expanding landscape of drone applications, this scoping review aims to systematically identify, categorise, and synthesise the technological deployments of UASs in landscape research and planning. This review focuses on platform typologies (e.g., fixed-wing, rotary-wing, hybrid systems), sensor technologies (e.g., RGB, LiDAR, multispectral, hyperspectral), and their functional roles in mapping, monitoring, ecological assessment, cultural heritage documentation, and sustainable spatial planning. By mapping the breadth, depth, and interdisciplinary nature of these applications, this review seeks to provide a comprehensive foundation for future research, technological integration, and policy development in the evolving field of landscape studies. In doing so, the review not only consolidates fragmented insights across disciplines, but also contributes to a forward-looking research agenda that embraces technological innovation for resilient and adaptive landscape planning.

1.1. The Importance of Drone-Assisted Landscape Research

Landscape research is a multidisciplinary field that underpins effective environmental planning, conservation of ecosystem services, sustainable resource management, and biodiversity protection. In the context of accelerating urbanisation and industrialisation, landscape ecology has gained prominence as a critical framework for understanding and mitigating human-induced environmental impacts [8]. Traditional approaches, although valuable, often face challenges in accessing difficult terrains and achieving timely data collection, which has prompted the integration of drone technologies to enhance research efficiency and accuracy.
Both rural and urban landscapes serve as key study domains for assessing the complex interactions between human activities and ecosystem health. For instance, a European study examining urbanisation’s effects on green spaces and ecological connectivity underscored the pivotal role of strategic landscape planning in preventing biodiversity loss [9], while research in the United States demonstrated how advanced landscape modelling techniques could effectively evaluate the sustainability of urban growth [10]. These examples highlight that landscape configuration is not merely a passive backdrop, but an active component in maintaining ecological corridors that foster biodiversity and optimise ecosystem services [11]. In China, studies have confirmed that preserving the connectivity between agricultural lands and natural habitats contributes significantly to biodiversity conservation [12], illustrating the universal relevance of ecological continuity.
Water resource management, erosion control, and soil quality monitoring are equally vital aspects of landscape research. A notable study in the Amazon Basin of Brazil utilised drones to investigate the detailed impacts of deforestation on ecosystem dynamics, revealing substantial disruptions in the water cycle and highlighting the cascading effects on broader ecological functions [13,14]. By enabling rapid high-resolution data acquisition across diverse landscapes, drones empower researchers and policymakers to respond more swiftly and effectively to environmental challenges, thus reinforcing their indispensable role in contemporary landscape science.

1.2. Evolution of Technology and Drones

Uncrewed Aerial Vehicle (UAS) technology has rapidly evolved in recent years, having been optimised for various applications. Initially developed for military purposes, drones are now widely used in fields ranging from agriculture to urban planning and ecological monitoring [15]. Technological advancements have enhanced drone sensor capacities, flight durations, and data processing capabilities, making them more accessible and efficient for landscape research [7].
Modern drones are equipped with high-resolution cameras, multispectral and hyperspectral sensors, LiDAR systems, and thermal imaging devices. These technological innovations significantly contribute to applications such as land mapping, monitoring ecosystem changes, and managing water resources [6]. For example, LiDAR-based drones can provide detailed topographic data even beneath dense vegetation cover, improving terrain modelling processes [16].
The autonomous flight capabilities of drones, combined with artificial-intelligence-supported analysis tools, enable the development of smarter and more autonomous decision support systems. Machine learning algorithms help analyse data collected by drones more accurately and efficiently [17]. Consequently, critical issues such as landscape changes, erosion processes, and post-disaster recovery can be monitored more effectively.
In alignment with the PRISMA-ScR guidelines, this review is structured into six main sections. Section 2 outlines the materials and methods, detailing the systematic literature search, inclusion criteria, and data extraction process. Section 3 presents the typologies of drone platforms and sensor technologies, followed by Section 4, which explores the major application domains such as mapping, vegetation analysis, water resource management, cultural heritage documentation, and urban and rural planning. Section 5 discusses future technological trends and emerging innovations, including AI integration, new sensor capabilities, and swarm systems. Section 6 offers concluding insights and recommendations, highlighting knowledge gaps and policy implications. This structure provides a comprehensive and interdisciplinary framework to guide future research and practice in drone-assisted landscape studies.

2. Materials and Methods

This scoping review is grounded in an extensive literature search and a structured thematic synthesis, aiming to provide a comprehensive and methodologically rigorous assessment of drone applications in landscape research. The analytical framework was developed to explore the evolving typologies of UAS platforms and sensor technologies in relation to their spatial deployment, functional capacities, and environmental applications. This framework integrates both technological and spatial dimensions, enabling a multi-scalar assessment of how drones contribute to monitoring, mapping, modelling, and planning practices.
To ensure methodological transparency and reproducibility, the research process was systematically organised into seven interrelated stages: (1) database selection, (2) search query development, (3) record identification and de-duplication, (4) title and abstract screening, (5) full-text eligibility assessment, (6) data extraction, and (7) thematic synthesis. These stages are summarised in the PRISMA-based flow diagram (Figure 1), which illustrates the study selection and review workflow.
Although the review protocol was not registered in a formal registry (e.g., Open Science Framework), the methodological design aligns with established best practices for scoping reviews, particularly those outlined in the PRISMA-ScR guidelines. The iterative approach adopted in this review emphasises comprehensive coverage, continuous refinement of analytical categories, and full transparency in reporting. The thematic synthesis was guided by recurring codes and comparative patterns observed across studies.
The conceptual scope was informed by interdisciplinary research that integrates landscape planning, geospatial analysis, ecology, environmental monitoring, and cultural heritage studies. Particular emphasis was placed on investigating UAS applications in both natural and anthropogenic landscape contexts, especially those associated with sustainability, resilience, and climate-sensitive development goals.
The methodological foundation of this scoping review builds upon an extensive literature search and a structured thematic synthesis, aiming to deliver a comprehensive and transparent assessment of drone applications in landscape research. The analytical framework was developed to capture the evolving typologies of UAS platforms and sensor technologies, focusing on their spatial deployment, functional capacities, and environmental relevance. This framework integrates both technological and spatial dimensions, enabling a multi-scalar assessment from site-specific erosion monitoring to regional-scale green infrastructure planning.
A total of 638 unique records were initially identified through the database search. After removing 48 duplicates, 590 studies were screened at the title and abstract level. Subsequently, 180 full-text articles were reviewed in detail. Based on predefined eligibility criteria, 87 studies were excluded due to irrelevance, lack of methodological detail, or insufficient thematic alignment. In the final synthesis, 93 studies were retained for data extraction and analysis. The selection process was independently conducted by two reviewers, with a third acting as an arbitrator in cases of disagreement. The overall review workflow is visualised in the PRISMA-based flow diagram (Figure 1), which was developed by the authors in accordance with the PRISMA Extension for Scoping Reviews guidelines [18].
A structured data extraction template was developed using Microsoft Excel to ensure consistency and transparency. The template captured key metadata, including author(s), year, country, UAS platform type (fixed-wing, rotary-wing, hybrid), sensor technology (RGB, multispectral, LiDAR, thermal), spatial scale, landscape typology (e.g., urban, rural, protected area), and application domain (e.g., vegetation analysis, flood modelling, tourism planning). These variables were selected to represent the technological, spatial, and functional diversity of drone applications in landscape research. The emphasis on UAS types and sensor configurations reflects their relevance in determining operational range, data resolution, and environmental applicability. Spatial scales and landscape typologies were used to differentiate urban, rural, and natural contexts.
It illustrates each step of the scoping review process, including identification, duplicate removal, screening, eligibility assessment, and inclusion. Data extraction did not involve any formal assumptions or simplifications. However, categorisation was informed by recurring themes and patterns identified during full-text review and was refined through iterative discussion between reviewers. Thematic clustering was conducted inductively, resulting in five primary application domains: monitoring, mapping, modelling, decision support, and policy development. These clusters enabled comparative insights into how UAS technologies are operationalised across diverse landscape functions and contexts. The coding and thematic synthesis process is further supported by frequency tables and co-occurrence matrices (see Section 3).
A thematic content analysis was conducted to synthesise the extracted data across all included studies. An inductive coding approach was applied, allowing for emergent themes to be identified through close reading and iterative comparison of key variables. Frequency tables and code co-occurrence matrices were developed using Microsoft Excel to quantify the presence and intersection of concepts such as UAS platform types, sensor technologies, spatial scales, and application domains. Comparative summaries were then generated to highlight technological trends, recurring combinations (e.g., LiDAR + fixed-wing for erosion modelling), and geographic concentrations of drone use.
Particular emphasis was placed on analysing platform–function–sensor combinations and their alignment with spatial, ecological, and planning objectives. Where relevant, the analysis also examined patterns related to regional priorities, interdisciplinary collaboration trends, and the emergence of innovative UAS configurations in landscape research. Visualisation techniques—including bar charts, spatial distribution maps, and cluster diagrams—were used to identify dominant application clusters and methodological outliers. These visual outputs support the synthesis and are referenced in the results section.
The reporting structure follows a logical progression, beginning with broad thematic scoping and narrowing down to specific analytical insights. The PRISMA-ScR flow diagram (Figure 1) provides a transparent and reproducible summary of the review process—including identification, duplicate removal, screening, eligibility assessment, and inclusion stages—based on the methodological recommendations outlined by Tricco et al. [18].

Limitations of This Review

  • Language Bias: Only English-language publications were included, potentially omitting relevant studies in other languages.
  • Grey Literature Exclusion: Institutional reports, technical documentation, and non-peer-reviewed content were excluded, which may limit the completeness of the dataset.
  • Technological Focus: This review primarily focuses on technological aspects of UAS deployment and may underrepresent socio-political, economic, or participatory perspectives in landscape transformation.

3. Types of Drones Used in Landscape Research

Drones employed in landscape research are diverse in design and function, each offering distinct advantages based on operational requirements. These aerial systems are generally classified into three primary categories: fixed-wing, rotary-wing, and hybrid drones. The selection among these platforms depends on several factors, including the size and complexity of the study area, spatial and spectral data needs, flight duration, payload capacity, and prevailing weather conditions.
Fixed-wing drones are typically used for large-area mapping due to their extended flight range and speed, whereas rotary-wing drones—such as quadcopters, hexacopters, and octocopters—offer superior manoeuvrability and vertical take-off and landing (VTOL) capabilities, making them ideal for site-specific analysis in constrained environments. Hybrid drones combine features of both categories, allowing for flexible mission planning in complex terrain conditions.
Within the broader scope of landscape research, which integrates ecological, spatial, and cultural dimensions, landscape ecology occupies a more specific position as the study of spatial heterogeneity, connectivity, and ecological processes across scales. This conceptual distinction is essential, as landscape ecology provides a scientific framework that directly links spatial patterns to ecological functions. Drone-based remote sensing offers valuable opportunities to advance this field, particularly by enabling high-resolution analyses of landscape structure, spatial configuration, and habitat connectivity. Recent studies have highlighted the potential of UAS technologies in plant ecology, biodiversity-friendly agriculture, and wetland research, while also emphasising emerging opportunities and challenges for their integration into landscape ecology [19,20,21,22]. Collectively, these contributions demonstrate that UAS technologies can capture fine-scale ecological patterns and provide a crucial intermediate link between field measurements and satellite-based observations, thereby reinforcing the role of drones in advancing landscape ecological research.
Figure 2 and Figure 3 visually illustrate the structural differences and comparative performance metrics of these drone types, respectively. Table 1, Table 2, Table 3 and Table 4 provide a comprehensive typological framework for UAS use in landscape research. Table 1 presents foundational definitions of drone types, while Table 2 and Table 3 summarise their operational advantages and limitations, respectively. Table 4 outlines their primary application areas across diverse environmental contexts. Together, these tables support informed decision making regarding platform selection, sensor integration, and mission planning in landscape-related research scenarios.
Table 1, Table 2, Table 3 and Table 4 collectively present a structured typological framework for UAS selection in landscape research. Table 1 introduces the foundational definitions of the three primary drone categories—fixed-wing, rotary-wing, and hybrid UASs—while Table 2 outlines their key operational advantages. Table 3 complements this overview by detailing platform-specific limitations, and Table 4 highlights the main application domains across varied environmental contexts.
Fixed-wing UASs stand out for their high endurance and wide area coverage, making them ideal for regional-scale mapping and land use monitoring. Rotary-wing drones excel in manoeuvrability and vertical flight, proving valuable in localised ecological assessments, forest canopy inspections, and urban green infrastructure analysis. Hybrid UASs merge the strengths of both systems, offering flexible deployment across complex terrains, albeit with higher technological and cost demands.
As summarised across Table 1, Table 2, Table 3 and Table 4, UAS platform selection in landscape applications is shaped by the spatial scale of the study area, topographical complexity, required data resolution, and intended ecological or planning objectives. These factors influence not only sensor compatibility and mission design, but also the overall feasibility and scientific validity of UAS-assisted research strategies.

3.1. Fixed-Wing Drones

Fixed-wing drones are widely utilised in landscape research due to their superior flight endurance and wide area coverage capacity. Their aerodynamic design enables them to cover long distances efficiently, which is particularly useful in projects involving large-scale agricultural landscapes, forest reserves, or coastal systems [24].
In rural applications, these platforms support precise and consistent monitoring of crop productivity, land degradation, and water resource patterns. Their high cruising altitude and stable flight paths make them ideal for acquiring uniform image resolution across expansive territories. This capability is frequently employed in soil condition assessments and topographic change detection, where rapid and repeated coverage is required [25].
Ecological research benefits from the use of fixed-wing UASs in tracking long-term changes in vegetation distribution and monitoring biodiversity corridors. Their payload capacity enables the integration of multispectral or LiDAR sensors to enhance the detection of subtle environmental variations [26].
Within the domain of environmental and spatial planning, fixed-wing drones contribute to the development of landscape-scale models, spatial baselines, and risk maps. Their role in land use change detection and large-area ecosystem evaluation has positioned them as essential tools in resilient planning strategies and sustainable landscape interventions [27].

3.2. Rotary-Wing Drones

Rotary-wing drones are increasingly prevalent in landscape research due to their high manoeuvrability and capacity to operate within spatially constrained environments. Their multirotor systems enable stable hovering, making them indispensable for tasks that require stationary observation, such as urban vegetation analysis, canopy-level inspection, and façade mapping [28]. This capacity is especially beneficial in densely built areas where fixed-wing drones cannot safely navigate or land [51].
In both urban and rural landscape planning processes, rotary-wing drones are actively employed in land use classification, green infrastructure modelling, and micro-scale hydrological assessments. Their compatibility with vertical and low-altitude flight paths facilitates the fine-scale mapping of ground conditions, vegetation fragmentation, and irrigation patterns [29]. These features are particularly advantageous when working on localised water management issues and designing ecological connectivity networks [52].
Equipped with multispectral and hyperspectral sensors, rotary-wing drones also play a vital role in agricultural and ecological monitoring. They support the accurate detection of plant health status, disease spread, and crop water requirements, contributing to the development of precision farming strategies and sustainable land management [30]. In this regard, their integration with real-time image processing systems has enhanced their use in post-disaster recovery and remote sensing-based ecological assessments [53,54].
From a broader planning and resilience perspective, rotary-wing UASs contribute to environmental integrity evaluation, carbon sequestration modelling, and rapid environmental impact assessments following natural disturbances [31]. Their flexibility in capturing localised data makes them valuable in adaptive landscape design scenarios, especially where timely data collection is essential [55].
However, these drones face key limitations compared to fixed-wing systems. Their shorter flight duration and reduced battery life limit the scope of data acquisition per mission, making them less suitable for large-scale landscape research [56]. Frequent battery swaps or recharging are required to maintain consistent coverage, which increases operational complexity in longitudinal studies or remote locations.

3.3. Hybrid Drones

Hybrid drones combine the technical advantages of both fixed-wing and rotary-wing platforms, offering a unique blend of long flight duration, hovering capability, and flexible take-off and landing. Their ability to perform vertical take-off and landing (VTOL) allows them to operate in spatially limited environments, while still maintaining the capacity for long-range and high-altitude data acquisition [32,57,58].
These drones are well suited for diverse landscape applications that require both wide area scanning and close-range detail capture. Their hybrid design supports efficient multi-scale analysis, enabling simultaneous use in regional ecosystem surveys and site-specific environmental inspections. Comparative operational studies have shown that hybrid drones generally achieve longer flight times than traditional rotary-wing systems, thereby expanding their data acquisition scope and enhancing their suitability for complex landscape research tasks [33,59].
Hybrid UASs are increasingly being employed in disaster management, post-disaster urban reconstruction, and environmental change detection. Their ability to operate in unstable or hazardous terrain without the need for runway space has made them effective tools for rapid data collection following natural disasters such as floods or earthquakes [34,60]. They are also applied in large-scale monitoring tasks in both urban and rural settings, including ecological network mapping and floodplain evaluation [61].
In landscape planning and green infrastructure development, hybrid drones have become reliable assets for monitoring vegetation cover, modelling ecological corridors, and assessing land use transitions in peri-urban regions. Their high-altitude stability and sensor versatility allow for the accurate mapping of transportation infrastructure and green spaces [35,62,63]. These drones are also proving effective in emerging smart city applications such as urban heat island modelling and air quality assessment, where continuous multi-layered data streams are critical [64].
In addition to the structural classification of UAS platforms (fixed-wing, rotary-wing, and hybrid), the selection of appropriate autopilot systems and ground control software plays a vital role in ensuring successful mission execution and data integrity in landscape research. Open-source flight control platforms such as Ardupilot and PX4 are widely used due to their adaptability, modular architecture, and cost-efficiency [65,66]. These systems are often integrated with planning and monitoring software like Mission Planner and QGroundControl, which enable users to define flight paths, set altitude thresholds, and monitor real-time telemetry data [67].
For researchers conducting image-based data acquisition, these systems offer robust tools for waypoint navigation, auto-take-off and landing, and sensor triggering, which are critical for generating consistent geospatial datasets. Moreover, autopilot systems facilitate time-stamped coordination between multiple onboard instruments—such as RGB cameras, multispectral sensors, and GNSS/IMU units—by maintaining precise flight logs and synchronisation protocols [68]. This automation reduces operational complexity and improves the reproducibility of spatial data collection in ecological and environmental studies.

3.4. Different Sensors and Data Collection Capacities

In recent years, the integration of advanced sensor technologies with Uncrewed Aerial Vehicles (UASs) has significantly transformed the scope and precision of landscape research. Drones equipped with diverse sensor systems—including RGB, multispectral, thermal cameras, LiDAR, and hyperspectral imaging systems—have become indispensable for acquiring spatially explicit, high-resolution data used in vegetation monitoring, terrain analysis, ecosystem mapping, and environmental diagnostics [63,69]. These sensors contribute to a wide range of applications, from agricultural productivity assessment to the analysis of urban green spaces and cultural heritage sites. By capturing visual, thermal, and spectral information, drones enable researchers and planners to engage in more efficient modelling, classification, and prediction tasks across diverse landscape settings. With the recent incorporation of AI-based analytical frameworks, sensor-equipped UASs now also facilitate real-time environmental decision making and dynamic system monitoring. In the following sections, each sensor type is examined in terms of its specific contributions to landscape-scale data acquisition, its methodological evolution, and its potential for AI-driven integration in contemporary environmental research.
A critical yet often overlooked aspect of UAS-based data acquisition is the synchronisation of multiple onboard sensors, particularly GPS modules, inertial measurement units (IMUs), and camera systems. These components frequently operate at different temporal resolutions and rarely initiate recording simultaneously, leading to temporal mismatches in the collected datasets. Consequently, extensive post-processing is often required to align time-stamped data streams accurately, especially in projects requiring sub-meter spatial fidelity. Addressing these challenges has become increasingly important with the growing integration of multi-sensor systems in landscape research, where positional accuracy and temporal consistency directly influence the validity of spatial analyses [70,71].

3.4.1. RGB Cameras

RGB cameras are among the most fundamental and widely adopted sensor types in drone-assisted landscape research due to their ability to capture high-resolution true-colour imagery. Their straightforward design, affordability, and compatibility with lightweight UAS platforms make them particularly attractive for broad-scale terrain mapping, crop monitoring, and visual surface modelling. These sensors have been widely applied in both agricultural and environmental contexts, offering an accessible yet robust means of obtaining spatial data. Typically operating in the visible range (400–700 nm) with three spectral bands (R, G, B), these sensors produce spatial resolutions ranging from 1 to 10 cm/pixel, depending on altitude and optics [72].
For example, Şanlıyüksel Yücel and Yücel (2024) employed UAS-mounted RGB and thermal sensors to investigate geothermal surface dynamics in the Kocabaşlar region of Türkiye, highlighting the utility of RGB imaging in geothermal monitoring [73]. In contrast, studies such as Agrawal and Arafat (2024) demonstrated how RGB imagery, when integrated with machine learning algorithms, could effectively predict yield performance in cultivated areas by analysing spectral variability [74]. Similarly, Abbas et al. (2023) focused on spatial patterns of crop stress detection, reinforcing the value of RGB data in delineating agronomic intervention zones [75]. Despite their limited spectral range compared to multispectral or hyperspectral systems, RGB sensors remain critical tools for initial assessments and rapid monitoring operations, particularly in settings where technical or financial constraints limit the deployment of more complex equipment.
These examples collectively underscore the sensor’s adaptability across both advanced and resource-constrained research settings. Moreover, the global distribution of case studies—from geothermal landscapes in Türkiye to precision farming applications in South Asia—illustrates RGB sensors’ cross-sectoral and geographic relevance. As a foundational data collection tool, RGB cameras continue to play a central role in informing land use decisions and documenting environmental change. Their utility also sets a baseline for understanding the comparative advantages offered by more specialised sensor types, such as multispectral and hyperspectral systems.

3.4.2. Multispectral Cameras

Multispectral cameras have become indispensable in landscape research, particularly for applications requiring detailed assessments of vegetation health, phenology, and crop productivity. These sensors capture reflectance data across discrete spectral bands, enabling the calculation of vegetation indices such as NDVI and SAVI, which are essential for understanding plant vigour, chlorophyll content, and canopy structure [76]. Most multispectral cameras operate across 4 to 8 discrete spectral bands (e.g., blue, green, red, red-edge, NIR), with spatial resolutions typically ranging from 5 to 20 cm/pixel depending on flight altitude and optics [44]. While their use is most prevalent in agricultural landscapes, multispectral imaging also supports ecosystem diagnostics, urban green space monitoring, and biodiversity conservation initiatives.
Several studies have demonstrated the power of these sensors across varied crop types and geographies. For instance, Tang et al. (2023) employed UAS-based multispectral imaging coupled with chemometric modelling to assess the quality of fresh tea leaves in East Asia, focusing on post-harvest quality parameters [77]. In a different climatic context, Atkinson Amorim et al. (2022) estimated biomass and nitrogen levels in winter wheat in temperate Europe, offering a data-rich alternative to manual sampling [78]. In South America, Berveglieri et al. (2024) predicted soybean yield using high-resolution multispectral data, while Saravia et al. (2022) mapped maize field variability to support precision agriculture strategies [79,80]. These examples illustrate how multispectral sensors not only inform crop health evaluations, but also contribute to yield forecasting, field variability mapping, and resource optimisation. Due to their lightweight and modular design, multispectral cameras are generally compatible with both rotary-wing and fixed-wing UASs, although gimbal stabilisation and spectral calibration tools are often required for precision measurements [44].
A notable strength of multispectral imaging lies in its ability to detect subtle biophysical differences across land cover types, which is particularly advantageous in heterogeneous or transitional landscapes. However, limitations such as sensitivity to atmospheric conditions or calibration inconsistencies should be acknowledged when comparing multi-regional applications. While agricultural applications dominate the current literature, recent studies have begun to explore the application of multispectral UASs in turfgrass monitoring, showing promising results for urban green infrastructure management and sports turf evaluation [81,82]. The potential of multispectral UASs in urban landscape planning—such as monitoring ornamental plant vitality or identifying thermal stress in parks—remains underexplored and represents a valuable future direction.
In sum, multispectral sensors occupy a strategic position in the spectrum of drone-based sensing technologies. Their capacity to bridge scale, precision, and thematic relevance makes them particularly effective in integrating ecological insights with sustainable land use planning across diverse environmental contexts.

3.4.3. Hyperspectral Sensors

Hyperspectral sensors represent one of the most sophisticated tools in UAS-based landscape research, offering high spectral resolution through the collection of contiguous spectral bands. This enables the highly detailed characterisation of the biochemical and biophysical properties of soils, vegetation, and surface water systems. Their utility is particularly evident in precision agriculture, soil diagnostics, and aquatic ecosystem assessments, where subtle spectral differences reveal information beyond the reach of RGB or multispectral sensors. Hyperspectral sensors typically operate with 50 to over 200 contiguous narrow bands across the visible to shortwave infrared spectrum (e.g., 400–2500 nm), allowing for the detection of subtle spectral signatures. However, their spatial resolution is often coarser (e.g., 20–50 cm/pixel) compared to RGB or multispectral sensors [83].
Sethy et al. (2022), for instance, used hyperspectral drone imagery to detect early-stage crop lesions and assess complex soil parameters, including moisture content and pH levels [84]. These applications underscore the role of hyperspectral data in supporting targeted interventions and sustainable agricultural practices. However, limitations exist; as Khan et al. (2022) highlighted, the inherently lower spatial resolution of hyperspectral sensors often requires spectral unmixing algorithms to accurately interpret mixed-pixel data [85]. While such algorithms improve analytical fidelity, they introduce additional computational complexity and may limit operational scalability in large or fragmented landscapes. Due to their weight, power consumption, and data volume, hyperspectral cameras are generally deployed on rotary-wing UASs with high payload capacity or on larger fixed-wing platforms; they often require specialised gimbals and radiometric calibration systems [83].
Despite these constraints, the value of hyperspectral imaging is considerable. Xu et al. (2021) illustrated its potential in capturing physiological stress indicators and mineral composition data, contributing to refined land use classification and nutrient management strategies [86]. Across case studies from South and East Asia, hyperspectral sensors have proven critical in quantifying landscape-level biogeochemical cycles and vegetation health with high thematic precision.
Interestingly, while most hyperspectral UAS applications focus on agricultural and ecological domains, their potential in urban and cultural landscape analysis—such as detecting building material deterioration or mapping heritage site vegetation—remains largely untapped. Given their spectral sensitivity, these sensors are well suited to identifying minute surface variations that could inform conservation planning, archaeological diagnostics, and resilience assessments in both natural and built environments.
In summary, hyperspectral UAS systems enable deep environmental insight through their fine spectral granularity. Their continued integration with spatial modelling frameworks and AI-driven classification techniques holds substantial promise for advancing both scientific understanding and planning precision in complex landscape systems.
The evolution of sensor integration in UAS platforms has expanded beyond traditional imaging technologies to include environmental in situ measurement systems. Among these, atmospheric sensors such as temperature, humidity, barometric pressure, and gas concentration sensors have become vital tools for landscape-scale microclimatic analysis, especially in contexts where real-time low-altitude data acquisition is necessary. Unlike optical sensors, these in situ systems do not rely on reflectance or surface imaging, but instead measure ambient conditions directly during flight, enabling the precision monitoring of eco-hydrological and agro-meteorological parameters [87,88].
These atmospheric sensors are typically mounted on rotary-wing or hybrid UAS platforms flying at altitudes below 400 feet, where vertical microclimate gradients are most prominent. Applications include mapping temperature inversions across valley floors, measuring moisture levels over agricultural plots, and detecting pollutant levels in urban canyons. Data obtained from these sensors are often synchronised with GPS and IMU outputs and processed in real time or post-flight via onboard microcontrollers and edge AI systems. This integration supports enhanced environmental diagnostics, such as early drought detection, frost risk modelling, and localised emission tracking, areas previously underserved by satellite or ground-based monitoring systems [89,90].
Beyond imaging applications, drones have also been equipped with onboard microclimatic sensors capable of capturing real-time data on temperature, humidity, barometric pressure, and soil moisture. These in situ sensing tools—when integrated with autopilot systems, GNSS units, and onboard data loggers—form a comprehensive UAS platform for environmental monitoring. They have proven vital in landscape site analysis, especially for adaptive irrigation design, slope stability monitoring, and thermal comfort evaluation in landscape architecture and planning [89].

3.4.4. Thermal Cameras

Thermal cameras have become increasingly prominent in drone-assisted landscape research due to their ability to capture thermal radiation in the infrared (IR) spectrum and provide surface temperature measurements with high sensitivity. Operating typically within the 7.5–14 µm wavelength range and offering resolutions such as 640 × 512 pixels with thermal sensitivities around ±2 °C, these sensors enable the visualisation of temperature-driven environmental variations that remain undetectable to RGB or multispectral sensors [91,92,93]. This capability makes them particularly effective for monitoring physiological plant stress, land–atmosphere interactions, and thermal anomalies in both natural and built environments.
In the context of landscape ecology, thermal imaging supports a wide array of applications, including assessments of plant water stress, surface energy balance, urban heat island mapping, and microclimatic differentiation. Zarco-Tejada et al. (2012) demonstrated the utility of combined thermal and hyperspectral imaging in detecting early-stage crop water deficits, outperforming conventional ground-based methods in accuracy and coverage [94]. In urban contexts, thermal data have been applied to evaluate the cooling efficiency of green infrastructure, identify impervious surface heat loading, and inform climate-sensitive spatial planning strategies [44].
Thermal imaging is also critical in wildlife monitoring, especially for nocturnal species or animals concealed under dense vegetation. Its application in fire ecology has proven valuable in identifying pre-fire risk zones, tracking active fire fronts, and analysing post-fire regeneration patterns [95]. In sensitive ecosystems such as wetlands, thermal cameras enable the real-time monitoring of habitat temperature dynamics, species distribution, and behavioural patterns with minimal disturbance [96].
In terms of UAS integration, thermal cameras are generally compatible with multirotor systems due to their compact design and low power requirements. However, high-resolution thermal sensors may necessitate stabilised gimbal systems, external cooling units, and radiometric calibration tools, which can impose payload constraints on lighter UASs [97]. Optimal data acquisition typically requires the consideration of environmental variables such as solar angle, surface reflectivity, and atmospheric conditions, often favouring missions during sunrise or sunset for enhanced contrast.
Recent studies also highlight the growing role of AI-based image processing techniques in classifying thermal imagery and automating landscape diagnostics. Deep learning algorithms facilitate the identification of specific land cover components—including water surfaces, animal groups, vegetation clusters, or anthropogenic structures—supporting the development of real-time decision support systems [98]. These advances transform thermal imaging from a passive observational tool into an active component of intelligent environmental monitoring frameworks.
In summary, UAS-integrated thermal sensors offer unique advantages for landscape ecology, biodiversity studies, thermal urban planning, and disaster risk reduction. When combined with other sensing modalities such as RGB, multispectral, or LiDAR, thermal data significantly enhance the spatial, temporal, and functional dimensions of landscape analysis, providing multi-layered insights for adaptive ecosystem management and sustainable land use planning.

3.4.5. LiDAR Sensors

LiDAR (Light Detection and Ranging) sensors have become a cornerstone in UAS-assisted landscape studies, enabling the precise three-dimensional measurements essential for modelling terrain morphology, estimating above-ground biomass, and analysing forest canopy structure. Unlike passive optical sensors, LiDAR actively emits laser pulses and measures their return time, allowing for vegetation penetration and the generation of detailed vertical profiles. This capability renders LiDAR indispensable for ecological monitoring, spatial planning, and terrain classification, especially in densely vegetated or topographically complex environments [69]. Typical UAS-mounted LiDAR systems operate at point densities ranging from 10 to 200 points/m2, with vertical accuracies between 5 and 15 cm depending on flight altitude and scan angle. Full-waveform and discrete return systems enable the detailed structural reconstruction of forest canopies and terrain morphology [99].
Geographically diverse applications highlight LiDAR’s global relevance. In Canada’s boreal forest zones, UAS-mounted LiDAR has been successfully used to estimate forest biomass and quantify above-ground carbon stocks, supporting long-term ecological monitoring efforts [100]. In the Amazon rainforest, similar systems mapped tree height and canopy density, outperforming traditional ground-based surveys in both speed and spatial coverage [101]. Meanwhile, in post-fire European landscapes, LiDAR was employed to document vegetation regeneration stages, offering insights into successional dynamics and informing adaptive restoration strategies [102]. These cases collectively underscore LiDAR’s versatility in addressing varying ecological and climatic challenges across continents.
Despite its strengths, LiDAR technology comes with notable limitations. The high cost of equipment, data acquisition, and post-processing—combined with large data volumes and the need for specialised software—can limit accessibility for small-scale or resource-constrained projects. Furthermore, while LiDAR excels in structural mapping, it lacks the spectral resolution needed for biochemical analyses, often necessitating its integration with multispectral or hyperspectral data sources. Due to the payload weight and energy requirements of LiDAR units, they are commonly integrated with rotary-wing UASs with high thrust-to-weight ratios, or with fixed-wing drones for large-area mapping; their deployment also requires onboard inertial measurement units (IMUs) and GNSS systems for accurate georeferencing [99].
The growing inclusion of LiDAR in conservation and planning frameworks reflects its expanding role beyond forest ecology. Recent studies have explored its use in delineating wildlife corridors, modelling floodplains, and even documenting archaeological topographies and urban green infrastructure. This broadened applicability suggests that, when integrated with other geospatial datasets and modelling techniques, LiDAR serves not only as a diagnostic tool, but also as a driver of precision landscape interventions.
In summary, LiDAR-equipped UAS systems offer unparalleled structural insight, making them indispensable in landscape-level monitoring, carbon accounting, and resilience planning. Their effectiveness increases exponentially when paired with complementary sensor technologies and applied in interdisciplinary planning contexts.
Collectively, the integration of diverse sensor types with UAS platforms reflects a rapidly evolving technological landscape in landscape research. RGB cameras offer affordability and operational simplicity, making them ideal for baseline assessments and widespread adoption in both rural and peri-urban environments. Multispectral sensors build on this foundation by enabling detailed vegetation analysis through spectral index modelling, while hyperspectral systems extend these capabilities further into biochemical diagnostics, albeit with greater complexity and cost. LiDAR, by contrast, offers unmatched spatial fidelity and vertical structural insights, proving essential for topographically complex or densely vegetated landscapes.
Each sensor type exhibits unique strengths aligned with specific research and planning needs: RGB and multispectral systems are particularly effective in agriculture and urban vegetation monitoring; hyperspectral sensors are preferred for precision soil and crop stress analysis; and LiDAR excels in terrain reconstruction and biomass estimation. The case studies also demonstrate regional usage patterns—such as LiDAR in boreal and tropical forests, or hyperspectral sensors in East Asian agronomic contexts—highlighting the contextual appropriateness of sensor selection.
Going forward, the real potential lies in the synergistic use of multiple sensors within integrated UAS frameworks. Combining structural, spectral, and thermal data streams will enable more robust multi-dimensional landscape diagnostics, supporting resilient land use planning, ecological forecasting, and environmental policy development in the era of accelerating change.
Despite the accuracy of LiDAR-based structural mapping, UAS-generated orthomosaics often suffer from spatial misalignment when compared to satellite imagery or manned aerial data. This discrepancy arises primarily due to errors in onboard GPS systems and limitations in photogrammetric projection models, especially in areas with complex topography or insufficient ground control. To overcome these challenges, researchers increasingly employ Real-Time Kinematic (RTK) GPS technologies in conjunction with Ground Control Points (GCPs), enhancing georeferencing accuracy and spatial consistency. Alternatively, post-processing techniques such as image-to-image registration are used to rectify geometric distortions and align datasets effectively. These practices are critical for generating actionable UAS maps that can be reliably integrated into multi-source landscape analyses and geospatial planning frameworks [88,89,103].

3.4.6. Camera Selection Criteria and Integration Issues

The selection of appropriate imaging sensors is a critical determinant in the methodological design of drone-based landscape research. As Uncrewed Aerial Vehicles (UASs) continue to evolve as platforms for environmental monitoring, the diversity of camera types—ranging from conventional RGB to advanced hyperspectral and LiDAR sensors—demands a structured framework for evaluating their suitability across research contexts.
Each sensor type offers specific advantages based on its spectral resolution and spatial fidelity. RGB cameras, with their trichromatic simplicity, remain widely used for baseline mapping and surface classification due to their affordability and broad UAS compatibility [93]. In contrast, multispectral cameras enable the derivation of vegetation indices (e.g., NDVI, SAVI) by capturing data across selected narrow bands, supporting ecological diagnostics in agriculture and urban green space planning [91,104]. Hyperspectral sensors extend this capacity by offering hundreds of contiguous spectral bands, facilitating the detection of biochemical traits such as leaf chlorophyll, lignin content, or soil mineralogy [105].
Thermal cameras capture long-wave infrared radiation and are essential for monitoring surface temperature, evapotranspiration, and habitat suitability, particularly in water-stressed ecosystems or wildlife tracking applications [94,96]. LiDAR, as an active remote sensing system, produces detailed 3D point clouds, enabling topographic modelling, canopy structure analysis, and terrain segmentation, especially under dense vegetation [44].
Sensor performance must also be evaluated in relation to UAS platform capabilities. While RGB and basic multispectral sensors are lightweight and easily mounted on small multirotor drones, hyperspectral and LiDAR systems often require high payload capacity, extended flight endurance, and enhanced onboard processing [97]. Fixed-wing UASs are typically used for covering large landscapes at the expense of vertical manoeuvrability, whereas multirotors provide spatial precision and hover stability but are limited by battery life and range.
Thermal sensors, depending on whether they are radiometric or non-radiometric, may demand specific mounting orientations and flight schedules (e.g., early morning or late evening) to minimise ambient interference [106]. Hyperspectral systems frequently require gimbals, radiometric calibration panels, and auxiliary storage systems, further increasing the complexity and cost of deployment [105,107]. LiDAR units necessitate the integration of inertial measurement units (IMUs) and high-precision GNSS systems for accurate georeferencing and point cloud processing [44].
Beyond hardware compatibility, researchers must consider the downstream data processing demands. High-resolution imagery generates significant data volumes that necessitate robust computing resources, cloud processing capabilities, and software ecosystems capable of handling multi-layer geospatial datasets [108]. Additionally, sensor fusion approaches—combining RGB with multispectral, or thermal with LiDAR—are increasingly used to enhance interpretive power, enabling multi-modal environmental diagnostics [109].
As illustrated in Table 5 and Table 6, UAS-compatible camera types differ not only in their technical specifications—such as spectral range, spatial resolution, and physical weight—but also in terms of operational requirements, integration constraints, and application suitability. While Table 5 focuses on sensor-specific technical parameters that influence data fidelity and system compatibility, Table 6 highlights practical deployment issues including UAS platform match, ecological or analytical use cases, and field integration needs. These distinctions emphasise that trade-offs between data richness, payload limitations, flight performance, and cost-efficiency must be carefully evaluated in sensor selection, particularly in multidisciplinary projects involving ecology, environmental planning, and geospatial decision making.

3.5. Drone Operating Conditions and Application Challenges

Despite their considerable advantages in landscape research, drones present a number of operational limitations and contextual challenges that may restrict their effective deployment. These challenges encompass a range of technical, environmental, legal, and logistical issues that must be addressed to ensure optimal data collection and integration. For example, weather conditions such as strong winds, heavy rainfall, or low visibility can significantly impair drone flight stability and sensor performance, particularly in large-scale or mountainous terrain. In addition, battery limitations remain a critical concern, especially for rotary-wing platforms with shorter flight durations, often requiring frequent recharging or mission interruptions during extended surveys [59,69].
Other complications stem from regulatory constraints, including airspace restrictions, flight permissions, and legal compliance, which vary significantly between countries and often require coordination with aviation authorities. Terrain complexity, vegetation density, and the electromagnetic environment may also introduce data noise or interference in sensor readings, complicating post-processing tasks and reducing analytical reliability. These factors illustrate that, while drones hold great promise, their practical integration into landscape research necessitates careful planning, scenario-specific calibration, and often hybridised workflows combining traditional and UAS-based methods.

3.5.1. Sensitivity to Weather Conditions

One of the most prominent operational challenges in drone-assisted landscape research is the system’s sensitivity to variable weather conditions. Because drone flights occur entirely outdoors, environmental factors such as wind, precipitation, humidity, and fog can significantly impair both flight stability and sensor functionality. Fixed-wing drones are particularly susceptible to wind turbulence, making it difficult to maintain a stable trajectory during data collection. Similarly, rotary-wing drones—although more manoeuvrable—suffer from diminished performance under high humidity or foggy conditions, often resulting in sensor inaccuracy and signal loss [59].
Several case studies have illustrated the disruptive impact of weather on UAS operations. For instance, a study conducted along Scotland’s coastal regions found that unpredictable wind patterns and rapid atmospheric shifts caused repeated interruptions in drone flight paths, undermining the accuracy of coastal erosion modelling [110]. In a dense urban setting such as Tokyo, studies indicate that high humidity and fog can degrade the performance of thermal and optical sensors for urban heat island analysis, and, without proper atmospheric correction, this may compromise the reliability of spatial planning models [111].
A Brazilian field experiment showed that strong wind gusts and abrupt temperature variations impaired the effectiveness of drone-based pesticide and fertiliser spraying, resulting in irregular droplet deposition that reduced crop yield predictability and overall operational efficiency [112]. In the mountainous terrain of Norway, severe snow and avalanche conditions were noted to cause data gaps in LiDAR-equipped UAS operations, introducing uncertainties in terrain modelling and directly influencing infrastructure and transportation planning decisions [113].
Together, these examples underscore the necessity of weather-aware flight planning, adaptive operational protocols, and platform-specific calibrations. They also highlight the essential role of pre-flight environmental assessments in ensuring the success and reliability of drone applications in landscape research.

3.5.2. Regulatory and Legal Constraints

The deployment of drones in landscape research is inherently tied to national and international airspace regulations, which often impose critical constraints on operational flexibility. Particularly in urban areas, protected ecological zones, or near strategic infrastructure, regulatory barriers such as altitude restrictions, mandatory licensing, data registration protocols, and flight permit requirements may significantly delay or limit UAS operations [69].
A study conducted in Germany’s protected natural areas highlighted the difficulty of integrating drone-based habitat mapping into landscape monitoring workflows. Researchers reported significant delays due to the lengthy permit acquisition process, which complicated pre-flight logistics and reduced project responsiveness in dynamic ecological settings [114]. In the United States, drone-supported infrastructure analysis within smart city frameworks faced notable challenges as well. Strict airspace controls around metropolitan zones led to prolonged waiting periods for flight authorisations, impeding timely data collection for transportation and mobility planning [115].
In the agricultural sector, airspace policies have similarly complicated UAS applications. A study in France revealed that regulatory limitations on drone-assisted pesticide spraying disproportionately affected small-scale farmers, many of whom lacked the resources or administrative support to navigate compliance processes. As a result, the adoption of precision agriculture practices remained uneven, despite the technological readiness of UAS platforms [116]. Cartographic and urban planning efforts have also been impacted. In a doctoral research project conducted in Canada, municipal land use modelling was delayed due to layered regulatory frameworks that required multi-tiered approvals for drone flights, thereby extending project timelines and increasing operational costs [117].
These examples reflect the pervasive influence of regulatory systems on drone utilisation. While safety and privacy considerations justify such frameworks, they also highlight the need for more adaptive, transparent, and field-specific regulatory models, particularly in research contexts that depend on rapid, flexible, and large-scale spatial data acquisition.

3.5.3. Energy and Battery Capacity

Energy limitations remain one of the most significant barriers to the widespread use of drones in landscape research, particularly for large-scale or long-duration projects. Rotary-wing UASs, while highly manoeuvrable and ideal for close-range surveys, have limited flight endurance due to their small battery capacity. This makes them unsuitable for expansive monitoring tasks without frequent interruptions. Fixed-wing drones offer greater range and flight duration but often require larger batteries, resulting in higher energy demands and weight penalties. Despite the use of advanced lithium-polymer battery technologies and the experimental integration of solar-assisted systems, current energy storage solutions are insufficient for the demands of many applied landscape research scenarios [60].
Field-based evidence illustrates these limitations across various sectors. In Canada’s forested regions, a study focusing on wildfire mapping using rotary-wing drones reported frequent disruptions in aerial data collection due to battery depletion, ultimately impacting the continuity of the spatial analysis process [118]. Similarly, fixed-wing drones deployed for precision agriculture in remote rural zones of Brazil encountered significant difficulty in completing planned flight paths. Their batteries could not support long-distance data capture missions, resulting in partial or incomplete mappings of crop performance [119].
Urban infrastructure planning also suffers from this constraint. In cities where drones are employed to monitor road networks and transport flow, energy limitations have forced project managers to install additional charging stations or rely on battery swaps to maintain consistent operations, an approach that increases cost and logistical complexity [120]. Meanwhile, in cartography and offshore construction, particularly in wind turbine siting projects, research has shown that high energy consumption restricts drone flight autonomy. Planned routes had to be curtailed or adjusted mid-operation, leading to data gaps and reduced modelling accuracy [121].
These examples underline how battery endurance remains a universal challenge across drone applications. Although technological developments such as autonomous energy management systems and real-time recharging infrastructure are on the horizon, their adoption remains limited. More importantly, enhancing drone capacity will also require regulatory innovation and AI-assisted flight path optimisation. As concluded in recent studies, overcoming energy limitations—alongside improving operational autonomy and legal flexibility—could dramatically expand the analytical power and spatial reach of drones in landscape research and planning [118,120,122].

4. Applications of Drones in Landscape Research

The integration of drone technology into landscape research has enabled a paradigm shift in how spatial, environmental, and cultural systems are analysed and interpreted. Uncrewed Aerial Vehicles (UASs), equipped with high-resolution cameras and multi-sensor payloads, have enhanced the speed, accuracy, and scope of data collection processes that once depended on labour-intensive and time-consuming field methods. These systems offer precise imaging, three-dimensional (3D) terrain modelling, and real-time monitoring capacities, thereby supporting a wide range of applications that span from natural ecosystem assessment to urban infrastructure analysis and heritage conservation [58].
Figure 4 provides a visual representation of the typical methodological workflow involved in UAS-based landscape studies, adapted from Karahan et al. [123]. This process generally begins with site selection and pre-flight planning, followed by data acquisition through platform-specific flight operations. Post-processing and data analysis—often involving Geographic Information Systems (GISs) and remote sensing software, and, in some cases, complemented by AI-supported analytics—then translate raw sensor outputs into actionable spatial insights. In the subsections that follow, the multifaceted applications of drones in landscape research are explored in detail. These include vegetation analysis and biodiversity monitoring, water resource mapping, urban and rural land use modelling, post-disaster assessment, and cultural landscape documentation. Each application demonstrates the potential of UAS technologies not only to capture dynamic landscape processes with high temporal and spatial resolution, but also to support more sustainable and informed decision making in environmental planning and policy development.

4.1. Mapping and Modelling

Drone technologies have dramatically advanced the fields of land mapping and three-dimensional (3D) modelling, reshaping the methodologies used in landscape research and planning. When integrated with Geographic Information Systems (GISs), drones enable the acquisition of high-resolution spatial data in near real time, facilitating the detailed characterisation of topography, land use patterns, and ecosystem structures [124]. In particular, 3D modelling has proven essential for understanding the morphological features of landscapes, offering valuable input for ecological assessments and sustainable spatial planning [125].
In the context of land mapping, UASs equipped with LiDAR and multispectral sensors have outperformed traditional satellite-based methods in both accuracy and temporal flexibility. These drones can generate finely detailed models of agricultural zones, urban development areas, and conservation landscapes. For instance, Guebsi et al. (2024) demonstrated that UAS-assisted mapping techniques enhanced irrigation management efficiency compared to conventional approaches, highlighting their potential for more precise and sustainable water resource use [126]. Similarly, Mashala et al. (2023) demonstrated that drone imagery significantly enhanced the detection of land use changes in rapidly developing urban environments [127]. Research by Xiao et al. (2022) further highlighted the effectiveness of drones in monitoring erosion dynamics in protected nature reserves [128].
A region-specific case study in Turkey illustrates the practical advantages of drone-supported mapping. Conducted in the Uzundere region of the Çoruh Valley, the research addressed the limitations of satellite-based orthoimages—particularly their seasonal bias toward periods of dense vegetation—which posed significant challenges for identifying agricultural terraces. To overcome this, high-resolution images were captured at altitudes ranging between 25 and 500 metres using drones equipped with nadir-facing (90-degree) cameras. Although LiDAR and multispectral sensors were not available, the study validated the effectiveness of high-angle RGB imaging as an alternative data source.
Using ArcGIS 10.8.2 (ESRI, Redlands, CA, USA) and QGIS 3.34.11 (QGIS Development Team, Open-Source Geospatial Foundation, Beaverton, OR, USA) software, these UAS-acquired images were georeferenced and processed to produce updated and precise maps of the agricultural terraces in Erikli Village. The analysis not only delineated rural–urban boundaries, but also assessed terrace abandonment and physical degradation. Compared to conventional mapping methods, this drone-based workflow provided higher spatial fidelity and contributed substantially to regional landscape change monitoring (Figure 5 and Figure 6). This case study reflects original research conducted in 2023 in Uzundere, Türkiye, as part of the ongoing doctoral dissertation of Oğuz Gökçe under the supervision of Prof. Dr. Faris Karahan at Atatürk University. As the dissertation has not yet been publicly defended or archived, the results presented here are not cited as a formal reference. However, the fieldwork methodology and preliminary findings form the empirical basis of this section and are available upon request or can be provided as supplementary material.

4.2. Three-Dimensional Modelling

Three-dimensional (3D) modelling has emerged as a core analytical function in drone-assisted landscape research, particularly through its integration with Geographic Information Systems (GISs). This fusion allows researchers to move beyond flat, two-dimensional representations, enabling volumetric and elevation-based analyses that are essential for assessing soil erosion patterns, landslide susceptibility, and urban sprawl. High-resolution drone imagery, when processed through photogrammetric software and overlaid on geospatial platforms, supports highly accurate digital terrain models that offer both visual clarity and analytical depth [60].
The utility of 3D modelling has been documented across a variety of landscape contexts. In a study by Hussain et al. (2022), drones were deployed in landslide-affected regions to estimate soil displacement volumes [129]. The resulting 3D models achieved higher measurement accuracy and spatial resolution than those obtained through traditional ground surveying techniques, providing valuable data for risk management and geotechnical planning [129]. Similarly, Siddiq et al. (2022) explored the application of 3D terrain reconstruction in coastal zones vulnerable to sea level rise [130]. Their research not only mapped the scale of erosion, but also aided in the simulation of mitigation strategies, thus bridging environmental monitoring with policy development [130].
Urban transformation areas have also benefited from this methodology. Hu and Minner (2023) evaluated the use of 3D modelling in tracking land use dynamics associated with rapid urban expansion [131]. Their study demonstrated how drone-acquired data can detect subtle morphological changes in terrain, informing zoning decisions and urban design strategies with higher precision [131].
In sum, the integration of UAS-based imaging and 3D modelling technologies has revolutionised how landscape researchers approach topographic analysis. It enables temporal comparisons, disaster forecasting, and planning simulations that are critical for evidence-based decision making in both rural and urban contexts.

4.3. Vegetation and Ecosystem Analyses

Drone technologies have emerged as indispensable tools for vegetation analysis and ecosystem monitoring, offering new levels of precision and spatial coverage in the assessment of ecological dynamics. Uncrewed Aerial Vehicles (UASs) equipped with multispectral and hyperspectral sensors provide detailed spectral data that allow for the detection of plant health status, species composition, and ecosystem functionality. These systems are particularly useful in large and ecologically sensitive landscapes where traditional ground-based methods are either impractical or inefficient [113].
Monitoring vegetation is a fundamental component of biodiversity conservation and sustainable land management. Through high-resolution aerial surveys, drones can capture changes in canopy structure, detect early signs of plant stress, and support habitat mapping in both protected and disturbed areas. The integration of drone imagery with geospatial analysis tools has enabled more frequent and non-invasive monitoring regimes, which are essential for adaptive ecosystem management practices [76].
This section explores the role of drones in vegetation and ecosystem analyses through several interrelated subdomains, including species-level vegetation mapping, plant health and stress detection, habitat quality evaluation, and ecological succession tracking. By examining case studies from different biomes and landscape types, it highlights the diverse applications of UASs in understanding and managing terrestrial ecosystems under climate, land use, and anthropogenic pressures.

4.3.1. Vegetation Monitoring

Vegetation monitoring represents one of the most significant applications of drone technology in both agricultural and ecological research. UASs equipped with multispectral cameras and vegetation indices—particularly the Normalized Difference Vegetation Index (NDVI)—allow researchers to assess plant vitality, biomass distribution, and photosynthetic activity across vast and diverse landscapes [124]. These tools enable high-frequency observations that are crucial for detecting anomalies, managing crop performance, and understanding broader ecological trends.
A growing body of literature supports the effectiveness of drones in monitoring vegetation conditions with greater spatial and temporal resolution than satellite-based systems. For instance, Ajakwe et al. (2024) reported that drones enabled the early detection of disease outbreaks in crops, which improved agricultural productivity by allowing for timely intervention and resource optimisation [132]. Similarly, Bollas et al. (2021) used multispectral UASs to monitor vegetation change in temperate forest ecosystems [133]. Their results indicated significantly higher classification accuracy and spectral fidelity compared to traditional satellite platforms, particularly under canopy-dense conditions [133].
In the context of wetland ecology, Zheng et al. (2022) employed UAS imagery to identify plant species distributions and assess habitat changes over time [134]. This approach provided detailed insights into spatial heterogeneity and enabled the creation of high-resolution habitat maps, which are vital for conservation planning and biodiversity protection in dynamic aquatic environments [134].
These examples underscore the value of drone-based vegetation monitoring in supporting data-driven decision making in agriculture, forestry, and conservation. The integration of NDVI and spectral analysis within UAS workflows continues to evolve, offering new pathways for precision landscape management and ecosystem resilience assessment.

4.3.2. Habitat Assessments

The ability of drones to capture high-resolution spatial and spectral data has significantly expanded the scope of habitat assessments in landscape research. UAS technologies provide a powerful platform for ecosystem monitoring, particularly in tracking habitat changes, identifying sensitive ecological zones, and supporting biodiversity conservation efforts. By offering non-invasive and repeatable survey capabilities, drones enable the rapid assessment of landscape-level patterns that are otherwise difficult to detect through traditional field-based methods [122].
Lyu et al. (2022) conducted a comprehensive meta-analysis that confirmed the effectiveness of UASs in biodiversity monitoring, highlighting their capacity to detect vegetation structure, wildlife presence, and ecological connectivity across diverse biomes [135]. Recent advancements in artificial intelligence and sensor technologies have further enhanced this capacity. Kaur et al. (2024), for example, used deep learning techniques in combination with drone imagery to identify endangered species within fragmented habitats, achieving high accuracy rates under variable lighting and terrain conditions [136]. Similarly, Povlsen et al. (2023) demonstrated the utility of thermal imaging and object detection algorithms embedded in UAS platforms for monitoring wildlife activity in open and forested environments [137].
A case study conducted in the Erikli village of the Uzundere Valley offers a compelling local example of how drones can enhance habitat evaluation. In this study, drone imagery was collected over multiple years, seasons, and phenological periods (Figure 7, Figure 8, Figure 9 and Figure 10). Notably, Cornelian cherry trees (Cornus mas), which bloom earlier than most other fruit species, were successfully identified in early spring imagery captured on 2 April 2024. Their flowering phenology allowed for clear visual differentiation from surrounding vegetation, aiding in habitat classification and species distribution mapping.
The study also integrated agricultural meteorological data with knowledge of seasonal physiological cycles, improving the interpretability of drone-acquired imagery. This integration facilitated the monitoring of flowering and leaf-shedding patterns, particularly during spring and autumn transitions. Compared to traditional field surveys, drone-assisted monitoring proved to be faster, more scalable, and more accurate in identifying plant species and assessing microhabitat characteristics within the terraced agricultural landscape.

4.4. Water Resource Management

Effective water resource management remains a cornerstone of sustainable environmental planning, especially in the context of intensifying climate variability, rapid urban expansion, and increased ecological vulnerability. In recent years, Uncrewed Aerial Vehicles (UASs) have emerged as transformative tools for hydrological observation and risk mitigation, offering spatially precise high-frequency data that exceed the capabilities of conventional ground-based or satellite systems. Drone-based monitoring now encompasses a wide range of hydrological functions, including surface water mapping, sediment transport analysis, water quality surveillance, and floodplain delineation [58].
One of the primary advantages of UAS platforms lies in their flexibility and rapid deployment capacity, which is particularly beneficial for capturing dynamic hydrological events such as flash floods or pollutant discharges. Their seamless integration with Geographic Information Systems (GISs), remote sensing technologies, and artificial intelligence algorithms enables the multi-scale modelling of water flow, sediment behaviour, and flood risks. These integrated systems support real-time decision making, improve early warning capabilities, and contribute to climate adaptation strategies through enhanced water system diagnostics.
In the following subsections, this section explores two primary thematic domains in UAS-assisted water resource management: hydrological mapping and flood risk analysis. Drawing upon recent empirical studies from both rural and urban settings, the discussion illustrates how drones are reshaping our ability to interpret, predict, and manage water-related dynamics across diverse environmental and infrastructural contexts.

4.4.1. Hydrological Mapping

Drones have emerged as highly effective tools for hydrological mapping, enabling the continuous monitoring of water levels, pollutant dispersion, and sediment dynamics in diverse aquatic environments such as rivers, lakes, wetlands, and estuarine systems [59]. Their high-resolution imaging and operational flexibility have redefined traditional hydrological assessment methods, delivering near-real-time data with enhanced spatial granularity.
Case studies from Türkiye and East Asia reflect the growing regional adaptation of drone-based hydrological techniques. In Türkiye, Yavuz and Tufekcioglu (2023) analysed decades of flood events and demonstrated that UASs significantly improved response times and situational awareness during emergencies, especially in topographically constrained or inaccessible areas [138]. In contrast, Liu et al. (2024) focused on wetland ecosystems, revealing that drones offer superior accuracy and coverage in water quality assessments compared to ground-based sampling, particularly for tracking turbidity, algal blooms, and surface pollutants [139]. Complementing these efforts, Dai et al. (2022) applied UAS imaging in riverine systems to trace sediment transport patterns, contributing to the modelling of erosion, deposition, and channel morphology [140].
While the thematic focus of these studies varies—from disaster response to ecological diagnostics and geomorphic process tracking—they collectively highlight several key advantages of drone-based hydrological mapping. First, UASs enable highly targeted data collection in both spatially constrained and ecologically sensitive areas. Second, their integration with GIS and modelling platforms allows for dynamic system visualisation across time and space. Lastly, the regional diversity of these applications—from Türkiye’s floodplain crises to China’s wetland monitoring—illustrates the global adaptability of drones in diverse hydrological contexts.
In sum, UAS-assisted hydrological mapping has evolved from a supplementary technique into a core methodological pillar for water system analysis. As environmental variability intensifies, the role of drones in providing precise, scalable, and context-sensitive hydrological data are poised to expand further across both scientific research and policy-oriented planning.

4.4.2. Flood Risk Analyses

The integration of UAS technologies into flood risk analysis has substantially enhanced our ability to detect, model, and visualise flood-prone areas. Drones provide high-resolution topographic data that feed into dynamic hydrological modelling systems, supporting real-time risk mapping, vulnerability assessment, and early warning infrastructure [125,140]. These technological advantages position UASs as critical tools in proactive disaster preparedness and strategic mitigation planning.
Darji et al. (2024) showcased how UASs can generate Digital Surface Models (DSMs) that, when coupled with two-dimensional hydrodynamic simulations, produce detailed flood hazard maps across various recurrence intervals [141]. Their work illustrated the importance of spatial granularity in delineating water pathways and inundation zones for land use planning [141]. In contrast, Munawar et al. (2021) applied UAS surveillance in conjunction with deep learning algorithms to create a real-time disaster response system capable of detecting and counting flood victims [142]. This innovative approach shifted the focus from predictive mapping to operational response, demonstrating the adaptability of UASs in emergency scenarios [142]. From a long-term planning perspective, Trepekli et al. (2022) utilised UAS imagery and GIS data to evaluate building exposure to historical flood levels, enabling tailored adaptation strategies for urban resilience under changing climate conditions [143].
While each of these studies targets different phases of flood risk management—forecasting, real-time response, and long-term spatial planning—they collectively demonstrate the breadth of UAS applications across the entire disaster management cycle. Methodologically, the use of DSMs, machine learning, and GIS platforms illustrates how UAS data can be tailored to various technical contexts, enhancing both predictive accuracy and operational efficiency. The case studies also highlight the cross-scalar adaptability of drones, from site-specific building analysis to regional floodplain modelling.
In sum, UAS-assisted flood risk analysis bridges the gap between hazard detection and actionable planning. By enabling multi-modal data integration and rapid environmental assessment, drones empower both decision-makers and emergency responders to anticipate, mitigate, and recover from hydrological disasters with increased confidence and precision.
Sediment dynamics pollutant dispersion, and flood risk analyses are geared toward scenario modelling, vulnerability assessment, and disaster mitigation. These complementary domains reflect distinct but overlapping technical priorities: the former emphasises ecological diagnostics and geomorphic interpretation, whereas the latter prioritises spatial risk visualisation and emergency responsiveness.
Methodologically, both domains benefit from the integration of UAS data with GIS platforms, yet diverge in analytical approaches: hydrological mapping often relies on spectral and visual data, while flood risk analysis incorporates DSM generation, hydrodynamic modelling, and AI-driven classification. Moreover, the geographic diversity of case studies—from Türkiye and China to India and Europe—reveals how UAS-based techniques are tailored to regional hydrological challenges, including flash floods, wetland degradation, and urban inundation.
As water-related risks intensify under climate change, the continued refinement of UAS-based hydrological tools will be essential. Future research and policy frameworks should increasingly focus on harmonising these methods within integrated watershed and disaster risk management systems, ensuring that UAS technologies not only detect, but also anticipate and reduce vulnerabilities in both natural and built environments.
In addition to remote hydrological observations, recent developments in in situ atmospheric sensing have enabled drones to measure key microclimatic variables such as temperature, humidity, and evapotranspiration directly during low-altitude flights. These measurements—often enabled by onboard environmental sensors—are critical for water balance analysis, irrigation scheduling, and land–atmosphere interaction studies in both natural and agricultural landscapes [88,89].

4.5. Cultural Heritage and Archaeological Documentation

The integration of drone technologies into cultural heritage and archaeological research has introduced a paradigm shift in how historical sites are documented, monitored, and preserved. Traditionally reliant on manual surveys and ground-level observations, the field now benefits from the non-invasive, scalable, and high-resolution data collection capabilities of UASs. These systems are particularly valuable in documenting fragile or hard-to-access structures, enabling researchers to generate accurate three-dimensional (3D) models of archaeological remains and architectural heritage without causing physical disturbance [60].
UASs have proven instrumental in supporting both preventive and corrective conservation practices. Their ability to capture multispectral, thermal, and high-resolution visual data enhances the capacity to detect structural deterioration, material fatigue, and environmental impacts over time. These features are increasingly used by archaeologists, conservation architects, and heritage site managers to plan restoration projects, assess ongoing degradation, and monitor post-intervention outcomes [76].
In the sections that follow, two key applications of drones in cultural heritage studies are examined: first, their role in structure and site documentation; and second, their use in supporting restoration and conservation strategies. Together, these applications highlight the critical role UASs play in preserving global cultural heritage under the growing pressures of climate change, tourism, and urban development.

4.5.1. Structure and Site Documentation

The use of drones for the digital documentation of architectural heritage and archaeological sites has brought significant advancements in accuracy, accessibility, and preservation efforts. Their ability to generate high-resolution three-dimensional (3D) models through photogrammetric techniques and LiDAR scanning provides researchers and conservation professionals with precise datasets that were previously difficult to obtain through conventional methods [53]. This approach is especially beneficial in remote or structurally fragile locations, where physical access may be restricted or potentially damaging.
In a study conducted by Treccani et al. (2024), UASs were deployed to digitally document a series of historical fortifications in the Terracorpo region of Italy [144]. The research team successfully produced detailed 3D models of remote castles, enabling virtual exploration and long-term preservation planning without disturbing the physical integrity of the structures [144]. Similarly, López-Herrera et al. (2025) applied high-resolution drone mapping techniques to the archaeological remains of the Almenara Castle in Spain [145]. By utilising DJI Phantom 4 RTK (DJI, Shenzhen, China) platforms, they created centimetre-level accurate 3D reconstructions that are now used for monitoring structural stability and informing restoration strategies [145].
Fiz et al. (2022) also demonstrated the value of drone-based photogrammetry in archaeological contexts [146]. Their study revealed that UAS-assisted 3D modelling not only enhanced the spatial accuracy of cultural site documentation, but also facilitated interdisciplinary collaboration between historians, architects, and conservationists by providing a shared digital framework for site analysis and decision making [146].
These examples illustrate the evolving role of drones in the digital heritage landscape, offering a reliable, non-invasive, and scalable solution for documenting historic sites with scientific precision.

4.5.2. Restoration and Conservation Efforts

High-resolution imagery obtained via drones has emerged as a fundamental asset in the restoration and conservation of historic structures. Through aerial surveys and multispectral imaging, drones enable the precise detection of material wear, surface erosion, and structural degradation, phenomena that are often difficult to identify through conventional inspection techniques. Their non-invasive nature, combined with the ability to produce repeatable time-series data, provides conservationists with a reliable method for monitoring long-term deterioration processes and guiding restoration interventions [69].
For example, De Fino et al. (2023) utilised drone-based photogrammetry to assess the gradual decay of ancient stone structures in Rome [147]. Their study demonstrated that UASs could reveal subtle surface changes over extended periods, allowing experts to detect degradation before it becomes structurally critical, thereby improving the effectiveness of preventive maintenance planning [147]. Similarly, Kerle et al. (2019) employed multispectral UAS data to evaluate the integrity of historical buildings by mapping material weathering and identifying early-stage damage [148]. By integrating advanced image processing algorithms, the researchers highlighted the value of combining UAS imagery with AI-based analytical tools for post-disaster damage assessments in cultural heritage contexts [148].
In another study, Laohaviraphap and Waroonkun (2024) applied thermal imaging via drones to detect micro-level surface erosion and thermal anomalies in architectural heritage structures [149]. Their findings underscored the utility of thermal UAS surveys in uncovering hidden structural vulnerabilities, especially in tropical and monsoon-prone regions where traditional inspection methods are often hindered by climatic constraints [149].
These examples illustrate how drone technologies not only enhance the precision of damage assessment in heritage conservation, but also expand the methodological toolkit available for preserving the integrity and authenticity of historical landmarks across varying environmental and cultural contexts.

4.6. Urban Planning and Environmental Monitoring

Urbanisation and rapid land use change have significantly heightened the demand for innovative tools that can support real-time monitoring and sustainable planning in urban environments. Drones have emerged as vital instruments in this context, offering high-resolution spatial data, flexible deployment capabilities, and integration with Geographic Information Systems (GISs) and remote sensing technologies. These features make them especially suitable for urban planning tasks such as green infrastructure mapping, air quality assessment, traffic flow analysis, and land use change detection [150].
The COVID-19 pandemic further catalysed the adoption of UASs in urban planning by demonstrating their value in non-contact data collection, public health surveillance, and emergency logistics. These expanded roles contributed to the development of new paradigms in smart city governance and environmental management, where drones were increasingly used to evaluate ecological performance indicators and inform policy decisions in real time.
In this section, the diverse applications of drones in urban and environmental contexts are examined through three primary themes: (i) green infrastructure and land use monitoring, (ii) air pollution and urban heat island analysis, and (iii) urban expansion and zoning control. Each of these categories is discussed with reference to recent empirical studies, highlighting how drone technologies have reshaped data-driven urban planning, especially under the pressures of public health crises and environmental degradation.

4.6.1. Green Infrastructure, Spatial Water Management, and Urban Ecology

The integration of drones into urban ecology and spatial environmental monitoring has created new opportunities for understanding the distribution and performance of green infrastructure in cities. UAS-based analyses have proven essential for identifying the spatial continuity of urban green spaces, detecting fragmentation in ecological corridors, and assessing public use of open spaces, particularly during disruptive periods such as the COVID-19 pandemic [150,151].
For instance, Li et al. (2024) utilised thermal data collected by drones to quantify the cooling effects of urban parks, revealing their capacity to mitigate the urban heat island (UHI) phenomenon and improve microclimatic comfort [152]. Similarly, Shao et al. (2021) examined the contribution of green roof systems to urban air quality using drone imagery to monitor vegetation performance and pollutant dispersion [153]. During the pandemic, drone technology enabled planners to monitor the increased demand for outdoor environments, offering insights into the adaptive use of public green spaces for recreation, mental health, and physical distancing compliance [150].
Beyond surface vegetation monitoring, drones have also become vital tools for urban water management. Liu et al. (2021) applied multispectral imaging and LiDAR analyses to evaluate the water needs of urban green areas, enabling more accurate irrigation scheduling and resource allocation [154]. In another study, Munawar et al. (2021) demonstrated that UAS-based assessments of urban flood risks and rainwater drainage systems were not only faster, but also more cost-effective than traditional surveying methods [155]. Complementing these efforts, Mishra et al. (2023) proposed the use of drone-derived geospatial data to simulate subsurface water flows in built-up environments, providing planners with actionable data to guide sustainable water resource strategies [156].
Collectively, these studies affirm the transformative role of drones in mapping, managing, and planning urban ecological infrastructure. Through their multi-sensor capabilities and rapid data acquisition, UASs support the design of climate-resilient ecologically integrated cities capable of adapting to environmental and public health challenges.

4.6.2. Drone Utilisation During the Pandemic

The COVID-19 pandemic significantly altered patterns of urban mobility and public space use, prompting the rapid deployment of drone technologies for environmental monitoring and crisis management. Equipped with multispectral sensors, gas detection systems, and real-time image analysis capabilities, UASs offered unique advantages in monitoring air pollution, spatial occupancy, and human activity trends without direct human contact [157].
In the context of air quality surveillance, drones enabled the precise measurement and mapping of pollutant concentrations in densely populated and industrial zones. Özer (2024) used UASs to collect air samples in major Turkish cities, generating high-resolution dispersion maps for nitrogen dioxide (NO2) and PM2.5 pollutants, which provided real-time insights into the spatial dynamics of air pollution [158]. Similarly, Bakirci (2024) utilised drones to detect industrial emission sources, creating detailed air quality models that helped identify pollution hotspots in heavy industrial areas [159]. Notably, while traffic volumes declined during lockdowns, emissions from specific manufacturing zones increased, a trend effectively captured and analysed using UAS-based environmental monitoring systems [160].
Beyond air quality analysis, drones were instrumental in studying the dynamic use of public spaces during the pandemic. They were deployed to assess occupancy patterns, monitor behavioural shifts, and evaluate compliance with social distancing measures [161]. Zhang et al. (2023) demonstrated that UAS-generated spatial data enabled the analysis of square occupancy rates, revealing changes in user behaviour in response to pandemic-related restrictions [162]. Valdez-Delgado et al. (2023) conducted void space analyses using drone imagery to evaluate how city squares were managed during the pandemic, providing empirical data on how people reappropriated urban spaces [163]. In another study, Olivatto et al. (2023) explained how UAS-assisted mapping informed the development of alternative public space utilisation plans, guiding more adaptive and resilient urban design strategies during public health crises [164].
These examples underscore how drones became vital tools for real-time urban diagnostics, offering scalable and contactless solutions for managing environmental and spatial challenges under pandemic conditions.

4.7. Use of Drones in the Landscape Sector

In the evolving discipline of landscape architecture and engineering, drones have become indispensable tools for enhancing spatial data acquisition, accelerating project workflows, and improving the accuracy of design interventions. Their integration into the landscape sector enables professionals to overcome the limitations of traditional site analysis methods by offering high-resolution imaging, precise topographic modelling, and real-time spatial assessments. From vegetation health monitoring to infrastructure layout optimisation, drones contribute not only to the aesthetic and functional quality of landscape designs, but also to their environmental sustainability and resilience.
Particularly within the field of landscape engineering, drones are increasingly utilised for terrain analysis, slope stability evaluations, and sustainable drainage planning, domains where timely and georeferenced data are essential. Their ability to integrate with Geographic Information Systems (GISs), remote sensing (RS), and artificial-intelligence-driven analytics also enhances multi-scalar planning strategies. This section explores the multifaceted applications of drones in landscape-related practice across three main domains: (i) design and planning processes, (ii) vegetation and structural inventory tracking, and (iii) spatially oriented engineering solutions. Each domain is elaborated upon through recent empirical findings, underscoring the technological and ecological value of drone-supported landscape practice.

4.7.1. Landscape Design and Planning

The integration of drones into the landscape design process has markedly improved the precision, efficiency, and responsiveness of spatial planning practices. Through high-resolution imaging, digital terrain modelling, and orthophoto mapping, drones facilitate comprehensive site analyses, enabling landscape architects to make more informed and timely decisions [165,166]. Cureton (2020) highlighted the use of drone-derived Digital Surface Models (DSMs) and orthophotos in planning projects as tools that can significantly improve workflow efficiency and enhance stakeholder responsiveness [167]. Similarly, Ma et al. (2024) demonstrated that UASs are particularly effective in urban landscape design, where they support the analysis of green space distribution and pedestrian circulation patterns [168]. These contributions highlight the role of drones in advancing data-driven design methodologies that align with the ecological and functional imperatives of contemporary landscape architecture [168].

4.7.2. Vegetation and Structural Inventory Tracking

Monitoring and documenting vegetation and structural assets are fundamental components of landscape planning and maintenance, particularly in large-scale or ecologically sensitive projects. Drone-mounted multispectral and hyperspectral sensors offer an advanced means of assessing plant health by detecting physiological stress, disease indicators, and biomass variations [169]. Pun et al. (2025) highlighted that drone-assisted NDVI analyses allowed for the early identification of plant diseases and enabled the continuous evaluation of vegetation density and health, contributing to more responsive green infrastructure management [170]. Complementing this, Wagner and Egerer (2022) demonstrated how drone-based species mapping in urban parks and gardens played a pivotal role in conserving botanical diversity and maintaining ecological balance within designed landscapes [171]. In addition to vegetation tracking, drones also provide crucial data for hardscape planning. Xu and Diao (2023) showed that 3D structural models generated through drone imagery enhanced the spatial organisation of built elements—such as walkways, seating zones, and pergolas—ensuring that design interventions were both functional and context-sensitive [172].

4.7.3. Landscape Engineering and Spatial Applications

In the domain of landscape engineering, drones serve as powerful tools for collecting high-resolution spatial data essential for terrain analysis, erosion control, soil permeability assessment, and water management. Their capacity to generate accurate surface models and geospatial data layers allows engineers and planners to make data-informed decisions in real time. Ghorbanzadeh et al. (2019) illustrated the utility of drones in predicting soil loss and conducting slope stability analyses for erosion control projects, significantly improving pre-intervention modelling accuracy [173]. Building on this, Koganti et al. (2021) employed UAS-acquired data to evaluate soil permeability and optimise drainage systems, contributing to the design of more resilient and sustainable landscape infrastructure [174]. Furthermore, Sibanda et al. (2021) demonstrated that drone-based monitoring in urban park settings contributed to improved water use efficiency, emphasising the role of drones in sustainable water resource management within the context of public green spaces [175]. These applications highlight the critical role drones play in landscape engineering by facilitating rapid, cost-effective, and ecologically sound design and maintenance interventions.

4.8. Use of Drones in Rural Landscapes, Nature Conservation, and Sustainable Tourism

The integration of drones into rural landscape management, nature conservation, and sustainable tourism practices has introduced transformative changes in spatial planning and environmental monitoring. By enabling high-resolution data acquisition, real-time surveillance, and precision analysis, drones have become vital instruments in supporting sustainable land use, biodiversity protection, and the development of environmentally conscious tourism strategies. Their applications range from monitoring ecological integrity in protected areas and assessing the condition of agricultural terraces to designing ecotourism corridors and optimising visitor mobility. As pressures on rural ecosystems intensify due to climate change and land use dynamics, the role of UAS technologies in adaptive management and multi-scalar spatial planning becomes increasingly significant. This section explores the broad spectrum of drone applications in these domains under three interrelated themes: rural landscape and agricultural management, nature conservation and protected area monitoring, and sustainable tourism and ecotourism development.

4.8.1. Rural Landscape Management and Agricultural Landscape Applications

Drones have emerged as indispensable tools in rural landscape planning and agricultural land management, providing high-resolution spatial data critical for monitoring soil and water resources, managing erosion risks, mapping agricultural terraces, and supporting pastoral and rangeland systems [176,177]. Their ability to operate flexibly across vast terrains enables real-time analysis that enhances both precision agriculture and environmental sustainability.
In a study by Gioia et al. (2021), UASs were successfully used to perform spatial analysis of agricultural terraces in rural landscapes, significantly aiding in erosion risk assessments and soil stability modelling [178]. Delavarpour et al. (2021) further demonstrated that drones facilitate the monitoring of pastoral activities by evaluating rangeland quality and optimising livestock distribution based on vegetation availability and terrain conditions [179]. Meanwhile, Cienciała et al. (2022) conducted a comprehensive study on how UAS-based imaging contributes to the updating of cadastral maps and facilitates land consolidation processes, particularly in fragmented rural agricultural zones [180].
The contribution of drones to rural ecosystem preservation is further evidenced by Zhang et al. (2021), who found that UASs offer an effective means to detect land use changes over time, enabling timely interventions for sustainable land management [181]. In irrigation-focused research, Meron et al. (2025) highlighted the capacity of drones to measure soil moisture levels with high accuracy, ultimately improving water-use efficiency and supporting climate-resilient farming practices [182]. Similarly, Abdullah et al. (2023) emphasised the benefits of remote sensing-enabled drones in early disease detection within crop systems, noting their potential to reduce pesticide usage, labour input, and costs while significantly enhancing agricultural productivity [183].
Collectively, these applications underscore the strategic importance of drones in addressing the complex spatial, ecological, and economic challenges faced by contemporary rural landscapes.

4.8.2. Nature Conservation, Biodiversity Monitoring, and Protected Area Management

The integration of drone technologies into nature conservation projects has substantially enhanced the efficiency and precision of monitoring efforts related to biodiversity, habitat quality, and protected area management [184]. With their capacity to access remote and ecologically sensitive zones, drones facilitate detailed assessments that are essential for preserving ecological integrity and supporting adaptive management strategies.
One of the key areas of application is visitor management in national parks. Donaire et al. (2020) demonstrated that UAS-based tracking of visitor movements provides critical data for managing carrying capacity and minimising ecological disturbance in heavily trafficked natural reserves [185]. Beyond human dynamics, drones are also indispensable for long-term ecological monitoring. Tait et al. (2019) utilised multispectral imagery to evaluate vegetation shifts in biosphere reserves, revealing patterns of habitat degradation and ecological stress that would otherwise remain undetected through conventional field surveys [186].
In restoration ecology, Rudge et al. (2022) highlighted the role of drone-assisted mapping in expediting rehabilitation efforts in nature parks by enabling the precise identification of degraded zones and supporting targeted interventions [187]. The use of UASs in avian ecology is also expanding, as demonstrated by Han et al. (2017), who tracked bird migration pathways to evaluate the effects of habitat fragmentation and climate-induced change [188].
Moreover, thermal imaging capabilities of drones are increasingly applied in wildlife population studies. Beaver et al. (2020) employed such systems to analyse population dynamics in dense forest ecosystems, significantly contributing to species monitoring and behavioural research [189]. The security dimension of conservation has also been strengthened by drones. Bhatia et al. (2023) evaluated drone-based surveillance programs designed to detect and deter illegal poaching activities in protected areas, showing that UASs can serve as a proactive enforcement and deterrence mechanism [190].
Together, these applications illustrate how drone technology has become a transformative tool in the field of nature conservation, enhancing spatial accuracy, reducing human impact, and supporting sustainable management of biologically rich and vulnerable landscapes.

4.8.3. Sustainable Tourism and Ecotourism Management

Drones are increasingly being deployed as critical tools in the planning and monitoring of sustainable tourism and ecotourism practices. Their ability to capture high-resolution spatial data and provide real-time insights enables informed decision making across tourism development, conservation management, and visitor experience design [176].
One of the prominent roles of drones in sustainable tourism lies in environmentally conscious investment planning. Seștraș et al. (2020) illustrated that drone-assisted land use analyses facilitate the identification of low-impact and ecologically compatible locations for tourism infrastructure, thereby reducing environmental degradation risks during development processes [191]. Likewise, Chen and Yuan (2020) demonstrated how UAS-generated scenic route maps support route optimisation by enhancing both visual accessibility and aesthetic value for visitors, contributing to the overall sustainability of tourism experiences [192].
Seasonal variations in tourist flow and ecological sensitivity further highlight the need for spatial intelligence in tourism route design. Tomczyk et al. (2023) emphasised the value of drone-derived spatiotemporal datasets in balancing visitor density with environmental carrying capacity [193]. Their study showcased how UAS monitoring supported the redistribution of tourist traffic to less vulnerable areas during peak seasons [193]. In a related study, Liao et al. (2021) found that drones enabled the continuous tracking of tourist densities within natural parks, allowing for proactive mobility management strategies and minimising the risk of ecosystem overuse [194].
Beyond visitor management, drones have also proven essential in preserving fragile ecosystems affected by tourism pressure. Díaz-Delgado and Mücher (2019) employed multispectral drone imagery to detect subtle indicators of environmental degradation in ecotourism zones, informing mitigation measures aimed at reducing overtourism impacts [195]. In a broader context, Wang et al. (2025) demonstrated that UAS-based spatial analyses substantially contributed to the protection of scenic rural landscapes and supported the development of sustainable tourism routes that harmonised with natural terrain and cultural heritage values [196]. Taken together, these diverse applications underscore the vital role of drones in advancing sustainability objectives in tourism planning. Their integration into spatial analysis, route modelling, and conservation oversight not only enhances operational efficiency, but also strengthens the resilience of tourism landscapes against anthropogenic stressors. As drone technologies continue to evolve, they are expected to play an increasingly central role in facilitating adaptive management strategies across both rural and protected environments [160,176,197].

4.9. Summary Analysis of UAS–Sensor Applications

The effective integration of Uncrewed Aerial Systems (UASs) with diverse sensor technologies has significantly transformed landscape research across ecological, cultural, and urban contexts. As outlined in Table 1, Table 2, Table 3, Table 4, Table 5 and Table 6 and detailed in Section 4.1, Section 4.2, Section 4.3, Section 4.4, Section 4.5, Section 4.6 and Section 4.7, the landscape-related applications of UAS vary not only by thematic focus—such as vegetation monitoring, hydrological mapping, cultural heritage documentation, or tourism planning—but also by the operational configurations of the drone platform and its onboard sensors. This section synthesises the key UAS–sensor combinations employed across these domains, critically analysing their suitability, methodological strengths, and deployment trade-offs in the context of landscape-based inquiries.
Fixed-wing drones, for instance, are most advantageous in applications that require wide area coverage and long-endurance flights, such as agricultural landscape mapping, hydrological watershed assessments, or regional ecosystem analysis. When paired with LiDAR or high-resolution RGB sensors, these drones excel in generating accurate digital elevation models (DEMs), delineating large-scale land use patterns, and monitoring floodplains. However, their inability to hover and limited manoeuvrability constrain their use in confined or topographically complex environments. In contrast, rotary-wing drones are more frequently deployed in applications requiring vertical take-off and landing (VTOL), high spatial precision, and operational flexibility, such as vegetation health monitoring, urban heat island analysis, or archaeological site documentation. Rotary drones are especially compatible with multispectral, thermal, and RGB sensors, which enable detailed spectral analyses and object-based classification across fine spatial scales.
Hybrid UAS platforms—combining the endurance of fixed-wing drones with the manoeuvrability of rotary systems—emerged in the recent literature as promising tools in multi-scale projects. Their dual-flight modes enable transitions between hovering and forward motion, making them ideal for long transects over mountainous or heterogeneous terrain. In applications such as biodiversity monitoring, soil erosion modelling, or slope stability assessments, hybrid drones integrated with multispectral and thermal sensors have demonstrated superior adaptability. Despite their operational benefits, high procurement and maintenance costs remain a challenge for broader adoption.
In terms of sensor deployment, the RGB camera remains the most widely used due to its affordability and high-resolution imaging capabilities, especially in initial site surveys, 3D modelling, and photogrammetric applications. Multispectral sensors, offering narrowband spectral resolution, are extensively employed in plant health monitoring, canopy structure mapping, and wetland classification. Thermal sensors, on the other hand, are instrumental in applications such as urban microclimate studies, habitat occupancy tracking, and energy efficiency audits of landscape structures. LiDAR sensors, though cost-intensive, are essential for detailed terrain and vegetation modelling, particularly in forested or densely vegetated areas where optical sensors may be obstructed. The emergence of hyperspectral sensors and data fusion techniques also suggests new research opportunities in species-level vegetation analysis and material composition studies.
Sensor selection is influenced by various factors including data fidelity, payload weight, power consumption, flight time, and post-processing requirements. For instance, while LiDAR offers superior vertical accuracy, its integration demands high-endurance platforms and sophisticated ground control calibration. Thermal sensors, while lightweight, require stable hovering and specific environmental conditions for accurate interpretation. Moreover, the integration of artificial intelligence (AI) with these sensor types—such as deep learning in object detection or automated classification using spectral indices—has expanded the analytical potential of drone-based studies, particularly in urban resilience assessments and biodiversity monitoring.
As summarised across this review, the choice of UAS and sensor configurations must be aligned with the spatial scale of the research, the ecological sensitivity of the study area, and the desired temporal resolution of data. There is no universally optimal combination; instead, successful UAS applications are those that strategically balance data richness, flight constraints, processing demands, and contextual priorities. This integrated perspective is especially important for interdisciplinary research projects, where ecological planning, cultural heritage preservation, and rural development converge. As landscape research continues to evolve in response to climate change, urbanisation, and digital innovation, the ability to customise UAS–sensor configurations will remain critical for generating robust, context-sensitive, and policy-relevant spatial knowledge.

5. Future Perspectives and Innovative Applications

5.1. Future Technological Developments in Drones

The role of drones in landscape research is undergoing a paradigm shift, catalysed by rapid advancements in sensor technology, artificial intelligence (AI)-enabled analytics, and autonomous navigation systems [176]. This transformation is not merely technical, but epistemological, reshaping how landscape knowledge is produced, visualised, and operationalised. The convergence of disciplines such as landscape architecture, urban and regional planning, ecological engineering, and remote sensing signals a shift toward integrated data-driven environmental governance [150].
At the heart of this convergence is the increasing reliance on real-time geospatial intelligence to manage complex landscape systems under accelerating climate change, urbanisation, and biodiversity loss. Emerging UAS platforms now support AI-based image interpretation, terrain classification, and predictive environmental modelling, enabling researchers and planners to detect land use changes, forecast vegetation dynamics, and simulate anthropogenic impacts with unprecedented accuracy [197].
Crucially, drones are evolving into interoperable platforms that function within broader spatial intelligence ecosystems, coordinating with Geographic Information Systems (GISs), remote sensing networks, and digital twin environments to enable real-time simulations and scenario planning [160]. This integration empowers multi-scalar modelling of ecological feedback, infrastructure risks, and spatial inequalities.
In this interdisciplinary and technologically fluid context, drones are not just enhancing research, but also transforming operational landscape management. From precision agriculture and conservation planning to urban resilience design, the drone of the future will be autonomous, adaptive, and analytics-driven, capable of delivering actionable real-time insights across both natural and human-modified environments.

5.1.1. AI-Supported Advances in Data Collection and Analysis

The convergence of artificial intelligence (AI) and machine learning (ML) with UAS systems is revolutionising the landscape research paradigm by transforming how environmental data are collected, processed, and interpreted. AI-enabled drones are evolving from passive data-gathering tools into autonomous analytic platforms capable of managing entire data cycles, from high-frequency acquisition to real-time predictive modelling [152]. This shift not only enhances data throughput, but also enables spatial decision making with greater temporal precision and thematic depth.
In urban contexts, AI-integrated UASs embedded within Geographic Information Systems (GISs) and digital twin environments are increasingly used to model heat island dynamics, monitor vehicular emissions, and track vegetation health gradients. These intelligent agents continuously update geospatial data layers, allowing for real-time responsiveness in adaptive green infrastructure planning [156]. Meanwhile, in agricultural and rural settings, AI-supported drones facilitate the early detection of crop diseases, optimise irrigation scheduling, and assess soil nutrient levels, thereby improving both yield and environmental sustainability [170].
The versatility of these systems becomes even more apparent when extended to ecological modelling and climate change mitigation. When coupled with satellite-based communication and 5G connectivity, drones can undertake the wide-scale near-real-time monitoring of deforestation patterns, carbon sequestration processes, and ecosystem restoration activities, even in inaccessible or fragile terrains [189]. Such capabilities are especially critical in developing countries or post-disaster zones where traditional data acquisition is logistically and financially prohibitive.
These developments signify more than just technological progression; they represent a methodological transformation in landscape science. By autonomously converting raw sensor data into predictive environmental intelligence, AI-powered UASs are reshaping not only how landscapes are analysed, but also how they are governed, restored, and designed. Their cross-sectoral applicability underscores their potential to unify fragmented planning domains and catalyse a more integrated, real-time, and responsive model of environmental management.
The practical deployment of artificial intelligence (AI) in landscape research typically involves a combination of supervised and unsupervised machine learning models, including Convolutional Neural Networks (CNNs), Support Vector Machines (SVMs), and Random Forest (RF) classifiers. These algorithms are often embedded into UAS-GIS systems to detect patterns in large geospatial datasets. For instance, CNNs are commonly employed for automatic object recognition and classification tasks using RGB and multispectral imagery, while SVMs have been applied to distinguish between land cover types in heterogeneous terrains [198,199,200]. Meanwhile, Random Forest models are frequently preferred for vegetation classification due to their robustness in handling noisy or incomplete datasets derived from drone-acquired imagery [201,202].
From a sensor integration standpoint, AI applications are increasingly tailored to the specific data outputs of onboard imaging technologies. RGB imagery supports visual pattern recognition and change detection in urban or agricultural settings, whereas multispectral and hyperspectral sensors enable AI algorithms to assess vegetation indices such as NDVI and Red Edge Position for crop monitoring or habitat stress evaluation [203,204]. Thermal sensors, in turn, are coupled with deep learning models to track surface temperature anomalies and wildlife activity under various environmental conditions. LiDAR data, when integrated with 3D point cloud analysis and AI clustering techniques, have demonstrated utility in topographic modelling, biomass estimation, and canopy structure analysis [205,206]. These synergies between AI algorithms and sensor data sources ensure that landscape analytics are both context-sensitive and methodologically scalable.
The integration of next-generation sensor technologies into UAS platforms is redefining the accuracy, efficiency, and strategic value of landscape research. Among the most impactful sensors are LiDAR, hyperspectral, and thermal imaging systems, each offering distinct strengths aligned with specific analytical tasks. LiDAR excels in generating high-fidelity 3D terrain models, particularly in forested or topographically complex regions; hyperspectral sensors capture subtle biochemical variations in vegetation and soil, supporting early diagnostics of plant stress and nutrient deficiencies; and thermal imaging enables real-time detection of hydrological stress, surface heating, and material degradation in both natural and built environments [133,147].
The true innovation, however, lies in the synergistic use of these sensors. When integrated with big data analytics and Geographic Information Systems (GISs), these technologies create multi-layered environmental datasets capable of supporting predictive scenario modelling. For instance, in smart cities and protected areas, the fusion of multispectral drone imagery with AI-supported analytics facilitates early detection of ecosystem degradation, allowing for pre-emptive data-informed conservation strategies [131].
Technological evolution is not limited to sensing capabilities alone. New advances in drone endurance, particularly through hybrid-powered platforms combining electric batteries with solar or hydrogen fuel sources, are addressing persistent limitations in flight duration [148]. These systems are proving essential for long-duration missions over large, remote, or infrastructure-poor landscapes, such as desertified regions, mountainous watersheds, or post-fire forest zones. In parallel, autonomous flight algorithms now allow drones to dynamically adjust their paths in response to real-time environmental inputs, improving both spatial coverage and energy efficiency [118].
Taken together, these technological developments signify more than hardware progress; they reflect a system-level shift toward intelligent, energy-conscious, and application-specific drone architectures. These UASs are not only expanding the operational frontier of landscape research, but are also aligning with sustainability imperatives by reducing carbon footprints and enhancing the continuity of ecosystem-scale monitoring. As such, they represent a foundational component of next-generation planning, conservation, and adaptation frameworks across ecological, agricultural, and urban landscapes.

5.1.2. Swarm Technology and Autonomous Landscape Analyses

Swarm drone technology, wherein multiple UASs operate in a synchronised and semi-autonomous manner, represents a transformative leap in landscape research. This innovation enables distributed yet coordinated data acquisition across heterogeneous and expansive terrains, significantly reducing operational time and enhancing analytical resolution. Applications are rapidly expanding in domains such as climate change monitoring, watershed dynamics, and urban green infrastructure modelling, where the scale and complexity of the landscape often surpass the capabilities of single-drone missions [108].
The core strength of swarm-enabled UAS systems lies in their collective intelligence and redundancy. By coordinating multiple units with minimal overlap, swarms can conduct real-time high-frequency assessments of ecologically sensitive regions while maintaining data continuity, even when individual drones fail or disengage. When equipped with multispectral, thermal, or LiDAR sensors, these systems can generate comprehensive, multidimensional datasets that are critical for mapping microclimates, quantifying carbon fluxes, and evaluating structural landscape connectivity.
Moreover, the integration of onboard AI modules and machine learning algorithms enhances the autonomy of these systems. Adaptive swarms are increasingly capable of executing complex tasks—such as post-disaster reconnaissance, real-time biodiversity monitoring, and erosion risk detection—without human intervention. Through real-time cloud platforms and edge computing, these drones dynamically adjust their trajectories based on incoming sensor data, enabling rapid insight delivery for decision-makers [147].
Looking ahead, swarm drones are poised to become foundational instruments in interdisciplinary research ecosystems that span landscape architecture, ecology, urban resilience, and environmental engineering. Their capacity to simulate dynamic land use scenarios, assess habitat fragmentation, and model infrastructure vulnerability under climate extremes will reshape how spatial phenomena are studied, forecasted, and governed.
As landscape systems grow more interconnected and data-intensive, swarm-based autonomous drones offer a scalable, resilient, and ecologically attuned solution. Their deployment marks the next frontier in the transition toward high-resolution, high-frequency, and systems-level environmental analysis, bridging the gap between real-time observation and anticipatory landscape planning.
Collectively, the technological trajectories explored in this section signal a profound reconfiguration of the drone’s role in landscape research, from passive data collectors to intelligent, autonomous, and system-integrated spatial agents. The convergence of AI-driven analytics, advanced sensor technologies, hybrid energy systems, and swarm intelligence reveals a clear shift from isolated observations toward comprehensive real-time environmental intelligence. Each innovation serves a complementary function: AI enhances interpretive capacity and adaptability; next-generation sensors enrich data dimensionality; hybrid energy systems extend operational endurance; and swarm architectures enable spatial scalability and redundancy.
This technological synergy is particularly consequential in addressing the grand challenges of our time: climate adaptation, biodiversity conservation, food security, and resilient urbanisation. Drones are no longer peripheral tools but central actors in a distributed geospatial intelligence network capable of informing anticipatory planning, multi-scale policy formation, and dynamic ecological stewardship. Moreover, the integration of these systems into digital twins, GIS platforms, and cloud-based analytics establishes the infrastructural basis for a new generation of landscape simulation and governance models.
Looking forward, the emphasis must shift from adopting individual technologies to designing interoperable and ethically informed UAS ecosystems. These ecosystems should be context-sensitive, energy-efficient, and inclusively governed to ensure that the benefits of drone innovations extend equitably across geographic regions and institutional domains. In this sense, the future of drones in landscape research lies not merely in technological sophistication, but in their capacity to catalyse systemic transformation across environmental science, design practice, and spatial policy.

5.2. Emerging Applications of Drones in Landscape Research

The integration of drones into landscape research is undergoing a rapid transformation, driven by overarching global imperatives such as the United Nations Sustainable Development Goals (SDGs), the escalating urgency of climate change mitigation, the spatial restructuring induced by post-pandemic recovery, and accelerating digitalisation in environmental monitoring. These multi-scalar dynamics have not only expanded the functional reach of drone applications, but have also repositioned drones as intelligent adaptive agents embedded within interdisciplinary research ecosystems [106].
Recent technological breakthroughs have significantly upgraded the analytical and operational potential of UAS systems. Advanced sensor platforms—particularly LiDAR, hyperspectral, and thermal imaging—now enable precise, multidimensional data acquisition across diverse ecological, agricultural, and urban settings. These sensors facilitate the detection and monitoring of critical environmental indicators, including biodiversity loss, soil erosion trajectories, changes in hydrological regimes, and the carbon sequestration performance of urban green infrastructure [116]. Unlike traditional monitoring systems, UASs offer higher temporal resolution, increased spatial specificity, and logistical flexibility, making them invaluable in rapidly shifting landscape conditions.
Moreover, the convergence of drone technologies with artificial intelligence (AI), machine learning (ML), and Geographic Information Systems (GISs) is redefining the scale and speed of environmental analysis. This integration supports dynamic spatial modelling and real-time environmental assessment, enabling responsive interventions in ecosystem management and spatial planning [108]. Such hybrid systems are particularly powerful in constructing predictive models that account for both spatial heterogeneity and temporal variability, capabilities essential for adaptive governance in complex landscapes.
Looking toward the future, drones are expected to evolve beyond their current roles as mobile data acquisition tools into fully integrated components of intelligent networked environmental management systems. Their deployment is expanding across diverse domains such as smart city modelling, agroecological zoning, biodiversity conservation planning, and climate-resilient infrastructure monitoring. The synergy between drone-based remote sensing, real-time analytics, and cloud-based data ecosystems is positioning UASs as cornerstone technologies for addressing the multifaceted environmental challenges of the Anthropocene [127].

5.2.1. The Evolving Role of Drones in Sustainable Landscape Management

Within the framework of the United Nations Sustainable Development Goals (SDGs), drone technologies have become pivotal in fostering the sustainability, resilience, and equity of both urban and rural landscapes. Their increasing integration into interdisciplinary fields—such as urban ecology, landscape architecture, and environmental engineering—has enhanced their functional relevance in planning, managing, and monitoring green infrastructure networks [130]. In particular, drones serve as efficient mediators between environmental data acquisition and spatial planning decisions, supporting sustainability-oriented transformations in complex built and natural environments.
Modern UAS systems are now routinely equipped with multispectral and hyperspectral sensors that allow for fine-grained assessments of environmental quality. In urban contexts, these technologies enable drones to monitor tree vitality, detect vegetation stress, and evaluate air pollution gradients. When paired with Geographic Information Systems (GISs), such data can inform the design of adaptive green space strategies that aim to reduce urban heat island (UHI) effects, enhance ecosystem connectivity, and improve public health outcomes [133,146]. This synergy between drones and spatial informatics thus contributes to holistic environmental governance, bridging real-time monitoring with long-term planning objectives.
Looking forward, drones are also expected to support the operationalisation of carbon-neutral urban design principles. Developments in renewable-energy-powered drones—utilising solar panels, hydrogen fuel cells, or hybrid propulsion systems—point toward environmentally responsible trajectories in drone-based landscape monitoring. These UASs can facilitate sustainable applications ranging from real-time air quality surveillance and smart waste logistics to the optimisation of urban water infrastructure [147]. When deployed strategically, drones can help cities meet ambitious SDG targets related to sustainable infrastructure, clean air and water, and climate resilience.
In sum, the evolving role of drones in sustainable landscape management is not only a matter of technological advancement, but also reflects a paradigm shift in spatial thinking. Drones are redefining how urban and ecological systems are perceived, designed, and governed, offering flexible, scalable, and data-driven pathways toward achieving global environmental goals.

5.2.2. The Role of Drones in Climate Change Mitigation Research

The escalating impacts of global climate change have amplified the demand for innovative and adaptive technologies capable of addressing complex environmental transformations. In this regard, drones have emerged as pivotal tools in climate change mitigation and adaptation research, offering unprecedented access to high-resolution spatial data, rapid deployment capacity, and compatibility with remote sensing, GIS, and real-time modelling platforms [130]. Their versatility and responsiveness make them especially valuable in dynamic and often hazardous landscapes where conventional data acquisition methods fall short.
Empirical studies from diverse geographies illustrate the multifaceted role of drones in climate-focused environmental monitoring. In the Brazilian Amazon, thermal imaging conducted via UASs enabled the detection of post-wildfire biomass loss, facilitating faster planning for ecosystem rehabilitation efforts [101]. In the United States, drone-enabled bathymetric measurements achieved improved accuracy in modelling water level changes compared to traditional techniques, enhancing the reliability of hydrological assessments under increasing climatic variability [89]. These examples underscore not only the technical efficiency of drones, but also their value in generating timely site-specific data that inform adaptive management strategies.
Beyond standalone use, drones are increasingly integrated with next-generation technologies such as 5G communication networks, satellite-linked geolocation systems, and cloud-based artificial intelligence (AI) frameworks. This convergence is expanding the scope of UAS applications from local monitoring to large-scale real-time environmental diagnostics. Domains particularly sensitive to climate variability—such as agricultural water management, glacier melt modelling, permafrost thaw detection, and desertification surveillance—stand to benefit significantly from these enhanced drone ecosystems [154]. The integration of AI and edge computing further enables real-time anomaly detection, predictive trend analysis, and scenario modelling at unprecedented resolution and speed.
As climate change continues to challenge the resilience of ecosystems and the sustainability of human settlements, drones are no longer merely passive observers but active instruments within adaptive governance systems. Their ability to transform environmental signals into actionable spatial intelligence empowers decision-makers across disciplines, ranging from urban planning and disaster risk reduction to conservation policy and landscape restoration. In this sense, UASs represent not only a technological innovation, but also a paradigm shift in how societies interact with and respond to rapidly changing environmental conditions.

5.2.3. AI and Big-Data-Driven Landscape Analyses

The convergence of artificial intelligence (AI), machine learning (ML), and drone technologies is revolutionising the paradigm of landscape research by introducing automated, high-speed, and precision-based analytical capabilities. These integrated systems are transforming how environmental and spatial data are processed, moving beyond traditional methods to generate predictive models and actionable insights that enhance the effectiveness of sustainable spatial planning and adaptive landscape management [154]. In climate-sensitive and rapidly transforming landscapes, AI-powered drones excel in automating data classification, anomaly detection, and multitemporal change analysis, thereby enabling proactive responses to ecological shifts.
One of the most transformative applications of this technological synergy is the integration of drone-collected data with digital twin environments. These platforms simulate real-time urban dynamics using continuously updated geospatial inputs, facilitating critical modelling functions such as disaster risk prediction, flood vulnerability mapping, and urban sprawl forecasting [108]. For instance, a study conducted in China demonstrated that integrating UAS-generated 3D urban models with Geographic Information Systems (GISs) led to notable improvements in the predictive performance of air pollution models compared to conventional data sources [115]. This outcome highlights the added analytical depth made possible through AI-enhanced drone systems in urban resilience studies.
Similarly, a European case study on wetland conservation employed hyperspectral imaging drones to detect nuanced ecological changes with substantially greater data precision compared to traditional monitoring methods. The high-resolution datasets enabled the identification of subtle habitat disruptions, supporting more anticipatory and effective habitat management interventions [127]. The capacity to detect such ecological variances, often invisible to conventional sensors, demonstrates the strategic advantage of integrating AI and big data with remote sensing platforms.
Looking forward, drones powered by AI and connected to big data infrastructures will increasingly be deployed to monitor ecological corridors, model biodiversity distribution shifts under climate change scenarios, and optimise urban green infrastructure networks. These systems will not only facilitate retrospective analysis of landscape change, but also support real-time scenario planning and forecasting. As such, the emerging research paradigm is transitioning from descriptive landscape monitoring to prescriptive spatial governance, where UASs act as intelligent agents guiding environmental decision making across scales and sectors.

6. Conclusions and Recommendations

This scoping review has synthesised the expanding role of drone technologies within the field of landscape research, emphasising how these tools have progressed from experimental surveying instruments to integral elements of interdisciplinary spatial planning. The reviewed literature, coupled with thematic categorisations and illustrative case studies, reveals that Uncrewed Aerial Vehicles (UASs) are now central to monitoring, modelling, and managing dynamic landscape systems. These contributions span a variety of domains—including agricultural productivity, cultural heritage conservation, ecological monitoring, and disaster risk planning—where UASs offer unparalleled advantages in data resolution, temporal precision, and operational flexibility.
A key insight emerging from this study is the degree to which technological innovation has transformed drone functionality. With the integration of multispectral, hyperspectral, thermal, and LiDAR sensors, UASs now support high-resolution environmental monitoring across both natural and human-modified landscapes. Their capabilities are further amplified when combined with AI-enhanced analytics, Geographic Information Systems (GISs), and digital twin technologies. Such configurations enable not only real-time data interpretation, but also predictive scenario modelling, positioning UASs as pivotal instruments in future-oriented landscape governance.
Nevertheless, several challenges continue to hinder the full-scale deployment of drone technologies. Operational limitations such as weather sensitivity, short battery life, and limited payload capacity persist, particularly in remote or topographically complex areas. On a regulatory level, legal ambiguities surrounding UAS use in heritage zones, protected environments, and dense urban contexts pose barriers to consistent practice. Moreover, ethical concerns relating to data privacy, surveillance, and consent are increasingly pressing, requiring the development of robust and transparent policy frameworks.
From a conceptual standpoint, this review identifies a notable underrepresentation of participatory, justice-oriented, and community-led approaches in current UAS applications. Most drone-based research remains technocentric, with limited engagement from indigenous knowledge systems or grassroots planning frameworks. Addressing this imbalance will be critical for ensuring that drone-supported solutions are not only scientifically robust, but also socially inclusive and contextually sensitive.
In light of these challenges and opportunities, this review proposes a strategic pathway for future integration. Promoting greater interoperability between UASs and spatial data infrastructures—particularly GIS and IoT systems—will support more agile real-time decision making. Further, enhancing the role of AI-based predictive models may strengthen climate resilience planning, biodiversity forecasting, and land degradation monitoring. Addressing energy autonomy through hybrid drone platforms, such as those powered by solar or hydrogen systems, will be essential for expanding coverage and continuity in large-scale or remote studies. At the same time, refining the legal governance of UAS operations must be prioritised, ensuring procedural accountability and ethical data use across diverse settings. Most importantly, the advancement of cross-disciplinary education and collaboration—particularly between landscape architects, planners, ecologists, data scientists, and policymakers—will be instrumental in translating technological progress into transformative landscape solutions.
In conclusion, UASs are no longer ancillary tools in landscape research; they have become foundational to the spatial sciences. Their capacity to deliver high-frequency, context-specific, and multi-dimensional data allows for the design of more resilient, adaptive, and equitable landscapes. As this review underscores, the future of landscape planning lies in integrating technological precision with environmental stewardship and social responsibility. Drone-based methodologies, when applied critically and inclusively, hold immense potential to reshape how we understand, manage, and co-create the landscapes of the Anthropocene.

Author Contributions

Conceptualisation, A.K. and N.D.; Methodology, F.K. and O.G.; Software, O.G.; Validation, M.Ö., A.K., and N.D.; Formal analysis, M.Ö.; Investigation, F.K. and O.G.; Resources, A.K. and N.D.; Data curation, M.Ö.; Writing—original draft preparation, A.K., F.K., and N.D.; Writing—review and editing, N.D. and M.Ö.; Visualisation, F.K., N.D., and O.G.; Supervision, F.K. and N.D.; Project administration, A.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research did not receive direct financial support. However, a limited portion of the drone imagery used as reference material in this review was originally obtained through a prior field project partially supported by Atatürk University, under the Scientific Research Projects (BAP) Coordination Unit, grant number FDK–2022–10399. This funding, coordinated by Faris Karahan with Oğuz Gökçe as project researcher, covered only the drone filming during that earlier study. All analysis, interpretation, and additional fieldwork for this article were fully conducted and funded by the authors.

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Gonzalez, L.F.; Montes, G.A.; Puig, E.; Johnson, S.; Mengersen, K.; Gaston, K.J. Unmanned Aerial Vehicles (UAVs) and Artificial Intelligence Revolutionizing Wildlife Monitoring and Conservation. Sensors 2016, 16, 97. [Google Scholar] [CrossRef]
  2. Zhang, C.; Kovacs, J.M. The Application of Small Unmanned Aerial Systems for Precision Agriculture: A Review. Precis. Agric. 2012, 13, 693–712. [Google Scholar] [CrossRef]
  3. Turner, D.; Lucieer, A.; Watson, C. An Automated Technique for Generating Georectified Mosaics from Ultra-High Resolution Unmanned Aerial Vehicle (UAV) Imagery, Based on Structure from Motion (SfM) Point Clouds. Remote Sens. 2012, 4, 1392–1410. [Google Scholar] [CrossRef]
  4. Jones, G.P.; Pearlstine, L.G.; Percival, H.F. An Assessment of Small Unmanned Aerial Vehicles for Wildlife Research. Wildl. Soc. Bull. 2006, 34, 750–758. [Google Scholar] [CrossRef]
  5. Chabot, D.; Bird, D.M. Small Unmanned Aircraft: Precise and Convenient New Tools for Surveying Wetlands. J. Unmanned Veh. Syst. 2013, 1, 15–24. [Google Scholar] [CrossRef]
  6. Colomina, I.; Molina, P. Unmanned Aerial Systems for Photogrammetry and Remote Sensing: A Review. ISPRS J. Photogramm. Remote Sens. 2014, 92, 79–97. [Google Scholar] [CrossRef]
  7. Watts, A.C.; Perry, J.H.; Smith, S.E.; Burgess, M.A.; Wilkinson, B.E.; Szantoi, Z.; Ifju, P.G.; Percival, H.F. Small Unmanned Aircraft Systems for Low-Altitude Aerial Surveys. J. Wildl. Manag. 2010, 74, 1614–1619. [Google Scholar] [CrossRef]
  8. Wu, J. Urban Ecology and Sustainability: The State-of-the-Science and Future Directions. Landsc. Urban Plan. 2014, 125, 209–221. [Google Scholar] [CrossRef]
  9. Niemelä, J.; Saarela, S.R.; Söderman, T.; Kopperoinen, L.; Yli-Pelkonen, V.; Väre, S.; Kotze, D.J. Using the Ecosystem Services Approach for Better Planning and Conservation of Urban Green Spaces: A Finland Case Study. Biodivers. Conserv. 2010, 19, 3225–3243. [Google Scholar] [CrossRef]
  10. Turner, B.; Devisscher, T.; Chabaneix, N.; Woroniecki, S.; Messier, C.; Seddon, N. The Role of Nature-Based Solutions in Supporting Social-Ecological Resilience for Climate Change Adaptation. Annu. Rev. Environ. Resour. 2022, 47, 123–148. [Google Scholar] [CrossRef]
  11. Fischer, J.; Lindenmayer, D.B.; Hobbs, R. Landscape Pattern and Biodiversity. In Princeton Guide to Ecology; Levin, S.A., Ed.; Princeton University Press: Princeton, NJ, USA, 2009; pp. 310–320. [Google Scholar]
  12. Zhang, X.; Jin, X.; Liang, X.; Ren, J.; Han, B.; Liu, J.; Yan, H.; Wang, H.; Zhou, X.; Sun, X.; et al. Implications of Land Sparing and Sharing for Maintaining Regional Ecosystem Services: An Empirical Study from a Suitable Area for Agricultural Production in China. Sci. Total Environ. 2022, 820, 153330. [Google Scholar] [CrossRef]
  13. Bennett, J.; Marandure, T.; Hawkins, H.J.; Mapiye, C.; Palmer, A.; Lemke, S.; Moradzadeh, M. A Conceptual Framework for Understanding Ecosystem Trade-Offs and Synergies, in Communal Rangeland Systems. Ecosyst. Serv. 2023, 61, 101533. [Google Scholar] [CrossRef]
  14. Beuchle, R.; Achard, F.; Bourgoin, C.; Vancutsem, C.; Eva, H.; Follador, M. Deforestation and Forest Degradation in the Amazon; European Union: Luxembourg, 2021. [Google Scholar] [CrossRef]
  15. Anderson, K.; Gaston, K.J. Lightweight Unmanned Aerial Vehicles Will Revolutionize Spatial Ecology. Front. Ecol. Environ. 2013, 11, 138–146. [Google Scholar] [CrossRef]
  16. Wallace, L.; Lucieer, A.; Watson, C.; Turner, D. Development of a UAV-LiDAR System with Application to Forest Inventory. Remote Sens. 2012, 4, 1519–1543. [Google Scholar] [CrossRef]
  17. Guan, S.; Zhu, Z.; Wang, G. A Review on UAV-Based Remote Sensing Technologies for Construction and Civil Applications. Drones 2022, 6, 117. [Google Scholar] [CrossRef]
  18. Tricco, A.C.; Lillie, E.; Zarin, W.; O’Brien, K.K.; Colquhoun, H.; Levac, D.; Moher, D.; Peters, M.D.J.; Horsley, T.; Weeks, L.; et al. PRISMA Extension for Scoping Reviews (PRISMA-ScR): Checklist and Explanation. Ann. Intern. Med. 2018, 169, 467–473. [Google Scholar] [CrossRef]
  19. Villarreal, M.L.; Bishop, T.B.B.; Sankey, T.T.; Smith, W.K.; Burgess, M.A.; Caughlin, T.T.; Gillan, J.K.; Havrilla, C.A.; Huang, T.; LeBeau, R.L.; et al. Applications of unoccupied aerial systems (UAS) in landscape ecology: A review of recent research, challenges and emerging opportunities. Landsc. Ecol. 2025, 40, 43. [Google Scholar] [CrossRef]
  20. Sun, Z.; Wang, X.; Wang, Z.; Yang, L.; Xie, Y.; Huang, Y. UAVs as remote sensing platforms in plant ecology: Review of applications and challenges. J. Plant Ecol. 2021, 14, 1003–1020. [Google Scholar] [CrossRef]
  21. Librán-Embid, F.; Klaus, F.; Tscharntke, T.; Grass, I. Unmanned aerial vehicles for biodiversity-friendly agricultural landscapes: A systematic review. Sci. Total Environ. 2020, 732, 139204. [Google Scholar] [CrossRef]
  22. Dronova, I.; Kislik, C.; Dinh, Z.; Kelly, M. A review of unoccupied aerial vehicle use in wetland applications: Emerging opportunities in approach, technology, and data. Drones 2021, 5, 45. [Google Scholar] [CrossRef]
  23. Terra Drone Saudi Arabia. Fixed-Wings vs Multirotor: Which One to Choose? Available online: https://terra-drone.com.sa/fixed-wings-vs-multirotor-which-one-to-choose/ (accessed on 30 May 2025).
  24. Cao, Z.; Kooistra, L.; Wang, W.; Guo, L.; Valente, J. Real-Time Object Detection Based on UAV Remote Sensing: A Systematic Literature Review. Drones 2023, 7, 620. [Google Scholar] [CrossRef]
  25. Fascista, A. Toward Integrated Large-Scale Environmental Monitoring Using WSN/UAV/Crowdsensing: A Review of Applications, Signal Processing, and Future Perspectives. Sensors 2022, 22, 1824. [Google Scholar] [CrossRef]
  26. Mohsan, S.A.H.; Khan, M.A.; Noor, F.; Ullah, I.; Alsharif, M.H. Towards the unmanned aerial vehicles (UAVs): A comprehensive review. Drones 2022, 6, 147. [Google Scholar] [CrossRef]
  27. İnan, A.T.; Ceylan, M. Aerodynamic analysis of fixed-wing unmanned aerial vehicles moving in swarm. Appl. Sci. 2024, 14, 6463. [Google Scholar] [CrossRef]
  28. Gholami, A. Exploring drone classifications and applications: A review. Int. J. Eng. Geosci. 2024, 9, 418–442. [Google Scholar] [CrossRef]
  29. Lockhart, K.; Sandino, J.; Amarasingam, N.; Hann, R.; Bollard, B.; Gonzalez, F. Unmanned aerial vehicles for real-time vegetation monitoring in Antarctica: A review. Remote Sens. 2025, 17, 304. [Google Scholar] [CrossRef]
  30. Savinelli, B.; Tagliabue, G.; Vignali, L.; Garzonio, R.; Gentili, R.; Panigada, C.; Rossini, M. Integrating drone-based LiDAR and multispectral data for tree monitoring. Drones 2024, 8, 744. [Google Scholar] [CrossRef]
  31. Yucesoy, E.; Balcik, B.; Coban, E. The role of drones in disaster response: A literature review of operations research applications. Int. Trans. Oper. Res. 2025, 32, 545–589. [Google Scholar] [CrossRef]
  32. Choi, H.W.; Kim, H.J.; Kim, S.K.; Na, W.S. An overview of drone applications in the construction industry. Drones 2023, 7, 515. [Google Scholar] [CrossRef]
  33. Bayomi, N.; Fernandez, J.E. Eyes in the sky: Drones applications in the built environment under climate change challenges. Drones 2023, 7, 637. [Google Scholar] [CrossRef]
  34. Lu, T.; Wan, L.; Qi, S.; Gao, M. Land cover classification of UAV remote sensing based on transformer–CNN hybrid architecture. Sensors 2023, 23, 5288. [Google Scholar] [CrossRef]
  35. Javed, A.R.; Shahzad, F.; ur Rehman, S.; Zikria, Y.B.; Razzak, I.; Jalil, Z.; Xu, G. Future smart cities: Requirements, emerging technologies, applications, challenges, and future aspects. Cities 2022, 129, 103794. [Google Scholar] [CrossRef]
  36. Gallacher, D. Ecological Monitoring of Arid Rangelands Using micro-UAVs (Drones). In Proceedings of the Sixth Health and Environment Conference, HBMsU Congress, Dubai, United Arab Emirates, 2015; p. 181. Available online: https://www.researchgate.net/publication/281546741_Ecological_Monitoring_of_Arid_Rangelands_using_Micro-_UAVs_drones (accessed on 11 August 2025).
  37. Pandey, S.; Kumari, N.; Mallick, L. Review on Assessment of Land Degradation in Watershed Using Geospatial Technique Based on Unmanned Aircraft Systems. In Unmanned Aircraft Systems; Gupta, S.K., Kumar, M., Nayyar, A., Mahajan, S., Eds.; Wiley: Hoboken, NJ, USA, 2024; Chapter 7. [Google Scholar] [CrossRef]
  38. Masný, M.; Weis, K.; Biskupič, M. Application of fixed-wing UAV-based photogrammetry data for snow depth mapping in alpine conditions. Drones 2021, 5, 114. [Google Scholar] [CrossRef]
  39. Rehak, M.; Skaloud, J. Fixed-wing micro aerial vehicle for accurate corridor mapping. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, 2, 23–31. [Google Scholar] [CrossRef]
  40. Mirka, B. Evaluation of Thermal Infrared Imaging from Unmanned Aerial Vehicles for Arboreal Wildlife Surveillance. Master’s Thesis, San Diego State University, San Diego, CA, USA, 2020. [Google Scholar]
  41. Gonçalves, J.; Henriques, R.; Alves, P.; Sousa-Silva, R.; Monteiro, A.T.; Lomba, Â.; Honrado, J. Evaluating an unmanned aerial vehicle-based approach for assessing habitat extent and condition in fine-scale early successional mountain mosaics. Appl. Veg. Sci. 2016, 19, 132–146. [Google Scholar] [CrossRef]
  42. Sharma, H.; Sidhu, H.; Bhowmik, A. Remote sensing using unmanned aerial vehicles for water stress detection: A review focusing on specialty crops. Drones 2025, 9, 241. [Google Scholar] [CrossRef]
  43. Roni, P.; Clark, C.; Ross, K.; Camp, R.; Krall, M.; Hall, J.; Brown, R. Using Remote Sensing and Other Techniques to Assess and Monitor Large Floodplain and Riparian Restoration Projects; Recreation and Conservation Office: Olympia, WA, USA, 2020. [Google Scholar]
  44. Manfreda, S.; McCabe, M.F.; Miller, P.E.; Lucas, R.; Pajuelo Madrigal, V.; Mallinis, G.; Ben Dor, E.; Helman, D.; Estes, L.; Ciraolo, G.; et al. On the use of unmanned aerial systems for environmental monitoring. Remote Sens. 2018, 10, 641. [Google Scholar] [CrossRef]
  45. Aasen, H.; Burkart, A.; Bolten, A.; Bareth, G. Generating 3D hyperspectral information with lightweight UAV snapshot cameras for vegetation monitoring: From camera calibration to quality assurance. ISPRS J. Photogramm. Remote Sens. 2015, 108, 245–259. [Google Scholar] [CrossRef]
  46. Vaglio Laurin, G.; Puletti, N.; Chen, Q.; Lindsell, J.A.; Coomes, D.A.; Del Frate, F.; Pirotti, F.; Papale, D.; Valentini, R. Tree height estimation in tropical forests using UAVs. Remote Sens. 2019, 11, 2645. [Google Scholar] [CrossRef]
  47. Javernick, L.; Brasington, J.; Caruso, B. Modeling the topography of river corridors: A structure-from-motion photogrammetry approach. Geomorphology 2014, 213, 166–182. [Google Scholar] [CrossRef]
  48. Vélez-Nicolás, M.; García-López, S.; Barbero, L.; Ruiz-Ortiz, V.; Sánchez-Bellón, Á. Applications of Unmanned Aerial Systems (UASs) in Hydrology: A Review. Remote Sens. 2021, 13, 1359. [Google Scholar] [CrossRef]
  49. Zhang, J.; Wang, C.; Wang, M. A review of fixed-wing drone applications in dryland ecosystem monitoring. Ecol. Indic. 2023, 152, 110281. [Google Scholar] [CrossRef]
  50. Grau, J.; Liang, K.; Ogilvie, J.; Arp, P.; Li, S.; Robertson, B.; Meng, F.R. Improved accuracy of riparian zone mapping using near ground unmanned aerial vehicle and photogrammetry method. Remote Sens. 2021, 13, 1997. [Google Scholar] [CrossRef]
  51. Alshaibani, W.T.; Shayea, I.; Caglar, R.; Din, J.; Daradkeh, Y.I. Mobility management of unmanned aerial vehicles in ultra-dense heterogeneous networks. Sensors 2022, 22, 6013. [Google Scholar] [CrossRef]
  52. Muhmad Kamarulzaman, A.M.; Wan Mohd Jaafar, W.S.; Mohd Said, M.N.; Saad, S.N.M.; Mohan, M. UAV implementations in urban planning and related sectors of rapidly developing nations: A review and future perspectives for Malaysia. Remote Sens. 2023, 15, 2845. [Google Scholar] [CrossRef]
  53. Lu, B.; Dao, P.D.; Liu, J.; He, Y.; Shang, J. Recent advances of hyperspectral imaging technology and applications in agriculture. Remote Sens. 2020, 12, 2659. [Google Scholar] [CrossRef]
  54. Kim, J.; Kang, Y.; Kim, D.; Son, S.; Kim, E.J. Carbon storage and sequestration analysis by urban park grid using i-tree eco and drone-based modeling. Forests 2024, 15, 683. [Google Scholar] [CrossRef]
  55. Al Shafian, S.; Hu, D. Integrating machine learning and remote sensing in disaster management: A decadal review of post-disaster building damage assessment. Buildings 2024, 14, 2344. [Google Scholar] [CrossRef]
  56. Messaoudi, K.; Oubbati, O.S.; Rachedi, A.; Lakas, A.; Bendouma, T.; Chaib, N. A survey of UAV-based data collection: Challenges, solutions and future perspectives. J. Netw. Comput. Appl. 2023, 216, 103670. [Google Scholar] [CrossRef]
  57. Zhang, Z.; Zhu, L. A review on unmanned aerial vehicle remote sensing: Platforms, sensors, data processing methods, and applications. Drones 2023, 7, 398. [Google Scholar] [CrossRef]
  58. Luo, J.; Tian, Y.; Wang, Z. Research on unmanned aerial vehicle path planning. Drones 2024, 8, 51. [Google Scholar] [CrossRef]
  59. Wang, Y.; Kumar, L.; Raja, V.; AL-bonsrulah, H.A.; Kulandaiyappan, N.K.; Amirtharaj Tharmendra, A.; Al-Bahrani, M. Design and innovative integrated engineering approaches-based investigation of hybrid renewable energized drone for long endurance applications. Sustainability 2022, 14, 16173. [Google Scholar] [CrossRef]
  60. Salmoral, G.; Rivas Casado, M.; Muthusamy, M.; Butler, D.; Menon, P.P.; Leinster, P. Guidelines for the use of unmanned aerial systems in flood emergency response. Water 2020, 12, 521. [Google Scholar] [CrossRef]
  61. Tonti, I.; Lingua, A.M.; Piccinini, F.; Pierdicca, R.; Malinverni, E.S. Digitalization and spatial documentation of post-earthquake temporary housing in Central Italy: An integrated geomatic approach involving UAV and a GIS-based system. Drones 2023, 7, 438. [Google Scholar] [CrossRef]
  62. Dainelli, R.; Toscano, P.; Di Gennaro, S.F.; Matese, A. Recent advances in unmanned aerial vehicles forest remote sensing—A systematic review. Part II: Research applications. Forests 2021, 12, 397. [Google Scholar] [CrossRef]
  63. Ezenne, G.I.; Jupp, L.; Mantel, S.K.; Tanner, J.L. Current and potential capabilities of UAS for crop water productivity in precision agriculture. Agric. Water Manag. 2019, 218, 158–164. [Google Scholar] [CrossRef]
  64. Cho, Y.I.; Yoon, D.; Lee, M.J. Comparative analysis of urban heat island cooling strategies according to spatial and temporal conditions using unmanned aerial vehicles (UAV) observation. Appl. Sci. 2023, 13, 10052. [Google Scholar] [CrossRef]
  65. Shahbazi, M.; Théau, J.; Ménard, P. Recent applications of unmanned aerial imagery in natural resource management. GISci. Remote Sens. 2014, 51, 339–365. [Google Scholar] [CrossRef]
  66. Meier, L.; Tanskanen, P.; Heng, L.; Lee, G.H.; Fraundorfer, F.; Pollefeys, M. PIXHAWK: A micro aerial vehicle design for autonomous flight using onboard computer vision. Auton. Robot. 2012, 33, 21–39. [Google Scholar] [CrossRef]
  67. Darweesh, H.; Takeuchi, E.; Takeda, K.; Ninomiya, Y.; Sujiwo, A.; Morales, L.Y.; Kurniawati, H.; Berrio, L.C.; Rhee, S.; Kato, S. Open source integrated planner for autonomous navigation in highly dynamic environments. J. Robot. Mechatron. 2017, 29, 668–684. [Google Scholar] [CrossRef]
  68. Popescu, D.; Stoican, F.; Stamatescu, G.; Chenaru, O.; Ichim, L. A survey of collaborative UAV–WSN systems for efficient monitoring. Sensors 2019, 19, 4690. [Google Scholar] [CrossRef]
  69. Pádua, L.; Vanko, J.; Hruška, J.; Adão, T.; Sousa, J.J.; Peres, E.; Morais, R. UAS, sensors, and data processing in agroforestry: A review towards practical applications. Int. J. Remote Sens. 2017, 38, 2349–2391. [Google Scholar] [CrossRef]
  70. Núñez, J.M.; Araújo, M.G.; García-Tuñón, I. Real-time telemetry system for monitoring motion of ships based on inertial sensors. Sensors 2017, 17, 948. [Google Scholar] [CrossRef]
  71. Coviello, G.; Avitabile, G. Multiple synchronized inertial measurement unit sensor boards platform for activity monitoring. IEEE Sens. J. 2020, 20, 8771–8777. [Google Scholar] [CrossRef]
  72. Um, J.S. Imaging sensors. In Drones as Cyber-Physical Systems: Concepts and Applications for the Fourth Industrial Revolution; Springer: Singapore, 2019; pp. 177–225. [Google Scholar]
  73. Yücel, M.A.; Şanlıyüksel Yücel, D. UAV-based RGB and TIR imaging for geothermal monitoring: A case study at Kestanbol geothermal field, Northwestern Turkey. Environ. Monit. Assess. 2023, 195, 541. [Google Scholar] [CrossRef] [PubMed]
  74. Agrawal, J.; Arafat, M.Y. Transforming farming: A review of AI-powered UAV technologies in precision agriculture. Drones 2024, 8, 664. [Google Scholar] [CrossRef]
  75. Abbas, A.; Zhang, Z.; Zheng, H.; Alami, M.M.; Alrefaei, A.F.; Abbas, Q.; Shahzad, A.; Rauf, A.; Raza, M.Q.; Ahmad, S.; et al. Drones in plant disease assessment, efficient monitoring, and detection: A way forward to smart agriculture. Agronomy 2023, 13, 1524. [Google Scholar] [CrossRef]
  76. Arabi Aliabad, F.; Ghafarian Malamiri, H.; Sarsangi, A.; Sekertekin, A.; Ghaderpour, E. Identifying and monitoring gardens in urban areas using aerial and satellite imagery. Remote Sens. 2023, 15, 4053. [Google Scholar] [CrossRef]
  77. Tang, T.; Luo, Q.; Yang, L.; Gao, C.; Ling, C.; Wu, W. Research review on quality detection of fresh tea leaves based on spectral technology. Foods 2023, 13, 25. [Google Scholar] [CrossRef]
  78. Atkinson Amorim, J.G.; Schreiber, L.V.; Quadros de Souza, M.R.; Negreiros, M.; Susin, A.; Bredemeier, C.; Trentin, C.; Vian, A.L.; Andrades-Filho, C.O.; Doering, D.; et al. Biomass estimation of spring wheat with machine learning methods using UAV-based multispectral imaging. Int. J. Remote Sens. 2022, 43, 4758–4773. [Google Scholar] [CrossRef]
  79. Berveglieri, A.; Imai, N.N.; Watanabe, F.S.Y.; Tommaselli, A.M.G.; Ederli, G.M.P.; de Araújo, F.F.; Yamashita, M.P.; Tucci, C.A.; Caires, S.M.; Oliveira, R.A.B.; et al. Remote prediction of soybean yield using UAV-based hyperspectral imaging and machine learning models. AgriEngineering 2024, 6, 3242–3260. [Google Scholar] [CrossRef]
  80. Saravia, D.; Salazar, W.; Valqui-Valqui, L.; Quille-Mamani, J.; Porras-Jorge, R.; Corredor, F.A.; Rojas-Huaranga, L.; Meza-Canales, A.; Medina-Villacorta, G.; Delgado-Castro, J.; et al. Yield predictions of four hybrids of maize (Zea mays) using multispectral images obtained from UAV in the coast of Peru. Agronomy 2022, 12, 2630. [Google Scholar] [CrossRef]
  81. Hernandez, A.; Bushman, S.; Johnson, P.; Robbins, M.D.; Patten, K. Prediction of turfgrass quality using multispectral UAV imagery and ordinal forests: Validation using a fuzzy approach. Agronomy 2024, 14, 2575. [Google Scholar] [CrossRef]
  82. Parra, L.; Ahmad, A.; Zaragoza-Esquerdo, M.; Ivars-Palomares, A.; Sendra, S.; Lloret, J. A comprehensive survey of drones for turfgrass monitoring. Drones 2024, 8, 563. [Google Scholar] [CrossRef]
  83. Sun, W.; Du, Q. Hyperspectral band selection: A review. IEEE Geosci. Remote Sens. Mag. 2019, 7, 118–139. [Google Scholar] [CrossRef]
  84. Sethy, P.K.; Pandey, C.; Sahu, Y.K.; Behera, S.K. Hyperspectral imagery applications for precision agriculture—A systemic survey. Multimed. Tools Appl. 2022, 81, 3005–3038. [Google Scholar] [CrossRef]
  85. Khan, A.; Vibhute, A.D.; Mali, S.; Patil, C.H. A systematic review on hyperspectral imaging technology with a machine and deep learning methodology for agricultural applications. Ecol. Inform. 2022, 69, 101678. [Google Scholar] [CrossRef]
  86. Xu, S.; Wang, M.; Shi, X.; Yu, Q.; Zhang, Z. Integrating hyperspectral imaging with machine learning techniques for the high-resolution mapping of soil nitrogen fractions in soil profiles. Sci. Total Environ. 2021, 754, 142135. [Google Scholar] [CrossRef]
  87. Lorek, A.; Majewski, J. Humidity measurement in carbon dioxide with capacitive humidity sensors at low temperature and pressure. Sensors 2018, 18, 2615. [Google Scholar] [CrossRef] [PubMed]
  88. Tauro, F.; Selker, J.S.; van de Giesen, N.; Abrate, T.; Manfreda, S.; Caylor, K.; Caparrini, F.; Benveniste, J.; Moller, D.; Yepez, E.; et al. Measurements and observations in the XXI century (MOXXI): Innovation and multi-disciplinarity to sense the hydrological cycle. Hydrol. Sci. J. 2018, 63, 169–196. [Google Scholar] [CrossRef]
  89. Tmušić, G.; Manfreda, S.; Aasen, H.; James, M.R.; Gonçalves, G.; Ben-Dor, E.; Brook, A.; Polinova, M.; Arranz Justel, J.J.; Lüpfert, E.; et al. Current practices in UAS-based environmental monitoring. Remote Sens. 2020, 12, 1001. [Google Scholar] [CrossRef]
  90. Díaz-Delgado, R.; Ónodi, G.; Kröel-Dulay, G.; Kertész, M. Enhancement of ecological field experimental research by means of UAV multispectral sensing. Drones 2019, 3, 7. [Google Scholar] [CrossRef]
  91. Yang, G.; Liu, J.; Zhao, C.; Li, Z.; Huang, Y.; Yu, H.; Xu, B.; Yang, X.; Zhu, D.; Zhang, R.; et al. Unmanned aerial vehicle remote sensing for field-based crop phenotyping: Current status and perspectives. Front. Plant Sci. 2017, 8, 1111. [Google Scholar] [CrossRef]
  92. Pamula, G.; Ramachandran, A. Thermal management for unmanned aerial vehicle payloads: Mechanisms, systems, and applications. Drones 2025, 9, 350. [Google Scholar] [CrossRef]
  93. Aasen, H.; Honkavaara, E.; Lucieer, A.; Zarco-Tejada, P.J. Quantitative remote sensing at ultra-high resolution with UAV spectroscopy: A review of sensor technology, measurement procedures, and data correction workflows. Remote Sens. 2018, 10, 1091. [Google Scholar] [CrossRef]
  94. Zarco-Tejada, P.J.; González-Dugo, V.; Berni, J.A. Fluorescence, temperature and narrow-band indices acquired from a UAV platform for water stress detection using a micro-hyperspectral imager and a thermal camera. Remote Sens. Environ. 2012, 117, 322–337. [Google Scholar] [CrossRef]
  95. Barmpoutis, P.; Papaioannou, P.; Dimitropoulos, K.; Grammalidis, N. A review on early forest fire detection systems using optical remote sensing. Sensors 2020, 20, 6442. [Google Scholar] [CrossRef] [PubMed]
  96. Chrétien, L.P.; Théau, J.; Menard, P. Wildlife multispecies remote sensing using visible and thermal infrared imagery acquired from an unmanned aerial vehicle (UAV). Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, 40, 241–248. [Google Scholar] [CrossRef]
  97. Gonzalez-Dugo, V.; Zarco-Tejada, P.; Berni, J.A.; Suarez, L.; Goldhamer, D.; Fereres, E. Almond tree canopy temperature reveals intra-crown variability that is water stress-dependent. Agric. For. Meteorol. 2012, 154, 156–165. [Google Scholar] [CrossRef]
  98. Awais, M.; Li, W.; Cheema, M.J.M.; Zaman, Q.U.; Shaheen, A.; Aslam, B.; Zhu, W.; Ajmal, M.; Faheem, M.; Hussain, S.; et al. UAV-based remote sensing in plant stress imaging using high-resolution thermal sensor for digital agriculture practices: A meta-review. Int. J. Environ. Sci. Technol. 2023, 20, 1135–1152. [Google Scholar] [CrossRef]
  99. Kotarski, D.; Piljek, P.; Pranjić, M.; Kasać, J. Concept of a modular multirotor heavy lift unmanned aerial vehicle platform. Aerospace 2023, 10, 528. [Google Scholar] [CrossRef]
  100. Queinnec, M.; White, J.C.; Coops, N.C. Comparing airborne and spaceborne photon-counting LiDAR canopy structural estimates across different boreal forest types. Remote Sens. Environ. 2021, 262, 112510. [Google Scholar] [CrossRef]
  101. d’Oliveira, M.V.N.; Figueiredo, E.O.; de Almeida, D.R.A.; Oliveira, L.C.; Silva, C.A.; Nelson, B.W.; Pimentel, T.P.M.; Silva, C.E.A.; Valbuena, R. Impacts of selective logging on Amazon Forest canopy structure and biomass with a LiDAR and photogrammetric survey sequence. For. Ecol. Manag. 2021, 500, 119648. [Google Scholar] [CrossRef]
  102. Kurbanov, E.; Vorobev, O.; Lezhnin, S.; Sha, J.; Wang, J.; Cole, J.; Dergunov, D.; Wang, Y. Remote sensing of forest burnt area, burn severity, and post-fire recovery: A review. Remote Sens. 2022, 14, 4714. [Google Scholar] [CrossRef]
  103. Green, D.R.; Gregory, B.J.; Karachok, A. (Eds.) Unmanned Aerial Remote Sensing: UAS for Environmental Applications; CRC Press: Boca Raton, FL, USA, 2020. [Google Scholar]
  104. Alabi, T.R.; Abebe, A.T.; Chigeza, G.; Fowobaje, K.R. Estimation of soybean grain yield from multispectral high-resolution UAV data with machine learning models in West Africa. Remote Sens. Appl. Soc. Environ. 2022, 27, 100782. [Google Scholar] [CrossRef]
  105. Xu, Y.; Li, X.; Pan, Y.; Liu, Q. Applications of hyperspectral remote sensing in vegetation monitoring: A review. Sensors 2021, 21, 8485. [Google Scholar] [CrossRef]
  106. Larrañaga, A.; Aragonés, D.; Recio, B.; Hernández-Lasheras, J.; Jiménez-Brenes, F.M.; Pérez-Ortiz, M.; Peña, J.M. Fire detection and early alert systems using thermal UAV imagery: A systematic review. Drones 2022, 6, 192. [Google Scholar] [CrossRef]
  107. Wu, H.; Prasad, S. Deep learning for UAV-based thermal remote sensing: A review and future perspectives. ISPRS J. Photogramm. Remote Sens. 2021, 175, 146–163. [Google Scholar] [CrossRef]
  108. Candiago, S.; Remondino, F.; De Giglio, M.; Dubbini, M.; Gattelli, M. Evaluating multispectral images and vegetation indices for precision farming applications from UAV images. Remote Sens. 2015, 7, 4026–4047. [Google Scholar] [CrossRef]
  109. Zhao, J.; Wu, Y.; Deng, R.; Xu, S.; Gao, J.; Burke, A. A survey of autonomous driving from a deep learning perspective. ACM Comput. Surv. 2025, 57, 1–60. [Google Scholar] [CrossRef]
  110. Verfaillie, M.; Cho, E.; Dwyre, L.; Khan, I.; Wagner, C.; Jacobs, J.M.; Hunsaker, A. UAS remote sensing applications to abrupt cold region hazards. Front. Remote Sens. 2023, 4, 1095275. [Google Scholar] [CrossRef]
  111. Choi, Y.-Y.; Suh, M.-S.; Park, K.-H. Assessment of surface urban heat islands over three megacities in East Asia using land surface temperature data retrieved from COMS. Remote Sens. 2014, 6, 5852–5867. [Google Scholar] [CrossRef]
  112. Arruda de Lima, G.S.; Ferreira, M.E.; Sales, J.C.; de Souza Passos, J.; Maggiotto, S.R.; Madari, B.E.; Carvalho, M.T.M.; de Almeida Machado, P.L.O. Evapotranspiration measurements in pasture, crops, and native Brazilian Cerrado based on UAV-borne multispectral sensor. Environ. Monit. Assess. 2024, 196, 1105. [Google Scholar] [CrossRef]
  113. McCormack, E.; Humstad, T.; Salazar, S.; Frauenfelder, R.; Dupuy, B.; Hendrikx, J.; Dahle, H.; Solbakken, E. Operational integration of uncrewed aerial vehicles into roadway agencies’ snow avalanche risk assessment process. In Proceedings of the International Snow Science Workshop (ISSW), Tromsø, Norway, 23–27 September 2024; pp. 1100–1106. [Google Scholar]
  114. Thiel, C.; Mueller, M.M.; Epple, L.; Thau, C.; Hese, S.; Voltersen, M.; Henkel, A. UAS imagery-based mapping of coarse wood debris in a natural deciduous forest in Central Germany (Hainich National Park). Remote Sens. 2020, 12, 3293. [Google Scholar] [CrossRef]
  115. Cohen, A.P.; Shaheen, S.A.; Farrar, E.M. Urban Air Mobility: History, Ecosystem, Market Potential, and Challenges. IEEE Trans. Intell. Transp. Syst. 2021, 22, 6074–6087. [Google Scholar] [CrossRef]
  116. Portillo, H.; Barbosa, R.; Beckwith, M.; Gullen, T.; Haynie, R.; Hovinga, S.; Kesavaraju, B.; Lang, E.; Livingston, P.; Newton, N.; et al. Industry (UAPASTF) Response to Pesticide Regulators’ “State of the Knowledge” Review of Drone Use for Pesticide Application: Best Practices for Safe and Effective Application of Pesticides. Drones 2025, 9, 388. [Google Scholar] [CrossRef]
  117. Jenkins, N. An Application of Aerial Drones in Zoning and Urban Land Use Planning in Canada: A Preliminary Review of Current Policies, Restrictions and Planning Direction for Aerial Drones in Canadian Cities. Ph.D. Thesis, Toronto Metropolitan University, Toronto, ON, Canada, 2013. [Google Scholar]
  118. Keerthinathan, P.; Amarasingam, N.; Hamilton, G.; Gonzalez, F. Exploring unmanned aerial systems operations in wildfire management: Data types, processing algorithms and navigation. Int. J. Remote Sens. 2023, 44, 5628–5685. [Google Scholar] [CrossRef]
  119. Grando, L.; Jaramillo, J.F.G.; Leite, J.R.E.; Ursini, E.L. Systematic Literature Review Methodology for Drone Recharging Processes in Agriculture and Disaster Management. Drones 2025, 9, 40. [Google Scholar] [CrossRef]
  120. Lucic, M.C.; Bouhamed, O.; Ghazzai, H.; Khanfor, A.; Massoud, Y. Leveraging UAVs to enable dynamic and smart aerial infrastructure for ITS and smart cities: An overview. Drones 2023, 7, 79. [Google Scholar] [CrossRef]
  121. Kulsinskas, A.; Durdevic, P.; Ortiz-Arroyo, D. Internal wind turbine blade inspections using UAVs: Analysis and design issues. Energies 2021, 14, 294. [Google Scholar] [CrossRef]
  122. Agrawal, P.; Nagarkar, D.; Khandelwal, P.; Sheikh, Z.; Kaur, G.; Pinjarkar, L. Unmanned aerial vehicals (UAVs): Exploring AI-driven applications, challenges and future prospects across diverse domains. In Proceedings of the 2024 10th International Conference on Electrical Energy Systems (ICEES), Chennai, India, 22–24 August 2024; pp. 1–5. [Google Scholar]
  123. Karahan, A.; Gökçe, O.; Demircan, N.; Özgeriş, M.; Karahan, F. Integrating UAV photogrammetry and GIS to assess terrace landscapes in mountainous northeastern Türkiye for sustainable land management. Sustainability 2025, 17, 5855. [Google Scholar] [CrossRef]
  124. Rolando, A.; Scandiffio, A. Mapping landscape components by UAV multispectral surveying platform. In Misura/Dismisura; Franco Angeli: Milan, Italy, 2024; pp. 3627–3633. [Google Scholar]
  125. Zhang, C.; Chen, J.; Li, P.; Han, S.; Xu, J. Integrated high-precision real scene 3D modeling of karst cave landscape based on laser scanning and photogrammetry. Sci. Rep. 2024, 14, 20485. [Google Scholar] [CrossRef] [PubMed]
  126. Guebsi, R.; Mami, S.; Chokmani, K. Drones in precision agriculture: A comprehensive review of applications, technologies, and challenges. Drones 2024, 8, 686. [Google Scholar] [CrossRef]
  127. Mashala, M.J.; Dube, T.; Mudereri, B.T.; Ayisi, K.K.; Ramudzuli, M.R. A Systematic Review on Advancements in Remote Sensing for Assessing and Monitoring Land Use and Land Cover Changes Impacts on Surface Water Resources in Semi-Arid Tropical Environments. Remote Sens. 2023, 15, 3926. [Google Scholar] [CrossRef]
  128. Xiao, W.; Ren, H.; Sui, T.; Zhang, H.; Zhao, Y.; Hu, Z. A drone- and field-based investigation of the land degradation and soil erosion at an opencast coal mine dump after 5 years’ evolution of natural processes. Int. J. Coal Sci. Technol. 2022, 9, 42. [Google Scholar] [CrossRef]
  129. Hussain, Y.; Schlögel, R.; Innocenti, A.; Hamza, O.; Iannucci, R.; Martino, S.; Havenith, H.B. Review on the geophysical and UAV-based methods applied to landslides. Remote Sens. 2022, 14, 4564. [Google Scholar] [CrossRef]
  130. Siddiq, M.I.; Daud, M.E.; Kaamin, M.; Mokhtar, M.; Omar, A.S.; Duong, N.A. Investigation of Pantai Punggur coastal erosion by using UAV photogrammetry. Int. J. Nanoelectron. Mater. 2022, 15, 61–69. [Google Scholar]
  131. Hu, D.; Minner, J. UAVs and 3D city modeling to aid urban planning and historic preservation: A systematic review. Remote Sens. 2023, 15, 5507. [Google Scholar] [CrossRef]
  132. Ajakwe, S.O.; Esomonu, N.F.; Deji-Oloruntoba, O.; Ajakwe, I.U.; Lee, J.M.; Kim, D.S. Machine learning in UAV-assisted smart farming. In Applications of Machine Learning in UAV Networks; IGI Global: Hershey, PA, USA, 2024; pp. 217–245. [Google Scholar]
  133. Bollas, N.; Kokinou, E.; Polychronos, V. Comparison of Sentinel-2 and UAV multispectral data for use in precision agriculture: An application from northern Greece. Drones 2021, 5, 35. [Google Scholar] [CrossRef]
  134. Zheng, J.Y.; Hao, Y.Y.; Wang, Y.C.; Zhou, S.Q.; Wu, W.B.; Yuan, Q.; Gao, H.; Jiang, S.; Hu, C.; Liu, H.; et al. Coastal wetland vegetation classification using pixel-based, object-based and deep learning methods based on RGB-UAV. Land 2022, 11, 2039. [Google Scholar] [CrossRef]
  135. Lyu, X.; Li, X.; Dang, D.; Dou, H.; Wang, K.; Lou, A. Unmanned aerial vehicle (UAV) remote sensing in grassland ecosystem monitoring: A systematic review. Remote Sens. 2022, 14, 1096. [Google Scholar] [CrossRef]
  136. Kaur, A.; Kukreja, V.; Chattopadhyay, S.; Verma, A.; Sharma, R. Empowering wildlife conservation with a fused CNN-SVM deep learning model for multi-classification using drone-based imagery. In Proceedings of the 2024 IEEE International Conference on Interdisciplinary Approaches in Technology and Management for Social Innovation (IATMSI), New Delhi, India, 14–16 March 2024; Volume 2, pp. 1–4. [Google Scholar]
  137. Povlsen, P.; Bruhn, D.; Durdevic, P.; Arroyo, D.O.; Pertoldi, C. Using YOLO object detection to identify hare and roe deer in thermal aerial video footage—Possible future applications in real-time automatic drone surveillance and wildlife monitoring. Drones 2023, 8, 2. [Google Scholar] [CrossRef]
  138. Yavuz, M.; Tufekcioglu, M. Assessment of flood-induced geomorphic changes in Sidere Creek of the mountainous basin using small UAV-based imagery. Sustainability 2023, 15, 11793. [Google Scholar] [CrossRef]
  139. Liu, X.; Wang, Y.; Chen, T.; Gu, X.; Zhang, L.; Li, X.; Zhao, Z.; Zeng, Z.; Wang, H.; Liu, Z.; et al. Monitoring water quality parameters of freshwater aquaculture ponds using UAV-based multispectral images. Ecol. Indic. 2024, 167, 112644. [Google Scholar] [CrossRef]
  140. Dai, W.; Qian, W.; Liu, A.; Wang, C.; Yang, X.; Hu, G.; Tang, G. Monitoring and modeling sediment transport in space in small loess catchments using UAV-SfM photogrammetry. Catena 2022, 214, 106244. [Google Scholar] [CrossRef]
  141. Darji, K.; Vyas, U.; Patel, D.; Singh, S.K.; Dubey, A.K.; Gupta, P.; Singh, R.P. UAV-based comprehensive modelling approach for flood hazard assessment and mitigation planning. Phys. Chem. Earth 2024, 135, 103609. [Google Scholar] [CrossRef]
  142. Munawar, H.S.; Ullah, F.; Qayyum, S.; Heravi, A. Application of deep learning on UAV-based aerial images for flood detection. Smart Cities 2021, 4, 1220–1242. [Google Scholar] [CrossRef]
  143. Trepekli, K.; Balstrøm, T.; Friborg, T.; Fog, B.; Allotey, A.N.; Kofie, R.Y.; Møller-Jensen, L. UAV-borne, LiDAR-based elevation modelling: A method for improving local-scale urban flood risk assessment. Nat. Hazards 2022, 113, 423–451. [Google Scholar] [CrossRef]
  144. Treccani, D.; Adami, A.; Fregonese, L. Drones and real-time kinematic base station integration for documenting inaccessible ruins: A case study approach. Drones 2024, 8, 268. [Google Scholar] [CrossRef]
  145. López-Herrera, J.; López-Cuervo, S.; Pérez-Martín, E.; Maté-González, M.Á.; Izquierdo, C.V.; Peñarroya, J.M.; Herrero-Tejedor, T.R. Evaluation of 3D models of archaeological remains of Almenara Castle using two UAVs with different navigation systems. Heritage 2025, 8, 22. [Google Scholar] [CrossRef]
  146. Fiz, J.I.; Martín, P.M.; Cuesta, R.; Subías, E.; Codina, D.; Cartes, A. Examples and results of aerial photogrammetry in archeology with UAV: Geometric documentation, high resolution multispectral analysis, models and 3D printing. Drones 2022, 6, 59. [Google Scholar] [CrossRef]
  147. De Fino, M.; Galantucci, R.A.; Fatiguso, F. Condition assessment of heritage buildings via photogrammetry: A scoping review from the perspective of decision makers. Heritage 2023, 6, 7031–7066. [Google Scholar] [CrossRef]
  148. Kerle, N.; Nex, F.; Gerke, M.; Duarte, D.; Vetrivel, A. UAV-based structural damage mapping: A review. ISPRS Int. J. Geo-Inf. 2019, 9, 14. [Google Scholar] [CrossRef]
  149. Laohaviraphap, N.; Waroonkun, T. Integrating artificial intelligence and the internet of things in cultural heritage preservation: A systematic review of risk management and environmental monitoring strategies. Buildings 2024, 14, 3979. [Google Scholar] [CrossRef]
  150. Quamar, M.M.; Al-Ramadan, B.; Khan, K.; Shafiullah, M.; El Ferik, S. Advancements and applications of drone-integrated geographic information system technology—A review. Remote Sens. 2023, 15, 5039. [Google Scholar] [CrossRef]
  151. Zheng, S.; Meng, C.; Xue, J.; Wu, Y.; Liang, J.; Xin, L.; Zhang, L. UAV-based spatial pattern of three-dimensional green volume and its influencing factors in Lingang New City in Shanghai, China. Front. Earth Sci. 2021, 15, 543–552. [Google Scholar] [CrossRef]
  152. Li, S.; Zhu, Y.; Wan, H.; Xiao, Q.; Teng, M.; Xu, W.; He, Y.; Li, Z.; Zhang, H.; Zhou, D.; et al. Effectiveness of potential strategies to mitigate surface urban heat island: A comprehensive investigation using high-resolution thermal observations from an unmanned aerial vehicle. Sustain. Cities Soc. 2024, 113, 105716. [Google Scholar] [CrossRef]
  153. Shao, H.; Song, P.; Mu, B.; Tian, G.; Chen, Q.; He, R.; Kim, G. Assessing city-scale green roof development potential using unmanned aerial vehicle (UAV) imagery. Urban For. Urban Green. 2021, 57, 126954. [Google Scholar] [CrossRef]
  154. Liu, H.; Xiao, P.; Zhang, X.; Zhou, X.; Li, J.; Guo, R. Object-based island green cover mapping by integrating UAV multispectral image and LiDAR data. J. Appl. Remote Sens. 2021, 15, 034512. [Google Scholar] [CrossRef]
  155. Munawar, H.S.; Hammad, A.W.; Waller, S.T.; Thaheem, M.J.; Shrestha, A. An integrated approach for post-disaster flood management via the use of cutting-edge technologies and UAVs: A review. Sustainability 2021, 13, 7925. [Google Scholar] [CrossRef]
  156. Mishra, V.; Avtar, R.; Prathiba, A.P.; Mishra, P.K.; Tiwari, A.; Sharma, S.K.; Sharma, S.; Ashtikar, R.; Islam, A.R.M.T.; Sharma, A.; et al. Uncrewed aerial systems in water resource management and monitoring: A review of sensors, applications, software, and issues. Adv. Civ. Eng. 2023, 2023, 3544724. [Google Scholar] [CrossRef]
  157. Gupta, R.; Bhattacharya, P.; Tanwar, S.; Sharma, R.; Alqahtani, F.; Tolba, A.; Sangaiah, A.K.; Kumar, N.; Elhoseny, M.; Raboaca, M.S. Fight against future pandemics: UAV-based data-centric social distancing, sanitizing, and monitoring scheme. Drones 2022, 6, 381. [Google Scholar] [CrossRef]
  158. Özer, M.M. Predictive modeling of urban air pollution using machine learning and unmanned aerial vehicle platforms. In Innovative Applications of Artificial Neural Networks to Data Analytics and Signal Processing; Springer: Cham, Switzerland, 2024; pp. 79–115. [Google Scholar]
  159. Bakirci, M. Enhancing air pollution mapping with autonomous UAV networks for extended coverage and consistency. Atmos. Res. 2024, 306, 107480. [Google Scholar] [CrossRef]
  160. Butilă, E.V.; Boboc, R.G. Urban traffic monitoring and analysis using unmanned aerial vehicles (UAVs): A systematic literature review. Remote Sens. 2022, 14, 620. [Google Scholar] [CrossRef]
  161. Tal, D.; Altschuld, J. Drone Technology in Architecture, Engineering and Construction: A Strategic Guide to Unmanned Aerial Vehicle Operation and Implementation; John Wiley & Sons: Hoboken, NJ, USA, 2021. [Google Scholar]
  162. Zhang, R.; Cao, L.; Liu, Y.; Guo, R.; Luo, J.; Shu, P. Decoding spontaneous informal spaces in old residential communities: A drone and space syntax perspective. ISPRS Int. J. Geo-Inf. 2023, 12, 452. [Google Scholar] [CrossRef]
  163. Valdez-Delgado, K.M.; Garcia-Salazar, O.; Moo-Llanes, D.A.; Izcapa-Treviño, C.; Cruz-Pliego, M.A.; Domínguez-Posadas, G.Y.; Danis-Lozano, R. Mapping the urban environments of Aedes aegypti using drone technology. Drones 2023, 7, 581. [Google Scholar] [CrossRef]
  164. Olivatto, T.F.; Inguaggiato, F.F.; Stanganini, F.N. Urban mapping and impacts assessment in a Brazilian irregular settlement using UAV-based imaging. Remote Sens. Appl. Soc. Environ. 2023, 29, 100911. [Google Scholar] [CrossRef]
  165. Skondras, A.; Karachaliou, E.; Tavantzis, I.; Tokas, N.; Valari, E.; Skalidi, I.; Bouvet, G.A.; Stylianidis, E. UAV mapping and 3D modeling as a tool for promotion and management of the urban space. Drones 2022, 6, 115. [Google Scholar] [CrossRef]
  166. Prabowo, G.; Bisri, M.; Asmara, R.; Budiyanto, A.S.; Aziz, A. Development of a precision map of the village based on drone imagery to improve the quality of planning. J. Pembang. Alam Lestari 2025, 16, 39–47. [Google Scholar] [CrossRef]
  167. Cureton, P. Drone Futures: UAS in Landscape and Urban Design; Routledge: London, UK, 2020. [Google Scholar]
  168. Ma, Q.; Zhang, J.; Li, Y. Advanced integration of urban street greenery and pedestrian flow: A multidimensional analysis in Chengdu’s central urban district. ISPRS Int. J. Geo-Inf. 2024, 13, 254. [Google Scholar] [CrossRef]
  169. Neupane, K.; Baysal-Gurel, F. Automatic identification and monitoring of plant diseases using unmanned aerial vehicles: A review. Remote Sens. 2021, 13, 3841. [Google Scholar] [CrossRef]
  170. Pun, K.B.; Katiyar, H.; Midde, S.; Paul, M.; Vigneshwaran, K.; Kannan, S.; Desouza, V.A.; Nughal, J.; Das, S.; Bhattacharyya, U.K.; et al. Artificial intelligence and drone-assisted plant disease diagnosis: A review of advancements and challenges. Int. J. Adv. Biochem. Res. 2025, 9, 358–366. [Google Scholar] [CrossRef]
  171. Wagner, B.; Egerer, M. Application of UAV remote sensing and machine learning to model and map land use in urban gardens. J. Urban Ecol. 2022, 8, juac008. [Google Scholar] [CrossRef]
  172. Xu, H.; Diao, J. Accurate design and layout of landscape elements based on improved particle swarm optimization. Comput. Aided Des. Appl. 2023, 21, 49–65. [Google Scholar] [CrossRef]
  173. Ghorbanzadeh, O.; Meena, S.R.; Blaschke, T.; Aryal, J. UAV-based slope failure detection using deep-learning convolutional neural networks. Remote Sens. 2019, 11, 2046. [Google Scholar] [CrossRef]
  174. Koganti, T.; Ghane, E.; Martinez, L.R.; Iversen, B.V.; Allred, B.J. Mapping of agricultural subsurface drainage systems using unmanned aerial vehicle imagery and ground penetrating radar. Sensors 2021, 21, 2800. [Google Scholar] [CrossRef]
  175. Sibanda, M.; Mutanga, O.; Chimonyo, V.G.; Clulow, A.D.; Shoko, C.; Mazvimavi, D.; Odindo, A.O.; Slotow, R.; Mabhaudhi, T. Application of drone technologies in surface water resources monitoring and assessment: A systematic review of progress, challenges, and opportunities in the global south. Drones 2021, 5, 84. [Google Scholar] [CrossRef]
  176. Jiménez López, J.; Mulero-Pázmány, M. Drones for conservation in protected areas: Present and future. Drones 2019, 3, 10. [Google Scholar] [CrossRef]
  177. Kumar, S.; Meena, R.S.; Sheoran, S.; Jangir, C.K.; Jhariya, M.K.; Banerjee, A.; Raj, A. Remote sensing for agriculture and resource management. In Natural Resources Conservation and Advances for Sustainability; Elsevier: Amsterdam, The Netherlands, 2022; pp. 91–135. [Google Scholar]
  178. Gioia, D.; Minervino Amodio, A.; Maggio, A.; Sabia, C.A. Impact of land use changes on the erosion processes of a degraded rural landscape: An analysis based on high-resolution DEMs, historical images, and soil erosion models. Land 2021, 10, 673. [Google Scholar] [CrossRef]
  179. Delavarpour, N.; Koparan, C.; Nowatzki, J.; Bajwa, S.; Sun, X. A technical study on UAV characteristics for precision agriculture applications and associated practical challenges. Remote Sens. 2021, 13, 1204. [Google Scholar] [CrossRef]
  180. Cienciała, A.; Sobura, S.; Sobolewska-Mikulska, K. Optimising land consolidation by implementing UAV technology. Sustainability 2022, 14, 4412. [Google Scholar] [CrossRef]
  181. Zhang, H.; Wang, L.; Tian, T.; Yin, J. A review of unmanned aerial vehicle low-altitude remote sensing (UAV-LARS) use in agricultural monitoring in China. Remote Sens. 2021, 13, 1221. [Google Scholar] [CrossRef]
  182. Meron, M.; Peres, M.; Levin-Orlov, V.; Shoshani, G.; Marchaim, U.; Chen, A. Irrigation uniformity assessment with high-resolution aerial sensors. Int. J. Appl. Earth Obs. Geoinf. 2025, 137, 104446. [Google Scholar] [CrossRef]
  183. Abdullah, H.M.; Mohana, N.T.; Khan, B.M.; Ahmed, S.M.; Hossain, M.; Islam, K.S.; Islam, S.; Sultana, S.; Das, B.; Akter, S.; et al. Present and future scopes and challenges of plant pest and disease (P&D) monitoring: Remote sensing, image processing, and artificial intelligence perspectives. Remote Sens. Appl. Soc. Environ. 2023, 32, 100996. [Google Scholar] [CrossRef]
  184. Wich, S.A.; Koh, L.P. Conservation Drones: Mapping and Monitoring Biodiversity; Oxford University Press: Oxford, UK, 2018. [Google Scholar]
  185. Donaire, J.A.; Galí, N.; Gulisova, B. Tracking visitors in crowded spaces using zenith images: Drones and time-lapse. Tour. Manag. Perspect. 2020, 35, 100680. [Google Scholar] [CrossRef]
  186. Tait, L.; Bind, J.; Charan-Dixon, H.; Hawes, I.; Pirker, J.; Schiel, D. Unmanned aerial vehicles (UAVs) for monitoring macroalgal biodiversity: Comparison of RGB and multispectral imaging sensors for biodiversity assessments. Remote Sens. 2019, 11, 2332. [Google Scholar] [CrossRef]
  187. Rudge, M.L.; Levick, S.R.; Bartolo, R.E.; Erskine, P.D. Developing landscape-scale forest restoration targets that embrace spatial pattern. Landsc. Ecol. 2022, 37, 1747–1760. [Google Scholar] [CrossRef]
  188. Han, Y.G.; Yoo, S.H.; Kwon, O. Possibility of applying unmanned aerial vehicle (UAV) and mapping software for the monitoring of waterbirds and their habitats. J. Ecol. Environ. 2017, 41, 21. [Google Scholar] [CrossRef]
  189. Beaver, J.T.; Baldwin, R.W.; Messinger, M.; Newbolt, C.H.; Ditchkoff, S.S.; Silman, M.R. Evaluating the use of drones equipped with thermal sensors as an effective method for estimating wildlife. Wildl. Soc. Bull. 2020, 44, 434–443. [Google Scholar] [CrossRef]
  190. Bhatia, D.; Dhillon, A.S.; Hesse, H. Preliminary design of an UAV based system for wildlife monitoring and conservation. In Proceedings of the International Conference on Aeronautical Sciences, Engineering and Technology; Springer Nature: Singapore, 2023; pp. 51–63. [Google Scholar]
  191. Seștraș, P.; Roșca, S.; Bilașco, Ș.; Naș, S.; Buru, S.M.; Kovács, L.; Török, Z.; Andreica, M.E.; Stăvaru, V.G.; Crișan, M.E.; et al. Feasibility assessments using unmanned aerial vehicle technology in heritage buildings: Rehabilitation-restoration, spatial analysis and tourism potential analysis. Sensors 2020, 20, 2054. [Google Scholar] [CrossRef]
  192. Chen, N.; Yuan, Q. 5G technologies and tourism environmental carrying capacity based on planning optimization with remote sensing systems. In Proceedings of the 2020 Fourth International Conference on I-SMAC (IoT in Social, Mobile, Analytics and Cloud) (I-SMAC), Palladam, India, 7–9 October 2020; pp. 322–325. [Google Scholar]
  193. Tomczyk, A.M.; Ewertowski, M.W.; Creany, N.; Ancin-Murguzur, F.J.; Monz, C. The application of unmanned aerial vehicle (UAV) surveys and GIS to the analysis and monitoring of recreational trail conditions. Int. J. Appl. Earth Obs. Geoinf. 2023, 123, 103474. [Google Scholar] [CrossRef]
  194. Liao, Z.; Ma, Y.; Huang, J.; Wang, J.; Wang, J. HOTSPOT: A UAV-assisted dynamic mobility-aware offloading for mobile-edge computing in 3-D space. IEEE Internet Things J. 2021, 8, 10940–10952. [Google Scholar] [CrossRef]
  195. Díaz-Delgado, R.; Mücher, S. Editorial of special issue “Drones for biodiversity conservation and ecological monitoring”. Drones 2019, 3, 47. [Google Scholar] [CrossRef]
  196. Wang, S.; Liu, H.; Rinaldi, M.; Tsang, Y.P. Evaluating the impact of air corridors on the environment and public interests. Transp. Res. Part D Transp. Environ. 2025, 143, 104732. [Google Scholar] [CrossRef]
  197. Samaras, S.; Diamantidou, E.; Ataloglou, D.; Sakellariou, N.; Vafeiadis, A.; Magoulianitis, V.; Zioulis, N.; Karaköse, M.; Tzovaras, D.; Votis, K. Deep learning on multi sensor data for counter UAV applications—A systematic review. Sensors 2019, 19, 4837. [Google Scholar] [CrossRef]
  198. Huang, F.; Xiong, H.; Chen, S.; Lv, Z.; Huang, J.; Chang, Z.; Catani, F. Slope stability prediction based on a long short-term memory neural network: Comparisons with convolutional neural networks, support vector machines and random forest models. Int. J. Coal Sci. Technol. 2023, 10, 18. [Google Scholar] [CrossRef]
  199. Koirala, A.; Walsh, K.B.; Wang, Z.; McCarthy, C. Deep learning for real-time fruit detection and orchard fruit load estimation: Benchmarking of ‘MangoYOLO’. Precis. Agric. 2019, 20, 1107–1135. [Google Scholar] [CrossRef]
  200. Guo, Q.; Zhang, J.; Guo, S.; Ye, Z.; Deng, H.; Hou, X.; Zhang, H. Urban tree classification based on object-oriented approach and random forest algorithm using unmanned aerial vehicle (UAV) multispectral imagery. Remote Sens. 2022, 14, 3885. [Google Scholar] [CrossRef]
  201. Shen, N.; Feng, F.; Xu, C.; Li, X.; Chiriacò, M.V.; Lafortezza, R. Drone-based assessment of urban green space structure and cooling capacity. Urban For. Urban Green. 2025, 82, 128953. [Google Scholar] [CrossRef]
  202. Jo, W.K.; Park, J.H. High-accuracy tree type classification in urban forests using drone-based RGB imagery and optimized SVM. Korean J. Remote Sens. 2025, 41, 209–223. [Google Scholar] [CrossRef]
  203. Yang, Q.; Shi, L.; Han, J.; Yu, J.; Huang, K. A near real-time deep learning approach for detecting rice phenology based on UAV images. Agric. For. Meteorol. 2020, 287, 107938. [Google Scholar] [CrossRef]
  204. Pi, W.; Du, J.; Bi, Y.; Gao, X.; Zhu, X. 3D-CNN based UAV hyperspectral imagery for grassland degradation indicator ground object classification research. Ecol. Inform. 2021, 62, 101278. [Google Scholar] [CrossRef]
  205. Revenga, J.C.; Trepekli, K.; Oehmcke, S.; Jensen, R.; Li, L.; Igel, C.; Tøttrup, C.; Krogh, S.A.; Elberling, B.; Friborg, T. Above-ground biomass prediction for croplands at a sub-meter resolution using UAV–LiDAR and machine learning methods. Remote Sens. 2022, 14, 3912. [Google Scholar] [CrossRef]
  206. Feroz, S.; Abu Dabous, S. UAV-based remote sensing applications for bridge condition assessment. Remote Sens. 2021, 13, 1809. [Google Scholar] [CrossRef]
Figure 1. PRISMA-ScR-based flow diagram of the study selection process. The figure was developed by the authors in accordance with the PRISMA Extension for Scoping Reviews guidelines [18].
Figure 1. PRISMA-ScR-based flow diagram of the study selection process. The figure was developed by the authors in accordance with the PRISMA Extension for Scoping Reviews guidelines [18].
Drones 09 00603 g001
Figure 2. Schematic representation of drone types commonly used in landscape research. The diagram categorises UASs into fixed-wing (a), hybrid (b), and rotary-wing configurations, with rotary-wing ones further divided into quadcopter (c1), hexacopter (c2), and octocopter (c3) based on design characteristics and flight dynamics [23].
Figure 2. Schematic representation of drone types commonly used in landscape research. The diagram categorises UASs into fixed-wing (a), hybrid (b), and rotary-wing configurations, with rotary-wing ones further divided into quadcopter (c1), hexacopter (c2), and octocopter (c3) based on design characteristics and flight dynamics [23].
Drones 09 00603 g002
Figure 3. Comparative assessment of drone types in landscape research based on the authors’ synthesis of Table 1, Table 2, Table 3 and Table 4 and the cited references.
Figure 3. Comparative assessment of drone types in landscape research based on the authors’ synthesis of Table 1, Table 2, Table 3 and Table 4 and the cited references.
Drones 09 00603 g003
Figure 4. Methodological workflow of drone applications in landscape research, adapted from Karahan et al. [123].
Figure 4. Methodological workflow of drone applications in landscape research, adapted from Karahan et al. [123].
Drones 09 00603 g004
Figure 5. Satellite map (left), overlayed satellite and drone images map (middle), and terrace density map (right) illustrating terrace density and rural–urban structure distribution in Uzundere with Drone and GIS assistance. The data derive from a doctoral research project (Oğuz Gökçe, supervised by Prof. Dr. Faris Karahan, Atatürk University, 2023), currently in progress and not yet publicly archived. All visuals are presented with ethical approval and for illustrative purposes only.
Figure 5. Satellite map (left), overlayed satellite and drone images map (middle), and terrace density map (right) illustrating terrace density and rural–urban structure distribution in Uzundere with Drone and GIS assistance. The data derive from a doctoral research project (Oğuz Gökçe, supervised by Prof. Dr. Faris Karahan, Atatürk University, 2023), currently in progress and not yet publicly archived. All visuals are presented with ethical approval and for illustrative purposes only.
Drones 09 00603 g005
Figure 6. Agricultural terraces in Erikli on the slopes of Tav Mountain (drone image captured on 2 April 2024). (This image was obtained as part of ongoing doctoral fieldwork conducted in Uzundere. As the dissertation is under development, the visual is included with permission and does not represent a formally published dataset).
Figure 6. Agricultural terraces in Erikli on the slopes of Tav Mountain (drone image captured on 2 April 2024). (This image was obtained as part of ongoing doctoral fieldwork conducted in Uzundere. As the dissertation is under development, the visual is included with permission and does not represent a formally published dataset).
Drones 09 00603 g006
Figure 7. Drone image of Erikli agricultural terraces taken on 30 March 2022. (This image was captured as part of the author’s ongoing doctoral research project, with the aim of monitoring seasonal dynamics in terraced landscapes. It is presented here to illustrate the drone-based documentation process. Used with permission from the doctoral research dataset).
Figure 7. Drone image of Erikli agricultural terraces taken on 30 March 2022. (This image was captured as part of the author’s ongoing doctoral research project, with the aim of monitoring seasonal dynamics in terraced landscapes. It is presented here to illustrate the drone-based documentation process. Used with permission from the doctoral research dataset).
Drones 09 00603 g007
Figure 8. Drone image of Erikli agricultural terraces taken on 2 April 2024. (Collected during the doctoral fieldwork, this image supports longitudinal observations of vegetation growth and terrace stability. It serves as an example of site-specific UAS monitoring in the context of landscape research. Used with permission).
Figure 8. Drone image of Erikli agricultural terraces taken on 2 April 2024. (Collected during the doctoral fieldwork, this image supports longitudinal observations of vegetation growth and terrace stability. It serves as an example of site-specific UAS monitoring in the context of landscape research. Used with permission).
Drones 09 00603 g008
Figure 9. Drone image of Erikli agricultural terraces taken on 23 September 2022. (Acquired through repeated drone flights in the doctoral research, this image documents transitional conditions during the post-harvest season. Included to demonstrate temporal comparative analysis supported by UAS imaging).
Figure 9. Drone image of Erikli agricultural terraces taken on 23 September 2022. (Acquired through repeated drone flights in the doctoral research, this image documents transitional conditions during the post-harvest season. Included to demonstrate temporal comparative analysis supported by UAS imaging).
Drones 09 00603 g009
Figure 10. Drone image of Erikli agricultural terraces taken on 7 November 2024. (Captured under the same doctoral research initiative, this image reflects late-season changes in vegetation and soil patterns. It complements the seasonal timeline illustrated in Figure 7, Figure 8, Figure 9 and Figure 10. All images are used with academic permission from the original dataset).
Figure 10. Drone image of Erikli agricultural terraces taken on 7 November 2024. (Captured under the same doctoral research initiative, this image reflects late-season changes in vegetation and soil patterns. It complements the seasonal timeline illustrated in Figure 7, Figure 8, Figure 9 and Figure 10. All images are used with academic permission from the original dataset).
Drones 09 00603 g010
Table 1. Definitions of UAS types used in landscape research.
Table 1. Definitions of UAS types used in landscape research.
Drone TypeDefinitionReferences
Fixed-Wing DronesFixed-wing drones are characterised by a rigid wing structure similar to traditional airplanes, enabling longer flight times, higher altitudes, and more energy-efficient coverage of large areas. Due to their aerodynamic efficiency, they are widely used in landscape-scale ecological mapping, coastal monitoring, and long-term environmental change detection. These drones are particularly suitable for applications requiring high-speed data acquisition over linear or extensive terrains, such as forest boundaries, river corridors, or agricultural fields. However, they often require runway or catapult-assisted launches and have limited hovering capability. Recent studies have shown their growing utility in dryland vegetation surveys, glacier margin monitoring, and savannah ecosystem tracking.[24,25,26,27,28,29]
Rotary-Wing DronesRotary-wing drones, including quadcopters and multirotor systems, are distinguished by their vertical take-off and landing (VTOL) capability and their ability to hover and manoeuvre in tight spaces. These drones offer exceptional control for low-altitude imaging and are ideal for capturing detailed spatial data in confined, topographically complex, or inaccessible environments such as urban green corridors, riparian zones, or archaeological sites. Their precise flight control makes them suitable for photogrammetry, 3D modelling, thermal mapping, and cultural heritage documentation. Despite shorter flight durations and limited coverage range compared to fixed-wing types, they are favoured in fine-scale landscape analysis and localised ecological monitoring.[28,29,30,31,32,33,34,35]
Hybrid DronesHybrid drones combine the endurance and range of fixed-wing systems with the manoeuvrability and VTOL capacity of rotary-wing drones. This integration enables both long-distance data collection and precision hovering, making hybrid platforms ideal for large-scale monitoring with intermittent high-detail observation. These systems are particularly effective in dynamic or transitional landscapes—such as mountainous terrain, coastlines, or habitat corridors—where flexibility in flight patterns and efficient area coverage are essential. Although more complex in terms of engineering and cost, hybrid drones are increasingly used in conservation planning, biodiversity monitoring, and agroforestry applications requiring adaptive and responsive flight strategies.[24,27,28,29,32,33,34,35]
Table 2. Operational advantages of UAV types in landscape research.
Table 2. Operational advantages of UAV types in landscape research.
Drone TypeAdvantagesReferences
Fixed-Wing DronesFixed-wing UASs offer significant advantages for landscape research requiring large-area coverage, stable flight trajectories, and energy efficiency. Their aerodynamically efficient structure enables them to conduct long-endurance missions at higher altitudes and faster speeds compared to multirotor systems, making them ideal for mapping linear features such as rivers, ridgelines, or agricultural belts. These systems are particularly suited for multispectral and hyperspectral imaging, where flight uniformity and spectral continuity are crucial for accurate vegetation analysis, land use classification, and habitat mapping. In semi-arid ecosystems, mountainous regions, or remote agricultural zones, fixed-wing drones facilitate broad-scale monitoring and longitudinal change detection with reduced operational costs.[24,25,26,27,28,29,30,31,32,33,36,37,38,39]
Rotary-Wing DronesRotary-wing UASs offer unmatched manoeuvrability, vertical take-off and landing (VTOL) capabilities, and stable hovering, making them ideal for high-resolution data collection in confined, fragmented, or ecologically sensitive landscapes. Their ability to operate in narrow corridors, steep terrain, and vegetated areas enables precision monitoring of habitats, riparian zones, cultural sites, and urban vegetation. These drones are particularly compatible with thermal imaging systems, allowing for the real-time assessment of surface temperature anomalies, wildlife presence, and water stress in plant communities. Their flexibility supports adaptive survey missions and frequent revisit cycles essential for temporal analysis. The integration with RGB, multispectral, and thermal sensors further enhances their operational value in ecosystem diagnostics, disaster response, and urban green infrastructure planning.[28,29,30,31,32,33,34,35,39,40,41,42,43]
Hybrid DronesHybrid drones combine the endurance and speed of fixed-wing platforms with the vertical take-off and landing (VTOL) and hovering capabilities of rotary-wing systems. This dual-functionality makes them particularly advantageous in landscape research that spans diverse or fragmented terrains such as mountainous ecotones, coastal transition zones, or habitat corridors. Their flexibility supports efficient mission planning in regions where conventional UASs may face limitations due to terrain constraints or lack of infrastructure. Hybrid UASs are well-suited for carrying multispectral, hyperspectral, and LiDAR payloads in complex settings where both broad coverage and site-specific detail are needed. This makes them ideal for tasks such as erosion monitoring, biodiversity mapping, and post-disaster environmental assessment. Their capacity to switch between dynamic coverage and localized inspection empowers researchers to adapt flight parameters in real-time, thus enhancing data precision and temporal responsiveness.[32,33,34,35]
Table 3. Operational limitations of UAS types in landscape research.
Table 3. Operational limitations of UAS types in landscape research.
Drone TypeDisadvantagesReferences
Fixed-Wing DronesDespite their advantages in range and endurance, fixed-wing UASs exhibit several operational limitations in landscape research. These systems typically require larger take-off and landing areas, which limits their deployability in dense vegetation, rugged topographies, or confined urban zones. Their inability to hover restricts low-altitude inspections, making them less suitable for site-specific diagnostics or under-canopy mapping. Moreover, real-time data acquisition and thermal imaging tasks are constrained due to platform motion and limited payload stability. Their structural design also complicates multi-sensor integration, particularly when combining high-resolution RGB with LiDAR or thermal payloads in a single mission. These challenges hinder their flexibility in small-scale or heterogeneous environments where rapid manoeuvring and precise altitude control are essential.[24,25,26,27,31,34]
Rotary-Wing DronesWhile rotary-wing UASs excel in manoeuvrability and localised data acquisition, they are significantly constrained by limited battery life, low flight altitudes, and short operational range. These limitations restrict their utility in large-scale landscape studies or in monitoring linear and remote features such as river basins or mountain ridges. Furthermore, their sensitivity to wind conditions and weather variability may reduce flight stability and image quality, particularly for high-resolution mapping tasks. Payload capacity is also a concern, limiting the concurrent use of multiple sensors like LiDAR and thermal cameras. Additionally, the repeated short flights required to cover larger areas increase data fragmentation, post-processing time, and mission planning complexity.[24,28,29,30,31,32,33]
Hybrid DronesAlthough hybrid UASs offer the combined advantages of rotary- and fixed-wing systems, they also inherit key operational trade-offs from both. Their increased mechanical complexity results in higher maintenance requirements and potential failure points, especially in prolonged missions or adverse weather conditions. Hybrid drones generally carry a heavier structural load, reducing their battery efficiency and limiting their flight duration relative to fixed-wing counterparts. Moreover, their platform-specific software, calibration procedures, and multi-sensor payload integration often demand advanced piloting and planning skills. These factors can limit their widespread adoption in rapid-response surveys or low-resource landscape research contexts. Additionally, because hybrid UASs are relatively new to the market, long-term performance data and standardised operational protocols remain limited.[25,26,27,30,32,33,34,35]
Table 4. Functional comparison of UAS types in landscape research: applications and technical relevance.
Table 4. Functional comparison of UAS types in landscape research: applications and technical relevance.
Drone TypeApplication Areas References
Fixed-Wing DronesFixed-wing UASs are extensively used in landscape research projects that require efficient coverage of large and remote territories. Their long flight endurance and consistent speed make them ideal for agricultural zoning, watershed delineation, and land degradation monitoring in semi-arid and mountainous regions. They are frequently employed in large-scale crop mapping, forest biomass estimation, and topographic modelling using photogrammetry or multispectral sensors. In post-disaster landscapes, fixed-wing drones enable the rapid assessment of erosion patterns, river dynamics, and terrain deformation. Their high-altitude operation supports strategic regional planning and ecological forecasting, especially in areas where ground-based surveys are impractical.[24,25,26,27,44,45,46,47,48]
Rotary-Wing DronesRotary-wing UASs are particularly effective in localised landscape research requiring high spatial detail, frequent revisit cycles, and access to ecologically complex or confined environments. They are widely used in biodiversity monitoring, wildlife habitat assessment, and vegetation health studies in forests, wetlands, and urban green infrastructure. Their ability to hover and operate at low altitudes makes them ideal for detailed canopy mapping, invasive species detection, cultural heritage site inspection, and monitoring of ecosystem restoration zones. Additionally, their compatibility with thermal cameras allows for targeted applications such as wildlife tracking, evapotranspiration analysis, and urban heat island mapping. These drones also facilitate participatory research and citizen science in landscape stewardship initiatives.[28,29,30,31,38,49,50]
Hybrid DronesHybrid UASs are increasingly employed in interdisciplinary landscape research where both long-range coverage and vertical take-off and landing capabilities are required. Their dual functionality enables comprehensive environmental monitoring in mountainous, forested, and semi-urban regions with limited infrastructure. These drones are especially valuable in ecosystem-level projects combining LiDAR, hyperspectral, and thermal imaging to assess vegetation structure, surface temperature anomalies, and soil conditions in a single flight. They are also utilised in adaptive land use planning, protected area surveillance, and climate resilience modelling. Their operational flexibility makes them well-suited for longitudinal studies that demand both high spatial detail and broad spatial extent.[24,28,32,33,34,35]
Table 5. Technical specifications of UAS-compatible camera types.
Table 5. Technical specifications of UAS-compatible camera types.
Camera TypeSpectral BandsSpatial ResolutionTypical WeightKey References
RGB3 (R, G, B)2–10 cm<500 g[91,93]
Multispectral4–8 discrete bands5–15 cm500–1000 g[91,104,108]
Hyperspectral50–200+ contiguous bands20–50 cm>1.5 kg[105,107,109]
Thermal1 (thermal IR)10–50 cm300–800 g[94,96,106]
LiDARN/A (active laser pulses)5–15 cm (vertical)2–4 kg[44,97,109]
Table 6. Operational considerations and application areas of UAS camera types.
Table 6. Operational considerations and application areas of UAS camera types.
Camera TypeUAS CompatibilityMain ApplicationsIntegration NeedsKey References
RGBMultirotor, fixed-wingVisual mapping, land useMinimal (plug-and-fly)[91,93]
MultispectralMultirotorVegetation indices, crop monitoringRadiometric calibration, GNSS sync[91,104,108]
HyperspectralHigh-end multirotor or fixed-wingBiochemical analysis, soil diagnosticsGimbal, external storage, spectral unmixing software[105,107,109]
ThermalMultirotorWater stress, heat islands, wildlife detectionRadiometric correction, optimal flight timing[94,96,106]
LiDARHeavy-lift multirotor or fixed-wingTerrain modelling, canopy structure, biomass estimationGNSS/IMU, gimbal, point cloud processing tools[44,97,109]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Karahan, A.; Demircan, N.; Özgeriş, M.; Gökçe, O.; Karahan, F. Integration of Drones in Landscape Research: Technological Approaches and Applications. Drones 2025, 9, 603. https://doi.org/10.3390/drones9090603

AMA Style

Karahan A, Demircan N, Özgeriş M, Gökçe O, Karahan F. Integration of Drones in Landscape Research: Technological Approaches and Applications. Drones. 2025; 9(9):603. https://doi.org/10.3390/drones9090603

Chicago/Turabian Style

Karahan, Ayşe, Neslihan Demircan, Mustafa Özgeriş, Oğuz Gökçe, and Faris Karahan. 2025. "Integration of Drones in Landscape Research: Technological Approaches and Applications" Drones 9, no. 9: 603. https://doi.org/10.3390/drones9090603

APA Style

Karahan, A., Demircan, N., Özgeriş, M., Gökçe, O., & Karahan, F. (2025). Integration of Drones in Landscape Research: Technological Approaches and Applications. Drones, 9(9), 603. https://doi.org/10.3390/drones9090603

Article Metrics

Back to TopTop