Next Article in Journal
Linking Experiential Marketing, Perceived Value, and Satisfaction in Agritourism: Implications for Sustainable Rural Development
Previous Article in Journal
Logistics Performance and Sustainability Outcomes: A Global Structural Analysis
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

UAVs in Urban Blue–Green Infrastructure Management: A Comprehensive Review of Sensors, Methods, and Applications

1
Department of Environmental Management and Protection, Faculty of Geo-Data Science, Geodesy and Environmental Engineering, AGH University of Krakow, 30-059 Krakow, Poland
2
Department of Integrated Geodesy and Cartography, Faculty of Geo-Data Science, Geodesy and Environmental Engineering, AGH University of Krakow, 30-059 Krakow, Poland
3
Department of Photogrammetry, Remote Sensing of Environment and Spatial Engineering, Faculty of Geo-Data Science, Geodesy and Environmental Engineering, AGH University of Krakow, 30-059 Krakow, Poland
*
Authors to whom correspondence should be addressed.
Sustainability 2026, 18(6), 3064; https://doi.org/10.3390/su18063064
Submission received: 12 January 2026 / Revised: 18 March 2026 / Accepted: 18 March 2026 / Published: 20 March 2026

Abstract

Urban blue–green infrastructure (BGI), comprising vegetation and aquatic elements, is fundamental to city resilience and climate adaptation. Effective BGI management necessitates high-resolution, spatially accurate data for which Unmanned Aerial Vehicles (UAVs) have emerged as versatile monitoring tools. This study provides a critical synthesis and analytical evaluation of UAV-based technologies for BGI management from 2018 to 2025. Following a PRISMA-guided methodology, the review evaluates dominant research themes, sensor technologies (RGB, multispectral, thermal, LiDAR, and water and air quality sensors), and analytical methods. Departing from traditional descriptive reviews, this study appraises the operational maturity of these technologies using an adapted Technology Readiness Level (TRL) framework. The analysis identifies a significant “maturity gap” between standardized structural mapping (TRL 9) and experimental functional assessments of environmental conditions (TRL 4–6). Notably, the article includes a detailed analysis of specific UAV platforms and sensors, providing specifications of technological capabilities. By identifying critical technical, regulatory, and economic bottlenecks, this review provides a robust, evidence-based foundation for the deployment of drones in enhancing urban resilience and sustainable environmental governance.

1. Introduction

1.1. Importance of Blue–Green Infrastructure in Urban Resilience

Cities worldwide face escalating threats from rapid urbanization and the impacts of climate change [1]. In response, there is an increasing need to shift from conventional ‘gray’ urban infrastructure to adaptive blue-green infrastructure (BGI) solutions. Systematic development, maintenance, and monitoring of BGI are crucial for raising urban resilience, quality of life, and sustainability [2,3]. BGI initiatives support the development of ‘regenerative cities,’ in which human-nature interactions aim to facilitate ecological restoration [4]. Urban BGI is a key concept for improving the sustainable development of urban environments [5]. BGI refers to a network of interlinked urban land and water features located within the boundary of a metropolitan area that deliver ecological and social benefits. It improves air and water quality, provides flood protection, enhances biodiversity, creates recreational spaces, and supports public health. BGI includes both human-made and natural spaces designed to offer ecosystem services and raise urban living standards [4,6,7]. Urban Blue-Green Infrastructure (BGI) thus expands the concept of Green Infrastructure (GI) by explicitly adding ‘blue’ water-related elements to the green network. Vegetation elements are a core component of BGI, often classified as Green Infrastructure (GI) or Urban Green Spaces (UGS) [6,8]. This category is broad and includes public, formally designated areas such as parks, urban forests, and street trees, as well as informal green spaces. Informally recognized urban green spaces include home gardens, orchards, green roofs, grasslands, agricultural land, post-industrial sites, vacant lots, brownfields, green spaces along railway tracks, reclaimed landfills, riverbanks, and other unmanaged or spontaneous green spaces [8,9,10,11,12]. Green infrastructure is often closely associated with blue infrastructure. The ‘blue’ elements include a variety of natural, semi-natural, and constructed water bodies, water systems, and other water-related components of the urban environment. These range from river networks, marshes, lakes, floodplains, park and garden ponds, and marine ecosystems to designed systems such as bioswales, rain gardens, canals, aquaculture infrastructure, constructed wetlands, and retention basins [6,13,14,15,16,17,18,19].
Urban planners and decision-makers value BGI mainly for managing extreme weather and mitigating risks such as flash floods and urban heat islands. BGI is also crucial for conserving biodiversity and supporting ecosystems [20,21]. This shift towards blue–green infrastructure is increasingly reflected in national and supranational policy frameworks. As more than half of the global population now resides in urban areas, environmental and planning policies place growing emphasis on the role of urban BGI in delivering ecosystem services and enhancing urban resilience [5,22]. In many major economies, BGI has moved from being a supplementary planning element to a strategic component of urban policy. In the United States, this approach is exemplified by the Environmental Protection Agency’s Green Infrastructure Program. The programme frames BGI as a practical tool for stormwater management and climate adaptation in urban areas. Beyond technical solutions, it explicitly incorporates social dimensions. These include environmental education and support for community-led initiatives focused on creating and maintaining BGI. In this way, the programme links environmental performance with improvements in residents’ quality of life [14,23].
Within the European Union, BGI has been embedded more strongly at the strategic policy level. The Green Infrastructure Strategy promotes the systematic integration of BGI across multiple sectoral policies. It emphasises BGI’s role in nature restoration and biodiversity enhancement. In urban contexts, the Strategy identifies the protection, restoration, and creation of BGI as core elements of spatial planning [2]. These objectives are further reinforced by the EU Biodiversity Strategy for 2030. This policy encourages Member States to mainstream BGI and nature-based solutions in urban development and provides a framework for targeted investment [21]. A different but complementary policy pathway can be observed in China. There, the concept of ‘Sponge Cities’ has been implemented through state-led programmes. This approach operationalises BGI through measures such as rain gardens, constructed wetlands, and permeable surfaces. These solutions aim to manage urban rainfall, reduce flood risk, and improve water quality. Evaluations of pilot projects initiated in 2014 show how green infrastructure solutions are translated into large-scale urban practice. They also illustrate how implementation is shaped by evolving regulatory and planning frameworks [24,25]. Despite shared objectives related to climate adaptation and urban resilience, these policy approaches differ substantially in their modes of governance, degrees of institutional centralisation, and mechanisms of stakeholder engagement. These differences have important implications for the effectiveness and transferability of BGI solutions across diverse urban and political contexts.

1.2. Challenges in Management and Monitoring of Urban BGI

The importance of monitoring and managing BGI lies in its multi-faceted contribution to urban resilience, economic efficiency, and public well-being. From an economic perspective, rigorous monitoring enables precise cost–benefit analyses, demonstrating that investments in BGI are often more sustainable than traditional ‘gray’ infrastructure [26]. The significance of BGI monitoring lies in its role as a fundamental strategy for climate change adaptation and sustainable urban expansion. As global urban populations rise, cities face escalating threats from pollution and resource depletion, rendering nature-based solutions indispensable for maintaining the urban quality of life [22,27]. However, the long-term provision of essential ecosystem services—such as carbon sequestration, biodiversity conservation, and hydrological regulation—cannot be guaranteed without rigorous, systematic management [12,28,29]. While Urban BGI represents a comprehensive solution that harmonizes social and environmental objectives, its ultimate success is predicated on maintaining its operational service life through active, data-driven oversight. Furthermore, as cities face intensifying climate-induced hazards, monitored BGI serves as a critical mechanism for disaster risk reduction, particularly by mitigating stormwater runoff and flood risks [30,31]. Ultimately, robust management frameworks ensure that BGI continues to deliver the ecosystem services required to create livable, climate-resilient environments that address both the environmental stability and the mental health needs of urban populations.
Comprehensive monitoring of BGI must integrate three core pillars: hydrological efficiency, ecological health, and climatic impact. Assessing the hydrological performance of BGI requires understanding antecedent conditions, specifically soil moisture and storage levels, that dictate how the system functions during a storm event [32,33]. To mitigate extreme weather, monitoring frameworks must prioritize measuring key metrics, including retention capacity, detention times, and outflow intensities [34]. Simultaneously, monitoring vegetation health is paramount, as the survival and vitality of plants directly govern their capacity to accumulate pollutants and regulate urban microclimates through evapotranspiration and shading [35,36,37]. These cooling mechanisms are increasingly vital for mitigating the Surface Urban Heat Island (SUHI) effect, particularly amid escalating climate-induced urban hazards [38,39]. To address these challenges effectively, an integrative approach that combines remote sensing data with the needs of the local community and analysis from the BGI ecosystem services is required. In urban ecosystems, it is essential to identify spatially mosaic-like strategic areas where maintaining a healthy BGI ratio is crucial to ensuring an appropriate standard of living for residents. The management of urban BGI may encounter some institutional, social, and technical barriers. Their nature and extent usually depend on the level of environmental awareness and the society’s economic development. In many countries, educating local communities and decision-makers to change perceptions of BGI in urban asset management remains crucial. Only broad access to BGI data provides a comprehensive overview of environmental and social performance. Such evidence-based knowledge is instrumental in shaping social attitudes and expectations [40], while ensuring the inclusion of BGI in integrated urban planning, which is a prerequisite for sustainable BGI governance [41,42].
The management and monitoring of urban blue-green infrastructure face critical hurdles stemming from spatial heterogeneity, temporal dynamics, and constraints on data accuracy. A significant challenge lies in implementing systematic, long-term monitoring programs, which are essential for evaluating the performance of BGI elements, such as street trees and green roofs [43]. However, a primary technical challenge is the spatial fragmentation of a diverse urban environment, which necessitates high-resolution mapping to capture small-scale features. Current popular analytical tools, such as Geographic Information Systems (GIS) and satellite remote sensing, often struggle with spectral and temporal variability and the inherent heterogeneity of urban land cover. Freely accessible remote sensing imagery is typically characterized by medium spatial resolution [44]. Specifically, Landsat data maintain a 30 m resolution [45,46,47], whereas Sentinel-2 offers a finer 10 m resolution for its primary bands [48,49]. This creates a significant disparity in the observational scale: a single Sentinel-2 pixel covers an area of 100 m2, while a Landsat pixel encompasses 900 m2. At a 30 m resolution, distinct environmental features often blur or aggregate into a single data point—a phenomenon commonly referred to as the ‘mixed pixel’ effect. When a single pixel represents a heterogeneous mix of vegetation, small water bodies, and impervious surfaces (such as concrete), the analysis may yield biased results on the cooling efficiency of Blue-Green Infrastructure (BGI). Furthermore, such resolutions are often insufficient to detect small-scale urban features, including pocket parks or rain gardens [38,44,48,50]. Despite these spatial limitations, a distinct advantage of Landsat imagery is the inclusion of two thermal infrared bands, which are essential for assessing the impact of BGI on mitigating the Urban Heat Island (UHI) effect [51,52,53]. Consequently, due to the aforementioned constraints, satellite imagery is best suited for continuous, precise monitoring of urban BGI in large-scale analyses where detail and high precision are not primary requirements.
Unmanned Aerial Vehicles (UAVs) offer a viable solution to the aforementioned challenges. UAVs (also referred to as drones or unmanned aircraft) in accordance with the regulatory framework of the European Union, are defined as ‘any aircraft operating or designed to operate autonomously or to be piloted remotely without a pilot on board’ [54]. Definitions of nearly identical scope are provided by the US Federal Aviation Administration [55] and the Civil Aviation Administration of China [56]. Furthermore, these regulatory bodies emphasize that, beyond the aircraft itself, executing a mission requires a comprehensive Unmanned Aircraft System (UAS). This system encompasses the aircraft as well as the supporting network, the remote pilot station, and all other equipment and personnel necessary to control the unmanned aircraft [54,55,57,58]. In environmental research, UAVs serve as versatile platforms that facilitate the transport and deployment of diverse sensory payloads. Implementing UAV technology may streamline the monitoring and management of complex, mosaic-like urban BGI systems. Access to remote or restricted locations becomes feasible with these aerial platforms. Moreover, the deployment of UAVs ensures the delivery of precise, high-resolution spatial data. Global utilization of these systems is currently expanding across diverse scientific and industrial sectors. Wide-scale application is already evident in precision agriculture and forestry. In these fields, drone integration effectively minimizes operational costs while optimizing the efficiency of sustainable management [59,60,61,62]. Hence, integrating UAV technology into the management of urban BGI represents a logical and necessary progression for modern environmental governance.
Although existing literature frequently catalogs drone applications within precision agriculture and commercial forestry, syntheses dedicated specifically to the urban Blue-Green Infrastructure domain remain relatively scarce. In light of this gap, the primary objective of this review is to provide an analytical assessment of current UAV technology for urban BGI monitoring. To provide distinct added value beyond a standard descriptive compilation, this manuscript shifts toward a management-oriented perspective. A central component of this approach is the evaluation of recent (2018–2025) hardware, sensor arrays, and AI-driven processing advances through a structured Technology Readiness Level (TRL) framework. Applying the TRL scale allows for a more rigorous determination of operational maturity, explicitly distinguishing experimental or proof-of-concept payloads from methods that are robust enough for routine municipal deployment. By systematically examining these technological readiness levels alongside operational constraints, regulatory barriers, and data-processing trade-offs, this review seeks to bridge the disconnect between remote sensing engineering and urban environmental governance. Ultimately, by linking technological maturity with practical oversight needs, the article identifies underlying methodological bottlenecks and outlines a consolidated roadmap for transitioning UAV-based BGI interventions from isolated pilot studies to standardized operational practices, thereby supporting broader urban environmental governance and climate resilience strategies.

2. Methodology

2.1. Research Design and Review Framework

This study adopts a scoping review methodology informed by the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines [63], aiming to ensure transparency and reproducibility in the literature search and screening process rather than full compliance with the 27-item PRISMA checklist. The PRISMA framework was selectively applied to structure the identification, screening, and eligibility stages of the literature selection process, including database selection, search string formulation, and exclusion criteria. This methodological choice reflects the interdisciplinary and heterogeneous nature of UAV-based research in urban blue–green infrastructure, where studies differ substantially in objectives, spatial scales, sensor configurations, analytical methods, and reported outcomes. Under such conditions, applying a full PRISMA protocol would limit the inclusion of relevant methodological contributions and could introduce artificial comparability across fundamentally different study designs. Consequently, a PRISMA-informed scoping review was considered the most appropriate approach to critically synthesise current technologies and trends, and to evaluate the operational maturity and limitations of UAV-based approaches in urban BGI management. The review focuses on the use of unmanned aerial vehicles for monitoring, managing, and maintaining urban blue–green infrastructure. Both review articles and original research papers, as well as case studies, are included to capture methodological developments, empirical evidence, and applied implementations. The scope encompasses UAV-based acquisition of optical, multispectral, hyperspectral, thermal, and spatial data, as well as advanced data processing approaches including artificial intelligence (AI), machine learning (ML), and computer vision techniques. In addition, the review considers non-imaging UAV applications, such as environmental sampling and targeted interventions relevant to urban BGI. To address the operational readiness and maturity concerns associated with diverse UAV-based monitoring techniques, this review employs the Technology Readiness Level (TRL) framework as a primary analytical tool. Originally conceptualized by the National Aeronautics and Space Administration (NASA) to provide a standardized metric for evaluating the maturity of evolving technologies [64,65], the TRL scale enables a systematic comparison of technical solutions regardless of their specific engineering domain.

2.2. Data Acquisition and Search Strategy

The primary search was conducted exclusively using the Scopus database, which provides comprehensive coverage of peer-reviewed technical and environmental literature. Restricting the search to a single comprehensive database ensured methodological transparency, reproducibility, and consistency in metadata within a rapidly evolving, highly interdisciplinary research field. Scopus offers broad coverage across environmental sciences, urban studies, and engineering, which aligns with the multidisciplinary nature of UAV-based research on urban blue–green infrastructure. The Scopus database provides transparent indexing rules, stable bibliographic records, and advanced query tools that enable exact replication of search strategies. The final bibliometric survey covered the period from January 2018 to December 2025. The search strategy combined Boolean operators (AND/OR) with specific thematic keywords. To ensure comprehensive coverage of the interdisciplinary research field, multiple logical search strings were developed and executed separately in Scopus. The core query referenced unmanned aerial vehicles in combination with urban blue-green infrastructure. Furthermore, the search included management, monitoring, and assessment (Table 1). These queries were combined using the ‘Combine Search’ function with the AND operator. This multi-query strategy enables sensitivity analysis across a broader range of UAV-BGI applications. The search strategy employed Scopus’s ‘Combine Search’ functionality to integrate multiple query strings within a single controlled database environment, thereby ensuring internal consistency of results and avoiding redundancy at the database level. It is acknowledged that this approach does not capture all potentially relevant publications indexed elsewhere; however, the objective of this scoping review was not exhaustive coverage, but the identification of dominant research patterns and methodological approaches. Inclusion criteria were defined to capture peer-reviewed journal articles written in English and explicitly addressing UAV-based applications relevant to urban blue–green infrastructure. No formal quality appraisal or risk-of-bias assessment was conducted, as the review aims to map technological diversity rather than to hierarchically rank evidence strength.
The literature search was initiated using three predefined search strings (Table 1). The initial query returned 1257 records. The earliest publication identified through this search was published in 2004. However, an examination of the thematic scope of the oldest records indicated that the first study directly relevant to the objectives of this review dates back to 2012. This study addressed the use of UAV imagery for tree species classification in an urban park environment [66].
Analysis of the temporal distribution of the retrieved publications revealed that research on the application of UAVs for monitoring and managing blue–green infrastructure began to expand more rapidly approximately a decade ago (Figure 1). This trend coincides with the accelerated development of UAV platforms and onboard sensing technologies. To focus on recent applications and current research challenges, the subsequent search was restricted to the period 2018–2025. During this period, substantial advances were made in both UAV hardware and sensor capabilities. After applying the temporal constraint, the number of retrieved records was reduced to 1103 (Figure 2).
In the next step, the search was limited to peer-reviewed journal articles and conference publications. This refinement resulted in a final dataset of 1067 records, including 672 journal articles, 302 conference papers, 63 conference reviews, and 30 review papers.
The screening process of 492 articles was conducted in two successive stages.
  • Stage 1: Title screening
To further narrow the dataset’s scope, an automated title screening was performed. At this stage, explicit reference to UAV-related technology was required in the article title. To achieve this, Search String 1 (UAV terminology) was restricted exclusively to the title field. This procedure reduced the dataset to 492 records.
  • Stage 2: Title and abstract screening
The remaining publications were then assessed individually based on their titles and abstracts. At this stage, 366 documents were excluded according to the following criteria:
  • The study addressed environmental monitoring issues but focused on non-urban ecosystems, such as forests, water bodies, protected areas (e.g., national parks), or wildlife inventories (e.g., crocodiles, deer, elephants).
  • The study concerned UAV applications in urban areas but focused primarily on grey infrastructure, such as roads or parking facilities.
  • The use of UAVs was marginal to the core topic, for example, studies centred on land use/land cover classification or vegetation mapping at spatial scales exceeding the urban context.
A total of 126 articles that passed the title and abstract screening were retrieved in full text and subjected to detailed analysis (Figure 2). The selected publications were examined to extract standardised information related to UAV platform specifications, sensor types, data processing workflows, and the specific application domains in which UAVs were used for monitoring and managing urban blue–green infrastructure.
Despite the systematic approach adopted in this review, several limitations should be acknowledged. First, the literature search was restricted to publications indexed in selected scientific databases and to documents written in English, which may have led to the omission of relevant studies published in other languages or in regional journals. Second, the applied search strings focused explicitly on UAV-related terminology and urban blue–green infrastructure, potentially excluding studies addressing similar applications under alternative terminology or broader conceptual frameworks. Third, the temporal restriction to the period 2018–2025 may have limited the inclusion of some earlier studies. However, it is unlikely that the exclusion of earlier publications resulted in the omission of any key themes related to the use of UAVs in the management and monitoring of urban blue–green infrastructure.

2.3. UAV Technology Readiness Level (TRL) Framework

In the context of this study, the TRL framework is adapted to assess the implementation feasibility of UAV platforms and sensors within urban Blue-Green Infrastructure (BGI) management. Drawing on the principles outlined in the NASA Technology Readiness Assessment (TRA) guidelines [65] and the foundational work on technology maturity life cycles [64], the applied scale is categorized into four consolidated clusters:
  • Conceptual and Preliminary Research (TRL 1–3): This phase encompasses basic scientific research, initial proof-of-concept validations, and the formulation of technological applications. It identifies nascent methods that have yet to be tested in representative environments.
  • Experimental and Pilot Validation (TRL 4–6): This stage represents technologies demonstrated and validated in relevant environments. In the BGI context, this includes pilot studies conducted in urban parks, riparian zones, or managed green spaces, where the system is tested against realistic operational constraints.
  • Operational Qualification and Demonstration (TRL 7–8): At this level, systems or models have been completed and qualified through rigorous testing in actual operational environments, such as municipal BGI management workflows or integrated urban monitoring programs.
  • Full Operational Maturity and Standardization (TRL 9): This final stage denotes actual systems proven through successful mission operations. Technologies at TRL 9 are commercially available, standardized, and possess a high degree of reliability for large-scale urban deployment.
By mapping technologies and areas of UAV application in urban GBI monitoring presented in the analyzed literature against these benchmarks, the review provides a critical appraisal of the current “maturity gap” in BGI monitoring. This facilitates the identification of technologies ready for immediate implementation and those that require further field validation.

3. Application Domains of UAVs in Urban BGI

The PRISMA-based literature review revealed a diverse but clearly structured body of research on the use of UAVs for monitoring and managing urban blue–green infrastructure. The analysed publications were grouped into six thematic categories reflecting dominant application domains and research perspectives (Figure 3). To maintain a robust and mutually exclusive framework for quantitative synthesis, each publication was assigned to a unique category representing its core application domain. While several studies exhibited cross-disciplinary characteristics, they were categorized according to their predominant thematic scope and primary study focus. This hierarchical approach allows for a clear identification of research trends while highlighting the current scarcity of integrated, system-level analyses that address the synergistic interactions between diverse urban BGI components. The distribution of articles across these categories indicates that UAV-based studies have so far focused primarily on vegetation-related components of green infrastructure (urban parks/forests, individual trees) and water quality monitoring within blue infrastructure. Furthermore, the analysis evaluates the operational strengths and limitations of specific UAV-based applications (Table 2). This critical assessment identifies key logistical, technical, and environmental constraints that influence the practical implementation of drone technology in complex urban settings. The following subsections synthesise the thematic scope of the reviewed literature and highlight how UAV technologies are applied across different dimensions of urban BGI monitoring and management.

3.1. UAV Applications in the Monitoring and Management of Green Infrastructure (GI)

The largest group of analysed publications focuses on the use of UAVs for monitoring and managing urban green infrastructure, accounting for approximately 39% of the reviewed studies (Figure 3). This body of literature is dominated by applications related to vegetation structure characterisation [67,68,69,70,71], biomass estimation [72,73], and the assessment of ecosystem services (e.g., reducing urban heat island effect [67]) provided by urban green spaces. UAV platforms equipped with RGB, multispectral, and LiDAR sensors are widely used to derive high-resolution three-dimensional information [74,75], vegetation height [76], leaf area index [77], and carbon storage [78,79] at the level of individual trees or entire parks (Table 2). Advanced data processing techniques, including machine learning [79,80] and deep learning-based semantic segmentation [81,82], are frequently employed to support automated vegetation classification, species identification, and change detection. These studies demonstrate that UAVs enable fine-scale, spatially explicit monitoring of urban vegetation that is difficult to achieve using satellite or ground-based methods alone, thereby supporting evidence-based planning and management of green infrastructure.
Table 2. Technical overview of UAV-based sensing platforms and their application domains within urban BGI frameworks.
Table 2. Technical overview of UAV-based sensing platforms and their application domains within urban BGI frameworks.
CategoryDominant UAV SensorsMain Application DomainsOperational StrengthsOperational Limitations Typical OutcomeRepresentative Article
Green Infrastructure (GI)RGB cameras
Multispectral cameras
Hyperspectral cameras
Thermal cameras
LiDAR
  • vegetation structure analysis,
  • biomass estimation,
  • carbon sequestration estimation,
  • species classification,
  • vegetation change detection,
  • ecosystem service assessment.
Ultra-high spatial resolution allows for individual tree crown (ITC) analysis; ability to perform multi-temporal phenological monitoringSensitivity to illumination and weather conditions; requirement for rigorous radiometric calibration; high computational costs associated with point cloud processing.
  • Vegetation indices (e.g., NDVI),
  • Canopy Height Model (CHM),
  • Leaf Area Index (LAI),
  • Tree Health Class,
  • Biomass measurement,
  • carbon storage estimation.
[67,68,69,70,71,72,73,74,76,77,78,79,80,81,82]
Blue Infrastructure (BI)RGB cameras
Multispectral cameras
Hyperspectral cameras
Thermal cameras
LiDAR
In situ sensor
Water Sampling Systems
  • identification of water bodies,
  • water quality change monitoring,
  • eutrophication monitoring,
  • algal bloom detection,
  • water temperature estimation,
  • water volume estimation,
  • water pollution analysis,
  • water sampling.
Facilitates access to hard-to-reach reservoirs or riparian zones; enables high-precision mapping of localized contaminant plumes.Monitoring limited to the surface layer only; errors may result from sun glint and specular reflection; depth of signal penetration depends on turbidity.
  • Turbidity (NTU),
  • Chlorophyll-a,
  • Total Suspended Solids (TSS),
  • Surface Water Temperature,
  • electrical conductivity (EC),
  • dissolved oxygen (DO),
  • pH,
  • phosphates,
  • nitrites.
[83,84,85,86,87,88,89,90,91,92]
BGI and urban thermal environmentRGB cameras
Multispectral cameras
Thermal cameras
  • urban heat island analysis,
  • cooling effect of vegetation and water features,
  • heat stress assessment.
Possible measurement of micro-scale temperature dynamics in “urban canyons”; identification of precise heat-leaking hotspots.Lower resolution of thermal sensors compared to RGB; atmospheric humidity interference; uncertainty in surface emissivity values.
  • Land Surface Temperature (LST),
  • Thermal Comfort Indices,
  • Sky View Factor (SVF),
  • Heat Island Intensity
[93,94,95,96,97,98,99,100,101,102]
Human–Nature interactions in BGIRGB cameras
Video imaging cameras
  • behavior mapping,
  • park use intensity monitoring,
  • spatial activity patterns monitoring,
  • recreational impact assessment.
Non-invasive data collection (at appropriate altitudes); relatively low-cost acquisition of time-varying data in selected urban spaces.Potential acoustic disturbance to human and fauna; canopy occlusion limits the detection of ground features.Temporally variable data on the ways and intensity of use of specific urban spaces[103,104,105,106,107,108]
Air quality monitoring related to BGIGas sensors
PM sensors
RGB cameras
Air Sampling Systems
  • monitoring of emitted pollutants
  • vertical profiling of pollutants,
  • spatial variability of air quality assessment,
  • mapping pollutant concentrations in residential areas,
  • control of emissions from chimneys
  • interaction between vegetation and pollutant dispersion
Ability to generate vertical concentration profiles; high mobility allowing tracking of point-source emission plumes in real-time.Influence of the downwash effect from the propellers on the sensors; need to install the sensors on an extension arm, reduced flight time with heavier chemical sensors.
  • information on emissions, e.g., suspended particulate matter (PM10 and PM2.5)
  • Vertical Pollutant Gradients of carbon monoxide (CO),
  • nitrogen dioxide (NO2), ammonia (NH3), carbon dioxide (CO2), methane (CH4).
[109,110,111,112,113,114,115,116,117]
Miscellaneous applicationsRGB cameras
Multispectral cameras
LiDAR
  • land cover change detection,
  • land–water interface monitoring,
  • integrated blue–green–grey infrastructure mapping.
Relatively low-cost provision of highly accurate, highly up-to-date data.Possible stress caused to wildlife by propeller noise.Maps of current land use, visualisation of the temporal variability of littoral areas[118,119,120,121,122]

3.2. UAV Applications in the Monitoring and Management of Blue Infrastructure (BI)

Approximately 30% of the analysed articles address UAV-based monitoring of blue infrastructure (Figure 3), with a strong emphasis on urban water quality assessment. These studies primarily focus on small and medium-sized water bodies, such as rivers, reservoirs, lakes, and aquaculture ponds, where traditional monitoring approaches are often spatially limited or logistically demanding. High-resolution images obtained using UAVs enable accurate identification of water bodies in complex urban environments [123]. UAVs equipped with multispectral and hyperspectral sensors are used to retrieve key water quality parameters, including turbidity [83,91], suspended solids [90,124], dissolved oxygen [84,87,88], chemical oxygen demand [84,88,91], nutrients [83,84,88,91], chlorophyll-a [83,89], and indicators of eutrophication [92] (Table 2). Machine learning algorithms play a central role in parameter inversion and predictive modelling, enabling the translation of spectral information into quantitative water quality metrics [83,85,86,92]. In addition to remote sensing-based approaches, some studies integrate UAV platforms with in situ sensor systems to enable real-time or near-real-time monitoring [125,126,127,128]. Collectively, this literature highlights the potential of UAVs as flexible tools for supporting operational management and early warning systems in urban blue infrastructure.

3.3. UAV Applications Linking BGI with Thermal Environment and Urban Climate Processes

A smaller but coherent group of studies (nearly 6%) investigates UAV applications at the interface between blue–green infrastructure and urban thermal conditions (Figure 3). The studies mainly focus on evaluating [93,95,102] or mitigating [99] the urban heat island effect. These articles predominantly employ UAV-based thermal imaging (infrared sensors—TIR) to assess surface temperature patterns [97], cooling effects of vegetation and water features [93,98,101], and spatial variability of urban heat stress [100] (Table 2). UAV observations are often combined with object-based image analysis or machine learning techniques to quantify the cooling performance of parks, green roofs, stormwater features, and vegetated corridors [94,96]. The reviewed studies demonstrate that UAVs provide valuable high-resolution data for evaluating the microclimatic functions of BGI, supporting urban climate adaptation strategies and the optimisation of nature-based cooling interventions.

3.4. UAV Applications in Analysing Human–Nature Interactions Within Urban BGI

Approximately 7% of the reviewed publications focus on the use of UAVs to study human–nature interactions in urban blue–green infrastructure (Figure 3). This literature explores how spatial characteristics of parks and green spaces influence patterns of human behaviour, physical activity, and recreational use [105,106,107,108]. UAV imagery is applied to map pedestrian movement, space occupancy, and user density, often in combination with behavioural observation frameworks and deep learning-based detection methods [103,104,105] (Table 2). These studies demonstrate the potential of UAVs to support evidence-based design and management of public green spaces by linking physical characteristics of BGI with social use patterns. At the same time, authors highlight methodological and ethical considerations related to privacy and data interpretation in UAV-based behavioural studies.

3.5. UAV Applications in Air Quality Monitoring Related to Urban BGI

Another 6% of the analysed articles address UAV-based air quality monitoring in urban environments (Figure 3), often in relation to the modifying role of blue–green infrastructure. These studies typically employ UAV-mounted sensor packages to measure the vertical and horizontal distributions of air pollutants, including particulate matter [110,111,114], carbon dioxide [112], ozone, and nitrogen dioxide [109] (Table 2). UAVs are used to capture fine-scale spatial gradients and vertical profiles that are difficult to observe using fixed monitoring stations. Although BGI is not always the primary focus, these studies provide important insights into how urban vegetation and surface characteristics influence pollutant dispersion and exposure patterns. The reviewed literature suggests that UAV-based air quality monitoring can complement traditional networks and contribute to integrated assessments of environmental performance in BGI-rich urban areas [113,115].

3.6. Miscellaneous UAV Applications Related to Urban BGI

The remaining 7% of publications fall into a heterogeneous category addressing miscellaneous or integrative applications (Figure 3). These studies include UAV-based land cover change detection [118,119,120], combined monitoring of land–water interface zones [121], and integration of UAV data with terrestrial laser scanning [129] (Table 2). While these contributions do not always focus explicitly on BGI management, they demonstrate the versatility of UAV platforms and their capacity to support multi-scale, multi-component analyses of urban environments. This category also reflects emerging research directions in which UAV data are increasingly integrated with other geospatial technologies to support holistic urban environmental assessments.
Evaluating the scope of existing applications points to a noticeable compartmentalization in BGI monitoring. Current drone-based methodologies show considerable sophistication when targeting isolated environmental variables, whether that involves extracting individual tree canopies or calculating surface water turbidity. Nevertheless, a clear gap persists regarding studies that approach BGI as a unified, interacting ecological network. For example, while aerial thermal mapping is routinely used to measure the localised cooling provided by urban greenery, researchers rarely combine these observations with concurrent water-quality data from adjacent aquatic features. Treating these domains separately restricts the capacity to evaluate how terrestrial and water systems mutually reinforce one another. Overcoming this methodological divide appears necessary to support integrated and effective climate adaptation strategies in urban planning.

4. Overview of UAV-Mounted Sensors for Blue–Green Infrastructure (BGI) Monitoring

Unmanned aerial vehicles have become an important platform for environmental sensing in urban areas. Their flexibility, high spatial resolution, and ability to operate below cloud cover make them particularly suitable for monitoring urban blue–green infrastructure. UAV-mounted sensors enable the acquisition of detailed spatial and temporal information on vegetation, surface water bodies, and land–water interfaces, which are difficult to observe using satellite or ground-based methods alone. In the context of urban BGI, UAV sensors support both diagnostic assessments and operational management by providing timely, site-specific data (Table 3). The methodological framework for the architecture of data acquisition and processing using unmanned aerial vehicles (UAVs) for multidimensional BGI analysis of urban areas is shown in Figure 4.

4.1. Red Green Blue (RGB) Optical Sensors

Digital cameras operating in the visible spectrum (RGB) remain the primary tool for high-resolution monitoring of urban BGI due to their cost-effectiveness and ease of integration with lightweight UAV platforms. RGB cameras use a color filter array to capture red (620–750 nm), blue (450–495 nm), and green (495–570 nm) wavelengths [130]. This data is processed into a two-dimensional, full-color image that represents the environment. Modern UAV-mounted RGB sensors provide sub-decimeter spatial resolution, which is essential for the detailed identification of fine-scale urban features [131,132,133]. The obtained resolution depends on the focal length and flight altitude [134]. Images captured by RGB cameras are suitable for mapping land cover [67,119,135], vegetation structure [131,136,137,138,139], and surface features [140,141,142]. In urban BGI applications, RGB data are commonly used for tree crown delineation [74,76,143,144], canopy cover estimation [44,145,146], park [66,108] and green roof [133] applications, and for visual inspection of blue infrastructure elements [89,123] (Figure 5).
In the context of BGI management, RGB imagery is extensively used to create high-density 3D point clouds and Digital Surface Models (DSMs). These spatial products enable the precise assessment of tree height [76], canopy volume [72,80,129], and structural integrity of blue infrastructure elements, such as retention pond embankments. Furthermore, despite the lack of near-infrared bands, high-resolution RGB data allow for the calculation of specialized indices, such as the Visible Atmospherically Resistant Index (VARI) [147,148] and the Triangular Greenness Index (TGI) [149,150,151], which are effective for monitoring urban lawn health and detecting chlorophyll degradation in ornamental street trees. Due to their low cost and ease of deployment, RGB sensors are often used as a baseline data source or combined with other sensors in multi-sensor workflows. However, their limited spectral information limits their ability to directly retrieve biophysical parameters. Limitations in the use of RGB cameras on UAV platforms stem from adverse meteorological conditions that directly affect data acquisition effectiveness and reliability. Adverse weather conditions, such as high winds or precipitation, impose physical limitations on flight operations. Furthermore, solar-induced fluctuations in the Earth’s magnetic field, quantified by the Planetary K-index, may significantly impact mission stability. High Kp-index values can lead to GNSS positioning errors, intermittent satellite connectivity, or complete signal loss. Beyond flight safety, data integrity is susceptible to atmospheric interference; factors such as fog, smoke, or haze reduce visibility, thereby degrading the quality and interpretability of the acquired RGB imagery. In addition to atmospheric and safety constraints, RGB sensors possess inherent radiometric limitations that affect data consistency. Most consumer-grade RGB cameras lack a downwelling light sensor, making radiometric calibration across different flight missions difficult, especially under variable cloud cover. In complex urban environments, high-reflectance surfaces (e.g., water surface, concrete, glass) often cause sensor saturation, while deep shadows in urban canyons lead to significant loss of detail in BGI areas. Furthermore, the broad spectral sensitivity of RGB bands prevents the detection of early-stage vegetation stress or specific nutrient deficiencies, which typically manifest in the Red-Edge or Near-Infrared regions before becoming detectable in the visible spectrum.

4.2. Multispectral Sensors

Multispectral sensors integrated with UAV platforms provide advanced analytical capabilities for the comprehensive monitoring of urban blue-green infrastructure (BGI). By capturing spectral reflectance in the near-infrared (NIR) and red-edge bands, these systems facilitate the precise assessment of vegetation health and the detection of physiological plant stress before visual symptoms appear in the RGB spectrum [71,125]. The acquisition of high-resolution multispectral data enables the calculation of diverse vegetation indices, such as the Leaf Area Index (LAI) [77], Normalized Difference Vegetation Index (NDVI) [71,152], difference vegetation index (DVI), green normalized vegetation index (GNDVI), green ratio vegetation index (GRVI), ratio vegetation index (RVI), renormalized vegetation index (RDVI), wide dynamic-range vegetation index (WDRVI), modified chlorophyll absorption reflectance vegetation index (MCARI), normalized red-edge vegetation index (NREVI), and red chlorophyll index (RECI). Indexes may serve as reliable proxies for greenery productivity and carbon sequestration potential [78].
The Normalized Difference Vegetation Index (NDVI) serves as a primary metric for quantifying urban green infrastructure by leveraging the contrast between vegetation’s high reflectance in the near-infrared (NIR) spectrum and its absorption in the red band. In the context of urban monitoring, NDVI derived from UAV platforms provides the high spatial resolution necessary to differentiate small-scale vegetation patches from complex impervious surfaces. Research demonstrates that NDVI is highly effective for identifying photosynthetic activity and mapping the distribution of green space. Consequently, UAV-based NDVI analysis has become an essential tool for periodic urban forest inventories and the evaluation of ecosystem services, offering a cost-effective alternative to traditional ground-based surveys [71,152,153,154,155]. Furthermore, the unique spectral signatures provided by these sensors support robust species differentiation and the mapping of complex urban vegetation structures [68]. The high temporal flexibility of UAV missions also enables continuous monitoring of seasonal dynamics and phenological shifts within green spaces, providing urban planners with critical data to assess ecosystem resilience to heat stress and drought [67,122].
UAV-based multispectral sensors have demonstrated significant utility for systematic monitoring of urban blue infrastructure, particularly for water quality assessment in narrow, complex river networks. By recording reflectance in discrete bands, such as green, red, and near-infrared (NIR), these systems allow for the effective inversion of critical water parameters, including turbidity [83], suspended solids (TSS) [83,156], pH [83], nutrients (ammonium nitrogen, total phosphorus) [83,85,91], chemical oxygen demand (COD) [85,91] and chlorophyll-a concentrations [156]. The integration of multispectral data with ensemble machine learning models facilitates the automated detection of inland water boundaries and the identification of floating macro-debris or vegetation [123]. These sensors provide a cost-effective alternative to traditional sampling, enabling municipal authorities to monitor the biochemical status of urban ponds and canals with high spatial frequency and operational efficiency.
Despite these advantages, the practical application of multispectral sensors in urban environments faces significant technical constraints. A primary challenge is maintaining radiometric consistency across different flight missions. Accurate quantitative analysis requires rigorous calibration using downwelling light sensors and pre-flight reflectance panels to account for changes in solar irradiance. Without these steps, vegetation indices such as NDVI can be misinterpreted due to cloud cover fluctuations rather than actual physiological changes. Furthermore, most affordable multispectral cameras have lower spatial resolution compared to RGB sensors, which may lead to mixed-pixel effects in fragmented urban landscapes where vegetation, shadows, and man-made materials coexist in close proximity. Additionally, specular reflection and sun glint on urban water surfaces often saturate multispectral bands, complicating the accurate retrieval of biochemical water parameters.

4.3. Hyperspectral Sensors

Hyperspectral imaging represents the most advanced tier of remote sensing for urban BGI, offering narrow-band spectral information across hundreds of contiguous wavelengths. Hyperspectral sensors gather data across the visible (VIS), near-infrared (NIR), and short-wave infrared (SWIR) spectral ranges, enabling spectral analysis beyond the scope of RGB or multispectral systems [157]. In both the visible (VIS) and near-infrared (NIR) regions, hyperspectral imaging expands the spectral range to hundreds or even thousands of contiguous spectra, providing enhanced, significant spectral information about objects. It is an imagery cube that simultaneously gives continuous spectral information within the image depth and spatial information along the image height and breadth [158]. A hyperspectral camera acquires a near-continuous spectrum divided into narrow bands, typically spanning wavelengths from 5 to 20 nm [130]. To better understand the visual features of lakes and rivers and achieve precise water quality monitoring, hyperspectral sensors with more bands, narrower bandwidths, and more sophisticated artificial intelligence techniques will be employed in the future to analyse the optical properties of water bodies [159].
This ultra-high spectral resolution allows for the identification of subtle chemical and physiological changes in plants, facilitating the detection of specific contaminants or nutrient deficiencies that are indistinguishable to multispectral or RGB sensors [84,86,160,161,162]. The richness of the hyperspectral data cube further supports complex classification tasks, including the differentiation of physiologically similar invasive species and the dynamic monitoring of urban water pollution events [80]. By employing deep learning architectures for hyperspectral band selection and feature extraction, researchers can develop highly accurate models for evaluating the overall ecological health and functional stability of fragmented urban habitats [68]. Hyperspectral imaging systems provide enhanced analytical detail for urban water resource management by capturing continuous, narrow-band spectral information. The UAV-based hyperspectral data are instrumental for the multi-parameter inversion of water quality indicators, such as total suspended solids (TSS), nitrogen concentrations, and chlorophyll-a levels in urban inland waters [83,85,86]. This high-density data also allows the identification of subtle absorption features associated with specific chemical pollutants and harmful algal blooms (HABs), which are often indistinguishable using multispectral sensors [84,125]. The application of deep learning architectures to hyperspectral imagery facilitates the multi-parameter inversion of water quality indicators in heterogeneous environments, providing accurate maps of total suspended solids and dissolved organic matter [86,163]. Furthermore, hyperspectral data support the dynamic monitoring of urban inland water pollution events, offering a non-contact method to evaluate the ecological health and functional stability of aquatic ecosystems in real-time [80,164].
Despite the analytical superiority of hyperspectral imaging, several critical limitations hinder its routine deployment in urban BGI management. The primary challenge is the ‘curse of dimensionality’, which requires specialized high-performance computing and substantial storage infrastructure to handle the massive volume of data produced. Most professional hyperspectral sensors are highly sensitive to UAV instability. Even minor vibrations or deviations in pitch and roll can cause significant geometric distortions, necessitating high-grade Inertial Measurement Units (IMU) and complex post-processing for accurate orthorectification. Furthermore, the signal-to-noise ratio (SNR) in hyperspectral bands is typically lower than in multispectral systems due to the extreme narrowness of the spectral channels. This necessitates ideal lighting conditions and slower flight speeds, which reduces overall operational efficiency and limits the windows of opportunity for data acquisition in cloud-prone urban regions.

4.4. Thermal Sensors (TIR)

Thermal infrared sensors mounted on unmanned aerial vehicles (UAVs) provide high-resolution observations of land surface temperature (LST), a key variable for assessing the performance of urban BGI under heat-stress conditions. These sensors typically operate in the long-wave infrared range (8–15 μm) and use microbolometer detectors to capture spatially explicit thermal patterns across heterogeneous urban surfaces [165,166]. When deployed on UAV platforms, thermal cameras enable flexible, repeatable, and fine-scale mapping of temperature contrasts between vegetated and sealed surfaces, which is not achievable with conventional satellite data.
The UAV-based thermal imagery has proven particularly valuable for identifying cooling effects associated with urban green spaces, tree canopies, water features, and green roofs (Figure 6). Several studies have demonstrated that UAV-derived LST maps can capture short-term and diurnal temperature dynamics, allowing for a detailed assessment of evapotranspiration-driven cooling and shading effects [67,94,98,99]. Such analyses are essential for evaluating how different BGI components contribute to thermal regulation at the neighborhood scale. Recent UAV studies on urban heat island mitigation show that thermal observations can effectively distinguish the cooling performance of various land-cover types across different spatial and temporal conditions. For example, high-resolution UAV surveys revealed significant temperature reductions in urban green spaces compared to surrounding built-up areas, particularly during periods of intense solar radiation. Comparative analyses of UHI cooling strategies further indicate that vegetation-based solutions often provide stronger and more spatially consistent cooling than reflective materials, especially in areas with lower sky view factors [93,99,101]. UAV-based thermal monitoring has also been applied to evaluate the effectiveness of BGI as a nature-based solution for UHI mitigation. Studies analysing surface temperature changes over urban green areas confirm that UAV data support evidence-based planning by identifying priority zones for greening interventions and by quantifying their cooling potential under real climatic conditions [100,101]. This capability is particularly relevant for adaptive urban management, where rapid feedback on BGI performance is required. Despite these advantages, thermal UAV observations remain sensitive to atmospheric conditions, surface emissivity assumptions, and acquisition timing, which may introduce uncertainties when comparing results across sites or seasons. Consequently, thermal UAV data should be interpreted as a complementary tool that supports, rather than replaces, long-term monitoring and integrated urban climate assessments.
Despite their utility, several technical limitations must be considered when using UAV-based thermal sensors. Most portable TIR cameras use uncooled microbolometers, which are prone to changes in sensor response due to internal temperature fluctuations during flight. Converting raw thermal signals into accurate LST is challenging because of the high variability in surface emissivity in urban areas. Materials such as glass, polished metal, and dry asphalt have widely different emissivities, which can lead to measurement errors of several degrees Celsius unless properly calibrated. Additionally, the relatively low spatial resolution of thermal sensors (typically 640 × 512 pixels) compared to RGB cameras may result in spatial blurring at the edges of small BGI features, such as narrow bioswales or young street trees.

4.5. LiDAR Systems

The UAV-mounted Light Detection and Ranging (LiDAR) systems provide high-accuracy three-dimensional information on the structure of urban BGI. They enable direct measurement of vegetation height, canopy structure, and surface roughness, which are critical parameters for assessing ecosystem services such as shading, ventilation, and runoff regulation. Data collected with Lidar allow the generation of the Digital Terrain Model (DTM) and the Canopy Height Model (CHM) [72,73,167]. In contrast to passive optical sensors, LiDAR systems actively emit laser pulses that may penetrate the vegetation canopy. This penetration capability is essential for accurately estimating individual tree parameters, such as stem volume and precise crown diameter, especially in dense urban forest environments where photogrammetric point clouds often fail to capture sub-canopy details [74,76,167]. However, several factors may influence the quality of laser-derived data. The accuracy of the resulting point cloud is highly dependent on the quality of the onboard Inertial Measurement Unit (IMU) and GNSS system. Errors in sensor orientation or GNSS signal multi-pathing in narrow urban ‘canyons’ can lead to significant geometric misalignments between flight strips, requiring complex boresight calibration. Furthermore, LiDAR data provide high-fidelity structural metrics that are crucial for estimating aboveground biomass (AGB) and assessing vertical complexity in urban parks [79]. The integration of these laser-derived point clouds into urban planning workflows enables a more reliable quantification of ecosystem services, including the shading effects and local cooling intensity provided by multi-layered vegetation structures [67].
While optical sensors focus on biochemical properties, UAV-based LiDAR provides critical structural and bathymetric data for urban blue infrastructure management. Active laser scanning is particularly valuable for delineating precise water-land boundaries and mapping the complex geometry of riverbanks and drainage ditches, which are often obscured by riparian vegetation [74,123]. The integration of LiDAR-derived Digital Elevation Models (DEMs) with multispectral imagery enhances the accuracy of urban water extraction and hydrological modeling by providing the necessary vertical context [122]. Furthermore, high-density point clouds facilitate the assessment of shoreline erosion and the monitoring of siltation processes in urban ponds and canals [164]. This structural information, combined with water quality data, enables a holistic evaluation of BGI functionality, supporting more effective flood risk management and biodiversity conservation strategies in densely built environments.
Although UAV-mounted LiDAR offers unmatched capabilities for three-dimensional characterisation of urban vegetation and terrain, its broader adoption in routine BGI monitoring remains limited by cost and the more complex processing requirements of point cloud data. Consequently, its use is often restricted to targeted studies or combined with optical data in multi-sensor workflows.

4.6. In Situ Monitoring and Aerial Water Sampling

While the use of multispectral and hyperspectral sensors for the inversion of water quality parameters from spectral signatures was extensively discussed in the previous section, UAV platforms offer broader analytical capabilities through direct-contact methods. Beyond remote sensing, the integration of physical sampling and contact-based sensing for in situ measurements has redefined the role of UAVs in hydro-environmental studies [168,169]. These systems typically integrate electrochemical probes into a stabilized housing, allowing for the real-time acquisition of fundamental water and air quality indicators [125,126].
An alternative, increasingly popular approach involves using UAVs as autonomous sampling platforms. In this workflow, the drone is equipped with mechanical sampling devices designed to collect physical water volumes at specific GPS coordinates and depths (e.g., SPH Engineering Remote Water Sampling System) [126,128,170]. This method is particularly valuable for analysing non-optically active pollutants, such as heavy metals, phosphates, and specific bacterial indicators, which require standardized laboratory protocols for precise quantification [127,128]. By combining in situ sensing with physical sampling, UAVs provide a comprehensive monitoring framework that bridges the gap between large-scale aerial observations and rigorous chemical analysis.
Modern UAV-based platforms may be equipped with integrated Water Quality Measurement Systems (WQMS) for in situ analysis. These platforms, often designed with flotation assistance and environmentally friendly casings, allow sensors to be submerged directly in urban reservoirs and rivers [168]. This approach enables the simultaneous measurement of physical and chemical indicators such as temperature, electrical conductivity (EC), dissolved oxygen (DO), and pH [168,169], which are commonly measured to assess water quality in rivers, ponds, and lakes. Such high-frequency monitoring is critical for protecting delicate urban ecosystems, where human activities often lead to rapid degradation of water quality. Technically, these systems use specialized payloads, such as the YSI EXO series multiparameter sondes, while experimental ultralight sensors (approx. 220 g) have been developed for targeted nutrient determination, including phosphate and nitrite [126]. A technical limitation is the weight of multi-parameter probes, especially those not specifically designed for UAV platforms. The maximum take-off mass (MTOM) of a drone also limits the size of the water sample that can be taken from a reservoir for ex situ laboratory testing (Table 4).
Despite their high operational flexibility, these systems face significant limitations. The UAV’s proximity to the water surface introduces the risk of “propeller downwash,” which can induce additional water aeration or turbulence, potentially distorting measurements of dissolved gases or surface-layer parameters [127]. Furthermore, the payload capacity of small-scale UAVs limits the volume of water samples that can be collected for ex situ laboratory analysis, often limiting the scope of comprehensive chemical assays. The weight of the collected samples may affect the stability of the aircraft and increase energy consumption, thereby reducing the maximum flight time [128]. From a legal and operational standpoint, water-contact missions in urban areas are frequently constrained by stringent aviation safety protocols, as flights over water bodies in proximity to public infrastructure require specialized flight permits, protocols for sample collection, and waterproof fail-safe mechanisms to prevent environmental contamination in the event of a crash [171].
The deployment of UAVs equipped with atmospheric sensors represents a paradigm shift in urban environmental monitoring, effectively bridging the gap between coarse satellite data and stationary ground stations. In the troposphere, these platforms serve as essential tools for vertical profiling of air pollutants. The UAV platforms allow for the examination of concentration gradients that are often inaccessible to traditional monitoring networks. These systems are predominantly utilized for real-time detection of chemical hazards [172], mapping smog concentrations in residential areas [116], and inspecting industrial chimney emissions [117]. Technically, UAVs utilize a diverse array of sensors (Table 4). The most common analyses refer to suspended particulate matter (PM10 and PM2.5) and gaseous emissions, carbon monoxide (CO), nitrogen dioxide (NO2), ammonia (NH3), and sulfur dioxide (SO2) [109,110,111,173]. Furthermore, specialized UAV payloads have been developed for isotopic signature identification of CO2 and CH4. These systems exhibit the sensitivity required to distinguish emission plumes from atmospheric background levels, providing a robust framework for tracking greenhouse gas sources and verifying local carbon budgets [174]. Sensors dedicated to UAVs can simultaneously analyse multiple parameters depending on the selected sensor configuration (e.g., the DJI Sniffer4d V2 Multi-gas Detection System can analyse up to 9 compounds). Sensors can also be highly specialised and analyse a single pollutant, such as the Falcon Plus TDLAS Methane Leak Detector from SPH Engineering, which is used to detect methane leaks from landfills or transmission facilities. Despite these advancements, several critical limitations constrain the efficacy of UAV-based air quality assessments. A primary technical challenge is the “downwash effect,” where rotor-induced turbulence biases sampling accuracy by disturbing the local air mass before it reaches the sensor intake [172]. This phenomenon necessitates careful consideration of sensor placement and intake design to ensure representative data collection [175]. Although UAVs offer unprecedented spatial mobility [176], their operational use is often limited by strict aviation regulations. In particular, flights in heavily urbanised areas, beyond visual line of sight (BVLOS), require appropriate pilot licenses and compliance with a series of procedures for obtaining permits from national aviation authorities, which may limit the frequency and scope of monitoring missions [57,177].
Table 4. Example device dedicated to UAV applications for water and air sensor quality testing.
Table 4. Example device dedicated to UAV applications for water and air sensor quality testing.
Monitoring DomainSensor TypeTechnical Data on the Performed
Measurements
Weight
In situ Water Sensing YSI EXO
Multiparameter Water Quality EXO1S/EXO2S/EXO3S Sonde [178]
Sustainability 18 03064 i001
Capability to attach to UAVs
With 4/5/7 Sensor Ports
EXO water sensors: ISE ammonium, ISE chloride, ISE nitrate, DO, pH, EC, ORP, temperature, TAL-Chlorophyll, TAL-Phycocyanin, TAL-Phycoerythrin, turbidity, UV Nitrate, Depth
480/1060 g
Water SamplingSPH Engineering
Remote Water Sampling System [179]
Sustainability 18 03064 i002
Sampling for ex situ testing. Water sampler volume: up to 1 dm3 (for DJI m300 RTK drone) or up to 5 dm3 (for DJI m600 Pro drone)up to 5000 g
In situ Air Sensing Scentroid
DR2000 [180]
Sustainability 18 03064 i003
Up to 4 electro-chemical sensors: PM1, PM2.5, and PM10, VOCs, CO2, NOx, CH4, temperature, relative humidity, and barometric pressure.
Detection methods: electrochemistry, photoionization detection (PID), non-dispersive infrared (NDIR), Laser Particulate Counter
520 g/640 g (base/fully loaded)
In situ Air Sensing, Air
Sampling
DJI
Sniffer4d V2 Multi-gas Detection System [181]
Sustainability 18 03064 i004
Up tu 9 configurable parameters: PM2.5, PM10, O2, O3, NO2, CO, CO2, SO2, NO2, H2S, CH4, Cl2, VOCs, odor (OU).
Gas sampling module.
Detection methods: electrochemistry, photoionization detection (PID), non-dispersive infrared (NDIR), laser scattering
400–500 g
In situ Air SensingSPH Engineering
Falcon Plus TDLAS Methane Leak Detector [182]
Sustainability 18 03064 i005
Methane detection infrared laser
Minimal detectable flow rate: 1 g/h (approx 500 ppm)
Detection Range: Reliable methane detection from distances of 10 to 80 m
360 g
UAV-mounted sensors offer a diverse, rapidly evolving set of tools for monitoring urban blue–green infrastructure (Table 4). Each sensor type offers specific strengths and limitations depending on the targeted BGI component and management objective. As UAV platforms increasingly support multi-sensor data acquisition, the volume and heterogeneity of collected data continue to grow. For example, while multispectral sensors are instrumental in calculating robust vegetation indices (e.g., NDVI), their efficacy is often contingent upon ambient lighting conditions and sensor calibration. In contrast, thermal infrared (TIR) sensors provide critical data on the Urban Heat Island (UHI) effect, yet they often suffer from lower spatial resolution than RGB counterparts. Therefore, achieving truly comprehensive BGI characteristics requires an approach that fuses data from multiple sensors. This trend underscores the importance of robust analytical methods that transform raw sensor data into meaningful indicators for planning and management. Accordingly, the following chapter focuses on analytical techniques and data processing workflows, including machine learning and deep learning approaches, that enable practical interpretation of UAV-derived data for urban BGI applications.

5. Analytical Methods and Data Processing Workflows

The dynamic development of computational methods in artificial intelligence, such as machine learning and neural networks, has enabled much more automated and precise analysis of UAV data [72,183]. The diversity of BGI research, ranging from satellite-based multispectral imaging to ground-based LiDAR and classic surveying methods, requires advanced computational tools that enable the integration and interpretation of large datasets in an efficient and automated manner [72,74,184]. Contemporary approaches can be divided into various categories. In this chapter, we will focus on land-cover and water-body classifications, predictive modelling of ecological parameters, and object detection for the purposes of BGI resource [185,186,187].

5.1. Classification of Vegetation and Water Bodies

The classification of vegetation using UAV data and broadly understood AI methods is currently one of the most dynamically developing areas of environmental research [79,188]. Thanks to imaging with a resolution much higher than that of satellites, it is possible to precisely identify plant species, land-cover types, and community structures, which is essential for assessing the state of ecosystems in BGI monitoring [189]. In UAV research, there has been a marked increase in the use of artificial intelligence-based methods, including both classic ML algorithms and deep learning models. For example, CNN and DL models are used to identify tree species by automatically extracting textures, colour patterns, and spatial arrangements from images with an accuracy of 0.93 [190]. In addition, new algorithms and methods are being developed in research on vegetation index determination. The experimental results show that considering tree crown height significantly improves classification accuracy, achieving an overall accuracy of 93.82% and a Kappa coefficient of 0.91 [76]. Three machine learning algorithms were evaluated for classifying complex natural habitat communities: random forests (RF), support vector machines (SVM), and averaged neural networks (avNNet). Both spectral enhancement and transformation techniques, field-collected data, soil data, texture, and spectral indices were used. The overall accuracy ranged from 83.8% to 93.7%, and the kappa (k) values ranged from 0.79 to 0.92. Based on the results, we concluded that the RF algorithm is a reliable choice for classifying complex forest vegetation, including surrounding wetland communities [191]. Deep learning techniques, incorporating feature engineering elements and attention mechanisms, are also playing an increasingly important role, reducing classification errors and improving the distinction of small plant objects across diverse urban conditions, as well as the creation of heat islands [68].
Contextual feature selection, including RGB indices, textures, and combinations of optimized SVM models, also enables effective vegetation mapping in heterogeneous environments, achieving accuracies above 87% even for RGB images alone [192]. Using UAVs and spectral resolution analysis with machine learning (SVM), dynamic thresholds applicable to crops, trees, and shrubs were established, achieving interspecies compatibility without multispectral data. These techniques enable effective extraction of different plant cover types without the need for multispectral sensors (Kappa > 0.84) [193]. The high effectiveness of tree classification is also confirmed by studies using crown extraction and advanced Convolutional Neural Networks, and deep learning models [194].

5.2. Object Detection for BGI Asset Inventory

Object detection in UAV images forms the basis for inventorying blue-green infrastructure in urban or urbanized areas, enabling automatic recognition of elements such as trees, shrubs, water reservoirs, retention systems, and small infrastructure devices [195,196]. In the case of hyperspectral satellite measurements such as Landsat or Sentinel, due to the low resolution of 10–30 m, the analysed data is highly generalized and therefore cannot be used for specific applications [197,198]. Due to the nature of data acquired at low flight altitudes, object detection faces several challenges: the dominance of small objects, scale variability, object overlap, and complex urban backgrounds. Research clearly indicates that classic detection architectures require specialized modifications to cope with these conditions, especially with low visual signal clarity and high detail density [199].
Current advancements in UAV-based BGI measurements focus on enhancing frame regression and small-object representation. For example, the use of the VIOU vector function improves the accuracy of object location determination, especially for small objects, by approximately 9.7% [200]. Solutions using Siamese networks are becoming increasingly popular for this type of analysis, especially for detecting changes over time, e.g., in monitoring small technical objects on building roofs, as demonstrated by the authors of the article [201]. The approach they used, involving multi-temporal analysis, improves detection accuracy by more than 5.96%, which, when translated to BGI monitoring, enables detection of changes, e.g., in retention systems or biological elements. Such analyses in urban areas are becoming increasingly important, combining detection with reinforcement learning to adapt detection strategies to changing environmental conditions. An example algorithm, the object detection technique (ODT), combines whale optimization with deep reinforcement learning to improve feature extraction. Compared to other methods, it more effectively handles overlapping objects common in dense urban environments [202]. A special case of BGI detection is the recognition of underwater objects and elements. UAV LiDAR bathymetry enables the detection of small objects in water even at a depth of 12 m, as exemplified in the article [203], based on very high-density data of 42 points/m2, which is vital for assessing the condition of the bottom and hydrotechnical elements, or BGI.
In this experiment, the authors tracked cubes with sides measuring 1 m (1 m2) and 2 m (8 m2) on and below the sea surface. Remote sensing data obtained by very high-resolution UAVs, combined with machine learning and deep learning methods, enable tracking of subtle physiological changes in plants, their hydration status, and spatial variation in land cover, as confirmed by comprehensive analyses of UAV systems presented in [204]. Similar approaches are used in the study of small water bodies, where hyperspectral UAV data combined with ML models enables precise estimation of water quality parameters, even in narrow, shallow urban rivers, as demonstrated in [205]. In turn, the work of Ngo et al. reflects the potential of multispectral UAV images and deep learning segmentation models (U-Net, MSNet) in the detection of small water bodies, including puddles, drainage ditches, and seasonal ponds, achieving very high accuracy (dice > 0.92) in complex tropical conditions [206].

5.3. Predictive Modeling (Health, Biomass, Water Quality)

UAV-based predictive modeling is a significant leap forward in the management of urban blue-green infrastructure (BGI) [72]. It enables a transition from more descriptive mapping to anticipatory assessment of ecosystem health, biomass, and water-quality changes. The availability of high-resolution UAV multispectral, hyperspectral, and thermal data has enabled the application of machine learning (ML) and deep learning (DL) techniques to model nonlinear relationships between environmental variables and ecosystem states [207,208]. The prediction of biomass and vegetation health is probably the area of application that has evolved most significantly. Many studies have shown that the use of UAV-based spectral indices, structural metrics, and biophysical variables can be brought together in an ML framework to produce accurate estimations of above-ground biomass for various vegetative types [209,210]. Comparative studies reveal that advanced learning techniques, such as ensemble methods and deep neural networks, are consistently superior to traditional regression models in capturing spatial heterogeneity and temporal variability [211]. In this manner, vegetation condition becomes the primary driver of ecosystem service supply for urban green infrastructure elements such as parks, riparian buffers, green roofs, and urban grasslands [12].
Recent advancements highlight the need for multi-temporal UAV observations to improve the accuracy and repeatability of biomass prediction. Those who use dense time series argue that including phenological information greatly improves the model’s generalizability and transferability to new sites. These results matter a lot in the context of BGI performance monitoring over time when subjected to climatic and anthropogenic changes [42].
Predicting water quality is an exciting new frontier in research, enabled by UAV-based predictive models. The use of high spatial resolution UAV multispectral and hyperspectral imagery has led to the reliable estimation of water quality parameters such as turbidity, chlorophyll-related parameters, and trophic state indices mainly in small and medium-sized urban water bodies [159,212]. Deep learning methods, for example, convolutional neural networks, can also help precisely delineate urban surface water extents and thus facilitate the dynamic monitoring of water quality at a very local scale [123,213]. Such features are of utmost importance for evaluating the performance of blue infrastructure components such as retention ponds, constructed wetlands, and urban rivers. UAV thermal remote sensing adds a biological aspect to predictive modeling in that it provides an indirect means of measuring plant water stress and evapotranspiration patterns, which are closely linked to vegetation health and the water regulation functions of BGI [214]. Structured analyses reveal a trend toward greater use of thermal data alongside multispectral observations and ML-based models, which, in turn, facilitate better prediction of ecosystem responses to heatwaves and drought episodes in urban environments [214].
Predictive modeling is gradually becoming a major tool in integrated assessment frameworks for urban blue–green infrastructure from a system-level viewpoint. UAV applications in water resource management reviews highlight the benefits of integrating UAV-based forecasts with hydrological and hydraulic models for improving stormwater management and flood mitigation strategies [163,164]. On top of that, the performance assessment of BGI networks, which heavily relies on data-driven approaches, is gradually opening the door to linking predicted biophysical variables to ecosystem services and urban resilience indicators. On the whole, the literature reviewed testifies to the viability of UAV-based predictive modeling as a scalable, flexible, and data-rich approach for assessing vegetation health, biomass dynamics, and water quality within urban blue–green infrastructure systems. Also, future methodological collaborations, facilitated mainly through multi-sensor data fusion, long-term UAV time series, and hybrid physics-informed ML models, are anticipated to elevate the status of UAVs in BGI planning and urban ecosystem management to a new level [215,216].

5.4. Challenges and Limitations in UAV-Based BGI Data Processing

While the analytical workflows presented in this chapter demonstrate the rapid advancement of UAV-based BGI monitoring, they also reveal significant operational and methodological barriers that hinder the full automation of data processing. A primary constraint lies in the limited transferability of the results, while classification and predictive algorithms often yield near-optimal performance metrics in controlled environments. Their robustness is frequently compromised by environmental factors. Factors such as seasonal phenology, diurnal lighting shifts, and heterogeneous terrain characteristics introduce significant spectral noise, making models highly sensitive to the specific conditions under which the training data were acquired. In practice, this means that a well-functioning flight organization scheme may not produce the expected results, or a measurement scheme that works well in one area may not produce the expected results in another area of operation (e.g., a different city, a different season, or a different terrain).
The absence of unified validation frameworks for UAV-derived datasets remains a critical bottleneck, hindering the establishment of reproducible measurement protocols. Current analytical workflows often lack rigorous quality assurance standards, which prevent cross-study comparisons and complicate the development of universal, industry-standard procedures for BGI asset assessment. This chapter describes the popularity of classic accuracy measures, but without clear guidelines in this area, e.g., a clear definition of classes or a description of the impact of resolution, which may raise doubts about the quality of the data obtained. Especially in urban environments, where small, diverse, and partially obscured objects dominate, detection requires multiscale methods and solutions that reduce the impact of complex backgrounds, which increases the complexity of designing UAV measurements and transferring them to other objects.
Reliable estimation of vegetation indices and hydro-chemical parameters is inherently dependent on high-density point clouds and hyperspectral datasets. However, acquiring and subsequently harmonizing such granular data is prohibitively costly and requires iterative sensor calibration. Furthermore, despite the conceptual shift toward integrated BGI management, empirical analyses often treat these components in isolation, yielding fragmented results that fail to capture the complex ecological synergies between urban water bodies and green spaces. In this sense, a key task for the next stage is to standardize the comprehensive development of measurements, from planning, through their correct acquisition, to the final processing of results, so that UAV methods can function as tools for continuous and, above all, reliable monitoring and management of BGI.

6. Discussion

6.1. The Maturity Gap: From Structural to Functional Monitoring

The systematic mapping of UAV applications in urban Blue-Green Infrastructure management reveals a landscape of varying technological maturity. By applying the Technology Readiness Level (TRL) framework, this synthesis moves beyond a descriptive compilation toward a critical evaluation of the field’s operational readiness. The TRL was used to conduct an assessment of both the main sensor categories (Table 5) and the application domains (Table 6). A primary finding is the significant heterogeneity between structural and functional BGI monitoring capabilities. The evaluation of sensor technologies against the TRL framework (Table 5) reveals a distinct dichotomy in operational readiness. The rationale for this classification stems from the degree to which these sensors have transitioned from controlled, experimental setups to robust, municipal-level integration. Sensors that rely on highly automated data processing and are supported by standardised commercial software are deemed fully operational (TRL 9). Conversely, sensors whose application in complex urban environments requires continuous manual calibration or experimental flight protocols are classified at lower maturity stages. Technologies dedicated to 3D mapping and the calculation of basic vegetation indices (such as NDVI/NIR) have achieved a state of full operational readiness, categorized as TRL 9 (Table 6). These methods use LIDAR structural data and multispectral camera spectral data to provide high-resolution, spatially explicit characterizations of green and blue elements. As detailed in Table 5, both RGB and LiDAR sensors have attained a TRL of 9, justified by their reliance on mature workflows, and their established utility in standardized 3D data acquisition. Similarly, multispectral cameras are classified at TRL 9, given their proven capacity to generate reliable vegetation health indices using commercially available, integrated hardware. However, the transition from simple structural mapping to functional ecological assessment remains a substantial research gap. While canopy height and geometry can be measured with high precision, the real-time assessment of functional parameters, such as air and water quality, currently resides at a lower maturity level, estimated at TRL 6 (Table 6). The primary research hurdle here lies in the lack of standardized calibration protocols capable of accounting for the “urban canyon” effect. In these environments, microclimatic turbulence and radiometric noise from artificial surfaces frequently compromise the accuracy of both thermal and multispectral sensors. This limitation is reflected in the classification of thermal and hyperspectral cameras (Table 5). Thermal sensors are designated at TRL 6–7 due to their high sensitivity to dynamic environmental variables, necessitating specific flight scheduling and complex radiometric calibration that largely confines their use to scientific research rather than routine municipal operations. Hyperspectral cameras face even greater operational hurdles, classified at TRL 5–6, owing to substantial hardware costs, significant computational overhead, and a pronounced lack of automated, end-to-end processing pipelines suitable for city-scale deployment. Furthermore, as Table 5 illustrates, the deployment of contact water samplers and gas/air quality sensors remains experimental (TRL 4–6). The rationale for this lower maturity rating involves the significant operational risks associated with water takeoffs or landings, the lack of standardized submersion protocols, and the persistent technical challenges posed by rotor downwash interference, which can severely compromise data integrity.

6.2. The Integration Gap: Transcending Compartmentalization

A critical barrier to holistic urban management is the thematic compartmentalization identified in the literature. The dominance of component-specific applications, where “Green,” “Blue,” and “Thermal” elements are analyzed in isolation, suggests that the current scientific focus is predominantly “sensor-centric” rather than “system-centric.” As illustrated in the proposed framework (Figure 7), the true utility of UAVs lies in multi-sensor data fusion rather than a “sensor-centric” approach.
Currently, only a minority of studies address the interactions between green and blue components or their combined effects at the system level. There is a notable scarcity of integrated algorithms capable of simultaneously processing structural (e.g., LiDAR) and biochemical (e.g., multispectral) data to predict overall BGI resilience. This fragmentation complicates the translation of UAV-derived indicators into integrated management frameworks required for urban resilience planning.
The diagram (Figure 7) illustrates the proposed integrative framework for UAV-supported monitoring of urban Blue–Green Infrastructure, shifting the focus from isolated data layers to functional ecosystem synergies. The central hub represents the UAV platform as a versatile carrier for five distinct data streams: Structural Data (LiDAR), Spectral Data (multispectral cameras), Thermal Data (IR cameras), and Temporal Data derived from both RGB sensors and specialized water/air quality sensors. These multi-sensor inputs are synthesized to capture the complex interplay between the primary BGI components: Green Infrastructure (Vegetation), Blue Infrastructure (Water), and the Thermal Environment (Microclimate). The framework specifically highlights critical functional interactions, such as:
  • Evapotranspiration Cooling Effect: The moderation of thermal stress by green assets.
  • Runoff Regulation: The synergistic management of hydrological cycles between green and blue components.
  • Water Body Cooling Effect: The local microclimatic regulation provided by urban water systems.
By integrating these dynamic data streams, the framework transcends simple structural mapping to support evidence-based Urban Management Outcomes, specifically targeting sustainable planning, climate adaptation, and cost-efficiency in municipal governance. Future research must, therefore, prioritize cross-domain workflows that link vegetation health directly to microclimatic regulation and runoff mitigation, rather than treating them as separate variables.

6.3. Technological Drivers and Environmental Sensing

Recent advancements in platform design and sensor technology are rapidly addressing these limitations, expanding the potential for continuous, management-oriented monitoring. Contemporary UAV systems increasingly feature extended flight endurance, higher payload capacities, and improved reliability, driven by the adoption of lightweight composite materials and efficient power systems. Concurrently, the miniaturization of sensors has democratized access to multispectral, thermal, hyperspectral, and LiDAR units, as well as lightweight in situ environmental sensors [59,217,218,219].
These hardware developments are crucial for elevating the TRL of functional monitoring applications. They enable multi-sensor data acquisition and higher temporal sampling frequencies, which are essential for capturing dynamic urban environmental processes. The RGB, multispectral cameras, and LiDAR technologies used on UAVs have reached full technological maturity (Table 5). Nevertheless, hardware capability alone does not guarantee effective management utility, leading to the challenge of data integration.
An overarching issue emerging from the synthesized literature is the persistent disconnect between data acquisition and practical policymaking. While contemporary drone platforms and deep learning algorithms generate highly detailed outputs, city administrators routinely lack the operational frameworks needed to convert this high-dimensional information into concrete environmental strategies. The technical sophistication and computational demands of these processing pipelines regularly exceed the resources and technical expertise available within municipal planning departments. Consequently, if UAV systems are to become standard instruments for BGI oversight, future methodologies must place greater emphasis on data distillation. Translating complex remote sensing metrics into standardized, accessible ecosystem health indicators appears to be a necessary step in aligning advanced spatial analytics with day-to-day urban governance.

6.4. Economic and Socio-Legal Constraints

Although UAVs offer high-resolution data, there is a lack of studies comparing the Return on Investment (ROI) of UAV-based monitoring versus traditional ground-based or satellite approaches in municipal management. Bridging this gap is essential for ensuring that UAV-supported monitoring leads to the desired outcome of long-term cost-efficiency in urban governance.
Moreover, the gap between technical feasibility and large-scale implementation is widened not only by economic considerations but also by regulatory factors. While hardware for monitoring of human-nature interaction is advancing (TRL 4–5), “regulatory readiness” often lags behind. Ethical and legal constraints, particularly GDPR compliance, dictate the spatial and temporal resolution of BGI data collection in densely populated urban areas.
The integration of Unmanned Aerial Vehicles into urban BGI management is profoundly constrained by a complex regulatory system governing airspace safety and data privacy. Operational limitations are primarily dictated by stringent pilot certification requirements and aircraft safety standards, particularly for missions conducted over built-up and populated areas as well as for missions performed beyond visual range (BVLOS). The legal regulations being introduced often classify UAV missions based on the potential risk to people and the environment. The European Union and the People’s Republic of China regulations require pilots conducting missions in urban areas to have the appropriate licenses. Flights are often classified as “special” missions, which often require a comprehensive risk assessment (SORA) and explicit permission from national aviation authorities [54,56,220]. Furthermore, the proximity of BGI components to critical infrastructure and metropolitan airports frequently places urban study areas within restricted “UAS Geographical Zones” (Geo-zones), where flights are strictly prohibited or require individual authorisation from national civil aviation agencies in order to prevent collisions and interference with manned aviation [221]. Similarly, in the United States, drone operators performing missions over people must meet a number of requirements regarding their licenses, mission requirements (obtaining special permission from the Federal Aviation Administration), and UAV technical-specific safety requirements (e.g., flight termination systems) [55].
Modern legal frameworks globally impose significant constraints on the acquisition and processing of high-resolution imagery that may be captured during urban BGI monitoring. In the European Union, the General Data Protection Regulation (GDPR) mandates a “privacy by design” approach, requiring monitoring missions to minimize data and to undergo a mandatory Data Protection Impact Assessment. Furthermore, EASA guidelines strictly integrate these privacy requirements into the Specific Category risk assessment (SORA) [54,222,223]. In contrast, the United States lacks a uniform federal equivalent to the GDPR, relying instead on fragmented sectoral and state-level privacy laws. This regulatory decentralisation often forces operators to navigate a patchwork of municipal restrictions on take-off and operational zones, while the heightened risk of litigation further necessitates the strict minimization of stored datasets [55,224,225]. The regulations in China introduce further limitations through strong state oversight and the Personal Information Protection Law. Authorities impose rigorous controls on the data flow of images containing critical infrastructure, often subjecting commercial databases to state scrutiny [57,226]. Across the jurisdictions mentioned, the inadvertent collection of identifiable facial or locational data poses a significant risk of legal liability.

7. Conclusions

This review synthesizes recent advances (2018–2025) in the application of unmanned aerial vehicles (UAVs) for monitoring and management of urban blue–green infrastructure (BGI). The analysed literature demonstrates that UAV-based approaches have reached a high level of methodological maturity, particularly in the acquisition of high-resolution spatial data and the integration of advanced analytical workflows. By applying a structured Technology Readiness Level (TRL) framework to the reviewed studies, this article provides a formal reference point for assessing the transition from experimental research to operational municipal practice. Across a wide range of application domains, UAVs have proven capable of delivering spatially explicit, timely, and cost-effective information that directly supports evidence-based decision-making in urban environmental management.
From a state-of-the-art perspective, the strongest development has occurred in vegetation monitoring and blue infrastructure assessment. UAV-mounted RGB, multispectral, thermal, LiDAR, air, and water-quality sensors are now routinely used to quantify vegetation structure, biomass, physiological condition, and ecosystem services. The TRL assessment conducted in this study confirms that while structural 3D mapping and basic spectral indices have reached full operational maturity (TRL 9), functional assessments such as cooling effect or complex air and water quality modeling currently reside at lower maturity stages (TRL 4–7), indicating a persistent “functional maturity gap”. The increasing adoption of machine learning and deep learning techniques enables automated classification, object detection, and predictive modelling, allowing UAV-derived data to move beyond descriptive mapping toward operational and anticipatory management applications.
The review highlights UAVs as transformative tools for urban BGI management. Their ability to operate at very high spatial resolutions, below cloud cover, and with flexible temporal frequency addresses key limitations of satellite and ground-based monitoring systems. Furthermore, this study bridges the gap between technical feasibility and practical implementation by providing a comparative analysis of specific UAV platforms and sensors, including technical specifications and approximate market costs, thus serving as a pragmatic guide for municipal asset management. UAVs enable the monitoring of fragmented BGI elements such as pocket parks, street trees, green roofs, retention ponds, and narrow urban waterways, which are often invisible or poorly represented in conventional datasets. As such, UAVs support a shift toward data-driven, adaptive management of BGI that aligns with broader goals of urban resilience, climate adaptation, and nature-based solutions.
However, the thematic compartmentalization observed in this review underscores a fundamental limitation in the current deployment of UAV technologies for urban BGI. While individual domains, such as thermal monitoring or vegetation analysis, are technologically mature, there is a profound lack of methodological integration across the individual spheres of the urban environment. This study identifies this ‘siloed’ research landscape as a primary barrier to advancing an integrated conceptual framework for BGI management. Future research must pivot from component-specific observations toward holistic, system-level assessments that utilize UAVs to capture the complex interdependencies between hydrological, thermal, and ecological urban systems.
The successful deployment of UAVs for urban BGI monitoring also depends on navigating a strict intersection of aviation safety protocols and privacy laws across different jurisdictions. While technical sensor capabilities continue to advance, operational scalability is hindered by the administrative burden of risk assessments and the geographical constraints imposed by restricted urban airspaces. Addressing these regulatory and data protection challenges is therefore essential for moving from experimental drone applications to standardized, long-term environmental management practices.
Key conclusions of this review include:
  • UAV-based monitoring of individual elements of urban BGI has reached a high level of technical and methodological maturity, with structural mapping achieving TRL 9 readiness.
  • A significant “maturity gap” exists between structural mapping and functional ecological assessment (TRL 4–7), requiring further methodological standardization.
  • UAVs provide unique advantages for capturing fine-scale spatial heterogeneity in complex urban environments.
  • Integration of UAV data with ML and DL methods enables automated, predictive, and management-oriented analyses.
  • The inclusion of approximate cost–benefit considerations for various sensor types provides a necessary foundation for municipal budgetary planning.
  • Current research involving the use of UAVs is dominated by single-component studies, highlighting the need for more integrated BGI assessments.
  • Widespread use of UAV-based urban BGI monitoring is hindered by complex regulatory frameworks and privacy laws, which require special licences, flight authorisations, and the application of standardized risk assessments (e.g., SORA) and “privacy by design” data protocols.

8. Future Directions

To bridge the identified maturity gaps and transition UAV-based BGI monitoring from experimental validation to standardized urban governance, future research must move beyond sensor-specific observations toward an integrated, system-level approach. The following priority areas are proposed to guide upcoming developments:
  • Multi-sensor fusion and cross-platform interoperability
Future efforts should prioritize the development of automated, end-to-end workflows that synthesize data from heterogeneous sources. Rather than relying on isolated spectral or structural metrics, research must focus on the algorithmic fusion of LiDAR-derived canopy geometry with radiometric thermal data and ground-based IoT sensor streams. Establishing standardized, cross-jurisdictional protocols for such data integration is essential for quantifying complex ecological functions, such as the real-time cooling efficiency of urban greenery or the dynamic hydraulic conductivity of blue infrastructure, across varying urban morphologies.
  • Integration with Urban Digital Twins (UDT) and smart city frameworks
A critical frontier lies in synchronising UAV-derived biophysical parameters with dynamic Urban Digital Twins. Future research should investigate the creation of automated pipelines that feed high-resolution aerial data directly into 3D city models. This integration will enable high-fidelity predictive simulations of microclimatic shifts and localised flood risks under diverse climate change scenarios. Transitioning from reactive monitoring to such proactive, AI-driven ecosystem management is vital for enhancing the climate resilience of densely populated areas.
  • Advanced AI architectures for functional and predictive analytics
While current AI applications in BGI are dominated by land-cover classification, there is a pressing need for specialized deep learning architectures trained on the unique heterogeneity of urban environments. Future models should shift toward functional monitoring—predicting BGI health, carbon sequestration rates, and runoff regulation capacity. Developing “explainable AI” (XAI) models that provide interpretable outputs for municipal decision-makers will be crucial in overcoming the current “data-to-decision” gap identified in this review.
  • Navigating the socio-technical and economic landscape
As technical barriers subside, the scientific community must increasingly focus on the regulatory-economic nexus. This includes developing “privacy by design” data protocols to ensure compliance with stringent legal frameworks (e.g., GDPR) in urban settings. Furthermore, there is a lack of longitudinal studies evaluating the long-term Return on Investment (ROI) of UAV deployments. Rigorous cost–benefit analyses comparing autonomous aerial systems with traditional satellite or ground-based monitoring are necessary to provide the economic justification required for full-scale municipal adoption and the establishment of continuous monitoring protocols within urban management departments.

Author Contributions

Conceptualization, M.J.; methodology, M.J.; investigation, M.J., K.M. and F.B.; resources, M.J., K.M. and F.B.; data curation, M.J., K.M. and F.B.; writing—original draft preparation, M.J., K.M. and F.B.; writing—review and editing, M.J., K.M. and A.B.; visualization, M.J. and F.B.; supervision, M.J., K.M. and A.B.; project administration, M.J.; funding acquisition, A.B. Authors contribution: M.J. 70%, K.M. 15%, A.B. 10% and F.B. 5%. All authors have read and agreed to the published version of the manuscript.

Funding

Research project partly supported by program ‘Excellence initiative—research university’ for the AGH University of Krakow and partly by subvention of AGH University of Krakow No. 16.16.150.545.

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding authors.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Almaaitah, T.; Appleby, M.; Rosenblat, H.; Drake, J.; Joksimovic, D. The Potential of Blue-Green Infrastructure as a Climate Change Adaptation Strategy: A Systematic Literature Review. Blue-Green Syst. 2021, 3, 223–248. [Google Scholar] [CrossRef]
  2. European Commission Green Infrastructure (GI). Enhancing Europe’s Natural Capital Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions; European Commission Green Infrastructure (GI): Brussels, Belgium, 2013. [Google Scholar]
  3. Pradana, M.R.; Wibowo, A.; Semedi, J.M. Multi-Perspective Evaluation of Urban Green Views: Spatial and Street-View Data Integration in Sudirman Central Business District, Indonesia. Geomat. Environ. Eng. 2025, 19, 91–114. [Google Scholar] [CrossRef]
  4. Bernaciak, A.; Bernaciak, A.; Fortuński, B. Blue-green infrastructure of a regenerative city. Econ. Environ. 2024, 91, 978. [Google Scholar] [CrossRef]
  5. Gong, X.; Chang, C.C. Monetized Estimates of the Ecosystem Service Value of Urban Blue and Green Infrastructure and Analysis: A Case Study of Changsha, China. Sustainability 2022, 14, 16092. [Google Scholar] [CrossRef]
  6. Maes, J.; Zulian, G.; Günther, S.; Thijssen, M.; Raynal, J. Enhancing Resilience of Urban Ecosystems Through Green Infrastructure (EnRoute); Final Report, EUR 29630 EN; Publications Office of the European Union: Luxembourg, 2019. [Google Scholar]
  7. Li, J.; Xu, H.; Ren, M.; Duan, J.; You, W.; Zhou, Y. Knowledge Mapping of Cultural Ecosystem Services Applied on Blue-Green Infrastructure—A Scientometric Review with CiteSpace. Forests 2024, 15, 1736. [Google Scholar] [CrossRef]
  8. Biernacka, M.; Kronenberg, J.; Łaszkiewicz, E.; Czembrowski, P.; Amini Parsa, V.; Sikorska, D. Beyond Urban Parks: Mapping Informal Green Spaces in an Urban–Peri-Urban Gradient. Land Use Policy 2023, 131, 106746. [Google Scholar] [CrossRef]
  9. Błasik, M.; Wang, T.; Kazak, J.K. The Effectiveness of Master Plans: Case Studies of Biologically Active Areas in Suburban Zones. Geomat. Environ. Eng. 2022, 16, 27–40. [Google Scholar] [CrossRef]
  10. Sun, C.Y.; Chiang, T.P.; Wu, Y.W. Residents’ Perceptions of Informal Green Spaces in High-Density Cities: Urban Land Governance Implications from Taipei. Land 2025, 14, 1466. [Google Scholar] [CrossRef]
  11. Archiciński, P.; Przybysz, A.; Sikorska, D.; Wińska-Krysiak, M.; Da Silva, A.R.; Sikorski, P. Conservation Management Practices for Biodiversity Preservation in Urban Informal Green Spaces: Lessons from Central European City. Land 2024, 13, 764. [Google Scholar] [CrossRef]
  12. Wang, X.; Hu, Q.; Zhang, R.; Sun, C.; Wang, M. Ecosystem Services in Urban Blue-Green Infrastructure: A Bibliometric Review. Water 2025, 17, 2273. [Google Scholar] [CrossRef]
  13. Jakubiak, M.; Bojarski, B.; Bieñ, M.; Stonawski, B.; Oglêcki, P. Influence of Fish Ponds on the Benthic Invertebrate Composition in Hydrological Networks of Selected Fish Farms in Southern Poland. Folia Biol. 2022, 70, 11–18. [Google Scholar] [CrossRef]
  14. Environmental Protection Agency US. Green Infrastructure Case Studies: Municipal Policies for Managing Stormwater with Green Infrastructure (EPA-841-F-10-004); United States Environmental Protection Agency: Washington, DC, USA, 2010.
  15. Bojarski, B.; Jakubiak, M.; Szczerbik, P.; Bień, M.; Klaczak, A.; Stański, T.; Witeska, M. The Influence of Fish Ponds on Fish Assemblages of Adjacent Watercourses. Pol. J. Environ. Stud. 2022, 31, 609–617. [Google Scholar] [CrossRef]
  16. McNabb, T.; Charters, F.J.; Challies, E.; Dionisio, R. Unlocking Urban Blue-Green Infrastructure: An Interdisciplinary Literature Review Analysing Co-Benefits and Synergies between Bio-Physical and Socio-Cultural Outcomes. Blue-Green Syst. 2024, 6, 217–231. [Google Scholar] [CrossRef]
  17. Wróbel, J.; Gałczyńska, M.; Tański, A.; Korzelecka-Orkisz, A.; Formicki, K. The Challenges of Aquaculture in Protecting the Aquatic Ecosystems in the Context of Climate Changes. J. Water Land Dev. 2023, 231–241. [Google Scholar] [CrossRef]
  18. Lach, S.K.; Kopacz, M.T. Forms of Nature Protection Occurring on Artificial Water Reservoirs in Poland. Ecol. Eng. Environ. Technol. 2025, 26, 346–351. [Google Scholar] [CrossRef]
  19. Szombara, S.; Lewińska, P.; Żądło, A.; Róg, M.; Maciuk, K. Analyses of the Prądnik Riverbed Shape Based on Archival and Contemporary Data Sets—Old Maps, LiDAR, DTMs, Orthophotomaps and Cross-Sectional Profile Measurements. Remote Sens. 2020, 12, 2208. [Google Scholar] [CrossRef]
  20. Antoszewski, P.; Świerk, D.; Krzyżaniak, M.; Choryński, A. Legal Tools for Blue-Green Infrastructure Planning—Based on the Example of Poznań (Poland). Sustainability 2024, 16, 141. [Google Scholar] [CrossRef]
  21. European Commission. EU Biodiversity Strategy for 2030; Publications Office of the European Union: Luxembourg, 2021. [Google Scholar]
  22. Jakubiak, M.; Chmielowski, K. Identification of Urban Water Bodies Ecosystem Services. Acta Sci. Pol. Form. Circumiectus 2020, 19, 73–82. [Google Scholar] [CrossRef]
  23. Environmental Protection Agency US. Land Use and Green Infrastructure Scorecard. Low Impact Development Strategies to Protect Water Resources; EPA 833R23002; United States Environmental Protection Agency: Washington, DC, USA, 2023.
  24. Yin, D.; Xu, C.; Jia, H.; Yang, Y.; Sun, C.; Wang, Q.; Liu, S. Sponge City Practices in China: From Pilot Exploration to Systemic Demonstration. Water 2022, 14, 1531. [Google Scholar] [CrossRef]
  25. Shang, S.; Wang, L.; Wang, Y.; Su, X.; Li, L.; Xia, X. Exploration of Sponge City Construction in China from the Perspective of Typical Cases. Front. Earth Sci. 2023, 11, 1238203. [Google Scholar] [CrossRef]
  26. Wilbers, G.J.; de Bruin, K.; Seifert-Dähnn, I.; Lekkerkerk, W.; Li, H.; Budding-Polo Ballinas, M. Investing in Urban Blue–Green Infrastructure—Assessing the Costs and Benefits of Stormwater Management in a Peri-Urban Catchment in Oslo, Norway. Sustainability 2022, 14, 1934. [Google Scholar] [CrossRef]
  27. Bogacki, M.; Neverova-Dziopak, E.; Yedoyan, T.; Dziopak, J. Evolution of Cities under Climate Change: Greening and Blue-Green Infrastructure. J. Archit. Eng. Res. 2025, 8, 22–33. [Google Scholar] [CrossRef]
  28. Wang, J.; Foley, K. Promoting Climate-Resilient Cities: Developing an Attitudinal Analytical Framework for Understanding the Relationship between Humans and Blue-Green Infrastructure. Environ. Sci. Policy 2023, 146, 133–143. [Google Scholar] [CrossRef]
  29. Mazur, R.; Jakubiak, M.; Santos, L. Environmental Factors Affecting the Efficiency of Water Reservoir Restoration Using Microbiological Biotechnology. Sustainability 2024, 16, 266. [Google Scholar] [CrossRef]
  30. Pochodyła, E.; Glińska-Lewczuk, K.; Jaszczak, A. Blue-Green Infrastructure as a New Trend and an Effective Tool for Water Management in Urban Areas. Landsc. Online 2021, 92, 1–20. [Google Scholar] [CrossRef]
  31. Dao, C.; Qi, J. Seeing and Thinking about Urban Blue–Green Space: Monitoring Public Landscape Preferences Using Bimodal Data. Buildings 2024, 14, 1426. [Google Scholar] [CrossRef]
  32. Richter, M.; Dickhaut, W. Long-Term Performance of Blue-Green Roof Systems—Results of a Building-Scale Monitoring Study in Hamburg, Germany. Water 2023, 15, 2806. [Google Scholar] [CrossRef]
  33. Cristiano, E.; Annis, A.; Apollonio, C.; Pumo, D.; Urru, S.; Viola, F.; Deidda, R.; Pelorosso, R.; Petroselli, A.; Tauro, F.; et al. Multilayer Blue-Green Roofs as Nature-Based Solutions for Water and Thermal Insulation Management. Hydrol. Res. 2022, 53, 1129–1149. [Google Scholar] [CrossRef]
  34. Wu, X.; Willems, P. Assessing Blue-Green Infrastructures for Urban Flood and Drought Mitigation under Changing Climate Scenarios. J. Hydrol. Reg. Stud. 2025, 62, 102798. [Google Scholar] [CrossRef]
  35. Jakubiak, M.; Panek, E.; Urbański, K.; Victória, S.S.; Lach, S.; Maciuk, K.; Kopacz, M. Nature-Based Solutions in Sustainable Cities: Trace Metal Accumulation in Urban Forests of Vienna (Austria) and Krakow (Poland). Sustainability 2025, 17, 7042. [Google Scholar] [CrossRef]
  36. Śliwka, M.; Jakubiak, M. Application of Laser Stimulation of Some Hydrophytes Species for More Efficient Biogenic Elements Phytoremediation. Proc. ECOpole 2010, 4, 205–211. [Google Scholar]
  37. de Rijke, C.A.; Lim, N.J.; Iqbal, A.; Brandt, S.A.; Sahlin, E.A.U. A Systematic Review of Blue-Green Infrastructure’s Role and Relevance in the Mitigation and Management of Climate-Induced Hazards in x-Minute Cities. Plan. Pract. Res. 2025, 1–29. [Google Scholar] [CrossRef]
  38. Głowienka, E.; Kucza, M. Persistent Urban Park Cooling Effects in Krakow: A Satellite-Based Analysis of Land Surface Temperature Patterns (1990–2018). Remote Sens. 2025, 17, 3608. [Google Scholar] [CrossRef]
  39. Czyża, S.; Kowalczyk, A.M. Applying GIS in Blue-Green Infrastructure Design in Urban Areas for Better Life Quality and Climate Resilience. Sustainability 2024, 16, 5187. [Google Scholar] [CrossRef]
  40. Langeveld, J.G.; Cherqui, F.; Tscheikner-Gratl, F.; Muthanna, T.M.; Juarez, M.F.D.; Leitão, J.P.; Roghani, B.; Kerres, K.; do Céu Almeida, M.; Werey, C.; et al. Asset Management for Blue-Green Infrastructures: A Scoping Review. Blue-Green Syst. 2022, 4, 272–290. [Google Scholar] [CrossRef]
  41. Sörensen, J.; Persson, A.S.; Olsson, J.A. A Data Management Framework for Strategic Urban Planning Using Blue-Green Infrastructure. J. Environ. Manag. 2021, 299, 113658. [Google Scholar] [CrossRef]
  42. Boguniewicz-Zabłocka, J.; Łukasiewicz, E. Blue–Green Infrastructure Effectiveness for Urban Stormwater Management: A Multi-Scale Residential Case Study. Land 2025, 14, 1340. [Google Scholar] [CrossRef]
  43. Richter, M.; Heinemann, K.; Meiser, N.; Dickhaut, W. Trees in Sponge Cities—A Systematic Review of Trees as a Component of Blue-Green Infrastructure, Vegetation Engineering Principles, and Stormwater Management. Water 2024, 16, 655. [Google Scholar] [CrossRef]
  44. Neyns, R.; Canters, F. Mapping of Urban Vegetation with High-Resolution Remote Sensing: A Review. Remote Sens. 2022, 14, 1031. [Google Scholar] [CrossRef]
  45. Seeberg, G.; Hostlowsky, A.; Huber, J.; Kamm, J.; Lincke, L.; Schwingshackl, C. Evaluating the Potential of Landsat Satellite Data to Monitor the Effectiveness of Measures to Mitigate Urban Heat Islands: A Case Study for Stuttgart (Germany). Urban Sci. 2022, 6, 82. [Google Scholar] [CrossRef]
  46. de Almeida, C.R.; Furst, L.; Gonçalves, A.; Teodoro, A.C. Remote Sensing Image-Based Analysis of the Urban Heat Island Effect in Bragança, Portugal. Environments 2022, 9, 98. [Google Scholar] [CrossRef]
  47. Michalowska, K.; Glowienka, E.; Hejmanowska, B. Temporal Satellite Images in the Process of Automatic Efficient Detection of Changes of the Baltic Sea Coastal Zone. In Proceedings of the IOP Conference Series: Earth and Environmental Science; Institute of Physics Publishing: Bristol, UK, 2016; Volume 44. [Google Scholar]
  48. Duan, Q.; Tan, M.; Guo, Y.; Wang, X.; Xin, L. Understanding the Spatial Distribution of Urban Forests in China Using Sentinel-2 Images with Google Earth Engine. Forests 2019, 10, 729. [Google Scholar] [CrossRef]
  49. Wu, S.; Song, Y.; An, J.; Lin, C.; Chen, B. High-Resolution Greenspace Dynamic Data Cube from Sentinel-2 Satellites over 1028 Global Major Cities. Sci. Data 2024, 11, 909. [Google Scholar] [CrossRef] [PubMed]
  50. Głowienka, E.; Michałowska, K. Analyzing the Impact of Simulated Multispectral Images on Water Classification Accuracy by Means of Spectral Characteristics. Geomat. Environ. Eng. 2020, 14, 47–58. [Google Scholar] [CrossRef]
  51. Na, N.; Xu, D.; Fang, W.; Pu, Y.; Liu, Y.; Wang, H. Automatic Detection and Dynamic Analysis of Urban Heat Islands Based on Landsat Images. Remote Sens. 2023, 15, 4006. [Google Scholar] [CrossRef]
  52. Zwolska, A.; Polrolniczak, M.; Kolendowicz, L. Remote Sensing-Based Analysis of Urban Land Cover Changes and Surface Urban Heat Island Dynamics Using Landsat and Local Climate Zones Classification in Poznań, Poland. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2025, 18, 16020–16037. [Google Scholar] [CrossRef]
  53. Głowienka, E.; Malinverni, E.S.; Sanità, M.; Michałowska, K.; Kucza, M. Harmonizing Satellite Thermal Data with Ground-Based Observations for Climate Long-Term Monitoring. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2025, 48, 127–132. [Google Scholar] [CrossRef]
  54. EU Commission Commission. Implementing Regulation (EU) 2019/947 of 24 May 2019 on the Rules and Procedures for the Operation of Unmanned Aircraft; Regulation (EU) 2019/947; The European Union: Brussels, Belgium, 2019. [Google Scholar]
  55. Federal Aviation Administration Small Unmanned Aircraft Systems (UAS) Regulations; 14 C.F.R. Part 107; US Federal Aviation Administration: Washington, DC, USA, 2016.
  56. Civil Aviation Administration of China (CAAC). Regulations on the Management of Civil Unmanned Aerial Vehicle Operation Safety; No. AC-91-03; Civil Aviation Administration of China: Beijing, China, 2017.
  57. Civil Aviation Administration of China (CAAC). Regulations on Real-Name Registration of Civil Unmanned Aircraft Systems; No. AP-45-AA-2017-03; Aircraft Airworthiness Certification Department, Civil Aviation Administration of China: Beijing, China, 2017.
  58. AAP-06 2020; NATO Glossary of Terms and Definitions. NATO Standardization Office: Brussels, Belgium, 2020.
  59. Dainelli, R.; Toscano, P.; Di Gennaro, S.F.; Matese, A. Recent Advances in Unmanned Aerial Vehicles Forest Remote Sensing—A Systematic Review. Part Ii: Research Applications. Forests 2021, 12, 397. [Google Scholar] [CrossRef]
  60. Duarte, A.; Borralho, N.; Cabral, P.; Caetano, M. Recent Advances in Forest Insect Pests and Diseases Monitoring Using UAV-Based Data: A Systematic Review. Forests 2022, 13, 911. [Google Scholar] [CrossRef]
  61. Olson, D.; Anderson, J. Review on Unmanned Aerial Vehicles, Remote Sensors, Imagery Processing, and Their Applications in Agriculture. Agron. J. 2021, 113, 971–992. [Google Scholar] [CrossRef]
  62. Velusamy, P.; Rajendran, S.; Mahendran, R.K.; Naseer, S.; Shafiq, M.; Choi, J.G. Unmanned Aerial Vehicles (Uav) in Precision Agriculture: Applications and Challenges. Energies 2022, 15, 217. [Google Scholar] [CrossRef]
  63. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 Statement: An Updated Guideline for Reporting Systematic Reviews. BMJ 2021, 372, 71. [Google Scholar] [CrossRef]
  64. Salazar, G.; Russi-Vigoya, M.N. Technology Readiness Level as the Foundation of Human Readiness Level. Ergon. Des. 2021, 29, 25–29. [Google Scholar] [CrossRef]
  65. Hirshorn, S.; Jefferies, S. Final Report of the NASA Technology Readiness Assessment (TRA) Study Team; NASA: Washington, DC, USA, 2016.
  66. Gini, R.; Passoni, D.; Pinto, L.; Sona, G. Aerial Images from an UAV System: 3D Modeling and Tree Species Classification in a Park Area. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, XXXIX-B1, 361–366. [Google Scholar] [CrossRef]
  67. Shen, N.; Feng, F.; Xu, C.; Li, X.; Chiriacò, M.V.; Lafortezza, R. Drone-Based Assessment of Urban Green Space Structure and Cooling Capacity. Urban For. Urban Green. 2025, 112, 128953. [Google Scholar] [CrossRef]
  68. Cao, Q.; Li, M.; Yang, G.; Tao, Q.; Luo, Y.; Wang, R.; Chen, P. Urban Vegetation Classification for Unmanned Aerial Vehicle Remote Sensing Combining Feature Engineering and Improved DeepLabV3+. Forests 2024, 15, 382. [Google Scholar] [CrossRef]
  69. Senanayake, S.M.R.B.; Herath, H.M.K.K.M.B.; Yasakethu, S.L.P.; Madhusanka, B.G.D.A. Semantic Segmentation in Unmanned Aerial Vehicle Surveillance for Detailed Urban Tree Mapping and Species Classification. In Proceedings of the 2024 6th International Conference on Advancements in Computing, ICAC 2024; Institute of Electrical and Electronics Engineers Inc.: Piscataway, NJ, USA, 2024; pp. 444–449. [Google Scholar]
  70. Akca, S. Evaluating Urban Green Spaces Using UAV-Based Green Leaf Index. Mersin Photogramm. J. 2024, 6, 52–59. [Google Scholar] [CrossRef]
  71. Lee, G.; Hwang, J.; Cho, S. A Novel Index to Detect Vegetation in Urban Areas Using Uav-Based Multispectral Images. Appl. Sci. 2021, 11, 3472. [Google Scholar] [CrossRef]
  72. Lin, J.; Chen, D.; Wu, W.; Liao, X. Estimating Aboveground Biomass of Urban Forest Trees with Dual-Source UAV Acquired Point Clouds. Urban For. Urban Green. 2022, 69, 127521. [Google Scholar] [CrossRef]
  73. Bin Shafaat, O.; Kauhanen, H.; Julin, A.; Jaalama, K.; Vaaja, M.T. Comparing Airborne Laser Scanning and UAV Photogrammetry for Estimating Aboveground Biomass of Individual Urban Trees in Helsinki. Urban For. Urban Green. 2025, 112, 128936. [Google Scholar] [CrossRef]
  74. Ghanbari Parmehr, E.; Amati, M. Individual Tree Canopy Parameters Estimation Using Uav-Based Photogrammetric and Lidar Point Clouds in an Urban Park. Remote Sens. 2021, 13, 2062. [Google Scholar] [CrossRef]
  75. Apollo, M.; Mostowska, J.; Maciuk, K.; Wengel, Y.; Jones, T.E.; Cheer, J.M. Peak-Bagging and Cartographic Misrepresentations: A Call to Correction. Curr. Issues Tour. 2021, 24, 1970–1975. [Google Scholar] [CrossRef]
  76. Li, R.; Bai, Z.; Ye, C.; Ablameyko, S.; Ye, S. Urban Green Space Vegetation Height Modeling and Intelligent Classification Based on UAV Multi-Spectral and Oblique High-Resolution Images. Urban For. Urban Green. 2025, 107, 128785. [Google Scholar] [CrossRef]
  77. Li, Y.; Wang, B.; Zhao, X.; Zhang, Y.; Qiao, L. Inversion and Analysis of Leaf Area Index (LAI) of Urban Park Based on Unmanned Aerial Vehicle (UAV) Multispectral Remote Sensing and Random Forest (RF). PLoS ONE 2025, 20, e0320608. [Google Scholar] [CrossRef]
  78. Cheng, H.; Wang, Y.; Shan, L.; Chen, Y.; Yu, K.; Liu, J. Mapping Fine-Scale Carbon Sequestration Benefits and Landscape Spatial Drivers of Urban Parks Using High-Resolution UAV Data. J. Environ. Manag. 2024, 370, 122319. [Google Scholar] [CrossRef]
  79. Wei, W.; Li, J. Assessing the Three-Dimensional Vegetation Carbon Sink of Urban Green Spaces Using Unmanned Aerial Vehicles and Machine Learning. Ecol. Indic. 2025, 173, 113380. [Google Scholar] [CrossRef]
  80. Li, S.; Li, W.; Yu, M.; Chen, D.; Xu, M.; Ren, M.; Yang, X. Urban Three-Dimension Green Quantity Estimation: An Approach Utilizing UAV, Satellite Imagery, and Machine Learning. Remote Sens. Appl. 2025, 39, 101691. [Google Scholar] [CrossRef]
  81. Yiğit, A.Y. Deep Learning-Based Palm Tree Detection for Urban Green Space Monitoring Using High-Resolution UAV Imagery. Trans. GIS 2025, 29, e70171. [Google Scholar] [CrossRef]
  82. Liu, Y.; Kong, G.; Shen, X.; Miao, S. A Fully Integrated Deep Learning Framework for Semantic Segmentation of Vegetation Classification Based on Active Learning Strategies and UAV Remote Sensing. In Proceedings of the Advances in Computer Science and Ubiquitous Computing. CUTECSA 2023; Lecture Notes in Electrical Engineering; Park, J.S., Yang, L.T., Pan, Y., Park, J.J., Eds.; Springer: Singapore, 2024; Volume 1190, pp. 247–252. [Google Scholar]
  83. Chen, B.; Mu, X.; Chen, P.; Wang, B.; Choi, J.; Park, H.; Xu, S.; Wu, Y.; Yang, H. Machine Learning-Based Inversion of Water Quality Parameters in Typical Reach of the Urban River by UAV Multispectral Data. Ecol. Indic. 2021, 133, 108434. [Google Scholar] [CrossRef]
  84. Chen, J.; Wang, J.; Feng, S.; Zhao, Z.; Wang, M.; Sun, C.; Song, N.; Yang, J. Study on Parameter Inversion Model Construction and Evaluation Method of UAV Hyperspectral Urban Inland Water Pollution Dynamic Monitoring. Water 2023, 15, 4131. [Google Scholar] [CrossRef]
  85. Lei, X.; Jiang, J.; Deng, Z.; Wu, D.; Wang, F.; Lai, C.; Wang, Z.; Chen, X. An Ensemble Machine Learning Model to Estimate Urban Water Quality Parameters Using Unmanned Aerial Vehicle Multispectral Imagery. Remote Sens. 2024, 16, 2246. [Google Scholar] [CrossRef]
  86. Li, H.; Wang, N.; Du, Z.; Huang, D.; Shi, M.; Zhong, Z.; Yuan, D. Multi-Parameter Water Quality Inversion in Heterogeneous Inland Waters Using UAV-Based Hyperspectral Data and Deep Learning Methods. Remote Sens. 2025, 17, 2191. [Google Scholar] [CrossRef]
  87. Liu, X.; Wang, Y.; Gu, X.; Li, M.; Lv, W.; Li, X.; Tang, R.; Chen, G.; Zhang, B.; Liu, S.; et al. Dynamic Mapping of Dissolved Oxygen in Freshwater Aquaculture Ponds Using UAV Multispectral Imagery. Ecol. Inform. 2025, 91, 103388. [Google Scholar] [CrossRef]
  88. Liu, C.; Zhou, X.; Zhou, Y.; Akbar, A. Multi-Temporal Monitoring of Urban River Water Quality Using Uav-Borne Multi-Spectral Remote Sensing. In Proceedings of the International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences-ISPRS Archives; International Society for Photogrammetry and Remote Sensing: Hannover, Germany, 2020; Volume 43, pp. 1469–1475. [Google Scholar]
  89. Nevárez, M.; Sigala, M. Estimation of Chlorophyll-a in Urban Lakes Using Drones. Tecnol. Y Cienc. Del Agua 2022, 13, 101–135. [Google Scholar] [CrossRef]
  90. Tang, Y.; Pan, Y.; Zhang, L.; Yi, H.; Gu, Y.; Sun, W. Efficient Monitoring of Total Suspended Matter in Urban Water Based on UAV Multi-Spectral Images. Water Resour. Manag. 2023, 37, 2143–2160. [Google Scholar] [CrossRef]
  91. Zheng, Z.; Jiang, Y.; Zhang, Q.; Zhong, Y.; Wang, L. A Feature Selection Method Based on Relief Feature Ranking with Recursive Feature Elimination for the Inversion of Urban River Water Quality Parameters Using Multispectral Imagery from an Unmanned Aerial Vehicle. Water 2024, 16, 1029. [Google Scholar] [CrossRef]
  92. Wu, D.; Jiang, J.; Wang, F.; Luo, Y.; Lei, X.; Lai, C.; Wu, X.; Xu, M. Retrieving Eutrophic Water in Highly Urbanized Area Coupling UAV Multispectral Data and Machine Learning Algorithms. Water 2023, 15, 354. [Google Scholar] [CrossRef]
  93. Hasyim, A.W.; Anggraini, I.A.; Usman, F.; Isdianto, A. Evaluating Urban Heat Island Effects in Malang City Parks Using UAV and OBIA Technologies. Int. J. Sustain. Dev. Plan. 2025, 20, 1633–1644. [Google Scholar] [CrossRef]
  94. Kim, W.; Kim, E.; Song, W. Evaluating the Cooling Benefits of Rainwater Spraying in Urban Environments Using Machine Learning and UAV Thermal Imaging. Landsc. Ecol. Eng. 2025, 21, 643–654. [Google Scholar] [CrossRef]
  95. Kim, D.; Yu, J.; Yoon, J.; Jeon, S.; Son, S. Comparison of Accuracy of Surface Temperature Images from Unmanned Aerial Vehicle and Satellite for Precise Thermal Environment Monitoring of Urban Parks Using in Situ Data. Remote Sens. 2021, 13, 1977. [Google Scholar] [CrossRef]
  96. Gu, J.; Kim, D.; Jun, C.; Son, S. Quantitative Assessment of Factors That Influence Heat Vulnerability in Residential Areas Using Machine Learning and Unmanned Aerial Vehicle. City Environ. Interact. 2025, 27, 100214. [Google Scholar] [CrossRef]
  97. Dieter, G.; McDonald, W. Drone Remote Sensing to Define Heat Exchange between Urban Surfaces and Stormwater Runoff. J. Hydroinform. 2024, 26, 2475–2488. [Google Scholar] [CrossRef]
  98. Trzeciak, M.; Sikorska, D. Application of UAV and Ground Measurements for Urban Vegetation Cooling Benefit Assessment, Wilanów Palace Case Study. Sci. Rev. Eng. Environ. Sci. 2024, 33, 53–68. [Google Scholar] [CrossRef]
  99. Cho, Y.I.; Jung, J.A.; Lee, M.J. Use of Unmanned Aerial Vehicles to Explore the Structural Characteristics of Urban Spaces for Outdoor Heat Stress Assessment and Comparative Analysis of Heat Island Cooling Strategies. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2024, 17, 15420–15430. [Google Scholar] [CrossRef]
  100. Cho, Y.I.; Yoon, D.; Lee, M.J. Comparative Analysis of Urban Heat Island Cooling Strategies According to Spatial and Temporal Conditions Using Unmanned Aerial Vehicles (UAV) Observation. Appl. Sci. 2023, 13, 10052. [Google Scholar] [CrossRef]
  101. Lee, S.B.; Kil, S.H.; Yun, Y.J.; Choi, Y.E. An Analysis of Surface Temperature Changes for Urban Green Space Using Unmanned Aerial Vehicles. J. People Plants Environ. 2022, 25, 685–701. [Google Scholar] [CrossRef]
  102. Xu, S.; Yang, K.; Xu, Y.; Zhu, Y.; Luo, Y.; Shang, C.; Zhang, J.; Zhang, Y.; Gao, M.; Wu, C. Urban Land Surface Temperature Monitoring and Surface Thermal Runoff Pollution Evaluation Using Uav Thermal Remote Sensing Technology. Sustainability 2021, 13, 11203. [Google Scholar] [CrossRef]
  103. Cano-Ciborro, V.; Medina, A.; Burgueño, A.; González-Rodríguez, M.; Díaz, D.; Zambrano, M.R. Mapping Public Space Micro Occupations: Drone Driven Predictions of Spatial Behaviors in Carapungo, Quito. Environ. Plan. B Urban Anal. City Sci. 2025, 52, 629–645. [Google Scholar] [CrossRef]
  104. Duan, L.; Cheng, J.; Huang, S.; Long, X.; Li, L.; Liu, W. Drone-Based Spatial Gait Analysis in an Urban Park Across Age Groups Using a Deep Learning Approach. In Proceedings of the SeGAH 2025—2025 IEEE 13th Conference on Serious Games and Applications for Health; Institute of Electrical and Electronics Engineers Inc.: Piscataway, NJ, USA, 2025. [Google Scholar]
  105. Zhang, R.; Cao, L.; Wang, L.; Wang, L.; Wang, J.; Xu, N.; Luo, J. Assessing the Relationship between Urban Park Spatial Features and Physical Activity Levels in Residents: A Spatial Analysis Utilizing Drone Remote Sensing. Ecol. Indic. 2024, 166, 112520. [Google Scholar] [CrossRef]
  106. Sikorsky, K.; Sharp, R.; Wilkes, J.; Fefer, J.; Nelson, K. The Use of Drones for Recreational Impact Monitoring of Public Lands. J. Park Recreat. Admi. 2023, 41, 68–84. [Google Scholar] [CrossRef]
  107. Park, K.; Christensen, K.; Lee, D. Unmanned Aerial Vehicles (UAVs) in Behavior Mapping: A Case Study of Neighborhood Parks. Urban For. Urban Green. 2020, 52, 126693. [Google Scholar] [CrossRef]
  108. Park, K. Park and Neighborhood Attributes Associated with Park Use: An Observational Study Using Unmanned Aerial Vehicles. Environ. Behav. 2019, 52, 518–543. [Google Scholar] [CrossRef]
  109. Chen, L.; Pang, X.; Li, J.; Xing, B.; An, T.; Yuan, K.; Dai, S.; Wu, Z.; Wang, S.; Wang, Q.; et al. Vertical Profiles of O3, NO2 and PM in a Major Fine Chemical Industry Park in the Yangtze River Delta of China Detected by a Sensor Package on an Unmanned Aerial Vehicle. Sci. Total Environ. 2022, 845, 157113. [Google Scholar] [CrossRef]
  110. Han, L.; Zhao, J.; Zhang, J.; Gao, Y.; Xin, K. Vertical Distribution of Urban Near-Surface Pollutant PM2.5 Based on UAV Monitoring Platform. Chem. Eng. Trans. 2018, 71, 25–30. [Google Scholar] [CrossRef]
  111. Xin, K.; Zhao, J.; Ma, X.; Han, L.; Liu, Y.; Zhang, J.; Gao, Y. Effect of Urban Underlying Surface on PM2.5 Vertical Distribution Based on UAV in Xi’an, China. Environ. Monit. Assess. 2021, 193, 312. [Google Scholar] [CrossRef]
  112. Kokate, P.; Middey, A.; Sadistap, S. Atmospheric CO2 Level Measurement and Discomfort Index Calculation with the Use of Low-Cost Drones. Eng. Technol. Appl. Sci. Res. 2023, 13, 11728–11734. [Google Scholar] [CrossRef]
  113. Bakirci, M. Evaluating the Impact of Unmanned Aerial Vehicles (UAVs) on Air Quality Management in Smart Cities: A Comprehensive Analysis of Transportation-Related Pollution. Comput. Electr. Eng. 2024, 119, 109556. [Google Scholar] [CrossRef]
  114. Lyu, R.; Zhang, J.; Pang, J.; Zhang, J. Modeling the Impacts of 2D/3D Urban Structure on PM2.5 at High Resolution by Combining UAV Multispectral/LiDAR Measurements and Multi-Source Remote Sensing Images. J. Clean. Prod. 2024, 437, 140613. [Google Scholar] [CrossRef]
  115. Dobrzański, M.; Muniak, D.P.; Müller, J.; Cichowicz, R. The Impact of Power Units on Air Quality on a University Campus Located in the Center of an Urban Agglomeration. Energy 2025, 324, 135993. [Google Scholar] [CrossRef]
  116. Pochwała, S.; Gardecki, A.; Lewandowski, P.; Somogyi, V.; Anweiler, S. Developing of Low-Cost Air Pollution Sensor—Measurements with the Unmanned Aerial Vehicles in Poland. Sensors 2020, 20, 3582. [Google Scholar] [CrossRef] [PubMed]
  117. Klimczyk, M. The Concept of a Collection System for Gas Mixture from the Interior of Chimney Openings for Unmanned Flying Systems. Adv. Sci. Technol. Res. J. 2021, 15, 191–196. [Google Scholar] [CrossRef]
  118. Polat, N.; Memduhoğlu, A. Assessing Spatiotemporal LST Variations in Urban Landscapes Using Diurnal UAV Thermography. Appl. Sci. 2025, 15, 3448. [Google Scholar] [CrossRef]
  119. Jech, J.; Komarkova, J.; Sedlak, P. Land Cover Change Detection near Small Water Bodies Based on RGB UAV Data: Case Study of the Pond Baroch, Czech Republic. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2022, 43, 617–623. [Google Scholar] [CrossRef]
  120. Jumaat, N.F.H.; Ahmad, B.; Dutsenwai, H.S. Land Cover Change Mapping Using High Resolution Satellites and Unmanned Aerial Vehicle. In Proceedings of the IOP Conference Series: Earth and Environmental Science; Institute of Physics Publishing: Bristol, UK, 2018; Volume 169. [Google Scholar]
  121. Efimov, D.; Shablov, A.; Shavalieva, E. Environmental Monitoring in the “Land–Water” Contact Zone of Water Bodies with the Help of Small Unmanned Aerial Vehicles. In Proceedings of 10th International Conference on Recent Advances in Civil Aviation. Lecture Notes in Mechanical Engineering; Gorbachev, O.A., Gao, X., Li, B., Eds.; Springer: Singapore, 2023; pp. 405–412. [Google Scholar]
  122. Jia, J.; Cui, W.; Liu, J. Urban Catchment-Scale Blue-Green-Gray Infrastructure Classification with Unmanned Aerial Vehicle Images and Machine Learning Algorithms. Front. Environ. Sci. 2022, 9, 778598. [Google Scholar] [CrossRef]
  123. Li, W.; Li, Y.; Gong, J.; Feng, Q.; Zhou, J.; Sun, J.; Shi, C.; Hu, W. Urban Water Extraction with Uav High-Resolution Remote Sensing Data Based on an Improved u-Net Model. Remote Sens. 2021, 13, 3165. [Google Scholar] [CrossRef]
  124. Pillay, S.J.; Bangira, T.; Sibanda, M.; Kebede Gurmessa, S.; Clulow, A.; Mabhaudhi, T. Assessing Drone-Based Remote Sensing for Monitoring Water Temperature, Suspended Solids and CDOM in Inland Waters: A Global Systematic Review of Challenges and Opportunities. Drones 2024, 8, 733. [Google Scholar] [CrossRef]
  125. Hagh, S.F.; Amngostar, P.; Zylka, A.; Zimmerman, M.; Cresanti, L.; Karins, S.; O’Neil-Dunne, J.P.; Ritz, K.; Williams, C.J.; Morales-Williams, A.M.; et al. Autonomous UAV-Mounted LoRaWAN System for Real-Time Monitoring of Harmful Algal Blooms (HABs) and Water Quality. IEEE Sens. J. 2024, 24, 11414–11424. [Google Scholar] [CrossRef]
  126. Lin, B.; Xu, J.; Yin, C.; Chen, L.; You, Y.; Hu, L. An Ultralight Dual-Wavelength and Dual-Beam Chemical Sensor on Small UAV for in-Situ Determination of Phosphate and Nitrite in Surface Water. Sens. Actuators B Chem. 2022, 368, 132235. [Google Scholar] [CrossRef]
  127. Ragib Ishraq Sanim, K.; Kalaitzakis, M.; Kosaraju, B.; Kitzhaber, Z.; English, C.; Vitzilaios, N.; Myrick, M.; Hodgson, M.; Richardson, T. Development of an Aerial Drone System for Water Analysis and Sampling. In Proceedings of the 2022 International Conference on Unmanned Aircraft Systems, ICUAS 2022; Institute of Electrical and Electronics Engineers Inc.: Piscataway, NJ, USA, 2022; pp. 1601–1607. [Google Scholar]
  128. Shelare, S.D.; Aglawe, K.R.; Waghmare, S.N.; Belkhode, P.N. Advances in Water Sample Collections with a Drone—A Review. In Proceedings of the Materials Today: Proceedings; Elsevier Ltd.: Amsterdam, The Netherlands, 2021; Volume 47, pp. 4490–4494. [Google Scholar]
  129. Bin Shafaat, O.; Kauhanen, H.; Julin, A.; Vaaja, M. Unveiling Urban Vegetation Monitoring: Integrating Multitemporal Terrestrial Laser Scanning and UAV Photogrammetry Datasets for Change Detection. In Proceedings of the Proceedings Volume 13198, Remote Sensing Technologies and Applications in Urban Environments IX; SPIE The International Society for Optical Engineering: Edinburgh, UK, 2024; p. 101915. [Google Scholar]
  130. Estrada, J.S.; Fuentes, A.; Reszka, P.; Auat Cheein, F. Machine Learning Assisted Remote Forestry Health Assessment: A Comprehensive State of the Art Review. Front. Plant Sci. 2023, 14, 1139232. [Google Scholar] [CrossRef] [PubMed]
  131. Schiefer, F.; Kattenborn, T.; Frick, A.; Frey, J.; Schall, P.; Koch, B.; Schmidtlein, S. Mapping Forest Tree Species in High Resolution UAV-Based RGB-Imagery by Means of Convolutional Neural Networks. ISPRS J. Photogramm. Remote Sens. 2020, 170, 205–215. [Google Scholar] [CrossRef]
  132. Lin, F.C.; Chuang, Y.C. Interoperability Study of Data Preprocessing for Deep Learning and High-Resolution Aerial Photographs for Forest and Vegetation Type Identification. Remote Sens. 2021, 13, 4036. [Google Scholar] [CrossRef]
  133. Akhie, A.A.; Joksimovic, D. Monitoring of a Productive Blue-Green Roof Using Low-Cost Sensors. Sensors 2023, 23, 9788. [Google Scholar] [CrossRef]
  134. Ecke, S.; Dempewolf, J.; Frey, J.; Schwaller, A.; Endres, E.; Klemmt, H.; Tiede, D.; Seifert, T. UAV-Based Forest Health Monitoring: A Systematic Review. Remote Sens. 2022, 14, 3205. [Google Scholar] [CrossRef]
  135. Puniach, E.; Gruszczyński, W.; Ćwiąkała, P.; Strząbała, K.; Pastucha, E. Recognition of Urbanized Areas in UAV-Derived Very-High-Resolution Visible-Light Imagery. Remote Sens. 2024, 16, 3444. [Google Scholar] [CrossRef]
  136. Marcial-Pablo, M.d.J.; Gonzalez-Sanchez, A.; Jimenez-Jimenez, S.I.; Ontiveros-Capurata, R.E.; Ojeda-Bustamante, W. Estimation of Vegetation Fraction Using RGB and Multispectral Images from UAV. Int. J. Remote Sens. 2019, 40, 420–438. [Google Scholar] [CrossRef]
  137. Zhang, L.; Niu, Y.; Zhang, H.; Han, W.; Li, G.; Tang, J.; Peng, X. Maize Canopy Temperature Extracted from UAV Thermal and RGB Imagery and Its Application in Water Stress Monitoring. Front. Plant Sci. 2019, 10, 1270. [Google Scholar] [CrossRef]
  138. Wan, L.; Cen, H.; Zhu, J.; Zhang, J.; Zhu, Y.; Sun, D.; Du, X.; Zhai, L.; Weng, H.; Li, Y.; et al. Grain Yield Prediction of Rice Using Multi-Temporal UAV-Based RGB and Multispectral Images and Model Transfer—A Case Study of Small Farmlands in the South of China. Agric. For. Meteorol. 2020, 291, 108096. [Google Scholar] [CrossRef]
  139. Müllerová, J.; Gago, X.; Bučas, M.; Company, J.; Estrany, J.; Fortesa, J.; Manfreda, S.; Michez, A.; Mokroš, M.; Paulus, G.; et al. Characterizing Vegetation Complexity with Unmanned Aerial Systems (UAS)—A Framework and Synthesis. Ecol. Indic. 2021, 131, 108156. [Google Scholar] [CrossRef]
  140. Ćwiąkała, P.; Kocierz, R.; Puniach, E.; Nędzka, M.; Mamczarz, K.; Niewiem, W.; Wiącek, P. Documentation of Hiking Trails and Wooden Areas Using Unmanned Aerial Vehicles (UAV) in Tatra National Park. Infrastruct. Ecol. Rural. Areas 2017, IV/2/2017, 1545–1561. [Google Scholar] [CrossRef]
  141. Puniach, E.; Bieda, A.; Ćwiakąła, P.; Kwartnik-Pruc, A.; Parzych, P. Use of Unmanned Aerial Vehicles (UAVs) for Updating Farmland Cadastral Data in Areas Subject to Landslides. ISPRS Int. J. Geoinf. 2018, 7, 331. [Google Scholar] [CrossRef]
  142. Walusiak, G.; Witek, M.; Niedzielski, T. Histogram-Based Edge Detection for River Coastline Mapping Using UAV-Acquired RGB Imagery. Remote Sens. 2024, 16, 2565. [Google Scholar] [CrossRef]
  143. Onishi, M.; Ise, T. Explainable Identification and Mapping of Trees Using UAV RGB Image and Deep Learning. Sci. Rep. 2021, 11, 903. [Google Scholar] [CrossRef] [PubMed]
  144. Syetiawan, A.; Susetyo, D.B.; Lumban-Gaol, Y.; Susilo, S.; Ardha, M.; Susilo, Y. Wahono Deep Learning-Based Palm Tree Detection in Unmanned Aerial Vehicle Imagery with Mask R-CNN. Telkomnika (Telecommun. Comput. Electron. Control) 2025, 23, 156–165. [Google Scholar] [CrossRef]
  145. Fernandez-Gallego, J.A.; Kefauver, S.C.; Kerfal, S.; Araus, J.L. Comparative Canopy Cover Estimation Using RGB Images from UAV and Ground. In Proceedings of the Remote Sensing for Agriculture, Ecosystems, and Hydrology XX; Neale, C.M., Maltese, A., Eds.; SPIE: Bellingham, WA, USA, 2018; p. 20. [Google Scholar]
  146. Park, G.; Song, B.; Park, K. Mapping Individual Tree Crowns to Extract Morphological Attributes in Urban Areas Using Unmanned Aerial Vehicle-Based LiDAR and RGB Data. Ecol. Inform. 2025, 88, 103165. [Google Scholar] [CrossRef]
  147. Anzar, S.M.; Sherin, K.; Panthakkan, A.; Al Mansoori, S.; Al-Ahmad, H. Evaluation of UAV-Based RGB and Multispectral Vegetation Indices for Precision Agriculture in Palm Tree Cultivation. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2025, XLVIII-G-2025, 163–170. [Google Scholar] [CrossRef]
  148. Rodrigo-Comino, J.; Gatea Al-Shammary, A.A.; Duran-Zuazo, V.H.; Serrano-Bernardo, F.; Caballero-Calvo, A.; Rodriguez-Galiano, V. The Limits of RGB-Based Vegetation Indexes under Canopy Degradation: Insights from UAV Monitoring of Harvested Cereal Fields. Drones Auton. Veh. 2026, 3, 10021. [Google Scholar] [CrossRef]
  149. Starý, K.; Jelínek, Z.; Kumhálova, J.; Chyba, J.; Balážová, K. Comparing RGB-Based Vegetation Indices from Uav Imageries to Estimate Hops Canopy Area. Agron. Res. 2020, 18, 2592–2601. [Google Scholar] [CrossRef]
  150. Polat, N.; Memduhoğlu, A.; Kaya, Y. Triangular Greenness Index Analysis for Monitoring Fungal Disease in Pine Trees: A UAV-Based Approach. Bartın Orman Fak. Derg. 2024, 26, 1–15. [Google Scholar] [CrossRef]
  151. Ocampo, A.L.P. De Dynamic Coefficient Triangular Greenness Index for Aerial Phenotyping in a Liberica Coffee Farm. Rev. Int. Geomat. 2025, 34, 731–749. [Google Scholar] [CrossRef]
  152. Isibue, E.W.; Pingel, T.J. Unmanned Aerial Vehicle Based Measurement of Urban Forests. Urban For. Urban Green. 2020, 48, 126574. [Google Scholar] [CrossRef]
  153. Moreno, R.; Ojeda, N.; Azócar, J.; Venegas, C.; Inostroza, L. Application of NDVI for Identify Potentiality of the Urban Forest for the Design of a Green Corridors System in Intermediary Cities of Latin America: Case Study, Temuco, Chile. Urban For. Urban Green. 2020, 55, 126821. [Google Scholar] [CrossRef]
  154. Zhang, K.E.; Okazawa, H.; Yamazaki, Y.; Hayashi, K.; Tsuji, O. Relationship between NDVI and Canopy Cover Sensed by Small UAV Under Different Ground Resolution. IJERD—Int. J. Environ. Rural. Dev. 2021, 12, 122–128. [Google Scholar]
  155. Lee, G.; Kim, G.; Min, G.; Kim, M.; Jung, S.; Hwang, J.; Cho, S. Vegetation Classification in Urban Areas by Combining UAV-Based NDVI and Thermal Infrared Image. Appl. Sci. 2023, 13, 515. [Google Scholar] [CrossRef]
  156. Román, A.; Tovar-Sánchez, A.; Gauci, A.; Deidun, A.; Caballero, I.; Colica, E.; D’Amico, S.; Navarro, G. Water-Quality Monitoring with a UAV-Mounted Multispectral Camera in Coastal Waters. Remote Sens. 2023, 15, 237. [Google Scholar] [CrossRef]
  157. Gano, B.; Bhadra, S.; Vilbig, J.M.; Ahmed, N.; Sagan, V.; Shakoor, N. Drone-based Imaging Sensors, Techniques, and Applications in Plant Phenotyping for Crop Breeding: A Comprehensive Review. Plant Phenome J. 2024, 7, e20100. [Google Scholar] [CrossRef]
  158. Nguyen, C.; Sagan, V.; Bhadra, S.; Moose, S. UAV Multisensory Data Fusion and Multi-Task Deep Learning for High-Throughput Maize Phenotyping. Sensors 2023, 23, 1827. [Google Scholar] [CrossRef] [PubMed]
  159. Xiao, Y.; Guo, Y.; Yin, G.; Zhang, X.; Shi, Y.; Hao, F.; Fu, Y. UAV Multispectral Image-Based Urban River Water Quality Monitoring Using Stacked Ensemble Machine Learning Algorithms—A Case Study of the Zhanghe River, China. Remote Sens. 2022, 14, 3272. [Google Scholar] [CrossRef]
  160. Tang, H.; Miao, F.; Yang, J.; Wu, B.; Zhang, Q.; Hao, H. Considering the Composite Tree Attributes Extracted by UAV Can Improve the Accuracy of Street Tree Species Classification. Dendrobiology 2024, 91, 85–99. [Google Scholar] [CrossRef]
  161. Ferreira, M.P.; Martins, G.B.; de Almeida, T.M.H.; da Silva Ribeiro, R.; da Veiga Júnior, V.F.; da Silva Rocha Paz, I.; de Siqueira, M.F.; Kurtz, B.C. Estimating Aboveground Biomass of Tropical Urban Forests with UAV-Borne Hyperspectral and LiDAR Data. Urban For. Urban Green. 2024, 96, 128362. [Google Scholar] [CrossRef]
  162. Zhong, H.; Lin, W.; Liu, H.; Ma, N.; Liu, K.; Cao, R.; Wang, T.; Ren, Z. Identification of Tree Species Based on the Fusion of UAV Hyperspectral Image and LiDAR Data in a Coniferous and Broad-Leaved Mixed Forest in Northeast China. Front. Plant Sci. 2022, 13, 964769. [Google Scholar] [CrossRef]
  163. Mishra, V.; Avtar, R.; Prathiba, A.P.; Mishra, P.K.; Tiwari, A.; Sharma, S.K.; Singh, C.H.; Chandra Yadav, B.; Jain, K. Uncrewed Aerial Systems in Water Resource Management and Monitoring: A Review of Sensors, Applications, Software, and Issues. Adv. Civ. Eng. 2023, 2023, 3544724. [Google Scholar] [CrossRef]
  164. de Oliveira Farias, M.; Cirilo, J.A.; Ribeiro Neto, A. Unmanned Aerial Vehicles (UAVS) in Water Resources Management: A Systematic Review. Rev. Gestão Soc. E Ambient. 2025, 19, e012237. [Google Scholar] [CrossRef]
  165. Heinemann, S.; Siegmann, B.; Thonfeld, F.; Muro, J.; Jedmowski, C.; Kemna, A.; Kraska, T.; Muller, O.; Schultz, J.; Udelhoven, T.; et al. Land Surface Temperature Retrieval for Agricultural Areas Using a Novel UAV Platform Equipped with a Thermal Infrared and Multispectral Sensor. Remote Sens. 2020, 12, 1075. [Google Scholar] [CrossRef]
  166. Sagan, V.; Maimaitijiang, M.; Sidike, P.; Eblimit, K.; Peterson, K.T.; Hartling, S.; Esposito, F.; Khanal, K.; Newcomb, M.; Pauli, D.; et al. UAV-Based High Resolution Thermal Imaging for Vegetation Monitoring, and Plant Phenotyping Using ICI 8640 P, FLIR Vue Pro R 640, and Thermomap Cameras. Remote Sens. 2019, 11, 330. [Google Scholar] [CrossRef]
  167. Karahan, A.; Demircan, N.; Özgeriş, M.; Gökçe, O.; Karahan, F. Integration of Drones in Landscape Research: Technological Approaches and Applications. Drones 2025, 9, 603. [Google Scholar] [CrossRef]
  168. Koparan, C.; Koc, A.B.; Privette, C.V.; Sawyer, C.B. In Situ Water Quality Measurements Using an Unmanned Aerial Vehicle (UAV) System. Water 2018, 10, 264. [Google Scholar] [CrossRef]
  169. Koparan, C.; Koc, A.B.; Privette, C.V.; Sawyer, C.B. Autonomous in Situ Measurements of Noncontaminant Water Quality Indicators and Sample Collection with a UAV. Water 2019, 11, 604. [Google Scholar] [CrossRef]
  170. Maciuk, K. Different Approaches in GLONASS Orbit Computation from Broadcast Ephemeris. Geod. Vestn. 2016, 60, 455–466. [Google Scholar] [CrossRef]
  171. Zhang, R.; Wang, Z.; Li, X.; She, Z.; Wang, B. Water Quality Sampling and Multi-Parameter Monitoring System Based on Multi-Rotor UAV Implementation. Water 2023, 15, 2129. [Google Scholar] [CrossRef]
  172. Allers, M.; Ahrens, A.; Hitzemann, M.; Bock, H.; Wolf, T.; Radunz, J.; Meyer, F.; Wilsenack, F.; Zimmermann, S.; Ficks, A. Real-Time Remote Detection of Airborne Chemical Hazards—An Unmanned Aerial Vehicle (UAV) Carrying an Ion Mobility Spectrometer. IEEE Sens. J. 2023, 23, 16562–16570. [Google Scholar] [CrossRef]
  173. Camarillo-Escobedo, R.; Flores, J.L.; Marin-Montoya, P.; García-Torales, G.; Camarillo-Escobedo, J.M. Smart Multi-Sensor System for Remote Air Quality Monitoring Using Unmanned Aerial Vehicle and LoRaWAN. Sensors 2022, 22, 1706. [Google Scholar] [CrossRef] [PubMed]
  174. Leitner, S.; Feichtinger, W.; Mayer, S.; Mayer, F.; Krompetz, D.; Hood-Nowotny, R.; Watzinger, A. UAV-Based Sampling Systems to Analyse Greenhouse Gases and Volatile Organic Compounds Encompassing Compound-Specific Stable Isotope Analysis. Atmos. Meas. Tech. 2023, 16, 513–527. [Google Scholar] [CrossRef]
  175. Villa, T.F.; Salimi, F.; Morton, K.; Morawska, L.; Gonzalez, F. Development and Validation of a UAV Based System for Air Pollution Measurements. Sensors 2016, 16, 2202. [Google Scholar] [CrossRef]
  176. Jońca, J.; Pawnuk, M.; Bezyk, Y.; Arsen, A.; Sówka, I. Drone-Assisted Monitoring of Atmospheric Pollution—A Comprehensive Review. Sustainability 2022, 14, 11516. [Google Scholar] [CrossRef]
  177. EASA Cover Regulation to Implementing Regulation (EU) 2019/947 Commission Implementing Regulation (EU) 2019/947 of 24 May 2019 on the Rules and Procedures for the Operation of Unmanned Aircraft Systems; European Union Aviation Safety Agency: Brussels, Belgium, 2019.
  178. YSI. EXO3s Multi-Parameter Sonde. Available online: https://www.ysi.com/exo3s (accessed on 2 March 2026).
  179. SPH Engineering, Remote Water Sampling System. Available online: https://shop.sphengineering.com/products/water-sampler (accessed on 2 March 2026).
  180. Scentroid. DR2000 Drone Based Air Quality Monitor. Available online: https://scentroid.com/products/analyzers/dr2000-flying-lab/ (accessed on 2 March 2026).
  181. DJI. Sniffer4D V2 Multi-Gas Detection System. Available online: https://enterprise.dji.com/ecosystem/sniffer-v2 (accessed on 2 March 2026).
  182. SPH Engineering. Falcon Plus TDLAS Methane Leak Detector. Available online: https://shop.sphengineering.com/products/falcon-plus-tdlas-methane-detector (accessed on 2 March 2026).
  183. Yang, Z.; Zhang, Y.; Zeng, J.; Yang, Y.; Jia, Y.; Song, H.; Lv, T.; Sun, Q.; An, J. AI-Driven Safety and Security for UAVs: From Machine Learning to Large Language Models. Drones 2025, 9, 392. [Google Scholar] [CrossRef]
  184. Khaliq, A.; Comba, L.; Biglia, A.; Ricauda Aimonino, D.; Chiaberge, M.; Gay, P. Comparison of Satellite and UAV-Based Multispectral Imagery for Vineyard Variability Assessment. Remote Sens. 2019, 11, 436. [Google Scholar] [CrossRef]
  185. Wang, X.; Peng, Y.; Shen, C. Efficient Feature Fusion for UAV Object Detection. arXiv 2025, arXiv:2501.17983. [Google Scholar] [CrossRef]
  186. Hua, W.; Chen, Q. A Survey of Small Object Detection Based on Deep Learning in Aerial Images. Res. Sq. 2023, 58, 162. [Google Scholar] [CrossRef]
  187. Tang, G.; Ni, J.; Zhao, Y.; Gu, Y.; Cao, W. A Survey of Object Detection for UAVs Based on Deep Learning. Remote Sens. 2023, 16, 149. [Google Scholar] [CrossRef]
  188. Medeiros, B.M.; Cândido, B.; Jimenez, P.A.J.; Avanzi, J.C.; Silva, M.L.N. UAV-Based Soil Water Erosion Monitoring: Current Status and Trends. Drones 2025, 9, 305. [Google Scholar] [CrossRef]
  189. Bozcan, I.; Kayacan, E. AU-AIR: A Multi-Modal Unmanned Aerial Vehicle Dataset for Low Altitude Traffic Surveillance. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA); IEEE: Piscataway, NJ, USA, 2020; pp. 8504–8510. [Google Scholar]
  190. Nepi, L.; Quattrini, G.; Pesaresi, S.; Mancini, A.; Pierdicca, R. AI-Based Estimation of Forest Plant Community Composition from UAV Imagery. Ecol. Inform. 2025, 90, 103199. [Google Scholar] [CrossRef]
  191. Bhatt, P.; Maclean, A.L. Comparison of High-Resolution NAIP and Unmanned Aerial Vehicle (UAV) Imagery for Natural Vegetation Communities Classification Using Machine Learning Approaches. GIsci. Remote Sens. 2023, 60, 2177448. [Google Scholar] [CrossRef]
  192. Dong, J.; Zhang, J.; Zhang, S.; Yu, Z.; Song, Z.; Meng, T. Vegetation Extraction through UAV RGB Imagery and Efficient Feature Selection. PLoS ONE 2025, 20, e0322180. [Google Scholar] [CrossRef] [PubMed]
  193. Wang, M.; Zhang, Z.; Gao, R.; Zhang, J.; Feng, W. Unmanned Aerial Vehicle (UAV) Imagery for Plant Communities: Optimizing Visible Light Vegetation Index to Extract Multi-Species Coverage. Plants 2025, 14, 1677. [Google Scholar] [CrossRef]
  194. Miao, S.; Zhang, K.; Zeng, H.; Liu, J. Improving Artificial-Intelligence-Based Individual Tree Species Classification Using Pseudo Tree Crown Derived from Unmanned Aerial Vehicle Imagery. Remote Sens. 2024, 16, 1849. [Google Scholar] [CrossRef]
  195. Wu, X.; Li, W.; Hong, D.; Tao, R.; Du, Q. Deep Learning for Unmanned Aerial Vehicle-Based Object Detection and Tracking: A Survey. IEEE Geosci. Remote Sens. Mag. 2022, 10, 91–124. [Google Scholar] [CrossRef]
  196. Huang, Y.; Ou, B.; Meng, K.; Yang, B.; Carpenter, J.; Jung, J.; Fei, S. Tree Species Classification from UAV Canopy Images with Deep Learning Models. Remote Sens. 2024, 16, 3836. [Google Scholar] [CrossRef]
  197. Głowienka, E.; Zembol, N. Forest Community Mapping Using Hyperspectral (CHRIS/PROBA) and Sentinel-2 Multispectral Images3. Geomat. Environ. Eng. 2022, 16, 103–117. [Google Scholar] [CrossRef]
  198. Feng, X.; He, L.; Cheng, Q.; Long, X.; Yuan, Y. Hyperspectral and Multispectral Remote Sensing Image Fusion Based on Endmember Spatial Information. Remote Sens. 2020, 12, 1009. [Google Scholar] [CrossRef]
  199. Yuan, Z.; Gong, J.; Guo, B.; Wang, C.; Liao, N.; Song, J.; Wu, Q. Small Object Detection in UAV Remote Sensing Images Based on Intra-Group Multi-Scale Fusion Attention and Adaptive Weighted Feature Fusion Mechanism. Remote Sens. 2024, 16, 4265. [Google Scholar] [CrossRef]
  200. Lu, S.; Lu, H.; Dong, J.; Wu, S. Object Detection for UAV Aerial Scenarios Based on Vectorized IOU. Sensors 2023, 23, 3061. [Google Scholar] [CrossRef]
  201. Tao, S.; Yang, M.; Wang, M.; Yang, R.; Shen, Q. Small Object Change Detection in UAV Imagery via a Siamese Network Enhanced with Temporal Mutual Attention and Contextual Features: A Case Study Concerning Solar Water Heaters. ISPRS J. Photogramm. Remote Sens. 2024, 218, 352–367. [Google Scholar] [CrossRef]
  202. Abu-Khadrah, A.; Al-Qerem, A.; Hassan, M.R.; Ali, A.M.; Jarrah, M. Drone-Assisted Adaptive Object Detection and Privacy-Preserving Surveillance in Smart Cities Using Whale-Optimized Deep Reinforcement Learning Techniques. Sci. Rep. 2025, 15, 9931. [Google Scholar] [CrossRef]
  203. Wang, D.; Xing, S.; He, Y.; Yu, J.; Xu, Q.; Li, P. Evaluation of a New Lightweight UAV-Borne Topo-Bathymetric LiDAR for Shallow Water Bathymetry and Object Detection. Sensors 2022, 22, 1379. [Google Scholar] [CrossRef] [PubMed]
  204. Yang, X.; Chen, J.; Lu, X.; Liu, H.; Liu, Y.; Bai, X.; Qian, L.; Zhang, Z. Advances in UAV Remote Sensing for Monitoring Crop Water and Nutrient Status: Modeling Methods, Influencing Factors, and Challenges. Plants 2025, 14, 2544. [Google Scholar] [CrossRef] [PubMed]
  205. Liu, B.; Li, T. A Machine-Learning-Based Framework for Retrieving Water Quality Parameters in Urban Rivers Using UAV Hyperspectral Images. Remote Sens. 2024, 16, 905. [Google Scholar] [CrossRef]
  206. Ngo, P.L.; Pham, V.H.; Bui, N.L.; Phan, H.A.T.; Vo, H.B.; Velavan, T.P.; Tran, D.K. Detection of Small Water Bodies for Vector Control Using Deep Learning on Multispectral Imagery from Unmanned Aerial Vehicles. Discov. Artif. Intell. 2025, 5, 170. [Google Scholar] [CrossRef]
  207. Alvarez-Mendoza, C.I.; Guzman, D.; Casas, J.; Bastidas, M.; Polanco, J.; Valencia-Ortiz, M.; Montenegro, F.; Arango, J.; Ishitani, M.; Selvaraj, M.G. Predictive Modeling of Above-Ground Biomass in Brachiaria Pastures from Satellite and UAV Imagery Using Machine Learning Approaches. Remote Sens. 2022, 14, 5870. [Google Scholar] [CrossRef]
  208. Zheng, C.; Abd-Elrahman, A.; Whitaker, V.; Dalid, C. Prediction of Strawberry Dry Biomass from UAV Multispectral Imagery Using Multiple Machine Learning Methods. Remote Sens. 2022, 14, 4511. [Google Scholar] [CrossRef]
  209. Alba, E.; Morais, J.E.F.d.; Santos, W.V.T.d.; Silva, J.E.d.S.; Oresca, D.; de Souza, L.S.B.; Bezerra, A.C.; Silva, E.A.; Silva, T.G.F.d.; Inacio Silva, J.R. Advances in Semi-Arid Grassland Monitoring: Aboveground Biomass Estimation Using UAV Data and Machine Learning. Grasses 2025, 4, 48. [Google Scholar] [CrossRef]
  210. Dlamini, C.M.; Odindi, J.; Matongera, T.N.; Mutanga, O. The Use of Unmanned Aerial Vehicle (UAV) Remotely Sensed Data and Biophysical Variables to Predict Maize Above-Ground Biomass (AGB) in Small-Scale Farming Systems. Remote Sens. Appl. 2025, 39, 101706. [Google Scholar] [CrossRef]
  211. Tunca, E.; Köksal, E.S.; Akay, H.; Öztürk, E.; Taner, S. Novel Machine Learning Framework for High-Resolution Sorghum Biomass Estimation Using Multi-Temporal UAV Imagery. Int. J. Environ. Sci. Technol. 2025, 22, 13673–13688. [Google Scholar] [CrossRef]
  212. Lin, C.-Y.; Tsai, M.-S.; Tsai, J.T.H.; Lu, C.-C. Prediction of Carlson Trophic State Index of Small Inland Water from UAV-Based Multispectral Image Modeling. Appl. Sci. 2022, 13, 451. [Google Scholar] [CrossRef]
  213. Liu, B.; Zhu, X.X.; Ding, Q.; Li, P.; Xi, H.; Li, T.; Luo, H. Integrated Retrieval of Water Quality Parameters Using UAV Hyperspectral Images and Satellite Imagery: Leveraging Deep Learning and Attention Mechanisms for Precision. Ecol. Indic. 2025, 179, 114191. [Google Scholar] [CrossRef]
  214. Ndlovu, H.S.; Odindi, J.; Sibanda, M.; Mutanga, O. A Systematic Review on the Application of UAV-Based Thermal Remote Sensing for Assessing and Monitoring Crop Water Status in Crop Farming Systems. Int. J. Remote Sens. 2024, 45, 4923–4960. [Google Scholar] [CrossRef]
  215. Quoc, V.; Nguyen, T.; Trung, H.; Nguyen, M. An Assessment of Green Space, Blue Space and Green Infrastructure Using Remote Sensing Approach, Research Report No. DMI-0111/2019; Vietnam National Space Center, Vietnam Academy of Science and Technology: Hanoi, Vietnam, 2019.
  216. Atkinson Amorim, J.G.; Schreiber, L.V.; de Souza, M.R.Q.; Negreiros, M.; Susin, A.; Bredemeier, C.; Trentin, C.; Vian, A.L.; de Oliveira Andrades-Filho, C.; Doering, D.; et al. Biomass Estimation of Spring Wheat with Machine Learning Methods Using UAV-Based Multispectral Imaging. Int. J. Remote Sens. 2022, 43, 4758–4773. [Google Scholar] [CrossRef]
  217. Bayomi, N.; Fernandez, J.E. Eyes in the Sky: Drones Applications in the Built Environment under Climate Change Challenges. Drones 2023, 7, 637. [Google Scholar] [CrossRef]
  218. Jakubiak, M.; Sroka, K.; Maciuk, K.; Abazeed, A.; Kovalova, A.; Santos, L. Unmanned Aerial Vehicles (UAVs) in the Energy and Heating Sectors: Current Practices and Future Directions. Energies 2025, 19, 5. [Google Scholar] [CrossRef]
  219. Mohsan, S.A.H.; Khan, M.A.; Noor, F.; Ullah, I.; Alsharif, M.H. Towards the Unmanned Aerial Vehicles (UAVs): A Comprehensive Review. Drones 2022, 6, 147. [Google Scholar] [CrossRef]
  220. Liu, M.; Liu, S. Comparative Study on the Legal Supervision System of Low-Altitude Aircraft in China and Europe Based on Key Risks. Eng. Proc. 2024, 80, 13. [Google Scholar] [CrossRef]
  221. Grote, M.; Pilko, A.; Scanlan, J.; Cherrett, T.; Dickinson, J.; Smith, A.; Oakey, A.; Marsden, G. Sharing Airspace with Uncrewed Aerial Vehicles (UAVs): Views of the General Aviation (GA) Community. J. Air Transp. Manag. 2022, 102, 102218. [Google Scholar] [CrossRef]
  222. The European Parliament and of the Council Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the Protection of Natural Persons with Regard to the Processing of Personal Data and on the Free Movement of Such Data, and Repealing Directive 95/46/EC (General Data Protection Regulation); Publications Office of the European Union: Luxembourg, 2016; Volume L 119, pp. 1–88.
  223. Bassi, E.; Bloise, N.; Dirutigliano, J.; Fici, G.P.; Pagallo, U.; Primatesta, S.; Quagliotti, F. The Design of GDPR-Abiding Drones Through Flight Operation Maps: A Win–Win Approach to Data Protection, Aerospace Engineering, and Risk Management. Minds Mach. 2019, 29, 579–601. [Google Scholar] [CrossRef]
  224. Fehling, C.; Saraceni, A. Technical and Legal Critical Success Factors: Feasibility of Drones & AGV in the Last-Mile-Delivery. Res. Transp. Bus. Manag. 2023, 50, 101029. [Google Scholar] [CrossRef]
  225. Fadhil, T.H.; Al-Haddad, L.A.; Al-Karkhi, M.I. Legal Accountability and UAV Fault Diagnosis Explainable AI in Aviation Safety and Regulatory Compliance for Liability Challenges. Discov. Artif. Intell. 2025, 5, 410. [Google Scholar] [CrossRef]
  226. He, M.; Chen, Y. Personal Data Protection in China: Progress, Challenges and Prospects in the Age of Big Data and AI. Telecomm. Policy 2025, 49, 103076. [Google Scholar] [CrossRef]
Figure 1. Analysis of the publication year of articles for search queries.
Figure 1. Analysis of the publication year of articles for search queries.
Sustainability 18 03064 g001
Figure 2. The flow chart of the research framework for article selection based on the PRISMA guidelines [63].
Figure 2. The flow chart of the research framework for article selection based on the PRISMA guidelines [63].
Sustainability 18 03064 g002
Figure 3. Thematic distribution (%) of analysed publications concerning UAV-based monitoring of urban blue-green infrastructure (BGI).
Figure 3. Thematic distribution (%) of analysed publications concerning UAV-based monitoring of urban blue-green infrastructure (BGI).
Sustainability 18 03064 g003
Figure 4. Methodological framework for UAV-enabled data acquisition and processing architecture for multidimensional urban BGI analysis.
Figure 4. Methodological framework for UAV-enabled data acquisition and processing architecture for multidimensional urban BGI analysis.
Sustainability 18 03064 g004
Figure 5. High-resolution UAV-based RGB imagery enabling the precise identification and spatial mapping of individual tree crowns (Eucalyptus plantation, Portugal, 2025).
Figure 5. High-resolution UAV-based RGB imagery enabling the precise identification and spatial mapping of individual tree crowns (Eucalyptus plantation, Portugal, 2025).
Sustainability 18 03064 g005
Figure 6. Comparative visualisation of the cooling effect caused by urban greenery in the city square captured using UAV: (a) thermal infrared imaging showing the spatial distribution of surface temperature, where lower thermal values correspond to areas covered with vegetation, demonstrating evapotranspiration cooling and shading functions; (b) high-resolution RGB optical images providing real color context and precise location of monitored urban green infrastructure. (Ponta do Sol, Cape Verde, 2025).
Figure 6. Comparative visualisation of the cooling effect caused by urban greenery in the city square captured using UAV: (a) thermal infrared imaging showing the spatial distribution of surface temperature, where lower thermal values correspond to areas covered with vegetation, demonstrating evapotranspiration cooling and shading functions; (b) high-resolution RGB optical images providing real color context and precise location of monitored urban green infrastructure. (Ponta do Sol, Cape Verde, 2025).
Sustainability 18 03064 g006
Figure 7. The Integrative UAV-BGI Framework: Synergizing Multi-Sensor Data Streams for Holistic Urban Ecosystem Monitoring.
Figure 7. The Integrative UAV-BGI Framework: Synergizing Multi-Sensor Data Streams for Holistic Urban Ecosystem Monitoring.
Sustainability 18 03064 g007
Table 1. The three search queries used in the search strategy.
Table 1. The three search queries used in the search strategy.
Search Query No.TopicSearch String
1UAV terminology(‘unmanned aerial vehicle*’ OR UAV OR drone*)
2BGI (‘blue-green infrastructure’ OR ‘green infrastructure’ OR ‘urban green space*’ OR ‘urban vegetation’ OR ‘urban water*’ OR ‘urban reservoir’ OR ‘urban lake’ OR ‘urban wetland’ OR ‘urban water bod*’ * OR park*)
3Management, monitoring, and assessment(‘monitor*’ OR ‘manag*’ OR ‘assess*’ OR ‘mapping’ OR ‘maintenance’ OR ‘inspection spray*’ OR ‘sampl*’)
Table 3. UAV-mounted sensors for monitoring urban blue–green infrastructure: components and outputs.
Table 3. UAV-mounted sensors for monitoring urban blue–green infrastructure: components and outputs.
Sensor TypeBGI ComponentsTypical OutputsManagement Relevance
RGB opticalurban vegetation,
parks,
green roofs,
riverbanks
Orthophotos,
canopy cover,
crown delineation,
land cover maps
Inventory of green spaces,
visual inspection,
change detection
Multispectralurban green infrastructure,
small water bodies
Vegetation indices (NDVI, GNDVI),
chlorophyll proxies,
turbidity indicators
Vegetation health monitoring,
seasonal dynamics,
basic water quality assessment
Hyperspectralurban vegetation,
lakes,
rivers
Species-level classification,
stress indicators,
detailed water quality parameters
Advanced diagnostics,
early stress detection,
targeted interventions
Thermal infraredparks,
water bodies,
permeable surfaces
Land surface temperature,
cooling intensity,
thermal heterogeneity
Urban heat mitigation assessment,
climate adaptation planning
LiDARtrees,
terrain,
mixed green–blue structures
Vegetation height,
canopy volume,
digital terrain models
Structural analysis,
biomass estimation,
flood and runoff modelling
Table 5. Classification of UAV sensor categories according to Technology Readiness Levels (TRL).
Table 5. Classification of UAV sensor categories according to Technology Readiness Levels (TRL).
Sensor CategoryTLR ScorePrimary Application DomainRationale for Classification
RGB cameras9Structural mapping, 3D tree modeling, land-cover classification.The use of RGB cameras is based on mature work processes and a high degree of image processing automation. Currently used for urban UGI management.
LiDAR9Accurate biomass estimation, vertical structure analysis, and digital terrain modeling.Standardized 3D data acquisition; high precision in complex urban canopy penetration; industry-standard hardware.
Multispectral cameras9Basic vegetation health indices (e.g., NDVI), chlorophyll estimation.Established spectral indices; commercially available integrated sensors; minor challenges remain in atmospheric correction.
Thermal cameras6–7Surface temperature mapping, Urban Heat Island (UHI) monitoring, cooling efficiency.High sensitivity to changing environmental conditions, requiring a specific flight schedule. This technology remains largely focused on scientific research.
Hyperspectral cameras5–6Tree species identification, early disease detection, and complex biochemical mapping.High computational load, significant hardware costs, lack of automated comprehensive data processing processes on a city scale.
Contact water samplers/in situ probes4–6Real-time water quality, water sample collection from inaccessible sites.Experimental payload integration, significant operational risks (water takeoff/landing), lack of standard submersion protocols.
Gas/air quality sensors5Vertical pollutant profiling, emission source detection.Problems related to interference caused by airflow from the rotor. Requires special conditions and flight patterns to ensure data integrity.
Table 6. Technology Readiness Level (TLR) assessment of UAV-based BGI monitoring applications.
Table 6. Technology Readiness Level (TLR) assessment of UAV-based BGI monitoring applications.
BGI Monitoring DomainTLR ScoreMaturity StatusRationale for Classification
Vegetation Health (GI)9Fully OperationalStandardized workflows using multispectral sensors provide high-accuracy vegetation indices (NDVI, NDRE). Calibration protocols are mature, and results show high correlation with ground-truth physiological data.
Structural/3D mapping9Fully OperationalPhotogrammetric and LiDAR-based point clouds have reached sub-centimeter precision. Automated generation of Canopy Height Models (CHM) and Digital Terrain Models (DTM) is now a standard tool in urban forestry and flood risk assessment.
Surface Temperature/microclimate6–7System PrototypeThermal sensors are mature, but urban environments introduce significant noise due to variations in the emissivity of artificial materials and “urban canyon” effects. Absolute temperature accuracy still requires complex atmospheric and surface-specific corrections.
Water quality (hyperspectral)6Operational DemoEffective for detecting surface-level parameters like turbidity, chlorophyll-a, and cyanobacteria blooms using specialized spectral bands. However, vertical profile analysis and bathymetry in turbid urban waters remain challenging without extensive in situ calibration.
Water and air quality (sensors)5ExperimentalThe maturity level of sensors is high, but their application and use on UAV platforms is, in many cases, experimental.
Social/human-nature4Conceptual/PilotTechnical capability to track human flow exists, but deployment is bottlenecked by stringent legal frameworks (such as GDPR) and ethical concerns. Research is currently limited to anonymized pilot studies with restricted operational scalability.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Jakubiak, M.; Maciuk, K.; Bidira, F.; Bieda, A. UAVs in Urban Blue–Green Infrastructure Management: A Comprehensive Review of Sensors, Methods, and Applications. Sustainability 2026, 18, 3064. https://doi.org/10.3390/su18063064

AMA Style

Jakubiak M, Maciuk K, Bidira F, Bieda A. UAVs in Urban Blue–Green Infrastructure Management: A Comprehensive Review of Sensors, Methods, and Applications. Sustainability. 2026; 18(6):3064. https://doi.org/10.3390/su18063064

Chicago/Turabian Style

Jakubiak, Mateusz, Kamil Maciuk, Firomsa Bidira, and Agnieszka Bieda. 2026. "UAVs in Urban Blue–Green Infrastructure Management: A Comprehensive Review of Sensors, Methods, and Applications" Sustainability 18, no. 6: 3064. https://doi.org/10.3390/su18063064

APA Style

Jakubiak, M., Maciuk, K., Bidira, F., & Bieda, A. (2026). UAVs in Urban Blue–Green Infrastructure Management: A Comprehensive Review of Sensors, Methods, and Applications. Sustainability, 18(6), 3064. https://doi.org/10.3390/su18063064

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop