Next Article in Journal
Development and Characterization of Hydrogen Peroxide Gels with Organic Gelling Agents for Use with Microencapsulated Fuels
Previous Article in Journal
Research on the Remaining Useful Life Prediction Algorithm for Aero-Engines Based on Transformer–KAN–BiLSTM
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Effective Process to Use Drones for Above-Ground Biomass Estimation in Agroforestry Landscapes

by
Andsera Adugna Mekonen
1,
Claudia Conte
1,* and
Domenico Accardo
1,2
1
Department of Industrial Engineering, University of Naples Federico II, Piazzale Tecchio 80, 80125 Naples, Italy
2
Center for Advanced Metrology and Technology Services CeSMA, University of Naples Federico II, Corso Nicolangelo Protopisani, 70, 80146 Naples, Italy
*
Author to whom correspondence should be addressed.
Aerospace 2025, 12(11), 1001; https://doi.org/10.3390/aerospace12111001
Submission received: 17 September 2025 / Revised: 5 November 2025 / Accepted: 6 November 2025 / Published: 8 November 2025
(This article belongs to the Section Aeronautics)

Abstract

Above-ground biomass in agroforestry refers to the total mass of living vegetation, primarily trees and shrubs, integrated into agricultural landscapes. It plays a key role in climate change mitigation by capturing and storing carbon. Accurate estimation of above-ground biomass in agroforestry systems requires effective drone deployment and sensor management. This study presents a detailed methodology for biomass estimation using Unmanned Aircraft Systems, based on an experimental campaign conducted in the Campania region of Italy. Multispectral drone platforms were used to generate calibrated reflectance maps and derive vegetation indices for biomass estimation in agroforestry landscapes. Integrating field-measured tree attributes with remote sensing indices improved the accuracy and efficiency of biomass prediction. Following the assessment of mission parameters, flights were conducted using a commercial drone to demonstrate consistency of results across multiple altitudes. Terrain-follow mode and high image overlap were employed to evaluate ground sampling distance sensitivity, radiometric performance, and overall data quality. The outcome is a defined process that enables agronomists to effectively estimate above-ground biomass in agroforestry landscapes using drone platforms, following the procedure outlined in this paper. Predictive performance was evaluated using standard model metrics, including R2, RMSE, and MAE, which are essential for replicability and comparison in future studies.

1. Introduction

Above-Ground Biomass (AGB) refers to the total mass of living plant components located above the soil surface, including stems, branches, foliage, and reproductive structures. It is typically expressed in units of mass per unit area, such as megagrams per hectare (Mg/ha). AGB serves as a critical biophysical indicator for assessing vegetation productivity, characterizing forest structure, and quantifying terrestrial carbon stocks [1]. Accurate estimation of AGB is essential for monitoring ecosystem services, guiding sustainable land-use practices, and assessing carbon sequestration potential within the broader context of climate change mitigation. Climate change exposes ecosystems to intense and rapid environmental shifts, primarily driven by rising Greenhouse Gas (GHG) concentrations. The concentration of atmospheric CO2 has significantly increased due to human activities, such as fossil fuel consumption and land-use changes [2]. Replacing fossil fuels with non-carbon energy sources is one widely accepted mitigation strategy [3]. Among nature-based climate solutions, agroforestry is the practice of integrating trees and shrubs into farming systems, which has emerged as a promising approach to enhance environmental sustainability, food security, and climate resilience [4].
Agroforestry systems are multifunctional landscapes that support carbon sequestration, biodiversity conservation, and rural development. In addition to storing atmospheric carbon in woody biomass and soil, they can also mitigate emissions of potent GHGs such as methane and nitrous oxide [5,6]. However, agroforestry systems often exhibit complex canopy structures and heterogeneous vegetation, making accurate estimation of AGB particularly challenging when using traditional field-based techniques or coarse-resolution satellite imagery [7,8].
AGB is a crucial biophysical indicator used to monitor crop growth status, assess carbon storage, and optimize agricultural management practices [9]. The ability to estimate AGB accurately and rapidly is essential for evaluating the effectiveness of agroforestry as a climate mitigation strategy. A recent report by the European Environment Agency predicts that climate change could reduce the EU’s agricultural income by up to 16% by 2050, with significant regional variation [2]. In this context, agroforestry represents a viable adaptation strategy, receiving growing support from international scholars and policymakers [9].
The implementation efficiency of AGB estimation methods should be critically assessed to ensure accuracy, scalability, and cost-effectiveness in operational monitoring [6]. Among them, remote sensing plays a critical role by enabling spatially explicit and non-destructive biomass assessment. Platforms such as satellites, aircraft, balloons, and Unmanned Aircraft Systems (UASs) are used to acquire spatial data with different resolution levels and flexibility of implementation.
UASs have revolutionized field-scale monitoring through their flexibility, low operational costs, and capability to capture high spatial and temporal resolution imagery [10]. Their use enables precise estimation of Plant Height (PH), a key parameter linked to biomass, which is often used in conjunction with Canopy Height Modeling (CHM) and vegetation indices to estimate AGB [11].
In this study, a set of vegetation indices, i.e., Normalized Difference Vegetation Index (NDVI), Green Normalized Difference Vegetation Index (GNDVI), Leaf Chlorophyll Index (LCI), Normalized Difference Red Edge Index (NDRE), and Optimized Soil-Adjusted Vegetation Index (OSAVI), was selected due to the sensitivity to canopy chlorophyll content, structural complexity, and biomass density. NDVI and OSAVI are widely used for assessing vegetation vigor and biomass estimation, particularly under variable soil backgrounds and mixed canopy conditions [12,13]. GNDVI and NDRE enhance sensitivity to chlorophyll and nitrogen concentration, providing improved discrimination in dense and high-biomass vegetation canopies [14,15]. LCI offers complementary information on leaf pigment and chlorophyll content, allowing better characterization of photosynthetic activity and canopy heterogeneity in agroforestry systems [15].
The performance of UAS in estimating AGB is sensitive to a range of system configuration parameters, including flight altitude, sensor angle, image overlap, and image-processing workflows. The choice of sensor type, such as RGB or multispectral cameras, and the processing software algorithms significantly affect data quality and the reliability of biomass estimates. Recent studies have emphasized the importance of standardized guidelines and best practices to optimize UAS-based AGB estimation across diverse environments [16,17]. Operational challenges associated with external factors such as diverse weather conditions, limited visibility, high wind speeds, airspace regulations, no-fly zone restrictions, and the necessity of maintaining visual line-of-sight with the UAS operator must be carefully considered to ensure successful mission execution and the integrity of the collected data.
This study investigates the optimal configuration of UAS platforms for AGB estimation through an experimental campaign conducted in an agroforestry environment in the Campania region, Italy. A commercial drone equipped with multispectral and RGB cameras was used to perform several flight tests at different altitudes. The proposed methodology enables AGB estimation using multiple approaches, including regression-based modeling and machine learning algorithms, by exploiting a combination of sensor data. This study aims to develop an effective end-to-end procedure for AGB estimation in agroforestry landscapes, using a preliminary experimental campaign to evaluate flight parameters and sensor performance within the study area.

2. Materials and Methods

2.1. Problem Assessment

Estimating AGB in agroforestry systems is essential for understanding carbon dynamics, informing climate-smart land-use strategies, and supporting sustainable management practices. However, the structural and spectral complexity of agroforestry environments characterized by multi-strata vegetation layers, diverse canopy architecture, and high spatial heterogeneity poses significant challenges for remote-sensing-based AGB estimation. These systems often include trees, shrubs, and crops of varying heights and species within the same plot, leading to issues such as canopy occlusion, shadow effects, and spectral mixing, all of which hinder reliable data acquisition using traditional satellite or aerial platforms.
UAS, particularly small drones equipped with high-resolution sensors, have emerged as valuable tools for addressing these challenges of biomass estimation in agroforestry. They offer flexible, low-altitude, and high-frequency data acquisition capabilities. However, the potential of UAS in agroforestry environments depends heavily on appropriate platform selection, sensor integration, and system configuration. Recent studies have emphasized the need for vegetation and site-specific drone guidance strategies to enhance the accuracy and reproducibility of AGB estimation [18,19]. In response to this need and the minimum requirements for accurate AGB estimation, we identified a set of key parameters that significantly influence estimation outcomes in agroforestry environments. The reference parameters for drone usage in precision agriculture that must be considered are summarized in Table 1.
The parameters reported in Table 1 must be carefully evaluated and integrated into mission planning and data acquisition procedures to ensure the accuracy and consistency of AGB estimation. By aligning technical settings with the specific requirements of agroforestry environments, this study aims to establish an effective methodology for vegetation index calculation and subsequent biomass modeling.
Among the parameters listed, flight altitude, image overlap, and flight duration play a critical role in determining the quality and reliability of data acquisition in agroforestry environments. Flight altitude directly influences the GSD and spatial resolution, affecting the level of detail captured for tree crowns and understory vegetation. Image overlap ensures complete coverage of the area of interest and supports accurate ortho mosaic generation by reducing gaps and improving geometric consistency. Meanwhile, flight duration is essential for efficient mission planning, ensuring that the area can be surveyed within operational constraints while maintaining image quality. These parameters, alongside sensor selection, georeferencing, and environmental considerations, must be carefully optimized and integrated into mission planning to ensure accurate, reproducible AGB estimation. By systematically addressing these factors, this study establishes a methodology that supports high-quality vegetation index computation and robust biomass modeling in complex agroforestry landscapes.
Rotary-wing drones, such as quadcopters, are particularly well-suited for agroforestry due to their ability to hover and maneuver in constrained and irregular terrains, making them ideal for capturing detailed canopy structures in multilayered environments. Fixed-wing drones, while advantageous for covering larger areas, often lack the flexibility needed to navigate around tall trees or sloped terrain, limiting their applicability in such settings [20]. Selecting the appropriate drone platform is essential for acquiring spatial data that accurately captures both vertical and horizontal variability within agroforestry plots.
Sensor choice is equally critical. While RGB and multispectral sensors remain widely used for generating vegetation indices such as NDVI or NDRE, they may not adequately capture the structural complexity inherent in agroforestry systems. LiDAR sensors, by contrast, provide three-dimensional measurements of canopy height, shape, and density, offering significant advantages in estimating biomass, especially when vegetation is vertically stratified. Recent advancements have demonstrated that combining LiDAR with multispectral data enables more robust modeling by integrating both spectral and structural information, resulting in improved AGB prediction performance in mixed landscapes [21,22].
Mission planning parameters, including flight altitude, ground sampling distance, image overlap, speed, weather, and lighting, significantly influence data quality. Flying at lower altitudes improves the resolution of canopy features; however, such operations must be carefully managed to mitigate the risk of collisions within dense vegetation. Midday flights are generally preferred, as they minimize shadow effects and provide more uniform illumination, an important consideration in agroforestry environments, where complex canopy structures often result in uneven light distribution and subsequent data distortion [23]. In addition, the placement of GCPs and the use of high-accuracy geolocation methods (e.g., RTK-GNSS) are necessary to ensure precise Orthomosaic generation and accurate alignment between image data and ground truth measurements [24]. The challenges to execute the mission shall be considered, such as weather, visibility, and wind conditions, regulations, and no-fly zone restrictions, and the need for line-of-sight to the remote pilot.
Processing and modeling approaches must also evolve in tandem with improved data acquisition strategies. The use of Convolutional Neural Networks (CNNs), U-Net architectures, and ensemble learning frameworks has allowed researchers to extract relevant spectral-textural features, map vegetation classes, and estimate biomass at high spatial resolution with notable accuracy improvements [25].
The success of drone-based monitoring in agroforestry systems depends on selecting suitable platforms and sensors that align with the spatial complexity, vegetation diversity, and monitoring objectives typical of these environments. Agroforestry landscapes often feature multi-strata canopies, mixed species, and varied terrain, all of which demand precise, adaptable, and reliable remote sensing solutions.
Considering these factors, it is evident that AGB estimation in agroforestry systems cannot rely on generic drone workflows designed for simpler agricultural or forest environments. Instead, a targeted approach that combines appropriate drone platform selection, tailored sensor payloads, and site-specific mission configurations is required to accurately capture the spatial and structural heterogeneity of agroforestry landscapes. Recent research emphasizes the need for standardized, ecosystem-sensitive drone guidance protocols to ensure data reliability, modeling consistency, and replicability across diverse agroforestry contexts [26].

2.2. Methodology

This section presents flowcharts illustrating a dual-level procedural methodology for drone-based image acquisition tailored to AGB estimation in agroforestry landscapes. The methodology workflow, shown in Figure 1, outlines the conceptual and operational foundation of the study, including problem definition, payload, and mission planning, followed by data acquisition, vegetation parameter extraction, and machine learning-based biomass modeling. Figure 1 summarizes this procedural workflow, structured into three main stages: planning, data acquisition, and modeling.
The workflow begins with problem definition, including the identification of study sites, target species, and biomass estimation objectives. Drone and sensor selection is guided by these objectives to meet spatial, spectral, and operational requirements. Flight mission planning defines optimal flight paths, altitude, image overlap, coverage, and georeferencing strategies while accounting for environmental conditions.
During data acquisition, drones capture high-resolution multispectral imagery, from which the tree structural parameters, such as height and diameter, are extracted using image processing and zonal statistics. These parameters, together with field measurements, are then preprocessed for model development. Machine learning algorithms are subsequently trained and validated to predict AGB, generating spatially explicit biomass predictions. Finally, the results are analyzed to assess biomass distribution patterns and evaluate workflow performance, supporting agroforestry management and carbon assessment. This workflow demonstrates a systematic, iterative approach that ensures robust and reproducible AGB estimation.
The methodology presented can be applied to a typical use case by adopting the applicative workflow shown in Figure 2, which illustrates how the research can be implemented within a practical drone-based mission framework.
The applicative workflow begins with identifying the area of interest and evaluating the expected AGB estimation performance, along with regulatory and environmental constraints. Based on these criteria, an appropriate drone platform and sensor payload are selected. This is followed by detailed mission planning, with attention to flight altitude, image overlap, and lighting conditions to ensure optimal image quality and spatial resolution. After mission execution, the collected imagery is processed into georeferenced ortho mosaics and vegetation index layers using photogrammetric tools. These data layers are then analyzed to extract key biophysical parameters, such as Diameter at Breast Height (DBH) and Plant Height (PH), which are used to develop data-driven models for accurate biomass estimation. The final stage involves interpreting the results to support agroforestry management and carbon assessment objectives.

3. Hardware Equipment and Software Tools

3.1. Drone Platform Selection

Drones used in agroforestry can be broadly classified into multirotor, fixed-wing, and hybrid VTOL systems. Each type involves specific trade-offs in flight endurance, maneuverability, spatial resolution, and payload capacity [17,18].
Multirotor drones, such as quadcopters and hexacopters, are ideal for small to medium-sized agroforestry plots. Their vertical takeoff and landing (VTOL) capability, stability in flight, and ease of operation make them well-suited for capturing high-resolution imagery in complex environments with tree-crop mixtures and uneven terrain [19].
Fixed-wing drones offer higher endurance and coverage per flight (up to 300 ha), making them suitable for large-scale agroforestry mapping projects [27,28]. However, they require open areas for launch and landing and are less flexible in rugged or fragmented landscapes [20]. VTOL hybrid drones combine the vertical agility of a multirotor with the efficiency of fixed-wing flight, though they remain expensive and require greater piloting expertise [21].
The selection of onboard sensors is crucial in determining the type and quality of biophysical and biochemical information that can be extracted from drone imagery. In agroforestry systems where vegetation is marked by structural complexity, species diversity, and spatial heterogeneity, sensor choice must be aligned with specific monitoring objectives such as tree health assessment, biomass estimation, carbon accounting, and crop stress detection [24,29].
Multispectral sensors can enhance the analytical capabilities of drone imagery by capturing reflectance data beyond the visible spectrum, specifically in the green, red, red edge, and near-infrared (NIR) bands. The DJI Mavic 3 Multispectral™ (Mavic 3M) drone, manufactured by DJI in Shenzhen, China, and used in the experimental study, integrates four MP multispectral cameras alongside a 20 MP RGB camera and a dedicated sunlight sensor for radiometric calibration, as reported in Table 2.
Mavic 3M can capture subtle variations in chlorophyll content, leaf structure, and canopy stress, making it particularly valuable in mixed agroforestry systems where species and condition variability are high [22,23]. Multispectral imagery enabled the computation of advanced vegetation indices, all of which demonstrated correlations with AGB derived from field measurements [30].
These indices served as effective proxies for structural attributes, allowing AGB and carbon stock to be estimated via regression modeling. Despite its limitation to visible-spectrum data, the RGB sensor demonstrated strong potential in low-cost AGB assessment and spatial tree mapping, especially where rapid deployment and maneuverability are essential. As for Multispectral Sensors, they can expand the analytical use of drone imagery by capturing reflectance data beyond the visible spectrum, specifically in the green, red, red edge, and NIR bands. The Mavic 3M drone, used in a parallel experimental study, integrates four 4 MP multispectral cameras alongside a 20 MP RGB camera and a dedicated sunlight sensor for radiometric calibration, as reported in Table 3.
The comparison between the minimum drone requirements for AGB estimation in agroforestry systems and the specifications of the Mavic 3M shows that the selected platform consistently meets, and in several aspects exceeds, the operational standards for this application. Its multirotor design provides maneuverability and adaptability for small to medium plots, while the 30 min flight time allows efficient single-mission coverage. The inclusion of RTK positioning is recommended to ensure centimeter-level accuracy, which is essential for aligning spectral information with field-based measurements.
From a sensing perspective, the Mavic 3M integrates a 20 MP RGB sensor and four multispectral bands Green, Red-edge, NIR, enabling computation of vegetation index linked to biomass modeling. Its Ground Sampling Distance of 0.6 cm/pixel (RGB) and 1.1 cm/pixel (multispectral) far surpasses the minimum requirement of ≤10 cm/pixel, ensuring precise canopy-level characterization. The DJI Pilot 2™ autonomous flight system, equipped with terrain-follow functionality, further supports repeatable and consistent data collection across heterogeneous landscapes.
In terms of robustness, the Mavic 3M withstands winds up to 43 km/h, more than double the minimum tolerance, making it suitable for typical agroforestry field conditions. With a storage capacity of up to 512 GB, it can accommodate large image datasets, while its compatibility with DJI Terra™ version 5.0.1 (developed by DJI, Shenzhen, China), Agisoft version 2.2.0 (Agisoft LLC, St. Petersburg, Russia), and QGIS™ version 3.32 (developed by the QGIS Project, headquartered in Grüt, Switzerland) provides a streamlined processing pipeline from photogrammetric Orthomosaic generation to geospatial feature extraction.
To ensure an efficient and integrated workflow for AGB estimation, several software tools were employed for various stages of data acquisition, processing, and analysis.
Table 4 summarizes the main software components used in this study, highlighting their respective functions and compatibility. DJI Pilot 2™ was utilized for flight planning and mission execution, while DJI Terra and Agisoft facilitated photogrammetric processing and Orthomosaic generation. QGIS supported feature extraction and GIS integration, and Python™ version 3.12 (developed by the Python Software Foundation, headquartered in Wilmington, DE, USA). was employed for modeling and workflow automation, ensuring a streamlined and reproducible data analysis pipeline.
Sensor integration plays a critical role in agroforestry UAS applications, where both structural and biochemical traits must be captured. Since Mavic 3M combines a 20 MP RGB sensor with four multispectral bands (Green, Red, Red-edge, NIR), enabling simultaneous high-resolution imaging and VI computation for biomass modeling. In contrast, other types of platforms require separate RGB and multispectral payloads, increasing complexity, while the other ones lack native multispectral capability. The selected platform integrated suite simplifies data collection and post-processing, offering a solution for developing the proposed methodology in the field of agroforestry monitoring.
Overall, the Mavic 3M proves to be a cost-effective, dependable, and scalable solution for above-ground biomass estimation in agroforestry systems. Its strong balance of flight endurance, measurement precision, and seamless software integration makes it an optimal choice for both research-driven studies and operational landscape monitoring.

3.2. Payload Selection

The success of drone-based remote sensing relies heavily on the design and quality of its onboard payload. In complex agroforestry systems, where vegetation shows high structural and spectral variability, payloads need to accurately capture multi-dimensional data to support both spectral and structural analyses [31,32]. To ensure effective biomass estimation, the following minimum payload requirements were considered: the availability of multispectral and RGB sensors (optionally thermal or LiDAR), integrated system design with minimal setup, synchronized data acquisition with GPS/IMU, and spectral resolution sufficient for vegetation index (VI) computation, such as NDVI, GNDVI, and NDRE. These criteria are summarized in Table 5.
To address these requirements, this study employed the Mavic 3M, which is equipped with a compact, yet powerful multi-sensor payload designed for precision vegetation monitoring. The payload includes a 20 MP RGB camera (4/3″ CMOS) for high-resolution true-color imagery and four 5 MP narrowband multispectral sensors targeting the Green (560 ± 16 nm), Red (650 ± 16 nm), Red Edge (730 ± 16 nm), and Near-Infrared (860 ± 26 nm) regions of the spectrum. An integrated Downwelling Light Sensor (DLS) provides in-flight radiometric calibration, ensuring consistent reflectance data across illumination conditions [33,34]. The system also incorporates a GCP for plot-level georeferencing for the tree, aligning remote sensing outputs with field measurements such as DBH and tree height [33].
Recent agronomic studies confirm that the Mavic 3M spectral configuration effectively captures chlorophyll-related signals, enabling accurate computation of vegetation indices, including NDVI, GNDVI, NDRE, OSAVI, and LCI all strongly correlated with crop biomass and canopy structure [33]. For example, in precision farming applications such as wheat and palm plantations, multispectral indices like NDRE and GNDVI have demonstrated sensitivity to nitrogen and chlorophyll content [34,35]. Research in diversified cropping systems further indicates that combining UAS-derived vegetation indices with texture and structural features significantly enhances AGB estimation [36].
Practitioners report that radiometric calibration via the DLS is essential for reliable multispectral outputs, and while WebODMTM or PIX4DfieldsTM can process the imagery effectively, careful band alignment is critical due to potential misregistration issues between spectral bands [33,34].

3.3. Mission Profile

A well-planned flight mission is vital for obtaining high-quality, repeatable multispectral data using the DJI Mavic 3 Multispectral (Mavic 3M) in agroforestry AGB assessments. This mission was designed to capture fine-scale spectral and structural features across a mixed tree crop system near in Campania, Italy. Key mission parameters such as flight overlap, time of day, path orientation, and scheduling significantly influence data quality and consistency. These factors ensure optimal image acquisition for accurate vegetation analysis and repeatable monitoring over time.
The investigated mission profile parameters are:
  • Flight overlap (front/side): Typically, 70–90% overlap is required for photogrammetric processing and vegetation index mosaicking.
  • Lighting conditions: Influence illumination uniformly and shadow formation, particularly in RGB and multispectral acquisitions.
  • Flight path orientation: Should align with crop rows or terrain slopes to improve interpretability and structural analysis.
  • Repeatability and scheduling: Enables temporal monitoring of crop growth stages, stress dynamics, or seasonal variation.
All flights occurred between 10:00 and 12:00 local time under clear skies to minimize shadow effects and ensure consistent illumination, aligning with recent recommendations for improving vegetation index accuracy in drone-based plant monitoring [37]. A grid-based flight plan was executed using DJI Pilot 2 and DJI Terra, flying at 30 m Above Ground Level (AGL). This altitude produced a GSD of 0.8 cm/pixel (RGB) and 1.4 cm/pixel (multispectral), which offers a balanced trade-off between spatial detail and coverage [38].
To ensure high-quality photogrammetric products and spectral mosaics, the drone operated with 80% front lap and 70% side lap at a flight speed of 3–5 m/s, which is consistent with recent UAS workflows in structurally heterogeneous agricultural landscapes [39]. The use of terrain-follow flight mode allowed the drone to maintain a steady altitude above ground, minimizing GSD distortion in areas with sloping terrain [40]. Attention should be given to avoid collision with high fixed obstacles. Many drones are equipped with anti-collision sensors. Anyway, avoiding foliage is difficult.
Radiometric calibration was achieved using the Mavic 3M’s integrated downwelling light sensor (DLS), which records solar irradiance in real time, and optional reflectance targets placed in the field. This methodology supports consistent reflectance values across scenes and over time [41].
Each flight covered approximately two hectares and lasted 35 to 40 min, well within the Mavic 3M nominal 43 min battery capacity [40]. Imagery was processed in DJI Terra to generate Orthomosaic, digital surface models (DSMs), and reflectance-calibrated vegetation index maps.
Post-flight quality control included visual inspection of image sharpness, shadow coverage, and band alignment. Occasional misalignments among the multispectral bands, as reported in recent Mavic 3M field evaluations, were corrected or resulted in re-flown missions to ensure data consistency.
The mission should integrate optimized altitude, overlap, radiometric calibration, RTK/GCP georeferencing, and real-time terrain follow, reflecting current best practices in UAS-based biomass mapping in agroforestry systems [32,37,42].
The drone is assumed to cover a square area with a surface S and lateral size L = S 2 by performing a serpentine path. Each straight path along the forward direction will have a length given by L , while the image capture separation in the side direction will give the distance between two straight legs on the side direction I C S S (Figure 3).
Given h A G L as selected drone mission altitude above ground level and θ F as the PH forward-direction Field of View, the forward-direction swath S W F given by Equation (1)
S W F = 2   h A G L   tan θ F 2
If θ S is the side-direction Field of View, the side-direction swath S W S given by Equation (2)
S W S = 2   h A G L   tan θ S 2
Front-direction and side-direction Ground Sampling Distances, i.e., G S D F and G S D S , given by Equations (3) and (4):
G S D F = S W F N p i x , F = 2   h A G L   tan θ F 2 N p i x , F
G S D S = S W S N p i x , S = 2   h A G L   tan θ S 2 N p i x , S
where N p i x , F and N p i x , S are, respectively, the number of pixels in the vertical and horizontal direction for the adopted photodetector array.
Starting from a 100% frontal overlap condition, when the drone travels a distance equal to d along the forward direction, the overlap is reducing by an amount that is proportional to 2 d because the non-overlapping surfaces have a linear increase both above and below the remainder overlapping surface. The image capture distance in the forward direction I C D F is defined as the traveled distance that can be associated with a given level of frontal overlap O L F . It is given by Equation (5):
I C D F = 1 O L F 2   S W F =   1 O L F     h A G L   tan θ F 2  
As a result, assigned an image capture time equal to I C I , the resulting average forward speed along route given by Equation (6):
V = I C D F I C I =   1 O L F   S W F 2 ·   I C I =   1 O L F     h A G L   tan θ F 2 I C I
where O L F is the assigned overlap percentile between two subsequent images acquired along the forward direction.
The image capture separation in the side direction I C S S given by Equation (7):
I C S S = 1 O L S 2   S W S =   1 O L S     h A G L   tan θ S 2  
where O L S is the assigned overlap percentile between two subsequent images acquired along the side direction. The overall number of straight-line paths in the serpentine N L I N E S given by Equation (8):
N L I N E S = f l o o r L 1 O L S 2   S W S = f l o o r L 1 O L S     h A G L   tan θ S 2
where f l o o r x is the mathematical operator that rounds the real number x to the nearest integer less than or equal to it.
Since moving in the side direction over a distance L adds line to the vertical ones given by Equation (8), the total distance traveled by the drone to execute its mission T O T D I S T will be given by Equation (9):
T O T D I S T = L   N L I N E S + 1 = L   f l o o r L 1 O L S   h A G L   tan θ S 2 + 1
The mission time t m i s s i o n given by the ratio of the total distance traveled T O T D I S T and the average forward speed V as reported by Equation (10):
t m i s s i o n = T O T D I S T V = L   f l o o r L 1 O L S     h A G L   tan θ S 2 + 1 1 O L F     h A G L   tan θ F 2 I C I
Operationally, the Mavic 3M supports flights at 25 m AGL, achieving a ground GSD of ~0.6 cm/pixel for RGB and ~1.1 cm/pixel for multispectral bands. Efficient forward and side overlaps (~80%) enables the generation of high-quality ortho mosaics and DSMs suitable for structural feature extraction. Comparative trials in winter wheat have reported that Mavic 3M imagery provides slightly improved image quality over other UAS platforms, though validation remains necessary to ensure consistent biomass inversion across different spectral and structural datasets.

3.4. Features of Interest: DBH, PH, and VIs

Accurate estimation of AGB in agroforestry systems depends on identifying and extracting critical vegetation parameters. Among these, Diameter at Breast Height (DBH), Plant Height (PH), and Vegetation Indices (VIs) are widely recognized as the most influential. These parameters capture the structural, textural, and spectral characteristics of vegetation and serve as the foundation for most remote sensing-based models used to estimate AGB via UAS. DBH is a key biometric indicator closely associated with tree volume and biomass. Although DBH cannot be directly measured from UAS imagery, it can be inferred indirectly through its correlations with plant height, crown width, and spectral attributes. Advanced LiDAR-derived canopy height models have demonstrated high accuracy in predicting DBH, achieving R2 values up to 0.90 and relative RMSE below 11% at the plot level [43]. Similarly, in Mediterranean mixed forests, integrating hand-held and UAS-based LiDAR has resulted in DBH estimates with minimal bias (~2–3 cm) [44].
PH is another key structural variable that can be directly derived from UAS data using photogrammetric surface models. Studies have consistently shown PH to be a strong predictor of AGB. For example, UAS-derived height and crown dimensions were used to estimate AGB in Quercus ilex samplings with R2 values ranging from 0.78 to 0.89 [45]. In row crop systems such as maize, UAS-based models combining PH and canopy coverage also produced high predictive performance [46].
VIs are spectral indicators used to infer physiological and biochemical properties of vegetation, such as chlorophyll content and canopy density. Traditional indices such as NDVI, SAVI, and EVI are widely applied but can suffer from saturation under high biomass conditions. Red-edge and narrow-band indices like NDRE, Modified Chlorophyll Absorption Reflectance Indices (MCARI), offer improved sensitivity in such environments [47]. The considered VI in this study are shown in Table 6.
Machine learning and statistical models that integrate structural (PH), spectral (VIs), and derived DBH estimates offer the most robust frameworks for AGB prediction. Multi-modal approaches have consistently outperformed single-source methods in heterogeneous landscapes [48]. Reviews of UAS-based forest studies confirm PH as the most frequently and successfully estimated metric [48,49], and studies like in [50,51] provide concrete evidence that RGB sensors, when paired with proper analytical frameworks, can generate accurate AGB maps even in structurally diverse agroforestry systems.
Recent studies further reinforce the centrality of DBH, PH, and VIs in AGB estimation by demonstrating their predictive power across varied forest ecosystems and remote sensing modalities. For instance, Ref. [52] showed that LiDAR-derived canopy height metrics such as height sum and height mean were strongly correlated with field-measured AGB (r = 0.83 and r = 0.72, respectively), confirming PH as a dependable structural proxy. Similarly, Ref. [53] emphasized the utility of integrating multiple vegetation indices and LiDAR metrics in Random Forest models, achieving RMSE values as low as 27.19 Mg ha−1. In a broader context, Ref. [54] demonstrated that stacking models combining vegetation height, remote sensing-based derived VIs, and topographic variables outperformed individual learners, with R2 values reaching 0.74. These findings underscore the robustness of combining structural and spectral features, particularly PH and VIs, as scalable predictors of AGB across heterogeneous landscapes.
Table 6. Estimated tree parameters: VI.
Table 6. Estimated tree parameters: VI.
Vegetation Indices/BandsFormulaReference
Normalized Different Vegetation Index N D V I = N I R R N I R + R [55]
Green Normalized Difference Vegetation Index G N D V I = N I R G N I R + G [55]
Normalized Difference Red Edge N D R E = N I R R E N I R + R E [55]
Optimize Soil Adjusted Vegetation Index O S A V I = N I R R N I R + R + 0.16 [56]
Leaf Chlorophyll Index L C I = N I R + R   N I R R E [55]
In conclusion, the synergistic use of DBH (proxy), PH (structure), and VIs (spectral features) forms the foundation for UAS-guided AGB estimation. These features are not only technically extractable using cost-effective UAS platforms but also computationally scalable through machine learning integration.
Multispectral imagery enabled the computation of advanced vegetation indices such as (NDVI, GNDVI, LCI, NDRE, and OSAVI), all of which were ed from field measurements [30]. Mavic 3M ability to capture subtle variations in chlorophyll content, leaf structure, and canopy stress makes it particularly valuable in mixed agroforestry systems where species and condition variability are high [22,23]. These indices served as effective proxies for structural attributes, allowing AGB and carbon stock to be estimated via regression modeling. Despite its limitation to visible-spectrum data, the RGB sensor demonstrated strong potential in low-cost AGB assessment and spatial tree mapping, especially where rapid deployment and maneuverability are essential.

4. Mission Planning and Operational Execution

Accurate estimation of AGB in agroforestry environments requires not only high-resolution data acquisition but also a well-structured and reproducible workflow. In this study, the experimental design was developed to assess the performance of the Mavic 3M drone (Figure 4A), integrating photogrammetric processing, GIS-based spatial analysis, and machine learning-driven predictive modeling. This multi-step approach provides a robust and scalable methodology for tree-level AGB estimation.
Field campaigns were planned to assess the influence of spatial resolution, image overlap, and vegetation index quality on model performance, as well as the responsiveness of predictive algorithms to field-measured parameters such as DBH and PH. Flights were conducted at altitudes of 20 m, 25 m, and 30 m to evaluate the effect of GSD on canopy structure detection and vegetation index extraction. Missions were pre-programmed using the DJI Pilot 2™ application (developed by DJI, Shenzhen, China) as shown in Figure 4D, allowing precise definition of flight paths, 80–90% front and side overlap, flight speed (3–4 m/s), and terrain-follow mode to maintain consistent altitude over heterogeneous canopies. As shown in Figure 4, Flights were performed under clear-sky conditions between 10:00 and 12:00 local time to minimize shadows and ensure radiometric stability.
The Mavic 3M acquired high-resolution RGB and multispectral imagery green, red, red-edge, and NIR, with radiometric calibration enabled via the integrated sunlight sensor (Figure 5). Following the acquisition, images were processed in DJI Terra™, where radiometric calibration, band alignment, and stitching generated a reflectance-corrected Orthomosaic for each spectral band. These ortho mosaics were combined into multispectral layers, enabling the computation of five vegetation indices: NDVI, GNDVI, NDRE, LCI, and OSAVI, which served as the primary inputs for predictive modeling, as shown in Figure 6 as examples of the spatial distribution of vegetation indices (GNDVI, NDRE, LCI, and OSAVI), highlighting variations in canopy vigor and chlorophyll content across the agroforestry landscape. The photogrammetric workflow ensured high-quality, spatially consistent mosaics, critical for subsequent per-tree feature extraction.
Feature extraction was performed in QGIS™ using the Zonal Statistics tool. Individual tree crowns were manually delineated on the Orthomosaic, and pixel-level vegetation index values were aggregated to produce canopy-level metrics. These metrics were then linked to field-measured DBH and PH, creating a comprehensive dataset that integrates structural and spectral information for each tree.
The results highlight several operational insights: maintaining a flight altitude of ~25 m provided an optimal GSD (~1.1 cm/pixel) for individual tree canopy mapping, while high image overlaps and midday flight scheduling minimized radiometric variability and shadow artifacts. Overall, the combination of high-resolution multispectral sensors, in-flight radiometric calibration, DJI Terra™ photogrammetric processing, QGIS™ feature extraction, and Python-based machine learning produced a robust, reproducible, and operationally applicable workflow for AGB estimation in complex agroforestry landscapes. The methodology proved cost-effective, scalable, and dependable for ecological monitoring and sustainable land management.
In summary, the combined hardware-software integration framework demonstrated high robustness, ease of integration, and scientific reliability for remote biomass estimation. The entire pipeline from UAS-based data acquisition to regression analysis proved to be cost-effective, replicable, and scalable for broader applications in agroecological monitoring and sustainable land management.
The experiment demonstrated both scientific reliability and operational practicality, making it a benchmark platform for biomass monitoring and vegetation analysis until next-generation hyperspectral or AI-enhanced UAS platforms become widely available.
According to the previous study [57] different regression approaches were evaluated for predicting AGB. After testing different methods, the selected regression models were chosen because they consistently provided the best predictive performance on our dataset. These approaches are well-suited for AGB estimation in heterogeneous agroforestry systems, as they effectively capture the relationships between structural DBH, PH, and spectral VIs features. This algorithm-based comparison underscores the varying contributions of vegetation indices to AGB estimation, with VIs demonstrating an acceptable predictive capacity for biomass in the study area.

5. Data Analysis and Insights

From an operational standpoint, the experiment proved highly efficient for both field deployment and post-processing. The drone covered approximately two hectares per sortie while maintaining high spatial consistency. The RTK module, combined with strategically placed Ground Control Points (GCPs), ensured precise georeferencing, which was vital for linking field measurements with extracted VI values during zonal statistics in QGIS. The integration with DJI Terra and the seamless export to Python-based modeling pipelines further underscored its readiness for scalable applications. The detailed procedure is explained as follows: multispectral drone imagery from the Mavic 3M was used to generate vegetation indices NDVI, GNDVI, NDRE, LCI, and OSAVI. The quantitative values of these indices were extracted using zonal statistics in QGIS based on the canopy extent of individual trees. These vegetation index values served as independent variables, while the AGB computed from DBH and PH using allometric equations was used as the dependent variable.
The analytical workflow was completed within a Python environment, where machine learning regression approaches, Multiple Linear Regression (MLR), Decision Tree Regression (DTR), Random Forest Regression (RF), and Support Vector Regression (SVR) were implemented using the scikit-learn library. The pipeline encompassed data loading, preprocessing, train–test splitting (70/30), model fitting, and evaluation using R2, RMSE, and MAE as performance metrics. This workflow executed smoothly with high compatibility and processing speed, ensuring reliability in the regression outputs. Figure 7 shows us an example of how to evaluate the performance of the RF regression model. A scatter plot was generated comparing predicted AGB values against field-measured observations. The scatter plot compares predicted and actual field-measured AGB values, showing model performance with R2 = 0.782, MAE = 0.006, and RMSE = 0.006. The trend line indicates a slight underestimation bias, suggesting that the model tends to predict slightly lower biomass values than those measured in the field. Complementing this, the bar chart ranks vegetation indices by their contribution to the model, highlighting GNDVI and NDRE as the most influential predictors. These results confirm the importance of red-edge and green-band reflectance in accurately estimating biomass in agroforestry systems.
The regression analysis demonstrated that vegetation indices derived from multispectral imagery possess strong predictive capacity for estimating AGB in agroforestry systems. Among the tested models, Decision Tree Regression achieved the highest performance (R2 = 0.95), benefiting from the small dataset that allowed a single tree to capture underlying relationships effectively. Multiple Linear Regression followed with R2 = 0.83, while Support Vector Regression showed limited accuracy (R2 = 0.62), constrained by data size and kernel optimization. Random Forest Regression yielded an R2 of 0.78, with a Mean Absolute Error (MAE) of 0.006 and Root Mean Squared Error (RMSE) of 0.006, indicating consistent prediction reliability across the test set.
Decision Tree Regression achieved the highest performance, due to the small dataset, which allowed a single tree to model the underlying relationships effectively. In contrast, Random Forest’s ensemble averaging may have reduced sensitivity to fine-scale patterns, while SVR performance was limited by data size and kernel optimization. The Mean Absolute Error (MAE) of 0.006 quantifies the average absolute deviation between predicted and actual biomass values, while the Root Mean Squared Error (RMSE) of 0.006 reflects the standard deviation of prediction errors, underscoring the model’s consistency and reliability across the test set as shown in Table 7.
The regression results were consistent with benchmarks and reflected the good reliability of the chosen software stack. The high accuracy achieved, particularly by the Decision Tree model, highlights not only the predictive strength of the algorithms but also the quality and consistency of the input data derived from the UAS platform and processing pipeline.
The detailed experiment at [57] refers to our previous work, which serves as a benchmark for the current study. In that earlier study, we evaluated the performance of different vegetation indices for AGB estimation. Building on those findings, the present work focuses on establishing a complete end-to-end framework that follows the procedures and methodological steps outlined there, from platform selection and data acquisition to modeling and evaluation.
Based on the experimental analysis, the proposed applicative methodology can be extended to other regions for UAS-based AGB estimation. The workflow integrates UAS-acquired multispectral imagery, vegetation index extraction, and regression modeling, but its successful implementation depends on careful parameter selection and mission planning. Critical flight parameters such as altitude, front overlap, GSD for multispectral and RGB sensors, swath width, flight speed, and mission time per hectare must be optimized to balance data quality and operational efficiency. Table 8 presents the estimated flight speeds and mission durations required to survey a 1-hectare area using the selected drone, based on a 1 s image capture interval. These estimates assume a 75% side overlap, a fixed field size of 100 m × 100 m, and synchronization between image capture and flight speed. Table 8 summarizes the parameter configurations assessed in the experiment, highlighting how variations in altitude and overlap influence image resolution, swath coverage, and survey duration. These guidelines ensure that the method remains adaptable while maintaining accuracy and reproducibility across diverse agroforestry environments. The results clearly demonstrate how changes in flight altitude and front overlap impact survey efficiency.
The results highlight the strong relationship between flight altitude, front overlap, and mission time for area surveying. Increasing the front overlap from 80% to 90% extends mission duration, particularly at lower altitudes. For instance, at 25 m altitude, the mission time nearly doubles from 17.2 min (80% overlap) to 34.4 min (90% overlap). As shown in Figure 8, higher altitudes (e.g., 35 m) reduce survey duration due to the wider swath coverage (as low as 9.1 min at 80% overlap), but this comes at the cost of lower ground sampling resolution. Conversely, lower altitudes (e.g., 25 m) provide finer spatial resolution (1.1 cm/pixel multispectral, 0.6 cm/pixel RGB) but require significantly longer survey times, especially at higher overlap settings.
Figure 8 visualizes the mentioned relations, illustrating how overlap consistently extends mission duration and how higher altitudes cluster at shorter times due to broader swath coverage. This highlights the importance of balancing spatial resolution, data redundancy, and operational efficiency when planning UAS missions for AGB estimation.
Although the proposed workflow can be extended to other regions, differences in canopy structure, vegetation type, and illumination conditions may require local calibration to maintain model reliability. These findings are consistent with previous UAS-based remote sensing studies that reported similar trade-offs between flight altitude, image overlap, and mapping accuracy, e.g., [23,37,39]. Higher altitudes reduce mission duration but increase GSD, while greater overlaps improve 3D reconstruction and radiometric consistency at the expense of longer flight times. Such relationships have also been emphasized in operational UAS photogrammetry guidelines, underscoring the need to balance spatial resolution, accuracy, and efficiency in mission planning.
In general, the development of an end-to-end process for AGB estimation in agroforestry landscapes is central to this study’s contribution. By integrating mission planning, sensor calibration, terrain-aware flight execution, and data fusion between field measurements and remote sensing outputs, the proposed workflow ensures both methodological consistency and operational adaptability. This structured approach enables agronomists and land managers to make informed decisions about flight altitude, image overlap, and payload selection based on site-specific constraints and desired spatial resolution. Moreover, the process facilitates reproducibility and scalability across diverse agroforestry contexts, supporting more accurate biomass quantification and enhancing the role of UAS platforms in climate-smart land management. The results demonstrate that such a process not only improves data quality and prediction capability but also streamlines mission execution, making AGB estimation more accessible and efficient for practitioners.

6. Conclusions

This study proposes an end-to-end, operationally viable methodology for UAS-assisted AGB estimation in agroforestry systems, in which sensor selection, mission planning, and analytical modeling are consolidated into a unified, field-deployable workflow. The result confirms that UAS-derived multispectral imagery provides sufficient detail to capture biophysical variability at high spatial resolution, thereby supporting accurate and scalable AGB modeling in heterogeneous agroforestry landscapes.
Vegetation indices derived from multispectral imagery, such as NDVI, NDRE, GNDVI, OSAVI, and LCI, exhibited strong correlations with key biophysical indicators, including DBH and PH. When used as predictors in biomass estimation models, these indices enabled high-accuracy results, with some models achieving a coefficient of determination (R2) of up to 0.95. Although DTR was effective in this study, the findings suggest that combining multiple machine learning algorithms (e.g., Random Forest, Support Vector Regression, Gradient Boosting) can further enhance model robustness and adaptability across different agroforestry conditions.
From an operational perspective, the study employed a set of geometric and sensor-specific equations to perform the mission budget of the most important flight parameters, such as swath width, GSD, flight speed, and mission duration. These equations were used to generate the estimates reported in the paper, providing a quantitative basis for optimizing drone mission planning under different altitudes and image overlap settings. This structured approach ensures repeatability and field efficiency in biomass data acquisition.
Drone imagery collected at 25 m above ground level achieved a GSD of approximately 1.1 cm/pixel, suitable for detailed canopy analysis at the individual tree level. Additional features such as radiometric calibration, RTK positioning, and compatibility with tools like DJI Terra, QGIS, and Python-based pipelines supported seamless integration from image capture to biomass modeling and zonal statistics.
Beyond its technical contributions, this study underscores the critical role of drone-based AGB estimation in climate change mitigation. Agroforestry systems function as important carbon sinks and biodiversity reservoirs, and scalable, cost-effective biomass quantification methods are essential for tracking carbon stocks, supporting ecosystem service payment schemes, informing REDD+ initiatives, and contributing to national greenhouse gas inventories.
In conclusion, drone-based remote sensing offers a practical, accurate, and scalable approach for above-ground biomass estimation in agroforestry landscapes. The combination of high-resolution multispectral imaging, structured mission planning, and flexible machine learning frameworks provides a solid foundation for both scientific research and operational monitoring. Until next-generation technologies such as hyperspectral or AI-enhanced drone systems become universally available, this integrated approach represents a robust, climate-relevant tool for sustainable land management and environmental assessment. Despite the promising results, this study was conducted at a single agroforestry site under specific environmental conditions and within a single time zone, using a small dataset. These constraints limit the generalizability and repeatability of the findings across diverse agroecological contexts. To support broader application, adaptive corrections and methodological refinements should be considered. Future research should prioritize multi-site validation across varying agroforestry typologies, climatic zones, and seasonal conditions. Incorporating structural sensors, multi-temporal datasets, and deep learning approaches may further enhance the accuracy and scalability of AGB estimation workflows. Expanding the dataset and validating the methodology under heterogeneous conditions will be essential for developing robust, transferable models suitable for operational deployment.

Author Contributions

Conceptualization, D.A.; Data curation, A.A.M.; Formal analysis, D.A.; Investigation, A.A.M.; Methodology, A.A.M.; Software, A.A.M.; Supervision, D.A.; Validation, C.C.; Visualization, C.C.; Writing—original draft, A.A.M.; Writing—review and editing, C.C. and D.A. All authors have read and agreed to the published version of the manuscript.

Funding

This research is related to a PhD student at the University of Naples Federico II, Department of Industrial Engineering. ENI supports Andsera Adugna Mekonen’s PhD scholarship, S.p.A., Italy, within the “Young talents from Africa” initiative.

Data Availability Statement

The data supporting the findings of this study are available from the corresponding author upon reasonable request. The data are not publicly available due to the data are actively involved in another experiment that is still in progress.

Acknowledgments

The authors would like to thank the Laboratory for Innovative Flight Technologies (LIFT) laboratory team at the University of Naples Federico II for their assistance during field data acquisition. Special thanks to ENI S.p.A. for supporting the doctoral research of Andsera Adugna Mekonen.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
AGBAbove-Ground Biomass
AGLAbove Ground Level
CHMCanopy Height Model
CNNConvolutional Neural Networks
DBHDiameter at Breast Height
DSMDigital Surface Models
DLSDownwelling Light Sensor
DTRDecision Tree Regression
ExRExcess Red Index
GCPGround Control Points
GHGGreenhouse Gas
GNDVIGreen Normalized Difference Vegetation Index
GSDGround Sampling Distance
LCILeaf Chlorophyll Index
LDRLight Detection and Ranging
LIFTLaboratory for Innovative Flight Technology
MAEMean Absolute Error
MLRMultiple Linear Regression
NDRENormalized Difference Red Edge Index
NIRNear-Infrared
OSAVIOptimized Soil Adjusted Vegetation Index
PHPlant Height
PPKPost-Processed Kinematic
RFRandom Forest
RGBRIRed-Green-Blue Ratio Index
RGBRed-Green-Blue
RMSERoot Mean Square Error
RTKReal-Time Kinematic
SfMStructure from Motion
SVRSupport Vector Regression
UASUnmanned Aircraft System/Systems
VIVegetation Indices
VEGVegetation Index
VTOLVertical Take-Off and Landing

References

  1. Chave, J.; Andalo, C.; Brown, S.; Cairns, M.A.; Chambers, J.Q.; Eamus, D.; Fölster, H.; Fromard, F.; Higuchi, N.; Kira, T.; et al. ECOSYSTEM ECOLOGY Tree allometry and improved estimation of carbon stocks and balance in tropical forests. Oecologia 2005, 145, 87–99. [Google Scholar] [CrossRef] [PubMed]
  2. Costa, J.M.; Egipto, R.; Aguiar, F.C.; Marques, P.; Nogales, A.; Madeira, M. The role of soil temperature in mediterranean vineyards in a climate change context. Front. Media S.A. 2023, 14, 1145137. [Google Scholar] [CrossRef] [PubMed]
  3. Song, R.; Zhu, Z.; Zhang, L.; Li, H.; Wang, H. A Simple Method Using an Allometric Model to Quantify the Carbon Sequestration Capacity in Vineyards. Plants 2023, 12, 997. [Google Scholar] [CrossRef] [PubMed]
  4. Raj, A.; Kumar, M.; Ram, J.; Meena, S. Agroforestry for Monetising Carbon Credits; Springer: Berlin/Heidelberg, Germany, 2025. [Google Scholar]
  5. Tamga, D.K.; Latifi, H.; Ullmann, T.; Baumhauer, R.; Thiel, M.; Bayala, J. Modelling the spatial distribution of the classification error of remote sensing data in cocoa agroforestry systems. Agrofor. Syst. 2022, 97, 109–119. [Google Scholar] [CrossRef]
  6. Williams, J.N.; Morandé, J.A.; Vaghti, M.G.; Medellín-Azuara, J.; Viers, J.H. Ecosystem services in vineyard landscapes: A focus on aboveground carbon storage and accumulation. Carbon Balance Manag. 2020, 15, 23. [Google Scholar] [CrossRef]
  7. Bégué, A.; Arvor, D.; Bellon, B.; Betbeder, J.; De Abelleyra, D.; Ferraz, R.P.D.; Lebourgeois, V.; Lelong, C.; Simões, M.; Verón, S.R. Remote sensing and cropping practices: A review. Remote. Sens. 2018, 10, 99. [Google Scholar] [CrossRef]
  8. Sharma, P.; Bhardwaj, D.R.; Singh, M.K.; Nigam, R.; Pala, N.A.; Kumar, A.; Verma, K.; Kumar, D.; Thakur, P. Geospatial technology in agroforestry: Status, prospects, and constraints. Environ. Sci. Pollut. Res. 2022, 30, 116459–116487. [Google Scholar] [CrossRef]
  9. Hu, X.; Li, Z.; Chen, J.; Nie, X.; Liu, J.; Wang, L.; Ning, K. Carbon sequestration benefits of the grain for Green Program in the hilly red soil region of southern China. Int. Soil Water Conserv. Res. 2021, 9, 271–278. [Google Scholar] [CrossRef]
  10. López-García, P.; Intrigliolo, D.S.; Moreno, M.A.; Martínez-Moreno, A.; Ortega, J.F.; Pérez-Álvarez, E.P.; Ballesteros, R. Assessment of vineyard water status by multispectral and rgb imagery obtained from an unmanned aerial vehicle. Am. J. Enol. Vitic. 2021, 72, 285–297. [Google Scholar] [CrossRef]
  11. Poley, L.G.; McDermid, G.J. A systematic review of the factors influencing the estimation of vegetation aboveground biomass using unmanned aerial systems. Remote Sens. 2020, 12, 1052. [Google Scholar] [CrossRef]
  12. Fern, R.R.; Foxley, E.A.; Bruno, A.; Morrison, M.L. Suitability of NDVI and OSAVI as estimators of green biomass and coverage in a semi-arid rangeland. Ecol. Indic. 2018, 94, 16–21. [Google Scholar] [CrossRef]
  13. Pan, W.; Wang, X.; Sun, Y.; Wang, J.; Li, Y.; Li, S. Karst vegetation coverage detection using UAV multispectral vegetation indices and machine learning algorithm. Plant Methods 2023, 19, 7. [Google Scholar] [CrossRef] [PubMed]
  14. Sousa, D.; Small, C. Linking Common Multispectral Vegetation Indices to Hyperspectral Mixture Models: Results from 5 nm, 3 m Airborne Imaging Spectroscopy in a Diverse Agricultural Landscape. Available online: https://github.com/isofit/isofit (accessed on 24 October 2025).
  15. Panumonwatee, G.; Choosumrong, S.; Pampasit, S.; Premprasit, R.; Nemoto, T.; Raghavan, V. Machine learning technique for carbon sequestration estimation of mango orchards area using Sentinel-2 Data. Carbon Res. 2025, 4, 33. [Google Scholar] [CrossRef]
  16. Shamaoma, H.; Chirwa, P.W.; Ramoelo, A.; Hudak, A.T.; Syampungani, S. The Application of UASs in Forest Management and Monitoring: Challenges and Opportunities for Use in the Miombo Woodland. Forests 2022, 13, 1812. [Google Scholar] [CrossRef]
  17. Manfreda, S.; McCabe, M.F.; Miller, P.E.; Lucas, R.; Madrigal, V.P.; Mallinis, G.; Ben Dor, E.; Helman, D.; Estes, L.; Ciraolo, G.; et al. On the use of unmanned aerial systems for environmental monitoring. Remote Sens. 2018, 10, 641. [Google Scholar] [CrossRef]
  18. Spiers, A.I.; Scholl, V.M.; McGlinchy, J.; Balch, J.; Cattau, M.E. A review of UAS-based estimation of forest traits and characteristics in landscape ecology. Landsc. Ecol. 2025, 40, 29. [Google Scholar] [CrossRef]
  19. Bazzo, C.O.G.; Kamali, B.; Hütt, C.; Bareth, G.; Gaiser, T. A Review of Estimation Methods for Aboveground Biomass in Grasslands Using UAV. Remote. Sens. 2023, 15, 639. [Google Scholar] [CrossRef]
  20. Boon, M.A.; Drijfhout, A.P.; Tesfamichael, S. Comparison of a fixed-wing and multi-rotor UAV for environmental mapping applications: A case study. In International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences-ISPRS Archives; International Society for Photogrammetry and Remote Sensing: Prague, Czech Republic, 2017; pp. 47–54. [Google Scholar] [CrossRef]
  21. Chen, A.; Wang, X.; Zhang, M.; Guo, J.; Xing, X.; Yang, D.; Zhang, H.; Hou, Z.; Jia, Z.; Yang, X. Fusion of LiDAR and Multispectral Data for Aboveground Biomass Estimation in Mountain Grassland. Remote Sens. 2023, 15, 405. [Google Scholar] [CrossRef]
  22. Yan, Y.; Lei, J.; Huang, Y. Forest Aboveground Biomass Estimation Based on Unmanned Aerial Vehicle–Light Detection and Ranging and Machine Learning. Sensors 2024, 24, 7071. [Google Scholar] [CrossRef]
  23. Pádua, L.; Vanko, J.; Hruška, J.; Adão, T.; Sousa, J.J.; Peres, E.; Morais, R. UAS, sensors, and data processing in agroforestry: A review towards practical applications. Int. J. Remote Sens. 2017, 38, 2349–2391. [Google Scholar] [CrossRef]
  24. Liu, X.; Lian, X.; Yang, W.; Wang, F.; Han, Y.; Zhang, Y. Accuracy Assessment of a UAV Direct Georeferencing Method and Impact of the Configuration of Ground Control Points. Drones 2022, 6, 30. [Google Scholar] [CrossRef]
  25. Ayushi, K.; Babu, K.N.; Ayyappan, N.; Nair, J.R.; Kakkara, A.; Reddy, C.S. A comparative analysis of machine learning techniques for aboveground biomass estimation: A case study of the Western Ghats, India. Ecol. Inform. 2024, 80, 102479. [Google Scholar] [CrossRef]
  26. Bazrafkan, A.; Delavarpour, N.; Oduor, P.G.; Bandillo, N.; Flores, P. An Overview of Using Unmanned Aerial System Mounted Sensors to Measure Plant Above-Ground Biomass. Remote. Sens. 2023, 15, 3543. [Google Scholar] [CrossRef]
  27. WingtraOne Drone Technical Specifications. Available online: https://wingtra.com/mapping-drone-wingtraone/technical-specifications (accessed on 28 October 2025).
  28. Guebsi, R.; Mami, S.; Chokmani, K. Drones in Precision Agriculture: A Comprehensive Review of Applications, Technologies, and Challenges. Drones 2024, 8, 686. [Google Scholar] [CrossRef]
  29. Asner, G.P.; Clark, J.K.; Mascaro, J.; García, G.A.G.; Chadwick, K.D.; Encinales, D.A.N.; Paez-Acosta, G.; Montenegro, E.C.; Kennedy-Bowdoin, T.; Duque, Á.; et al. High-resolution mapping of forest carbon stocks in the Colombian Amazon. Biogeosciences 2012, 9, 2683–2696. [Google Scholar] [CrossRef]
  30. Hamada, Y.; Zumpf, C.R.; Cacho, J.F.; Lee, D.; Lin, C.-H.; Boe, A.; Heaton, E.; Mitchell, R.; Negri, M.C. Remote sensing-based estimation of advanced perennial grass biomass yields for bioenergy. Land 2021, 10, 1221. [Google Scholar] [CrossRef]
  31. Li, Y.; Li, C.; Cheng, Q.; Chen, L.; Li, Z.; Zhai, W.; Mao, B.; Chen, Z. Precision estimation of winter wheat crop height and above-ground biomass using unmanned aerial vehicle imagery and oblique photoghraphy point cloud data. Front. Plant Sci. 2024, 15, 1437350. [Google Scholar] [CrossRef]
  32. Anzar, S.M.; Sherin, K.; Panthakkan, A.; Al Mansoori, S.; Al-Ahmad, H. Evaluation of UAV-Based RGB and Multispectral Vegetation Indices for Precision Agriculture in Palm Tree Cultivation. arxiv 2025, arXiv:2505.07840. [Google Scholar]
  33. DJI-Mavic-3M-Guide-1. Available online: https://dl.djicdn.com/downloads/DJI_Mavic_3_Enterprise/20221216/DJI_Mavic_3M_User_Manual-EN.pdf (accessed on 25 April 2024).
  34. Zhang, D.; Qi, H.; Guo, X.; Sun, H.; Min, J.; Li, S.; Hou, L.; Lv, L. Integration of UAV Multispectral Remote Sensing and Random Forest for Full-Growth Stage Monitoring of Wheat Dynamics. Agriculture 2025, 15, 353. [Google Scholar] [CrossRef]
  35. Li, X.; Hayder, Z.; Zia, A.; Cassidy, C.; Liu, S.; Stiller, W.; Stone, E.; Conaty, W.; Petersson, L.; Rolland, V. NeFF-BioNet: Crop Biomass Prediction from Point Cloud to Drone Imagery. arXiv 2024, arXiv:2410.23901. [Google Scholar]
  36. Liu, J.; Wang, W.; Li, J.; Mustafa, G.; Su, X.; Nian, Y.; Ma, Q.; Zhen, F.; Wang, W.; Li, X. UAV Remote Sensing Technology for Wheat Growth Monitoring in Precision Agriculture: Comparison of Data Quality and Growth Parameter Inversion. Agronomy 2025, 15, 159. [Google Scholar] [CrossRef]
  37. Li, J.; Wu, W.; Zhao, C.; Bai, X.; Dong, L.; Tan, Y.; Yusup, M.; Akelebai, G.; Dong, H.; Zhi, J. Effects of solar elevation angle on the visible light vegetation index of a cotton field when extracted from the UAV. Sci. Rep. 2025, 15, 18497. [Google Scholar] [CrossRef]
  38. Badagliacca, G.; Messina, G.; Praticò, S.; Presti, E.L.; Preiti, G.; Monti, M.; Modica, G. Multispectral Vegetation Indices and Machine Learning Approaches for Durum Wheat (Triticum durum Desf.) Yield Prediction across Different Varieties. AgriEngineering 2023, 5, 2032–2048. [Google Scholar] [CrossRef]
  39. Daniels, L.; Eeckhout, E.; Wieme, J.; Dejaegher, Y.; Audenaert, K.; Maes, W.H. Identifying the Optimal Radiometric Calibration Method for UAV-Based Multispectral Imaging. Remote Sens. 2023, 15, 2909. [Google Scholar] [CrossRef]
  40. DJI Assistant 2(Enterprise Series) Release Note. Available online: https://dl.djicdn.com/downloads/dji_assistant/20250609/DJI+Assistant+2+(Enterprise+Series)+Release+Notes(V2.1.17).pdf (accessed on 3 August 2023).
  41. Xie, J.; Shen, Y.; Cen, H. Real-time reflectance generation for UAV multispectral imagery using an onboard downwelling spectrometer in varied weather conditions. arXiv 2024, arXiv:2412.19527. [Google Scholar] [CrossRef]
  42. White, J.C.; Wulder, M.A.; Varhola, A.; Vastaranta, M.; Coops, N.C.; Cook, B.D.; Pitt, D.; Woods, M. A best practices guide for generating forest inventory attributes from airborne laser scanning data using an area-based approach. Can. Inst. For. 2013, 89, 722–723. [Google Scholar] [CrossRef]
  43. Yang, T.R.; Kershaw, J.A. Estimating diameter and height distributions from airborne lidar via copulas. Math. Comput. For. Nat.-Resour. Sci. 2021, 14, 1–14. [Google Scholar]
  44. Tupinambá-Simões, F.; Pascual, A.; Guerra-Hernández, J.; Ordóñez, C.; Barreiro, S.; Bravo, F. Combining hand-held and drone-based lidar for forest carbon monitoring: Insights from a Mediterranean mixed forest in central Portugal. Eur. J. For. Res. 2025, 144, 925–940. [Google Scholar] [CrossRef]
  45. Juan-Ovejero, R.; Elghouat, A.; Navarro, C.J.; Reyes-Martín, M.P.; Jiménez, M.N.; Navarro, F.B.; Alcaraz-Segura, D.; Castro, J. Estimation of aboveground biomass and carbon stocks of Quercus ilex L. saplings using UAV-derived RGB imagery. Ann. For. Sci. 2023, 80, 44. [Google Scholar] [CrossRef]
  46. Niu, Y.; Zhang, L.; Zhang, H.; Han, W.; Peng, X. Estimating above-ground biomass of maize using features derived from UAV-based RGB imagery. Remote Sens. 2019, 11, 1261. [Google Scholar] [CrossRef]
  47. Dhakal, R.; Maimaitijiang, M.; Chang, J.; Caffe, M. Utilizing Spectral, Structural and Textural Features for Estimating Oat Above-Ground Biomass Using UAV-Based Multispectral Data and Machine Learning. Sensors 2023, 23, 9708. [Google Scholar] [CrossRef]
  48. Taniguchi, S.; Sakamoto, T.; Imase, R.; Nonoue, Y.; Tsunematsu, H.; Goto, A.; Matsushita, K.; Ohmori, S.; Maeda, H.; Takeuchi, Y.; et al. Prediction of heading date, culm length, and biomass from canopy-height-related parameters derived from time-series UAV observations of rice. Front. Plant Sci. 2022, 13, 998803. [Google Scholar] [CrossRef] [PubMed]
  49. Bagheri, N.; Kafashan, J. UAV-based remote sensing in orcha-forest environment; diversity of research, used platforms and sensors. Remote. Sens. Appl. Soc. Environ. 2023, 32, 101068. [Google Scholar] [CrossRef]
  50. Mekonen, A.A.; Accardo, D.; Renga, A. Above Ground Biomass Estimation in Agroforestry Environment by UAS and RGB Imagery. In Proceedings of the 2024 IEEE International Workshop on Metrology for AeroSpace, MetroAeroSpace 2024-Proceeding, Lublin, Poland, 3–5 June 2024; Institute of Electrical and Electronics Engineers Inc.: Piscataway, NJ, USA, 2024; pp. 272–277. [Google Scholar] [CrossRef]
  51. Liu, X.; Dong, L.; Li, S.; Li, Z.; Wang, Y.; Mao, Z.; Deng, L. Improving AGB estimations by integrating tree height and crown radius from multisource remote sensing. PLoS ONE 2024, 19, e0311642. [Google Scholar] [CrossRef] [PubMed]
  52. Maesano, M.; Santopuoli, G.; Moresi, F.; Matteucci, G.; Lasserre, B.; Mugnozza, G.S. Above ground biomass estimation from UAV high resolution RGB images and LiDAR data in a pine forest in Southern Italy. iForest-Biogeosci. For. 2022, 15, 451–457. [Google Scholar] [CrossRef]
  53. Lamahewage, S.H.G.; Witharana, C.; Riemann, R.; Fahey, R.; Worthley, T. Aboveground biomass estimation using multimodal remote sensing observations and machine learning in mixed temperate forest. Sci. Rep. 2025, 15, 31120. [Google Scholar] [CrossRef]
  54. Liu, C.; Shi, S.; Liao, Z.; Wang, T.; Gong, W.; Shi, Z. Estimation of woody vegetation biomass in Australia based on multi-source remote sensing data and stacking models. Sci. Rep. 2025, 15, 34975. [Google Scholar] [CrossRef]
  55. Veneros, J.; Chavez, S.; Oliva, M.; Arellanos, E.; Maicelo, J.L.; García, L. Comparing Six Vegetation Indexes between Aquatic Ecosystems Using a Multispectral Camera and a Parrot Disco-Pro Ag Drone, the ArcGIS, and the Family Error Rate: A Case Study of the Peruvian Jalca. Water 2023, 15, 3103. [Google Scholar] [CrossRef]
  56. Leolini, L.; Moriondo, M.; Rossi, R.; Bellini, E.; Brilli, L.; López-Bernal, Á.; Santos, J.A.; Fraga, H.; Bindi, M.; Dibari, C.; et al. Use of Sentinel-2 Derived Vegetation Indices for Estimating fPAR in Olive Groves. Agronomy 2022, 12, 1540. [Google Scholar] [CrossRef]
  57. Mekonen, A.A.; Accardo, D.; Renga, A. Above-Ground Biomass Prediction in Agroforestry Areas Using Machine Learning and Multispectral Drone Imagery. In Proceedings of the 2025 IEEE 12th International Workshop on Metrology for AeroSpace (MetroAeroSpace), Naples, Italy, 18–20 June 2025. [Google Scholar]
Figure 1. Methodology Workflow.
Figure 1. Methodology Workflow.
Aerospace 12 01001 g001
Figure 2. Applicative Workflow.
Figure 2. Applicative Workflow.
Aerospace 12 01001 g002
Figure 3. Drone path during Image capturing.
Figure 3. Drone path during Image capturing.
Aerospace 12 01001 g003
Figure 4. Drone Hardware setup. (A) Mavic 3M drone; (B) Multispectral Camera View; (C) Photo during image acquisition; (D) Flight route Planning.
Figure 4. Drone Hardware setup. (A) Mavic 3M drone; (B) Multispectral Camera View; (C) Photo during image acquisition; (D) Flight route Planning.
Aerospace 12 01001 g004
Figure 5. Example multispectral images captured by the Mavic 3M drone over the agroforestry site. (A) Flight Path; (B) RGB image; (C) Green band; (D) NIR band; (E) Red band; (F) Red Edge band.
Figure 5. Example multispectral images captured by the Mavic 3M drone over the agroforestry site. (A) Flight Path; (B) RGB image; (C) Green band; (D) NIR band; (E) Red band; (F) Red Edge band.
Aerospace 12 01001 g005
Figure 6. Example spectral index. (A) GNDVI; (B) NDRE; (C) LCI; (D) OSAVI.
Figure 6. Example spectral index. (A) GNDVI; (B) NDRE; (C) LCI; (D) OSAVI.
Aerospace 12 01001 g006
Figure 7. An example of Random Forest model performance and feature importance for AGB prediction, showing predicted vs. actual values and ranking of vegetation indices by contribution.
Figure 7. An example of Random Forest model performance and feature importance for AGB prediction, showing predicted vs. actual values and ranking of vegetation indices by contribution.
Aerospace 12 01001 g007
Figure 8. Effect of Overlap Percentage on Height-Time Dynamics.
Figure 8. Effect of Overlap Percentage on Height-Time Dynamics.
Aerospace 12 01001 g008
Table 1. Key parameters and specifications for multispectral UAS data acquisition and processing.
Table 1. Key parameters and specifications for multispectral UAS data acquisition and processing.
ParameterUnit/DetailsPurpose/Notes
Drone platformMultispectral PayloadMultispectral data acquisition
Sensor spectral bandsNIR, Red Edge, Red, Green; 10-bit radiometric resolutionCapture vegetation reflectance for indices
Ground Sampling Distance (GSD)cm/pixelDefines the spatial detail of imagery
Flight altitudemeters (m)Affects GSD and coverage
Flight speedm/sEnsures image sharpness and overlap
Image overlap% (front lap/side lap)Ensures complete coverage and mosaicking
Coverage areaha or m2Area surveyed per mission
GeoreferencingRTK/PPK, GCPsEnsures spatial accuracy of imagery
Environmental lightSun angle (°), cloud cover (%)To minimize shadows, maintain consistent illumination, and ensure radiometric accuracy of spectral data.
Time/SeasonalityDate, crop growth stageCapturing data under similar phenological stages and lighting conditions.
Vegetation indicesNDVI, NDRE, GNDVI, LCI and OASVIQuantifies vegetation status and biomass
Post-processingOrthomosaic,3D-model, DEM/DSMGenerates final data products for analysis
Table 2. Optical specifications of RGB and Multispectral sensors installed aboard the drone Mavic 3M.
Table 2. Optical specifications of RGB and Multispectral sensors installed aboard the drone Mavic 3M.
ParameterRGB SensorMultispectral Sensor
FOV H × V (deg)73.2° × 53.0°61.2° × 48.1°
Array size H × V (pixels)5280 × 39562592 × 1944
Table 3. Minimum Drone Requirements for AGB Estimation in Agroforestry Compared with Mavic 3M Specifications.
Table 3. Minimum Drone Requirements for AGB Estimation in Agroforestry Compared with Mavic 3M Specifications.
ParameterRaw RequirementMavic 3M SpecificationRemarks
Platform TypeMultirotorMultirotorOptimal for small to medium agroforestry plots
Flight Time 30–43 min 30 minSingle-flight mapping capability
GPS AccuracyRTK/PPK RTK includedHigh-precision mapping
Sensor CompatibilityMultispectral, RGB, optional LiDAR5 multispectral bands + RGB sensorSuitable for vegetation indices and structure modeling
Ground Sampling Distance≤10 cm/pixelRGB: 0.6 cm/pixel; Multispectral 1.1 cm/pixel High-resolution image acquisition
Autonomous Flight CapabilityPre-set path with terrain adaptationvia DJI Pilot 2 with built-in flight planning Enables consistent, repeatable flights
Weather Tolerance≥20 km/h Wind resistance43 km/h Wind resistanceRobust for typical agroforestry conditions
Data Storage≥64 GB up to 512 GB microSDSufficient for large image datasets
Table 4. Software integration and workflow for photogrammetric and modeling analysis.
Table 4. Software integration and workflow for photogrammetric and modeling analysis.
SoftwarePrimary FunctionIntegration/Workflow Role
DJI Pilot 2™Flight planning and mission executionData acquisition from UAV platforms
DJI Terra/Agisoft Meta-shapeOrthomosaic generation and 3D reconstructionPhotogrammetric processing
QGISFeature extraction and visualizationGIS integration and spatial analysis
PythonData modeling and automationWorkflow automation and custom analysis pipelines
Table 5. Payload—Minimum Requirements.
Table 5. Payload—Minimum Requirements.
ParameterRaw RequirementMavic 3M SpecificationRemarks
Sensor TypeMultispectral + RGB (optionally thermal or LiDAR)4 Multispectral Bands + RGB Meets standard for vegetation analysis
Sensor IntegrationIntegrated with minimal setupFully integrated sensorsSimplify operation
Payload Weight Capacity≥500 g (if external sensor required)Built-in sensors No external payload required
Data SynchronizationTime-synchronized with GPS/IMUGNSS + RTK synchronizedEnsures geospatial accuracy
Spectral ResolutionBands suitable for VI:-NDVI, GNDVI, NDRE, LCI and OSAVI.RGB, Green, Red, Red Edge, NIRSuitable for AGB vegetation calculation
Table 7. Model Evaluation Metrics.
Table 7. Model Evaluation Metrics.
MetricValueDescription
R2 Score0.782Proportion of AGB variance explained by the model
MAE0.006Mean Absolute Error: Average size of prediction errors (in AGB units)
RMSE0.006Root Mean Square Error: standard deviation of prediction errors
Trend line Equation y = 0.534 x + 0.048 The regression fits between actual and predicted AGB field values
Table 8. Summary of the key parameters investigated in this study, serving as reference values for the proposed methodology.
Table 8. Summary of the key parameters investigated in this study, serving as reference values for the proposed methodology.
Altitude
[m]
Front
% Overlap
GSD
Multispectral
[cm/Pixel]
GSD
RGB
[cm/Pixel]
Swath Width
Multispectral [m]
Average Speed
[m/s]
Mission Time
[min/Hectares]
2580%1.1 cm/px0.6 cm/px22.32.217.2
2585%1.1 cm/px0.6 cm/px22.31.723.0
2590%1.1 cm/px0.6 cm/px22.31.134.4
3080%1.4 cm/px0.8 cm/px26.82.711.8
3085%1.4 cm/px0.8 cm/px26.82.015.7
3090%1.4 cm/px0.8 cm/px26.81.323.7
3580%1.6 cm/px0.9 cm/px31.23.19.1
3585%1.6 cm/px0.9 cm/px31.22.312.1
3590%1.6 cm/px0.9 cm/px31.21.618.1
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Mekonen, A.A.; Conte, C.; Accardo, D. An Effective Process to Use Drones for Above-Ground Biomass Estimation in Agroforestry Landscapes. Aerospace 2025, 12, 1001. https://doi.org/10.3390/aerospace12111001

AMA Style

Mekonen AA, Conte C, Accardo D. An Effective Process to Use Drones for Above-Ground Biomass Estimation in Agroforestry Landscapes. Aerospace. 2025; 12(11):1001. https://doi.org/10.3390/aerospace12111001

Chicago/Turabian Style

Mekonen, Andsera Adugna, Claudia Conte, and Domenico Accardo. 2025. "An Effective Process to Use Drones for Above-Ground Biomass Estimation in Agroforestry Landscapes" Aerospace 12, no. 11: 1001. https://doi.org/10.3390/aerospace12111001

APA Style

Mekonen, A. A., Conte, C., & Accardo, D. (2025). An Effective Process to Use Drones for Above-Ground Biomass Estimation in Agroforestry Landscapes. Aerospace, 12(11), 1001. https://doi.org/10.3390/aerospace12111001

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop