Next Article in Journal
The Largest Geodetic Coseismic Assessment of the 2020 Mw = 6.4 Petrinja Earthquake
Previous Article in Journal
Comprehensive Study on the 143 A.D. West Gangu Earthquake in the West Qinling Area, Northeastern Margin of Tibetan Plateau
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

UAV-Spherical Data Fusion Approach to Estimate Individual Tree Carbon Stock for Urban Green Planning and Management

by
Mattia Balestra
1,*,
MD Abdul Mueed Choudhury
1,
Roberto Pierdicca
2,
Stefano Chiappini
2 and
Ernesto Marcheggiani
1
1
Department of Agricultural, Food and Environmental Sciences, Università Politecnica delle Marche, 60131 Ancona, Italy
2
Department of Construction, Civil Engineering and Architecture Marche Polytechnic University, 60131 Ancona, Italy
*
Author to whom correspondence should be addressed.
Remote Sens. 2024, 16(12), 2110; https://doi.org/10.3390/rs16122110
Submission received: 20 March 2024 / Revised: 3 June 2024 / Accepted: 7 June 2024 / Published: 11 June 2024

Abstract

:
Due to ever-accelerating urbanization in recent decades, exploring the contributions of trees in mitigating atmospheric carbon in urban areas has become one of the paramount concerns. Remote sensing-based approaches have been primarily implemented to estimate the tree-stand atmospheric carbon stock (CS) for the trees in parks and streets. However, a convenient yet high-accuracy computation methodology is hardly available. This study introduces an approach that has been tested for a small urban area. A data fusion approach based on a three-dimensional (3D) computation methodology was applied to calibrate the individual tree CS. This photogrammetry-based technique employed an unmanned aerial vehicle (UAV) and spherical image data to compute the total height (H) and diameter at breast height (DBH) for each tree, consequently estimating the tree-stand CS. A regression analysis was conducted to compare the results with the ones obtained with high-cost laser scanner data. Our study demonstrates the applicability of this method, highlighting its advantages even for large city areas in contrast to other approaches that are often more expensive. This approach could serve as an efficient tool for assisting urban planners in ensuring the proper utilization of the available green space, especially in a complex urban environment.

1. Introduction

In recent decades, green infrastructure has become an essential component of sustainable urban landscape planning in response to the imminent threats of relentless urbanization. Urban green space (UGS) significantly enhances the quality of cities by mitigating urban heat island effects, reducing flood risk, lowering atmospheric carbon dioxide (CO2) levels [1], and improving mental health and well-being. Despite approximately 56.2% of the world’s population residing in urban areas [2], the supply of UGS is diminishing to a level that poses a significant threat to urban ecosystems [3]. Urban atmospheres, with people, cars, and industries, are contributing to 70% of the total CO2 emissions [4,5]. As a result, trees in urban areas are increasingly recognized as a key tool for absorbing CO2 and mitigating the significant impacts of such emissions [6,7]. These trees sequester atmospheric carbon, ultimately helping to delay the adverse effects of climate change and contributing to the accumulation of carbon in the soil [8]. According to previous studies, a yearly reduction in carbon emissions can be up to 18 kg tree−1 in a city area [9,10]. Given these findings, various approaches based on advanced remote-sensing technologies have been employed to assess the contributions of urban trees to atmospheric carbon stock (CS). There is a growing demand for city arborists and policymakers to introduce a convenient approach to understanding the role of urban trees, which can lead to an efficient management system for UGS. The exploration of information on tree attributes (i.e., diameter at breast height (DBH), total height (H), and above-ground biomass (AGB)) and species in urban areas with high accuracy is essential for evaluating the existing tree-stand CS and establishing a proper monitoring and management system [11,12]. Urban forest CS, based on estimating the AGB [13,14], also has a significant value for evaluating urban ecosystem services.
Traditionally, the assessment of these attributes has relied upon random field sampling and visual interpretation from aerial photos, which is generally expensive, labor-intensive, time-consuming, and often unable to cover a larger area of interest [15,16]. Recently, remote-sensing tools have gained widespread recognition, with many of them based on LiDAR-derived methods [17,18,19]. Despite achieving higher accuracy in results, LiDAR-based approaches encounter challenges during data processing, portability, and hardware costs [19,20]. Therefore, photogrammetric approaches based on unmanned aerial vehicle (UAV) images have been successfully adopted as an alternative to the frequent use of LiDAR data [21,22]. These photogrammetric methods, based on the Structure from Motion (SfM) algorithm, provide more accurate results where 3D models are reconstructed to estimate various 2D and 3D tree-stand attributes [23,24,25]. Yet it is a huge challenge to introduce a convenient and low-cost approach that quantifies tree-stand CS with higher accuracy, especially concerning the urban space agglomerations that are complex entities and quite difficult to govern. Several studies have suggested different approaches [25,26], which may be practical but often focus on a limited set of urban tree attributes.
This study tests a photogrammetric approach for estimating urban tree CS, particularly applicable to larger urban areas. The overarching goal is to develop an efficient CS computation method tailored to the complex urban environment, with a focus on dominant tree populations. While estimating individual tree CS presents different technical complexities [27,28,29] and traditional methodological challenges, this study utilized UAV images and adopted a user-friendly photogrammetric approach. Using the SfM algorithm with image-matching algorithms is well-established, particularly for measuring linear metrics, such as the DBH, H, crown spread, and stem radius [30,31]. In this case, spherical images were employed for 3D reconstruction, enabling DBH calibration. Spherical image data were collected in a typical urban environment using a GoPro MAX multidirectional camera capable of recording a 360° view video. This approach saved time compared to the traditional method of capturing multiple images from various angles and directions for each tree. Other research papers have also utilized spherical images for tree semantic segmentation in urban green areas [32]. When conducting geometric characterizations, an evaluation of the obtained accuracies is crucial to validate the subsequent analysis. To assess and validate the methodology, high-cost LiDAR data were used as reference data for each of the trees. The total CS for each tree was estimated using a well-known allometric model that utilizes computed H and DBH values. [33]. The results of total CS computation were discussed and assessed through both a regression analysis and the root mean square deviation (RMSD) to illustrate the approach’s applicability [34].
We can summarize the main contributions of this paper as follows: (i) it proposes an alternative, agile, and low-cost methodology for creating 3D point clouds in urban green areas, particularly for individual trees by merging UAV and spherical images; (ii) it utilizes the collected data to compare single tree CS in urban environments with a qualitative comparison with other well-established methods like LiDAR-based approaches. Comparisons were also drawn to understand its applicability in the case of developing countries, replacing expensive data sources i.e., LiDAR data or others [35].
While our methodology is more complex than LiDAR acquisition, it can assist researchers, government policymakers, and urban planners primarily in monitoring and assessing the value of the available UGS [25,36,37]. This is how this study’s outcomes contribute to a better understanding of the methodology for estimating tree dendrometry as well as predicting urban CS in typical city areas, i.e., by using low-cost sensors and photogrammetric techniques instead of LiDAR-derived methods.

2. Materials and Methods

2.1. Study Area

The research site is situated in central Italy (the Marche region) within the city of Ancona on a university campus. It encompasses an approximate area of 4000 m2, which includes 20 urban trees, as illustrated in Figure 1. During the field survey, the dominant tree species were identified, i.e., Pinus pinea L., commonly referred to as the “Stone pine”, and Platanus hispanica Mill. ex Münchh., more commonly known as the “London plane”.

2.2. Data Acquisition

During the field data collection, we focused on urban trees (i.e., woody or tall perennial plants), excluding ornamental herbs, shrubs, or grassland areas. The main goal was to create a 3D model for measuring tree metrics (DBH and H), which would enable CS computation. An allometric model was utilized for CS computation, which is well-known for Italian tree species [33]. We used a DJI Air 2S, a small UAV, to capture images with a resolution of 20MP (5472 × 3078 pixels, 8.38 mm focal length, 100 ISO). The flight duration was approximately 14 min, maintaining a height of 40 m with a ground sampling distance of 1 cm per pixel. The flight was planned using the Litchi app, with a cruising speed of 2.5 m per second and a photo capture interval of 2.5 s, resulting in a total of 301 images. During the flight mission on 13 May 2022, the weather conditions were favorable for the flight. It was sunny and free from sudden wind gusts. The ground control points (GCPs) were placed earlier, and a total of 5 GCPs were collected. The XYZ coordinates in WGS 84 UTM 33N were collected utilizing the HiPer VR Topcon GNSS antenna with the RTK (real-time kinematic) method, producing an accuracy of around 1 cm in real-time. To acquire 360° panoramic data, a GoPro MAX camera was used; this system, which represents the main novelty of this work, replaced the standard reflex cameras that are commonly used to perform the SfM pipeline [38]. The representation of the acquisition scheme from both the aerial and spherical images is depicted in Figure 2.
These spherical image data were generated by capturing an 8 min video (4096 × 2048 pixel resolution, 30 fps, 3.73 GB, and .360 file format) in the handheld mode, carrying the GoPro MAX on a pole and walking through the urban tree rows. We imported the .360 video file format into the GoPro Player, a free video player, and we saved the video in the .mp4 file format. The latter was then processed in photogrammetric software (Agisoft Metashape Professional, version 1.8.4.), where the spherical images were exported as frames. Given the experimental nature of the proposed methodology, it was necessary to validate a benchmark dataset. Therefore, we opted to utilize a mobile mapping system (MMS) to collect point cloud data for the case study area, which served as our reference data. Such 3D data were used to compare the SfM point cloud obtained from the data fusion between the UAV and GoPro MAX. Similar approaches in the literature are quite common, and interested readers can gain further insights [39]. The LiDAR data were collected using a mobile laser scanner (MLS) known as Kaarta Stencil-2, a commercial mobile scanner equipped with a Velodyne VLP-16 LiDAR. As an MLS, this device can be mounted on various mobile platforms, but in our case, it was supported using a hand-held pole. The tree species information and image data were collected on 13 May 2022. During the field survey, we focused only on the dominant tree species in the selected location. The SfM approach, which includes all necessary data processing and CS computation, was executed on a laptop with a 2.30 GHz Intel Core i7-11800H (11th generation) processor supported with an NVIDIA GeForce RTX®3050 graphics card.

2.3. Data Processing

The SfM photogrammetric approach was conducted using Agisoft Metashape Professional (AMP) software, version 1.8.4 (https://www.agisoft.com/ accessed on 2 June 2024). The workflow consisted of different stages of image processing in Metashape, including image quality control, mask application, image alignment, and the generation of a dense point cloud. Then, different steps in CloudCompare were performed to obtain the final segmented trees, which were used as the input for the tree metrics computation needed to estimate the CS. Figure 3 illustrates all these steps.
The UAV images and the spherical data (360° video) were imported into the AMP software interface to begin the preprocessing phase. Initially, we extracted the 360° images from the spherical data, selecting only the high-quality ones for the subsequent steps. We used the “Estimate image quality” algorithm in the “Photos” menu to assess the image quality of the spherical images. After discarding the poor-quality images, we applied individual masks to the remaining ones. In contrast, the UAV image data were aligned without a masking procedure. The masks were used to exclude the image edges from the AMP computation of the 360° frames, as the calibration of the spherical cameras was too noisy to capture the outermost portions of the image. To maintain the original resolution, we set the accuracy to “high” in Metashape for image matching. We used both the “Generic” preselection and the “Sequential” preselection modes to identify the overlapping images. As a default setting, we employed 40,000 key points and 4000 tie points to ensure proper image matching. Once the alignments were completed, we optimized the resulting sparse point clouds through a gradual outlier selection process. We removed the worst 5% of the points obtained in the sparse point clouds using the tools available in AMP. The criteria for selecting these points were “reprojection error, reconstruction uncertainty, and projection accuracy”. We utilized the collected GCPs for georeferencing and scaling the models using the guided marker positioning system. Both the spherical and UAV images shared the same set of GCPs, streamlining the survey process and eliminating the need to acquire GCPs for each survey (Table 1 and Table 2).
The final step involved creating a dense point cloud, where for the UAV and the spherical photogrammetry, the dense cloud generation parameters were set on the “high” quality and the “aggressive” filtering modes. The “aggressive” filtering mode was selected as preserving the smallest details was unnecessary [40], except for reconstructing tree trunks as accurately as possible. Still, with the final aim of achieving an accurate point cloud, several chunks were created, splitting the study area into 7 sub-zones so that there would be a 3D model reconstruction for each sub-zone. This workspace setup enabled batch processing with reduced computational effort, simplifying the construction of the dense cloud even with fewer images. Later, we applied filtering to retain the finest surface details, using the “Select points by color” tool within the AMP software interface. This filtering tool removed all the blue points representing the sky, resulting in a cleaner, dense point cloud with less noise. The dense clouds obtained from the spherical and UAV photogrammetry were exported in the LAS format and then imported into open-source CloudCompare (CC) software (version 2.13.1). Using this software, the dense cloud resulting from the UAV photogrammetry and the one from the spherical photogrammetry were merged to create a complete 3D model of the study area. Thus, the required 3D model illustrating the whole study area was reconstructed (Figure 4).
Later, the MLS point clouds were imported and georeferenced into CC for further computations. In CC software, the ground points of the point clouds were identified by applying the cloth simulation filter (CSF), setting the cloth resolution, the max iterations, and the classification threshold as advanced parameters in the plugin [41]. These points were then converted into a 2D raster using the rasterize tool in Cloud Compare (Figure 5). By computing the distance between the off-ground point clouds and the raster, we were able to normalize the point clouds.
Subsequently, we utilized both the 3D model derived from the LiDAR data and the 3D model from photogrammetry (Figure 6) to extract information about the DBH and H. The LiDAR data were used to compare the results. With the “tree_metrics” function, we extracted the DBH and H information using the “VoxR” package, a voxel-based “R” package [42]. VoxR is based on voxelization algorithms, where a file can be imported as a data frame to extract the required tree metrics data.
After extracting the DBH and the H, we proceeded with the estimation of the AGB to calibrate the tree-stand CS. In this phase, we utilized a well-known allometric model to estimate the AGB for the individual trees. In this article, the authors developed prediction equations for estimating the various components of above-ground tree volume and phytomass for 25 significant tree species in Italy. The equations were constructed based on the data collected from approximately 1300 sampling units, or sample trees, representing these species. With these prediction equations, it was possible to compute the stem volume (v), stem phytomass (dw1), small branch phytomass (dw2), stump phytomass (dw3), and the total above-ground phytomass (dw4), which represented the sum of the phytomass of the three previously mentioned components [33]. This model used the extracted DBH (m) and the H (m) based on the species to calibrate the total AGB in kilograms (kg) [33]. The total AGB tree-1 estimation was necessary, as it is recommended to assume that the tree above-ground CS is 50% of the total AGB [43,44,45]. The total AGB tree−1 (kg) was multiplied by 0.5 as a conversion factor to calibrate the total tree-stand CS (kg) [46,47].

3. Results

The results of the total CS computation were discussed and assessed through both a regression analysis and the RMSD by comparing the LiDAR data with the computed DBH and H values derived from the UAV-spherical fusion data (Table 3 and Table 4). The results revealed effective relationships, whereas in the case of the DBH (R2 = 0.91), it was lower than that of the H (R2 = 0.98) (Figure 7, Figure 8, Figure 9 and Figure 10).
In Table 4, the results show that the tree-stand CS for each of the trees was consistent with the results obtained in the literature for either of the species [27,48,49]. The computed CS tree−1 was compared with the LiDAR data to validate and clarify the metric accuracy of the estimations. Regarding the total tree-stand CS, we calculated an RMSD of 58.05 kg tree−1 and an RMSD% of 7.60%. Regarding the CS computation, the coefficient of determination (Figure 11 and Figure 12) was also greater than 90% (R2 = 0.94), which underscores the applicability and significance of the tested methodology.

4. Discussion

This study used UAV and spherical image data fusion to obtain dense point clouds and calculate single tree CS, reconstructing a complete 3D model for the entire study area. Even though this study was conducted on a relatively small area with a limited number of trees, it could be utilized to highlight the scopes, mostly concerning the expenses and the accuracy of the computations (i.e., AGB, CS, etc.). For instance, photogrammetry could offer the advantage of regularly monitoring urban trees, allowing access to the tree parameters even if using low-cost sensors. For the computation of the DBH, spherical image data and derived point clouds were matched with the UAV data, where the outcomes were strongly related (R2 = 0.91) to the LiDAR-based computations. The results (Table 3 and Table 4) also show that there were no significant discrepancies in the computed tree parameters (i.e., the DBH, H, and CS) between the LiDAR and the photogrammetric point clouds. It was noticed that the CS estimation in most of the cases had a lower difference, except for the few trees having a larger difference (i.e., tree ID two). These differences were found mostly due to the difficulties during the DBH and H estimations. In particular, calculating the DBH posed challenges, particularly when the tree trunk was surrounded by overgrown herbs and shrubs. However, as demonstrated by the R2 values and by the errors, the overall computation demonstrated a robust correlation between the methods. The discrepancies might be due to the tree shapes and verticality, together with the presence of occlusions that may occur in a conventional urban setting.
This approach could be applied as more essential and economical than the other expensive LiDAR-based approaches [50,51,52,53]. An MLS point cloud is certainly easier to acquire and process compared to the photogrammetric approach analyzed in this paper, which needs to be processed with an SfM pipeline, but the high level of accuracy achieved using the fused spherical and UAV data indicates significant potential for future applications, especially in those where budgets are not so extensive as to afford the purchase of professional laser scanners. One of the most visible MLS advantages is the long-range detection. Indeed, it is usually tens of meters, and in our case, it was set at 100 m. This extended range allowed for the comprehensive reconstruction of the urban trees detected. This is not feasible with spherical cameras with a range from 5 to 7 m and relying on a passive sensor. The range is determined by its field of view, and the quality of reconstruction depends on the necessity of recording the tree from multiple positions. Furthermore, the details of the tree reconstruction are related to the acquisition distance.
The advantages of using devices like the GoPro Max or other spherical cameras are the low prices of the cameras, their light weight, the compact design, and their familiarity with a majority of the population, making them more accessible. Furthermore, in the future, these types of instruments will be enhanced with additional sensors or functions, making them easier for users to operate. In our case, the primary advantage is the faster data acquisition enabled by 360° mobile photogrammetry. Of course, a stop-and-go approach with only one tree photographed each time will lead to a sub-centimeter accuracy, as demonstrated in other studies [26,54]. On the other hand, it is essential with mobile photogrammetry to secure a good overlap among images, and being a video captured by a constantly moving operator, the data acquired must not be blurred, and the weather conditions must be ideal. Moreover, when dealing with spherical images, the advantage is the possibility of capturing both sides of an urban area at the same time. Spherical images tend to have more deformations along the edges. In this work, to obtain a high-quality, dense cloud, we proposed masking these edges and maintaining only the central area of the extracted frames.
Several studies have demonstrated that the applications of LiDAR data are significantly more expensive than existing UAV-based photogrammetric approaches [55,56]. For instance, a report from the United States Geological Survey published in 2017 indicated that the total cost of LiDAR data acquisition could vary from approximately USD 220 per square mile to USD 700 per square mile (1 sq mile = 2.6 sq km) [57]. Another study found that the photogrammetric technique could extract sufficient information about tree canopy cover with low-cost sensors, which could be 20 times cheaper than LiDAR sensors [55]. A small UAV equipped with 360-degree spherical cameras, like the Insta360 Sphere, is readily available for collecting urban tree data and costs USD 1499. This UAV can be used for numerous field surveys covering larger areas. In this context, lightweight RGB UAVs are often recommended, while UAV-LiDAR instruments tend to be relatively heavy, affecting factors like flight times, operational flexibility, and their applicability in densely populated areas [58,59].
LiDAR data-derived computation has been utilized here as reference data, as it is widely used to compute biomass and carbon storage in urban vegetation [50,51,52,53] with improved accuracy. During the validation process, it became evident that, despite some limitations (i.e., overgrown herbs and shrubs), the DBH computation related to tree trunks was highly correlated (R2 = 91%). Subsequently, the coefficient of determination in the regressed relationship between the LiDAR CS and photogrammetric point cloud-derived computations was approximately 94%, offering substantial evidence of the methodology’s applicability. In terms of cost-effectiveness and potential, this approach provides a more feasible option, especially in developing countries where expensive data sources like professional laser scanner data acquisitions are hardly possible or available to compute. However, it should be noted that this study was also performed to explore the perspectives and opportunities of the applied methodology. It would be interesting to find out the CS computation efficiency in the case of a larger city area, applying this methodology to a larger variety of surveyed tree species. Currently, other low-cost LiDAR devices, such as iPhones or iPads, can assist in reconstructing tree DBH and H, presenting another viable option for exploration.

5. Conclusions

A photogrammetric approach was successfully tested for its applicability, both in terms of cost efficiency and user convenience. A convenient computation method is essentially required to understand the possible atmospheric CS, especially in urban street trees. It also helps identify the contributions of the dominant tree species in the urban environment and their ecological benefits. This CS computation approach can be extended to target city areas where estimating and monitoring urban trees remains a considerable challenge without the need for manual tree marking and inspection. That is why this study was designed, i.e., to understand and explore the feasibility of the applied approach. However, further studies are needed, especially in the context of larger urban areas, to provide more recommendations regarding the capabilities of this photogrammetric approach. However, concerning developing countries or where the application of LiDAR data is not that cost-effective, this approach could be used as a feasible alternative, recognizing the scopes of the photogrammetric approaches. This could also assist the perspective of city planners and policymakers in implementing an efficient urban green management system, ensuring robust urban tree monitoring, and foresting an improved urban environment for residents. It is important to highlight the main drawback of our proposed system. Despite being cost-effective and easy to use in the acquisition phase, the data processing stage still relies on expert operators. This limitation could hinder broader adoption in current practice, as the data processing step requires significant computational resources and time. Nevertheless, future technological developments will yield more user-friendly pipelines to overcome these challenges. As directions for future research, we consider expanding the research scope to include a wider variety of tree species and diverse urban environments to generalize our findings and improve the robustness of our methods, even testing our approaches on different tree species, providing insights into species-specific variations and enhancing the accuracy of our metrics.

Author Contributions

Conceptualization, M.A.M.C. and E.M.; methodology, M.B., M.A.M.C., and E.M.; software, M.B.; validation, M.B., and M.A.M.C.; formal analysis, M.B., and M.A.M.C.; investigation, M.B., and M.A.M.C.; resources, R.P. and E.M.; data curation, M.B., and M.A.M.C.; writing—original draft preparation, M.B., M.A.M.C., and E.M.; writing—review and editing, M.B., M.A.M.C., S.C., E.M., and R.P.; visualization, M.B., M.A.M.C., and S.C.; supervision, E.M., and R.P. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The data will be made available upon request.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Nowak, D.J.; Crane, D.E. Carbon storage and sequestration by urban trees in the USA. Environ. Pollut. 2002, 116, 381–389. [Google Scholar] [CrossRef] [PubMed]
  2. Haase, D.; Güneralp, B.; Dahiya, B.; Bai, X.; Elmqvist, T. Global urbanization. Urban Planet Knowl. Towar. Sustain. Cities 2018, 19, 326–339. [Google Scholar]
  3. Puplampu, D.A.; Boafo, Y.A. Exploring the impacts of urban expansion on green spaces availability and delivery of ecosystem services in the Accra metropolis. Environ. Chall. 2021, 5, 100283. [Google Scholar] [CrossRef]
  4. Ribeiro, H.V.; Rybski, D.; Kropp, J.P. Effects of changing population or density on urban carbon dioxide emissions. Nat. Commun. 2019, 10, 3204. [Google Scholar] [CrossRef]
  5. Suwardhi, D.; Fauzan, K.N.; Harto, A.B.; Soeksmantono, B.; Virtriana, R.; Murtiyoso, A. 3D Modeling of Individual Trees from LiDAR and Photogrammetric Point Clouds by Explicit Parametric Representations for Green Open Space (GOS) Management. ISPRS Int. J. Geo-Inf. 2022, 11, 174. [Google Scholar] [CrossRef]
  6. Baines, O.; Wilkes, P.; Disney, M. Quantifying urban forest structure with open-access remote sensing data sets. Urban For. Urban Green. 2020, 50, 126653. [Google Scholar] [CrossRef]
  7. Doick, K.J.; Davies, H.J.; Moss, J.; Coventry, R.; Handley, P.; VazMonteiro, M.; Rogers, K.; Simpkin, P. The Canopy Cover of England’s Towns and Cities: Baselining and setting targets to improve human health and well-being. In Proceedings of the Trees, People and the Built Environment III, International Urban Trees Research Conference, Birmingham, UK, 3–14 April 2011; University of Birmingham: Birmingham, UK, 2017; pp. 5–6. [Google Scholar]
  8. Jo, H.-K.; Kim, J.-Y.; Park, H.-M. Carbon reduction and planning strategies for urban parks in Seoul. Urban For. Urban Green. 2019, 41, 48–54. [Google Scholar] [CrossRef]
  9. Kanniah, K.D.; Muhamad, N.; Kang, C.S. Remote sensing assessment of carbon storage by urban forest. IOP Conf. Ser. Earth Environ. Sci. 2014, 18, 12151. [Google Scholar] [CrossRef]
  10. Rosenfeld, A.H.; Akbari, H.; Romm, J.J.; Pomerantz, M. Cool communities: Strategies for heat island mitigation and smog reduction. Energy Build. 1998, 28, 51–62. [Google Scholar] [CrossRef]
  11. Nowak, D.J.; Greenfield, E.J.; Hoehn, R.E.; Lapoint, E. Carbon storage and sequestration by trees in urban and community areas of the United States. Environ. Pollut. 2013, 178, 229–236. [Google Scholar] [CrossRef]
  12. McPherson, E.G. Carbon Dioxide Reduction through Urban Forestry: Guidelines for Professional and Volunteer Tree Planters; US Department of Agriculture, Forest Service, Pacific Southwest Research Station: Placerville, CA, USA, 1999; Volume 171. [Google Scholar]
  13. Lin, J.; Chen, D.; Wu, W.; Liao, X. Estimating aboveground biomass of urban forest trees with dual-source UAV acquired point clouds. Urban For. Urban Green. 2022, 69, 127521. [Google Scholar] [CrossRef]
  14. Penman, J.; Gytarsky, M.; Hiraishi, T.; Krug, T.; Kruger, D.; Pipatti, R.; Buendia, L.; Miwa, K.; Ngara, T.; Tanabe, K. Good practice guidance for land use, land-use change and forestry. In Good Practice Guidance for Land Use, Land-Use Change and Forestry; IPCC: Geneva, Switzerland, 2003. [Google Scholar]
  15. Li, X.; Chen, W.Y.; Sanesi, G.; Lafortezza, R. Remote sensing in urban forestry: Recent applications and future directions. Remote Sens. 2019, 11, 1144. [Google Scholar] [CrossRef]
  16. Song, Y.; Imanishi, J.; Sasaki, T.; Ioki, K.; Morimoto, Y. Estimation of broad-leaved canopy growth in the urban forested area using multi-temporal airborne LiDAR datasets. Urban For. Urban Green. 2016, 16, 142–149. [Google Scholar] [CrossRef]
  17. Cabo, C.; Del Pozo, S.; Rodríguez-Gonzálvez, P.; Ordóñez, C.; González-Aguilera, D. Comparing terrestrial laser scanning (TLS) and wearable laser scanning (WLS) for individual tree modeling at plot level. Remote Sens. 2018, 10, 540. [Google Scholar] [CrossRef]
  18. Calders, K.; Adams, J.; Armston, J.; Bartholomeus, H.; Bauwens, S.; Bentley, L.P.; Chave, J.; Danson, F.M.; Demol, M.; Disney, M.; et al. Terrestrial laser scanning in forest ecology: Expanding the horizon. Remote Sens. Environ. 2020, 251, 112102. [Google Scholar] [CrossRef]
  19. Shao, T.; Qu, Y.; Du, J. A low-cost integrated sensor for measuring tree diameter at breast height (DBH). Comput. Electron. Agric. 2022, 199, 107140. [Google Scholar] [CrossRef]
  20. Hyyppä, E.; Yu, X.; Kaartinen, H.; Hakala, T.; Kukko, A.; Vastaranta, M.; Hyyppä, J. Comparison of backpack, handheld, under-canopy UAV, and above-canopy UAV laser scanning for field reference data collection in boreal forests. Remote Sens. 2020, 12, 3327. [Google Scholar] [CrossRef]
  21. Tao, W.; Lei, Y.; Mooney, P. Dense point cloud extraction from UAV captured images in forest area. In Proceedings of the 2011 IEEE International Conference on Spatial Data Mining and Geographical Knowledge Services, Fuzhou, China, 29 June–1 July 2011; IEEE: Piscataway, NJ, USA, 2011; pp. 389–392. [Google Scholar]
  22. Yang, W.; Liu, Y.; He, H.; Lin, H.; Qiu, G.; Guo, L. Airborne LiDAR and Photogrammetric Point Cloud Fusion for Extraction of Urban Tree Metrics According to Street Network Segmentation. IEEE Access 2021, 9, 97834–97842. [Google Scholar] [CrossRef]
  23. Akpo, H.A.; Atindogbé, G.; Obiakara, M.C.; Adjinanoukon, A.B.; Gbedolo, M.; Fonton, N.H. Accuracy of common stem volume formulae using terrestrial photogrammetric point clouds: A case study with savanna trees in Benin. J. For. Res. 2021, 32, 2415–2422. [Google Scholar] [CrossRef]
  24. Fekry, R.; Yao, W.; Cao, L.; Shen, X. Ground-based/UAV-LiDAR data fusion for quantitative structure modeling and tree parameter retrieval in subtropical planted forest. For. Ecosyst. 2022, 9, 100065. [Google Scholar] [CrossRef]
  25. SADEGHIAN, H.; Naghavi, H.; Maleknia, R.; Soosani, J.; Pfeifer, N. Estimating the attributes of urban trees using terrestrial photogrammetry. Environ. Monit. Assess. 2022, 194, 625. [Google Scholar] [CrossRef] [PubMed]
  26. Mokroš, M.; Výbošťok, J.; Grznárová, A.; Bošela, M.; Šebeň, V.; Merganič, J. Non-destructive monitoring of annual trunk increments by terrestrial structure from motion photogrammetry. PLoS ONE 2020, 15, e0230082. [Google Scholar] [CrossRef] [PubMed]
  27. Moreno, G.; Martínez Carretero, E.; Vázquez, J.; Vento, B.; Ontivero, M.; Duplancic, A.; Alcalá, J. Quantifying the role of Platanus hispanica in carbon storage in an urban forest in central west Argentina. Arboric. J. 2023, 45, 118–131. [Google Scholar] [CrossRef]
  28. Schreyer, J.; Tigges, J.; Lakes, T.; Churkina, G. Using airborne LiDAR and QuickBird data for modelling urban tree carbon storage and its distribution-a case study of Berlin. Remote Sens. 2014, 6, 10636–10655. [Google Scholar] [CrossRef]
  29. Strohbach, M.W.; Haase, D. Above-ground carbon storage by urban trees in Leipzig, Germany: Analysis of patterns in a European city. Landsc. Urban Plan. 2012, 104, 95–104. [Google Scholar] [CrossRef]
  30. Miller, J.; Morgenroth, J.; Gomez, C. 3D modelling of individual trees using a handheld camera: Accuracy of height, diameter and volume estimates. Urban For. Urban Green. 2015, 14, 932–940. [Google Scholar] [CrossRef]
  31. Panagiotidis, D.; Surový, P.; Kuželka, K. Accuracy of Structure from Motion models in comparison with terrestrial laser scanner for the analysis of DBH and height influence on error behaviour. J. For. Sci. 2016, 62, 357–365. [Google Scholar] [CrossRef]
  32. Choi, K.; Lim, W.; Chang, B.; Jeong, J.; Kim, I.; Park, C.-R.; Ko, D.W. An automatic approach for tree species detection and profile estimation of urban street trees using deep learning and Google street view images. Isprs J. Photogramm. Remote Sens. 2022, 190, 165–180. [Google Scholar] [CrossRef]
  33. Tabacchi, G.; Di Cosmo, L.; Gasparini, P. Aboveground tree volume and phytomass prediction equations for forest species in Italy. Eur. J. For. Res. 2011, 130, 911–934. [Google Scholar] [CrossRef]
  34. Guerra-Hernández, J.; Cosenza, D.N.; Rodriguez, L.C.E.; Silva, M.; Tomé, M.; Díaz-Varela, R.A.; González-Ferreiro, E. Comparison of ALS-and UAV (SfM)-derived high-density point clouds for individual tree detection in Eucalyptus plantations. Int. J. Remote Sens. 2018, 39, 5211–5235. [Google Scholar] [CrossRef]
  35. Lian, X.; Zhang, H.; Xiao, W.; Lei, Y.; Ge, L.; Qin, K.; He, Y.; Dong, Q.; Li, L.; Han, Y. Biomass Calculations of Individual Trees Based on Unmanned Aerial Vehicle Multispectral Imagery and Laser Scanning Combined with Terrestrial Laser Scanning in Complex Stands. Remote Sens. 2022, 14, 4715. [Google Scholar] [CrossRef]
  36. Lee, J.-H.; Ko, Y.; McPherson, E.G. The feasibility of remotely sensed data to estimate urban tree dimensions and biomass. Urban For. Urban Green. 2016, 16, 208–220. [Google Scholar] [CrossRef]
  37. Mokroš, M.; Liang, X.; Surový, P.; Valent, P.; Čerňava, J.; Chudý, F.; Tunák, D.; Saloň, Š.; Merganič, J. Evaluation of close-range photogrammetry image collection methods for estimating tree diameters. ISPRS Int. J. Geo-Inf. 2018, 7, 93. [Google Scholar] [CrossRef]
  38. Balestra, M.; Tonelli, E.; Vitali, A.; Urbinati, C.; Frontoni, E.; Pierdicca, R. Geomatic Data Fusion for 3D Tree Modeling: The Case Study of Monumental Chestnut Trees. Remote Sens. 2023, 15, 2197. [Google Scholar] [CrossRef]
  39. Chiappini, S.; Pierdicca, R.; Malandra, F.; Tonelli, E.; Malinverni, E.S.; Urbinati, C.; Vitali, A. Comparing Mobile Laser Scanner and manual measurements for dendrometric variables estimation in a black pine (Pinus nigra Arn.) plantation. Comput. Electron. Agric. 2022, 198, 107069. [Google Scholar] [CrossRef]
  40. Tinkham, W.T.; Swayze, N.C. Influence of Agisoft Metashape parameters on UAS structure from motion individual tree detection from canopy height models. Forests 2021, 12, 250. [Google Scholar] [CrossRef]
  41. Zhang, W.; Qi, J.; Wan, P.; Wang, H.; Xie, D.; Wang, X.; Yan, G. An easy-to-use airborne LiDAR data filtering method based on cloth simulation. Remote Sens. 2016, 8, 501. [Google Scholar] [CrossRef]
  42. Lecigne, B.; Delagrange, S.; Messier, C. Exploring trees in three dimensions: VoxR, a novel voxel-based R package dedicated to analysing the complex arrangement of tree crowns. Ann. Bot. 2018, 121, 589–601. [Google Scholar] [CrossRef] [PubMed]
  43. Losi, C.J.; Siccama, T.G.; Condit, R.; Morales, J.E. Analysis of alternative methods for estimating carbon stock in young tropical plantations. For. Ecol. Manag. 2003, 184, 355–368. [Google Scholar] [CrossRef]
  44. Vashum, K.T.; Jayakumar, S. Methods to estimate above-ground biomass and carbon stock in natural forests-a review. J. Ecosyst. Ecography 2012, 2, 1–7. [Google Scholar] [CrossRef]
  45. Whittaker, R.H.; Likens, G.E. Carbon in the biota. In Brookhaven Symposia in Biology; Associates Universities: Washington, DC, USA, 1973; Volume 30, pp. 281–302. [Google Scholar] [PubMed]
  46. Solomon, S.; Qin, D.; Manning, M.; Chen, Z.; Marquis, M.; Averyt, K.; Tignor, M.; Miller, H. IPCC, 2007: Climate Change 2007: The Physical Science Basis. In Contribution of Working Group I to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change; Cambridge University Press: Cambridge, UK; New York, NY, USA; 996p.
  47. Vicharnakorn, P.; Shrestha, R.P.; Nagai, M.; Salam, A.P.; Kiratiprayoon, S. Carbon stock assessment using remote sensing and forest inventory data in Savannakhet, Lao PDR. Remote Sens. 2014, 6, 5452–5479. [Google Scholar] [CrossRef]
  48. Correia, A.C.; Faias, S.P.; Ruiz-Peinado, R.; Chianucci, F.; Cutini, A.; Fontes, L.; Manetti, M.C.; Montero, G.; Soares, P.; Tomé, M. Generalized biomass equations for Stone pine (Pinus pinea L.) across the Mediterranean basin. For. Ecol. Manag. 2018, 429, 425–436. [Google Scholar] [CrossRef]
  49. Cutini, A.; Chianucci, F.; Manetti, M.C. Allometric relationships for volume and biomass for stone pine (Pinus pinea L.) in Italian coastal stands. Iforest-Biogeosciences For. 2013, 6, 331. [Google Scholar] [CrossRef]
  50. Alonzo, M.; McFadden, J.P.; Nowak, D.J.; Roberts, D.A. Mapping urban forest structure and function using hyperspectral imagery and lidar data. Urban For. Urban Green. 2016, 17, 135–147. [Google Scholar] [CrossRef]
  51. Mitchell, M.G.E.; Johansen, K.; Maron, M.; McAlpine, C.A.; Wu, D.; Rhodes, J.R. Identification of fine scale and landscape scale drivers of urban aboveground carbon stocks using high-resolution modeling and mapping. Sci. Total Environ. 2018, 622, 57–70. [Google Scholar] [CrossRef] [PubMed]
  52. Raciti, S.M.; Hutyra, L.R.; Newell, J.D. Mapping carbon storage in urban trees with multi-source remote sensing data: Relationships between biomass, land use, and demographics in Boston neighborhoods. Sci. Total Environ. 2014, 500, 72–83. [Google Scholar] [CrossRef] [PubMed]
  53. Shrestha, R.; Wynne, R.H. Estimating biophysical parameters of individual trees in an urban environment using small footprint discrete-return imaging lidar. Remote Sens. 2012, 4, 484–508. [Google Scholar] [CrossRef]
  54. Mikita, T.; Janata, P.; Surový, P. Forest stand inventory based on combined aerial and terrestrial close-range photogrammetry. Forests 2016, 7, 165. [Google Scholar] [CrossRef]
  55. Ghanbari Parmehr, E.; Amati, M. Individual tree canopy parameters estimation using UAV-based photogrammetric and LiDAR point clouds in an urban park. Remote Sens. 2021, 13, 2062. [Google Scholar] [CrossRef]
  56. Hummel, S.; Hudak, A.T.; Uebler, E.H.; Falkowski, M.J.; Megown, K.A. A comparison of accuracy and cost of LiDAR versus stand exam data for landscape management on the Malheur National Forest. J. For. 2011, 109, 267–273. [Google Scholar] [CrossRef]
  57. Sugarbaker, L.J.; Eldridge, D.F.; Jason, A.L.; Lukas, V.; Saghy, D.L.; Stoker, J.M.; Thunen, D.R. Status of the 3D Elevation Program, 2015; US Geological Survey: Reston, VA, USA, 2017. [Google Scholar]
  58. Nex, F.; Armenakis, C.; Cramer, M.; Cucci, D.A.; Gerke, M.; Honkavaara, E.; Kukko, A.; Persello, C.; Skaloud, J. UAV in the advent of the twenties: Where we stand and what is next. ISPRS J. Photogramm. Remote Sens. 2022, 184, 215–242. [Google Scholar] [CrossRef]
  59. Stöcker, C.; Bennett, R.; Nex, F.; Gerke, M.; Zevenbergen, J. Review of the current state of UAV regulations. Remote Sens. 2017, 9, 459. [Google Scholar] [CrossRef]
Figure 1. The three-lined green spot in the city of Ancona, representing the study area. In red the Italian regions.
Figure 1. The three-lined green spot in the city of Ancona, representing the study area. In red the Italian regions.
Remotesensing 16 02110 g001
Figure 2. Photogrammetric process for the point cloud generation. (A) GoPro MAX video path. (B) DJI Air 2S automated flight path, with a preview of some images collected. In both the figures, the blue marks are the estimated camera positions, and the black lines are the camera axes. The numbers represent the GCP fixed in the study area.
Figure 2. Photogrammetric process for the point cloud generation. (A) GoPro MAX video path. (B) DJI Air 2S automated flight path, with a preview of some images collected. In both the figures, the blue marks are the estimated camera positions, and the black lines are the camera axes. The numbers represent the GCP fixed in the study area.
Remotesensing 16 02110 g002
Figure 3. The workflow of the tested methodology (field survey and 3D model reconstruction) for the individual tree CS computations.
Figure 3. The workflow of the tested methodology (field survey and 3D model reconstruction) for the individual tree CS computations.
Remotesensing 16 02110 g003
Figure 4. 3D merged model of the whole study area (UAV + spherical photogrammetry).
Figure 4. 3D merged model of the whole study area (UAV + spherical photogrammetry).
Remotesensing 16 02110 g004
Figure 5. Dense cloud normalization with the distance between the points and the 2D raster.
Figure 5. Dense cloud normalization with the distance between the points and the 2D raster.
Remotesensing 16 02110 g005
Figure 6. Same scene of the (A) LiDAR point cloud used as the reference data. (B) Dense cloud obtained with the UAV + spherical data fusion.
Figure 6. Same scene of the (A) LiDAR point cloud used as the reference data. (B) Dense cloud obtained with the UAV + spherical data fusion.
Remotesensing 16 02110 g006
Figure 7. Comparison of the DBH obtained using UAV-spherical and LiDAR data.
Figure 7. Comparison of the DBH obtained using UAV-spherical and LiDAR data.
Remotesensing 16 02110 g007
Figure 8. DBH errors from the UAV-spherical fusion approach compared to the DBH computed from the professional LiDAR device.
Figure 8. DBH errors from the UAV-spherical fusion approach compared to the DBH computed from the professional LiDAR device.
Remotesensing 16 02110 g008
Figure 9. Comparison of the H obtained using UAV-spherical and LiDAR data.
Figure 9. Comparison of the H obtained using UAV-spherical and LiDAR data.
Remotesensing 16 02110 g009
Figure 10. H errors from the UAV-spherical fusion approach compared to the H computed from the professional LiDAR device.
Figure 10. H errors from the UAV-spherical fusion approach compared to the H computed from the professional LiDAR device.
Remotesensing 16 02110 g010
Figure 11. Comparison of the tree-stand CS computed using UAV-spherical and LiDAR data.
Figure 11. Comparison of the tree-stand CS computed using UAV-spherical and LiDAR data.
Remotesensing 16 02110 g011
Figure 12. CS errors from the UAV-spherical fusion approach compared to the CS computed from the professional LiDAR device.
Figure 12. CS errors from the UAV-spherical fusion approach compared to the CS computed from the professional LiDAR device.
Remotesensing 16 02110 g012
Table 1. SfM errors in both meters and pixels for the UAV survey.
Table 1. SfM errors in both meters and pixels for the UAV survey.
GCPDeviceX Error [m]Y Error [m]Z Error [m]Total Error [m]Error [pix]
101DJI Air 2S0.0073560.0117800.0000290.0138891.546
102DJI Air 2S−0.009771−0.024303−0.0000970.0261940.709
103DJI Air 2S0.0222930.0220650.0001550.0313672.923
104DJI Air 2S−0.043506−0.0132170.0003360.0454713.594
105DJI Air 2S0.0236280.003673−0.0004230.0239171.705
Total 0.0248900.0167600.0002550.0300092.146
Table 2. SfM errors in both meters and pixels for the spherical data survey.
Table 2. SfM errors in both meters and pixels for the spherical data survey.
GCPDeviceX Error [m]Y Error [m]Z Error [m]Total Error [m]Error [pix]
101GoPro MAX−0.001599−0.009673−0.0084520.0129450.673
102GoPro MAX−0.004356−0.0039290.0195210.0203840.990
103GoPro MAX0.0116750.002757−0.0240570.0268821.046
104GoPro MAX−0.016791−0.0164060.0146870.0276910.832
105GoPro MAX0.0052510.008889−0.0017920.0104790.707
Total 0.0096680.0096410.0158120.0208920.846
Table 3. The obtained results of the DBH (cm) and H (m) in both the datasets.
Table 3. The obtained results of the DBH (cm) and H (m) in both the datasets.
DBH (cm)H (m)
Tree IDLiDARUAV-Spherical LiDARUAV-Spherical
161.1066.83 11.2410.56
262.1170.25 9.8410.23
353.8556.35 10.5010.58
454.6460.80 10.7110.25
550.6256.07 10.7310.15
643.0844.84 10.7810.27
747.6449.94 11.3111.25
861.0465.70 12.5111.87
956.9359.18 12.7912.62
1050.4451.07 11.7711.65
1152.2555.92 12.4512.15
1253.5558.55 14.0413.69
1348.5951.70 12.5212.42
1450.6048.29 12.8613.00
1575.3078.49 14.2413.68
1652.1052.71 13.7913.17
1749.6754.45 12.8412.57
1843.3248.22 11.0610.44
1952.8754.69 12.8112.59
2055.0554.35 19.5519.43
RMSD abs.RMSD % RMSD abs.RMSD %
4.0212.48 0.414.21
Table 4. The obtained results of the estimated CS (kg tree−1) in either case of the outcomes, including the differences.
Table 4. The obtained results of the estimated CS (kg tree−1) in either case of the outcomes, including the differences.
Tree IDTree SpeciesTotal Tree-Stand CS (UAV-Spherical Data) kg tree−1Total Tree-Stand CS (LiDAR Data) kg tree−1Difference kg tree−1
1Pinus pinea L.593.82528.5665.26
2Pinus pinea L.635.50477.99157.51
3Pinus pinea L.423.01383.5639.45
4Pinus pinea L.477.34402.7274.62
5Pinus pinea L.401.78346.2855.5
6Pinus pinea L.260.17251.988.19
7Pinus pinea L.353.38323.3730.01
8Pinus pinea L.645.37586.8558.52
9Pinus pinea L.556.56522.1234.44
10Pinus pinea L.382.75377.045.71
11Pinus pinea L.478.64428.1150.53
12Pinus pinea L.591.06507.1083.96
13Pinus pinea L.418.14372.4245.72
14Pinus pinea L.381.81414.8933.08
15Pinus pinea L.1061.181016.2344.95
16Pinus pinea L.460.86471.4110.55
17Pinus pinea L.469.23399.0370.2
18Pinus pinea L.305.80261.5544.25
19Pinus pinea L.474.29450.8123.48
20Platanus hispanica423.48435.5712.09
RMSD abs.RMSD%
58.057.60
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Balestra, M.; Choudhury, M.A.M.; Pierdicca, R.; Chiappini, S.; Marcheggiani, E. UAV-Spherical Data Fusion Approach to Estimate Individual Tree Carbon Stock for Urban Green Planning and Management. Remote Sens. 2024, 16, 2110. https://doi.org/10.3390/rs16122110

AMA Style

Balestra M, Choudhury MAM, Pierdicca R, Chiappini S, Marcheggiani E. UAV-Spherical Data Fusion Approach to Estimate Individual Tree Carbon Stock for Urban Green Planning and Management. Remote Sensing. 2024; 16(12):2110. https://doi.org/10.3390/rs16122110

Chicago/Turabian Style

Balestra, Mattia, MD Abdul Mueed Choudhury, Roberto Pierdicca, Stefano Chiappini, and Ernesto Marcheggiani. 2024. "UAV-Spherical Data Fusion Approach to Estimate Individual Tree Carbon Stock for Urban Green Planning and Management" Remote Sensing 16, no. 12: 2110. https://doi.org/10.3390/rs16122110

APA Style

Balestra, M., Choudhury, M. A. M., Pierdicca, R., Chiappini, S., & Marcheggiani, E. (2024). UAV-Spherical Data Fusion Approach to Estimate Individual Tree Carbon Stock for Urban Green Planning and Management. Remote Sensing, 16(12), 2110. https://doi.org/10.3390/rs16122110

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop