Next Article in Journal
Identifying the Sweet Spot of Padel Rackets with a Robot
Previous Article in Journal
A Glove-Wearing Detection Algorithm Based on Improved YOLOv8
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

UAV Photogrammetry for Estimating Stand Parameters of an Old Japanese Larch Plantation Using Different Filtering Methods at Two Flight Altitudes

by
Jeyavanan Karthigesu
1,2,
Toshiaki Owari
3,*,
Satoshi Tsuyuki
1 and
Takuya Hiroshima
1
1
Department of Global Agricultural Sciences, Graduate School of Agricultural and Life Sciences, The University of Tokyo, Tokyo 113-8657, Japan
2
Department of Agronomy, Faculty of Agriculture, University of Jaffna, Jaffna 40000, Sri Lanka
3
The University of Tokyo Hokkaido Forest, Graduate School of Agricultural and Life Sciences, The University of Tokyo, Furano 079-1563, Hokkaido, Japan
*
Author to whom correspondence should be addressed.
Sensors 2023, 23(24), 9907; https://doi.org/10.3390/s23249907
Submission received: 23 October 2023 / Revised: 8 December 2023 / Accepted: 14 December 2023 / Published: 18 December 2023

Abstract

:
Old plantations are iconic sites, and estimating stand parameters is crucial for valuation and management. This study aimed to estimate stand parameters of a 115-year-old Japanese larch (Larix kaempferi (Lamb.) Carrière) plantation at the University of Tokyo Hokkaido Forest (UTHF) in central Hokkaido, northern Japan, using unmanned aerial vehicle (UAV) photogrammetry. High-resolution RGB imagery was collected using a DJI Matrice 300 real-time kinematic (RTK) at altitudes of 80 and 120 m. Structure from motion (SfM) technology was applied to generate 3D point clouds and orthomosaics. We used different filtering methods, search radii, and window sizes for individual tree detection (ITD), and tree height (TH) and crown area (CA) were estimated from a canopy height model (CHM). Additionally, a freely available shiny R package (SRP) and manually digitalized CA were used. A multiple linear regression (MLR) model was used to estimate the diameter at breast height (DBH), stem volume (V), and carbon stock (CST). Higher accuracy was obtained for ITD (F-score: 0.8–0.87) and TH (R2: 0.76–0.77; RMSE: 1.45–1.55 m) than for other stand parameters. Overall, the flying altitude of the UAV and selected filtering methods influenced the success of stand parameter estimation in old-aged plantations, with the UAV at 80 m generating more accurate results for ITD, CA, and DBH, while the UAV at 120 m produced higher accuracy for TH, V, and CST with Gaussian and mean filtering.

1. Introduction

Old plantations are iconic sites that have great value. Assessing stand parameters at old plantations is therefore extremely important [1]. The Japanese larch (Larix kaempferi (Lamb.) Carrière) is an endemic coniferous species in Honshu Island, central Japan [2,3,4,5], but it is a non-native and key plantation species in Hokkaido, northern Japan [6]. It is an economically important deciduous conifer species that grows in cool-temperate forests [4]. Japanese larch has suitable characteristics for forestry, and plantations were introduced to Hokkaido from the central mountainous region of Honshu in the early part of the last century [3]. These plantations have succeeded due to their rapid growth and disease- and cold-resistance characteristics compared to other planting species. The Japanese larch was therefore extensively used for reforestation in northern Japan from 1960 to the 1970s [7]. According to the Forestry Agency, in the National Forest Inventory of Japan, only 3% of the total forest area consists of larch (including both natural and plantation forests), whereas its proportion in Japan’s planted forests is 10% [8]. The recommended long-cutting period of Japanese larch is 40–60 years [9,10]. We considered our study site to be an old plantation because it was more than 100 years old, i.e., approximately double the age of the cutting period. However, stand parameter data for old larch plantations are scarce in the region.
Forest inventory information is extremely important for forest management. The tree height (TH) and diameter at breast height (DBH) provide useful information in the field of forest research, allowing for the quantification of timber resources, evaluation of the ecological and economic value of the forest stand, computation of the number of individual trees, stem volume (V), and carbon stock (CST), and understanding the rate and pattern of forest regeneration [11,12,13]. In forest management, TH and crown area (CA) are used to develop the allometric equations for CST calculation and a broad range of stand attributes [14,15]. For example, TH was used for both individual tree volume and stand volume estimation. The estimated volume in the forest is important for the assessment of the hydrological cycle [16], and the production capacity of the site and a single tree [17]. Further, stand density, competition, and survival are characterized by CA [18,19]. Stand density is a significant parameter that explains the dimension and distribution of trees [20].
The collection of field data is laborious, time-consuming, and only appropriate for small forest stands. High accuracy in stand parameter estimation is difficult when using remote sensing technology due to issues related to uncertainty, technology, the availability of high-spatial-resolution data, and cost. Light detection and ranging (LiDAR) provides accurate data but is not suitable for large-scale forest monitoring due to its high cost. Satellite data can be very affordable; however, there are several limiting factors such as a relatively low spatial resolution, occlusion by cloud cover, and difficulties in obtaining them at specific times. Unmanned aerial vehicle (UAV) technology is a recent advance in remote sensing that can be used to characterize plantation forest [21]. UAVs are the most efficient platforms for obtaining remotely sensed data and can provide extremely high spatial resolution, low-cost data, and cloud-free images with high versatility, flexibility, and adaptability [22]. The use of UAV photogrammetry provides high-resolution images for estimating individual tree position, and TH, and for crown delineation, with high accuracy [14,23]. However, digital terrain models (DTMs) generated from UAV photogrammetry lack accuracy due to the occlusion effect [24,25,26]. It has been reported that UAV photogrammetry during leaf-off conditions is able to generate an accurate UAV DTM [27]. Moe et al. [28] achieved a high accuracy (63–73%) for forest canopy classification in a complex mixed conifer–broadleaf forest using the combination of a UAV digital surface model (DSM) and airborne LiDAR DTM. Wang et al. [27] used a radial basis function neural network (RBFNN) with spatial interpolation to achieve high accuracy in a UAV DSM. Hastaoglu et al. [29] used an inverse distance weighted model (IDW) that took into account field slope and directional distributions of reference points in IDW-based interpolations to increase the accuracy of DTM. However, in a dense forest, it is not possible to derive an accurate DTM using photogrammetric methods, because insufficient ground surface is visible in the aerial images [24,25,26]. Xu et al. [30] built a high-precision DTM from the point cloud generated by LiDAR and then subtracted the DTM from the DSM generated from the photogrammetric point cloud to obtain the CHM.
Tree crowns and other structural variables are extracted either from canopy height models (CHMs) or normalized point clouds for individual tree detection (ITD). Individual tree metrics are extracted within the segmented tree crowns. However, tree density, forest type, and tree species are the main factors influencing the accuracy of tree crown detection [31,32]. Even though satisfactory results were obtained for conifer plantations in many studies [33,34], stand parameter estimation in old Sugi (Cryptomeria japonica), Hinoki (Chamaecyparis obtusa), and other conifer plantations was performed using UAV technology [35,36,37,38]. However, there have been no studies of old larch plantations using UAV technology. Many algorithms have been used to distinguish tree crowns, namely, inverse watershed segmentation (IWS), watershed segmentation (WS), seed growing segmentation (SG), and object-based image segmentation (OBIA) [39,40]. In addition, various smoothing techniques, such as lowpass, highpass, Gaussian, and mean filtering, at different kernel sizes, have been used in many studies. Improvements in the quality of remote sensing data and processing workflows have recently enabled remote forest mapping to become more analogous to field-based approaches that involve detecting and characterizing individual trees [41,42,43]. Nasiri et al. [44] found that lowpass filtering with a circular neighborhood at a 25-cell radius (kernel size; cell size with respect to the size of the largest area within the scene) provided highly accurate ITD. Similarly, various software packages have been used for crown segmentation, such as eCognition (Trimble Inc., Sunnyvale, CA, USA), Labkit (in Fiji) [45], and ArcGIS (ESRI, Redlands, CA, USA). In QGIS (Open–Source Geospatial Foundation), the System for Automated Geoscientific Analyses (SAGA) software (version 2.1.4) is used to conduct the segmentation process. The WS approach has been shown to have an acceptable ability to delineate tree crowns using a CHM in a closed forest canopy structure [14,46]. Moe et al. [28] studied crown segmentation using OBIA and a multiresolution segmentation algorithm using the eCognition Developer, and accuracy was confirmed by manual delineation of crown cover (CC). Different algorithms give different tree crown diameters for different flight altitudes [40].
In this study, we examined the capabilities of high-resolution UAV imagery to estimate ITD, TH, CA, DBH, V, and CST using various filtering methods, and flight altitudes of 80 and 120 m, in an old larch plantation site. We considered the following questions: Can a UAV generate an accurate CHM and high-resolution orthomosaic in an old larch plantation? How do different UAV flying altitudes and filtering methods improve ITD? Can UAV photogrammetry estimate TH accurately in an old larch plantation? Can UAV photogrammetry estimate the CA and CC? What are the most important UAV-derived metrics for estimating DBH, V, and CST? To estimate these stand parameters, we used various filtering methods, i.e., lowpass, Gaussian, and mean, at different search radii and window sizes using a combination of ArcGIS Pro and QGIS in the SAGA and the open-source shiny R package (SRP) [47,48].

2. Review of Literature

Photogrammetry is a technique that derives the required information by creating a 3D model from 2D images. Common points are matched from a series of overlapping 2D images to create the 3D model through Structure-from-Motion (SfM) technology [49,50,51]. The photogrammetry technique has been applied in many fields such as surveying, civil engineering, urban planning, gas detection, fire monitoring, archeology, mining, industry, urban management, agriculture, and forest management [49,51,52].
In the sustainable forest management approach, the estimation of forest stand parameters is extremely important. Gómez et al. [53] stated that age class, stem density, stem frequency, DBH, CA, crown closure, mean crown size, crown width, circumference, TH, mean stand height, maximum height, basal area, biomass, and stand V are forest structural parameters estimated in many studies using high spatial resolution (HSR) satellite imagery (IKONOS, Pan, Pan-sharpened, QuickBird, and SPOT). Gómez et al. [53] extracted quadratic mean diameter, basal area, and tree density as forest structural parameters to assess wood volume and biomass using QuickBird-2 imagery. Spatial resolution is an important consideration when using remote sensing for forest characterization [54]. In addition to the use of HSR satellite imagery, recent advances in UAVs have provided high-resolution imagery, enabling more reliable forest structure estimates with high accuracy. Jayathunga et al. [55] estimated the standard deviation of height, percentile height, coefficient of variation in height, skewness, and kurtosis, and canopy cover above mean height using Fixed-Wing UAV. Gao et al. [56] combined UAV laser scanning and ground backpack laser sacking to extract individual tree structural parameters and fit volume models in subtropical planted forests in southeastern China.
Belmonte et al. [57] found that estimates of individual tree height and crown diameter were most accurate at low stand density, with significantly reduced accuracy at high stand density using UAV photogrammetry. Individual DBH and stand-level estimates of basal area, stand density, and canopy cover (CC) are commonly used as forest mensuration metrics. Kameyama and Sugiura [58] estimated the CA and TH using different SfM software such as Terra Mapper (version 2.5.1), PhotoScan (version 1.3.2.4205), and Pix4Dmapper (version 4.5.6) on aerial image processing by UAV at different altitudes. The UAV has been used successfully in several recent studies to predict the DBH distribution of trees [59], mean TH [60], and aboveground CST [10]. In addition, the high point density of UAV data allows the crowns of individual trees to be delineated. This improves the accuracy of the ITD [61]. The SRP is a freely available application developed with a LiDAR analysis tool and a standalone R package called treetop. The treetop package is publicly available on the SRP, which is a service platform for hosting shiny web apps (https://carlosasilva.shinyapps.io/weblidar-treetop/, accessed on 10 February 2023). Detailed methods for its application are provided by Silva et al. [47,62]. The treetop package is capable of fast and effective ITD and crown delineation and is also applicable to UAV-derived CHM [47]. The local maximum (LM) is an algorithm that finds a maximum height in the CHM that indicates treetops [63]. The treetops and CA were extracted automatically by adjusting the two types of window sizes, referred to as a smoothing window size (SWS), and a fixed window size (FWS). The spurious local maxima detected in the CHM are eliminated by the application of a smoothing filter, which will increase tree detection accuracy [64]. Voronoi tessellation-based algorithm is particularly suitable for dense areas of conifer [65] and broadleaf forests [66]. The Voronoi tessellation algorithm was considered suitable for our study area due to the dense canopy of the old larch plantation. Moe et al. [28] visually interpreted the orthomosaic to digitize the conifer tree crown due to the absence of field data for CA and reported that the manual CA had high accuracy compared with the field CA. Mohan et al. [67] visually interpreted high-resolution imagery.
DBH measurement in the field gives an accurate estimation and it is highly correlated with other tree parameters [68]. DBH is used as a predictor variable to develop the stem V equations, tree growth model, and biomass equations. LiDAR data have been used in many studies for the estimation of individual tree DBH [60,68,69]. Liang et al. [70] stated that trunk position and DBH accuracy of individual trees were 88.2% and 90.4%, respectively, using the SfM point cloud. Piermattei et al. [71] found that the tree detection rate and bias of the extracted DBH were 69–98% and 1.13 cm, respectively, using SfM point clouds. Sun et al. [72] applied different methods such as the linear regression model, a linear model with ridge regularization, support vector regression, random forest, artificial neural network, and k-nearest neighbors to predict the individual DBH of Larch (Larix olgensis) using UAV-LiDAR. They reported that all methods improved the accuracy of the predictions except linear regression.
In old-growth forests, stand parameters are applied on a local to regional scale using detailed data, often from airborne laser scanning [73]. ITD, TH, CA, lying deadwood, standing deadwood, canopy cover, stand height, stand density, dominant height, height distribution, gap detection, aboveground biomass, timber V, and tree species were estimated using airborne laser scanning, optical very high-resolution, and synthetic aperture radar in old-growth forest. Qiu et al. [74] estimated the TH, DBH, crown width, and age in an old pear orchard using UAV photogrammetry and obtained an RMSE of 0.1814 m, 3.0039 cm, 0.3292 m, and 4.3753 years, respectively. Holiakaa et al. [75] used UAV photogrammetry to estimate ITD, TH, and biomass in different ages of a Scots pine forest, including a 115-year-old stand. Zhou and Zhang [76] estimated TH, CA, and biomass of larch (Larix gmelinii) and Chinese pine (Pinus tabuliformis) plantations of different ages using UAV oblique photogrammetry. Although many previous studies have also demonstrated the great potential of UAVs for estimating forest structural parameters and their advantages over airborne LiDAR, the use of UAV photogrammetry with RGB imagery in old larch plantations has not been fully explored.

3. Materials and Methods

3.1. Study Site

Figure 1 is a map of the study area. A 115-year-old Japanese larch plantation site was selected (43°12′55″ N, 142°23′7″ E, 43°13′8″ N, 142°23′31″ E) at the University of Tokyo Hokkaido Forest (UTHF) in Furano City, on Hokkaido Island in northern Japan [77]. The site was planted in 1908 with a seedling density of 3000 stem ha−1. The larch plantation is located in sub-compartment 87B of UTHF. The study area extends for 0.93 ha with a mean temperature: 6.6 °C and precipitation of 1196 mm/year at the arboretum (230 m). Snow covers the ground from late November to early April, with a maximum depth of approximately 1 m. The elevation is 250–300 m above sea level and the slope is 18–20°.

3.2. Field Data

A field survey was conducted in November 2022. A total of 136 individual larch trees were measured. Seven other individual trees of other species were identified in the forest but were not sampled. Tree spatial position, TH, and DBH (1.3 m above ground) were measured. The TH was measured using a Vertex III hypsometer and transponder (Haglöf Sweden AB, Långsele, Sweden). Tree DBH was measured using diameter tape. The tree spatial locations were measured in 2007 using an Impulse laser rangefinder with a Mapstar electronic compass module (Laser Technology, Inc., Centennial, CO, USA). The ITD, basal area (BA), V, and CST were calculated from these field-measured parameters. We used a species-specific volume table provided by UTHF. The CST was calculated using the following allometric Equation (1) [78]:
C S T = j V j × D j × B E F j × 1 + R j × C F
where CST is the carbon stock in living biomass (MgC ha–1); V is the merchantable volume (m3 ha–1), it is a volume estimated for each tree species based on the yield table developed for a given region, site class, and stand age; D is the wood density (t–d.m. m–3); BEF is the biomass expansion factor for the conversion of volume; R is the root-to-shoot ratio; CF is the carbon fraction of dry matter (MgC t–d.m.–1); and j is the tree species [78]. For species larch, the value of D, BEF, R, and CF was 0.404, 1.15, 0.29, and 0.51, respectively, as suggested by Greenhouse Gas Inventory Office of Japan and Ministry of Environment, Japan [78].
The summary statistics of the field data are given in Table 1. The tree density, stand volume, and CST of the larch trees in the stand were 147 stems ha–1, 543 m3 ha−1, and 168 MgC ha−1, respectively.

3.3. UAV Data

The UAV image collection process and the parameter settings in the field are given in Figure 2 and Table 2, respectively. The UAV imagery was acquired using a Matrice 300 real-time kinematic (RTK) drone with a Zenmuse P1 sensor (DJI, Shenzhen, China) on 13 October 2022. The front and side overlap were both 90%. Flight planning was performed using DJI Pilot2 software (version 6.1.2), and the location details were sent to the UAV drone in the field. Two ground control points (GCPs) were used (Figure A1). The GCPs, take-off, and landing points were set in available open areas before the flight missions [77]. The xyz coordinates of the GCPs were recorded with an RTK global satellite navigation system (GNSS) receiver (DG-PRO1RWS, BizStation Corp., Tokyo, Japan), with a positional accuracy of <0.02 m. Two batteries were required for a one-time flight of approximately 30 min, which was less than the theoretical time (55 min) due to the environmental conditions and time allocation for a flight return to the station. The flight missions proceeded at flight heights of 80 m (UAV 80 m) and 120 m (UAV 120 m). The terrain following mode was selected in the settings of the UAV flight missions.

3.4. Data Analysis

3.4.1. UAV Image Processing

The overall workflow of the study is shown in Figure 3. The professional photogrammetric processing software Agisoft Metashape 1.8.4 (Agisoft LLC, St. Petersburg, Russia) was used for UAV image processing. The parameter settings for UAV image processing are given in Table A1. Image alignment, building a dense point cloud, building a digital elevation model (DEM), and building an orthomosaic were processed. Medium accuracy was set to optimize the camera location, orientation, and other internal parameters during the image alignment and building of dense point cloud stages, to reduce the processing time (Table A2). Image processing was performed separately for UAV 80 m and UAV 120 m.
The GCPs were added to each corresponding image for optimization of the camera locations and orientations, as well as other internal camera parameters. The depth filtering mode during the photogrammetric process is considered to remove noticeable outliers while preserving as much as possible the detailed elements of the three-dimensional model [79]. Tavasci et al. [79], Moe et al. [77], and Jayathunga et al. [15] used mild depth filtering for the automated removal of outliers. We also used mild depth filtering, which was set to remove outliers. The Tokyo Japan plane rectangular CS XII (ESPG: 2454) coordinate system was used for georeferencing. We followed the Agisoft Metashape default setting for the DEM building stage and orthomosaic building stage. Orthomosaics were exported in GeoTiff format, and dense point clouds were exported in LAS format.

3.4.2. Generation of the CHM

The LAS files of the 3D point clouds generated by Agisoft Metashape were used to generate the DSM. The LAS files were input to ArcGIS Pro (version 2.8) for DSM generation. First, an LAS dataset was used as input to ArcGIS. Then, a raster was created using the LAS dataset [80], with the file value set to elevation. For DSM generation, the LAS file was filtered to the first return, and the value was set to maximum. A binning approach was adopted, with values assigned to the nearest cell using the void fill function as a natural neighbor technique. The UAV DTM was not accurate due to the occlusion effect of the top canopy. LiDAR can penetrate the forest canopy to the interior and the ground through laser echoes, thereby obtaining vertical forest structure information and enabling the generation of a high-precision DTM. Therefore, we generated the CHMs for respective UAV flights using each pixel value of the UAV DSM by subtracting the LiDAR DTM from the UAV DSM. We used the LiDAR DTM generated in 2018 by UTHF using an Optech Airborne Laser Terrain Mapper (ALTM) Orion M300 sensor (Teledyne Technologies, Thousand Oaks, CA, USA) mounted on a helicopter that flew 600 m aboveground at a speed of 140.4 km h–1. The course overlap, pulse rate, scan angle, beam divergence, and point density of LiDAR data were 50%, 100 kHz, ±20°, 0.16 mrad, and 11.6 points per m2, respectively. The flight was designed to optimize image overlap and distribution, using high-resolution imagery across the survey area to generate a dense and accurate point cloud. The GCPs were measured using RTK GNSS for precise measurement and the coordinates (latitude, longitude, and elevation) were recorded. Classified LiDAR point data, e.g., ground, non-ground, first, second, third, and last returns, were delivered by the data provider (Hokkaido Aero Asahi, Hokkaido, Japan) and data were stored in LAS format. The DTM was generated from LiDAR point clouds using well-distributed GCPs (seven checkpoints) spatially and covered a representative portion of the terrain all over the UTHF. The LiDAR ground returns were used to develop a digital terrain model (LiDAR-DTM) [77]. Minimum height, maximum height, average height, RMSE, and standard deviation of the derived LiDAR DTM were 0.02 m, 0.14 m, 0.00 m, 0.061 m, and 0.061 m, respectively. Both average height and standard deviation were below the limit of 0.25 m (Work Regulations Article 326-3, Hokkaido Aero Asahi, Hokkaido, Japan), confirming that the local elevation and laser elevation were consistent. In our study, the stand area of the LiDAR DTM was clipped to generate the CHM.

3.4.3. Individual Tree Detection

The UAV ITD approach used a combination of filtering, search radius, and window sizes, as shown in Table 3. The CHM was used as the input for ITD. Various filtering methods were used, such as lowpass, Gaussian, and mean. Different search radii/sigma, window sizes, and circular searches were used to achieve highly accurate tree detection. A local minima and maxima algorithm in QGIS SAGA was used to identify individual treetops in the filtered CHM. We also used an SRP for treetop detection, TH, and crown delineation [47,62]. In our study, we performed the ITD with the SRP using two window sizes, FWS = 3 × 3 and 5 × 5 and SWS = 3 × 3 and 5 × 5, where the maximum crown factor and exclusion parameters were set to 0.4 and 0.7, respectively. The TH threshold used was 1.37 with 0.5 m resolution in the CHM as a default setting (see the shiny web apps for more detail).
The LM algorithm identified the field treetop as the UAV treetop when their locations were similar or identical. These cases were taken to indicate correctly detected trees (true positive; TP). When the field treetop and UAV treetop were found in the CHM of a respective tree crown, it was classified as TP. We also validated the detected trees when the location of the field treetop matched the UAV treetop in the CHM. When there was no UAV treetop close to the field treetop in the CHM of a tree crown, it was taken to indicate incorrectly undetected trees (false negative; FN). Sometimes, the UAV treetop was close to the field treetop of another tree but not close to the CHM of the respective field tree, which was considered to represent an incorrectly undetected tree. Cases where there was no field treetop, but the UAV treetop was found in the CHM, were taken to indicate incorrectly detected trees (false positive; FP). Cases where there was no field treetop and no UAV treetop in the CHM were classified as correctly undetected (true negative; TN). The distance threshold for searching field treetops neighboring the UAV treetops was the search radius or sigma, based on the window sizes in the filtered or smoothed CHM.

3.4.4. Tree Height Estimation

The generated CHM was used to extract the TH using different filtering methods. The CHM was clipped within the stand area using the Extract by Mask function [81] in ArcGIS Pro to extract the individual TH using different algorithms. The UAV tree locations were identified by the LM algorithm with lowpass, mean, and Gaussian filtering. First, UAV treetops were converted to raster data using the point-to-raster function to make them compatible for extraction. Then, they were input to the CHM, and the spatial location UAV treetops were used as input raster or feature mask data. Then, the extracted raster data of UAV treetops were converted from raster to point data to obtain the TH attribute table. The summary statistics function was used to derive the mean, minimum, maximum, standard deviation, and variance for the respective UAV TH. We also derived the TH information from the SRP, which automatically generated the TH [47].

3.4.5. Tree Crown Delineation

The UAV orthomosaic was used to manually digitalize the CA because of its high resolution. First, a new shape file was created, and the respective tree crown was then manually delineated using a freehand tool on the newly created shape file while viewing the orthomosaic in ArcGIS Pro. Then, CA was determined using a geometric calculation. We also used the SRP to derive the CA. We set the parameters by adjusting the window sizes (Table 3). As FWS increased, the number of trees detected decreased [47,82]. Once individual treetops were detected, their crown boundaries were delineated using the Voronoi tessellation-based algorithm developed by Silva et al. [62]. This algorithm was operated with the LM algorithm and used the maximum crown factor and exclusion parameters, both ranging from 0 to 1, to define the crown boundaries on the UAV CHM, delimiting the boundary of the grid cells belonging to each tree. In this study, we used the CA derived from manual digitalization and the SRP.

3.4.6. Tree DBH, V, and CST Estimation

Individual tree parameters were estimated between segmented tree crowns and manually digitized polygons using one-to-one relationships in previous studies [37]. Moe et al. [28] extracted individual tree parameters from the manually delineated tree crowns using LiDAR and UAV-DAP-normalized point clouds in fusion software. In this study, we used the RGB imagery generated from UAV point clouds in the ArcGIS Pro software (version 2.8) package and SRP to derive the structural variables. The dependent variables of tree DBH, V, and CST were modeled with independent variables such as the manual CA, tree crown perimeter (C_peri), near distance (ND), SRP CA, and UAV flight height derived from the lowpass, mean, and Gaussian filtering methods.

3.4.7. Accuracy Assessment and Validation

For accuracy assessment and validation, the parameters extracted from the UVA were compared with the field-measured parameters of the larch trees. The seven trees of other species found in the larch plantation were excluded from the analysis.
The F-score was calculated for ITD. The F-score is based on the harmonic mean of precision and recall. The evaluation produced three types of segmentation results. If a tree existed and was identified successfully, it was labeled TP, representing correct segmentation. Under-segmentation is represented; if a tree existed but was not detected, it was labeled FN [83]. Similarly, over-segmentation is represented; if a tree was detected, but did not exist on the ground, it was labeled FP. The overall accuracy of the individual tree detection is calculated using F-score based on precision and accuracy [84,85]:
P r e c i s i o n = T P T P + F P
R e c a l l = T P T P + F N
F - s c o r e = 2 × P r e c i s i o n × R e c a l l P r e c i s i o n + R e c a l l
where precision represents detection accuracy (commission) (2); recall represents detection rate (omission) (3); and the F-score is the weighted average taking both detection rate and detection accuracy into consideration (4).
For the regression of TH and CA, a simple linear regression model was used. The most common methods used for statistical analysis and validation of ground data are the root mean square error (RMSE) (5) and coefficient of determination (R2) (6) [11].
R M S E = 1 N i = 1 N ( y i y ^ ) 2
R 2 = 1 ( y i y ^ ) 2 ( y i y - ) 2
where ŷ= predicted value of y; ȳ = mean value of y.
For tree DBH, V, and CST, a multiple linear regression (MLR) model was used (7) using R studio version 4.22. The final models were selected based on Akaike’s information criterion (AIC) and stepwise variable selection for UAV-derived stand parameters [86] in which the variables with a variance inflation factor (VIF) > 5 were removed to avoid multicollinearity [87]. The leave-one-out-cross-validation method was used to validate the accuracy of the selected models.
y = β 0 + β 1 X 1 + β 2 X 2 + + β n X n + ϵ
where y = the predicted value of the dependent variable (DBH, V or CST); β 0 = the y-intercept; independent variables were UAV TH, manual CA, UAV CA, tree crown perimeter (C_peri), and near distance (ND) at respective UAV flight altitude. β 1 = the regression coefficient ( β 1 ) of the first independent variable ( X 1 ); β 2 X 2 = the regression coefficient ( β 2 ) of the second independent variable ( X 2 ); β n = the regression coefficient of the last independent variable; a n d   ϵ = model error.

4. Results

4.1. The CHM and Orthomosaic

The CHM was generated at two flight altitudes from the respective UAV DSM and LiDAR DTM (Figure 4), and orthomosaics were created in which tree locations and the stand area were represented (Figure A1). The maximum TH in the CHMs was 40.77 m for UAV 80 m (Figure 4c) and 42.07 m for UAV 120 m (Figure 4e), whereas the field maximum TH was 42.90 m (Table 1). The maximum TH in both CHMs was lower than the field tree maximum. We generated a high-resolution orthomosaic with a pixel value of 3.25 cm/pix for UAV 80 m and 5.39 cm/pix for UAV 120 m, in which the tree crowns were clearly visible. The total number of images was 1257 and 342 for UAV 80 m and UAV 120 m, respectively. All images had a resolution of 8192 × 5460 pixels (Table A3).

4.2. Individual Tree Detection and Tree Density

The ITD results are given in Table 4, and representations of UAV treetops and field treetops for part of the stand are shown in Figure A2. A total of 143 individual trees were identified in the field, including 136 larch trees and seven other trees within the stand area. Five Pinus nigra Arnold, one Kalopanax septemlobus (Thunb.) Koidz, and one Abies sachalinensis (F. Schmidt) Mast. were identified as the other species in the stand. We only considered larch trees for the ITD and other stand parameter estimations. For ITD, more UAV treetops were identified by LM lowpass (LML) filtering than LM Gaussian (LMG) filtering. For both methods, UAV treetops were significantly (p < 0.0001) higher than field treetops. The number of UAV treetops was reduced when the search radius was increased. The LM algorithm detected field treetops at a threshold search radius of 2 m in the LML filtering. The optimum number of UAV treetops in LML filtering was 176 at UAV 80 m and 178 at UAV 120 m. We also compared these results with those of treetop detection with the SRP. The SRP treetop detection performance was higher at the combination of 5 × 5 FWS and 3 × 3 SWS than at the combination of 3 × 3 FWS and 3 × 3 SWS, for mean and Gaussian filtering at threshold search radii of 5 and 2 m, respectively. When the FWS or SMS decreased, SRP detected more treetops than field treetops. For the UAV 80 m flight, ITD was 144 by shiny mean (SM) filtering and 145 by shiny Gaussian (SG) filtering. For the UAV 120 m flight, ITD was 142 by both the SM and SG filtering methods.
The F-score for ITD ranged from 0.76 to 0.87. This indicated that both LML filtering and the SRP performed well in detecting the treetop. However, the F-score was higher for the SRP (0.87 at UAV 80 m and 0.83 at UAV 120 m) than LML filtering (0.79 at UAV 80 m and 0.76 at UAV 120 m). The UAV treetops detected by LM filtering were higher than those detected with SRP filtering. Therefore, FPs were high with LM filtering. The precision value was higher than the recall value using all methods, while ITD was higher at UAV 80 m than at UAV 120 m. Tree density was calculated based on ITD. The UAV tree density was higher than the field tree density (147 stems ha−1) at both flight altitudes due to the higher number of UAV treetop detections. Tree density was in the range of 155–190 stems ha−1 at UAV 80 m and 153–192 stems ha−1 at UAV 120 m.

4.3. Tree Height

Table 5 shows the TH estimated from the LM algorithm and SRP. The LM and SRP results indicated that TH was slightly higher than the field TH at both flight altitudes. The maximum and minimum TH were slightly lower than the field TH minimum and maximum. Figure 5 shows the results of a simple linear regression of field TH with UAV TH. The R2 value for TH was slightly higher and had a lower RMSE at UAV 120 m than at UAV 80 m. Similarly, for the different filtering methods, the R2 value of LM filtering was lower than with SRP filtering, with a higher RMSE. At UAV 80 m, the R2 and RMSE values were 0.71 and 1.73 m, 0.75 and 1.55 m, and 0.75 and 1.54 m for LML filtering, SM filtering, and SG filtering, respectively. At UAV 120 m, the R2 and RMSE values were 0.76 and 1.59, 0.76 and 1.45 m, and 0.77 and 1.45 m for LML filtering, SM filtering, and SG filtering, respectively.

4.4. CA and CC Percentages

Tree crowns, delineated manually and from the SRP, are presented in Figure 6. We manually delineated the tree crown of 136 larch (yellow) and seven other trees (orange) (Figure 6a,b). The SRP delineated the tree crowns automatically (Figure 6c,d). The manual CA (56.65 m2) of larch trees was lower than both the SRP mean (63.72 m2) and that obtained with Gaussian (63.83 m2) filtering at UAV 80 m. However, the SRP mean CA (56.44 m2) was slightly lower than the mean filtering CA (55.57 m2) and slightly higher than the Gaussian filtering CA (56.64 m2) at UAV 120 m. The maximum CA was higher with manual crown delineation than with the SRP at both UAV altitudes (Table 6). The manual CA delineation was more accurate than the SRP crown delineation due to the use of a high-resolution orthomosaic where the larch crown is easily visible. The CC percentage of larch with manual crown delineation (74.7%) was lower than that with SRP (92.0−92.7% at UAV 80 m and 78.2% at UAV 120 m). The total CC (including other species_estimated via manual crown delineation was 77.4% at both UAV altitudes, whereas the total CC obtained by SRP was 96% at UAV 80 m and 82.3% at UAV 120 m.
The results of a simple linear regression analysis are given in Figure 7. The R2 value for crown delineation was lower than that of the TH regression. The R2 value was higher at UAV 80 m than at UAV 120 m. At UAV 80 m, the R2 and RMSE values were 0.3 and 20.85 m2, and 0.30 and 20.76 m2 for mean and Gaussian filtering, respectively. At UAV 120 m, the R2 and RMSE values were 0.21 and 20.02 m2 and 0.21 and 19.96 m2 for mean and Gaussian filtering, respectively.

4.5. DBH, V, and CST

The results of the MLR model for tree DBH, V, and CST are given in Table 7. Figure 8 shows scatter plots of the predicted variables with field-estimated values. In the model, TH values derived from Gaussian filtering and manually digitalized CA were more accurate than those obtained using other filtering methods. DBH model had a lower R2 value (0.27) than the V and CST models, with an RMSE of 5.64 cm. Model R2 values for V and CST were 0.30 and 0.29, with an RMSE of 0.87 m3 tree−1 and 0.24 MgC tree−1, respectively.

5. Discussion

5.1. Individual Tree Detection and Tree Density

For ITD, the careful selection of algorithms with suitable filtering/smoothing methods and window sizes influenced the accuracy of treetop detection [67,88]. The LM algorithm has strong potential to detect treetops [67], mainly in conifer plantations. We used the LM algorithm with different CHM filtering methods, radii, and window sizes. The F-score showed that the ITD was higher at UAV 80 m (0.87) than at UAV 120 m (0.83). Mohan et al. [82] showed that the F-score of the ITD in the canopy of a mixed conifer forest was 0.87. The ITD was higher in a conifer plantation than in a mixed broadleaf forest due to its homogenous structure. Our stand area was also a conifer plantation, but it was an old plantation with a canopy overlapping. Young et al. [48] reported that the accuracy of ITD and the resulting tree maps was generally maximized by collecting imagery at high altitude (120 m) with at least 90% image-to-image overlap in structurally complex mixed conifer forests where ITD ranged from 0.67 to 0.87, but the TH accuracy (R2 = 0.95) in their study was higher than in our study. In this study, when the tree density increased, the F-score decreased. We estimated tree density based on the ITD results. We tested the applicability of the SRP, an open-source application with a limited input size (30 megabytes) and a resolution of 0.5 m in the CHM [47]. When this model was larger and had a wider range of resolutions, ITD was more accurate. A detailed analysis is required to further increase the accuracy of the ITD. Using UAVs with hyperspectral images will increase the accuracy of ITD. Nevalainen et al. [89] obtained a high F-score of 0.93 with hyperspectral imagery for ITD in boreal forest.

5.2. Tree Height

In our study, TH accuracy was high, with a higher R2 and a lower RMSE for the UAV 120 m flight compared to the UAV 80 m flight. Pourreza et al. [81] reported that UAV data acquisition was not significantly different among three altitudes (25, 50, and 100 m) using a local network RTK system (NRTK), except for the mean values calculated at 100 m. They also obtained a positive and strong relationship between the measured and estimated TH (R2 > 0.99) at all three flight altitudes. The RMSE values for estimated TH at flight altitudes of 25, 50, and 100 m were 0.9%, 4.3%, and 10.2%, respectively, and the respective mean absolute error (MAE) values were 0.04, 0.21, and 0.52. Additionally, the findings indicated an underestimation of TH, which increased with increasing UAV flight altitude. Islami et al. [90] reported a high R2 value of 0.935 at a flight height of 100 m; this was higher than the values for the 80 m and 120 m flight altitudes in the present study, whereas the RMSE was lower at 100 m than at the other altitudes. Nasiri et al. [44] reported that the R2 and RMSE for TH, using a UAV and the LM algorithm, were 0.808 and 3.22 m, respectively. This RMSE was higher than our value (1.45–1.73 m), although it was calculated using a different method, while we obtained a lower R2 value (0.71–0.77). In a study of a cashew plantation, Mot et al. [80] reported that the highest R2 (0.60) was derived from a 50 m UAV flight, whereas the 200 m UAV flight only achieved an R2 of 0.50. They also revealed that their proposed method was only applicable to open terrain where the TH was <12 m due to a design limitation of the pipe meters (i.e., a straight tube to measure height). Because the cashew tree has a complex leaf, identifying the treetop was a challenge.
The TH accuracy of conifer plantations in other studies was higher than in the present study due to the younger ages of their stands. Ota et al. [38] found that R2 and RMSE values for the mean TH from a CHM were in the ranges of 0.89–0.92 and 1.24–1.31, respectively, in an area dominated by 62-year-old plantations of evergreen conifer trees including Sugi and Hinoki. Krause et al. [11] reported a treetop detection rate > 80% using the LM algorithm in a 40-year-old conifer plantation, where the TH R2 value was 0.97–0.99 with an RMSE of 0.3–0.48 m. In general, our R2 value was low compared to studies in other conifer plantations. This may be due to the old-growth condition of the larch plantation in the present study, which had reached the stage of canopy overlapping. We extracted the TH based on the spatial position of the UAV tree location; hence, the TH accuracy depended on the accuracy of ITD. The accuracy of canopy detection was low due to the loss of apical dominance in old trees [91,92]. We found that some of the larch trees had a higher field TH than UAV TH. This was due to the edge effect of some trees in the field, which resulted in them being distributed in the high-height range of the CHM. Similarly, some trees had a lower field TH than UAV TH. This was due to some field trees being located outside of the high height range of the CHM, and there was no suitable height range in the CHM to match with the field TH. This may be due to the overlapping of lower trees by the treetop crown. For these reasons, the TH accuracy of the old larch plantation was lower than for other conifer plantations studied elsewhere. Therefore, the application of suitable algorithms and technologies that could penetrate or scan the vertical distribution of the tree canopy; for example, UAV LiDAR technology [93,94] in old plantations should be considered in future analyses.

5.3. Crown Delineation and CC Percentage

In our study, the mean manual CA SRP CA values were not statistically significantly different. The R2 value was higher at a lower altitude (UAV 80 m), with a slight change in the RMSE. Pourreza et al. [81] found that the mean crown diameter based on field measurements and UAV estimations at all flight altitudes were not statistically significantly different. The RMSEs for the estimated tree crown diameter at flight altitudes of 25, 50, and 100 m were 2.2%, 4.6%, and 10.7%, respectively. Additionally, Pourreza et al. [81] reported an underestimation of the crown diameter, which increased as the UAV flight altitude increased and also tended to increase with tree size. In our study, CA was overestimated by the SRP, and it decreased at higher altitudes due to the low pixel resolution and point density. Nasiri et al. [44] reported that the R2 and RMSE for crown diameter using the LM algorithm were 0.923 and 0.81 m (7.02%), respectively. The correlations for CA (0.45–0.55) were consistent with those of Moe et al. [28] who reported that the correlation between the UAV and manual CA values ranged from 0.45 to 0.57 in broadleaf tree species. He also reported low correlations between UAV and field-measured CA (0.23–0.44) values, but higher correlations were obtained for manual and field-measured CA (0.63–0.72) values. We analyzed the relationship between the manual and SRP CA values. The manual CA was lower than the SRP CA in many larch trees, resulting in an overestimation. This was because the SRP delineated a tree crown CHM that included the shadow area (i.e., the extended lower canopy of the tree crown) in the tree CA; however, CA was manually delineated based on the visual appearance of larch in the orthomosaic. Similarly, the manual CA had a higher value than the SRP CA in some trees, resulting in underestimation. This was because the SRP delineated the larch tree crown into two or more crowns. This explained the low accuracy of CA in our study. Manual crown delineation was more accurate in delineating multiple tree crowns due to the high-resolution orthomosaic. Therefore, various robust delineation approaches are needed to derive the CA of old larch plantations when the tree canopy contains multiple overlapping crowns.

5.4. Tree DBH, V, and CST

The values of our field stand parameters were high due to the plantation being 115 years old; plantations in other areas were much younger. The DBH of larch ranged from 10.9 to 23.7 cm at a 60-year-old Japanese larch plantation in central Japan [95]. Kita et al. [96] reported that TH, DBH, V, stand density, stand volume, and CST were in the ranges of 20.8–21.6 m, 21.4–27.7 cm, 0.393–0.587 m3 tree−1, 460–896 stem ha−1, 276–353 m3 ha−1, and 84.6–106.1 MgC ha−1, respectively, in a 31-year-old larch plantation. For the management of a single tree, the individual tree DBH is an important variable. However, the estimation of individual tree DBH from point-density-related remote sensing metrics is not significant. In this study, we developed a model using UAV-derived image metrics [28,97,98]. Yu et al. [97] stated that individual tree DBH estimated using the tree crown and height metrics was the best model. Chen et al. [98] reported that an individual tree V, estimated using LiDAR height and crown metrics, was the best model. Tree crown and TH measured in the field were used to develop the DBH models in previous studies [99,100]. In our study, we used UAV TH and CA metrics, as well as various filtering methods and the manual CA, to develop DBH, V, and CST models. The results of the DBH, V, and CST models also revealed that manual CA values, together with UAV TH values, could better estimate the DBH, V, and CST. Moe et al. [28] obtained R2 values of 0.32−0.47 using field CA and TH, while they were in the range of 0.4−0.56 using the manual CA and 99th percentile of TH in a conifer mixed broadleaf forest. Our R2 values (0.27−0.32) for the DBH, V, and CST models were closer to those of Moe et al. [28]. This may be due to the complexity of the old stand. However, a high R2 value could be obtained using LIDAR point clouds and other structural and textural UAV metrics.

5.5. Parameter Setting during the Photogrammetric Process

SfM and Multi-View Stereo (MVS) techniques were used in the UAV photogrammetry pipeline and these were processed in a fully automated way [101]. A 3D model of an object is built from the different positions of 2D photographs using the SfM technique [102]. The 3D model is created from the common features that were identified as matching points or key points in the 2D images through the SIFT (Scale Invariant Feature Transform) algorithms of SfM [103]. We used the same setting of Agisoft Metashape software (version 2.8) during the image alignment and other processes for both UAV altitudes of 80 m and 120 m to maintain consistency. Mousavi et al. [101] reported that tie point setting during the photogrammetric process will affect the accuracy of image orientation and outcome. Barazzetti [104] found that the improvement of precision is significant for a small number of points, whereas a huge number of 3D points does not provide significant improvement. Reprojection error, multiplicity, intersection angle, and a posteriori standard deviation are considered quality parameters during the point cloud extraction [105]. The aggregation of these quality metrics allows for the removal of low-quality tie points before refining the orientation results in a new adjustment. The lowest values for re-projection error and posteriori standard deviation, while the highest values for multiplicity and intersection angle showed high accuracy during the image alignment process. In this study, the mean reprojection error was higher for UAV 120 m (3.41 pix) than for UAV 80 m (2.84 pix). The authors will consider the algorithm of multi-criteria decision-making (MCDM) developed by Mousavi et al. [106] for future analysis to reduce the reprojection error.
However, our RMSE was 0.000553 m and 0.331 pix for UAV 80 m, and 0.000481 m and 0.367 pix for UAV 120 m, during image alignment when two GCPs were added in Agisoft Metashape, confirming the accuracy of the measurements. Tavasci et al. [79] obtained an RMSE of 0.06 m with seven GCPs and a GSD of 0.03 m using RTK GNSS, and they confirmed the good quality of the measurement. Izere [107] stated that UAV Phantom 4 RTK M300 with RTK GNSS enabled high accuracy on plant height estimation without GCPs. As a result, it has been suggested that this accurate positioning information can serve as a viable alternative to the traditional use of GCPs for georeferencing photogrammetric models [108,109,110]. Tahar [111] found that the error range was decreased after using seven or more GCPs in 150 ha. Kalacska et al. [112] concluded that where repeatability and adherence to a high level of accuracy are needed, only RTK and PPK systems should be used without GCPs. Stott et al. [113] stated that using no GCPs and 5 GCPs with 3300 independent spatially distributed RTK-GNSS surveyed checkpoints gave an RMSE of 0.066 and 0.072 m, respectively. Štroner et al. [114] combined DJI Phantom 4 RTK with RTK-GNSS methods, giving the best results for both the vertical and horizontal components, but using a small number of GCPs (at least one) or quality camera pre-calibration is applicable where the terrain is difficult for SfM evaluation. Martínez-Carricondo et al. [115] minimized the altimetry errors by placing 1.7 GCPs around the edge of the study area. Yu et al. [116] reported that 12 GCPs and 18 GCPs were optimal for 7–39 ha and 342 ha, respectively.
GCPs rigorously incorporated in the adjustment remain mandatory to control network deformation. Accuracy is also dependent on the software [58]. We therefore used two highly accurate GCPs using RTK GNSS and calibrated images for the photogrammetric process using Agisoft Metashape. Additionally, Swayze et al. [43] stated that Agisoft Metashape-estimated horizontal alignment error was not significantly different with the acquisition altitude of UAV flight. We also found that the image resolution was reduced when the altitude increased (3.25 cm/pix at UAV 80 m and 5.39 cm/pix at UAV 120 m), this also played the accuracy of the outcome. Frey et al. [117] stated that the influence of flight parameters on TH and crown diameter was studied; however, there was a knowledge gap on other stand parameters. They also reported that the accuracy of TH was high for all UAV flight parameters; however, the accuracy of DBH was high for lower altitudes. Image alignment and positional accuracy of the point clouds are the sources of error in extracting individual tree locations and DBH. However, they used a 4 m buffer during the tree-matching process to reduce spatial positional errors [43]. In this study, the accuracy of ITD and TH was obtained without buffer.

6. Conclusions

Many studies have been conducted in old-growth forests and plantations using various remote sensing technologies, but to the best of our knowledge, stand parameters have not been estimated using UAV technology in an old Japanese larch plantation. The old larch plantation in this study had mean TH, DBH, BA, V, and CST values of 35.2 m tree−1, 60.9 cm tree−1, 0.3 m2 tree−1, 3.76 m3 tree−1, and 1.15 MgC tree−1, respectively, while tree density, stand V, and stand CST were 154 stem ha−1, 543 m3 ha−1, and 168 MgC ha−1, respectively. The CHM was generated using UAV DSM and LiDAR DTM to ensure the accuracy of the extracted stand parameters. From the UAV photogrammetry results, the accuracy of ITD and TH was higher than that of the CA, DBH, V, and CST. For the different UAV flying altitudes, the accuracy of ITD, CA, and DBH was highest at UAV 80 m, whereas the accuracy of TH, V, and CST was highest at UAV 120 m.
Increasing the search radius and window size improved the ITD rate. When comparing the different filtering methods, the accuracy of TH was highest for both mean and Gaussian filtering, while for CA it was highest for Gaussian filtering. Comparatively, higher accuracy was obtained with Gaussian and mean filtering. Only the high-resolution UAV orthomosaic enabled highly accurate manual crown delineation. For DBH, V, and CST estimation, the best model was obtained after fitting with the metrics of manually digitalized CA and UAV TH. We found that the accuracy of the stand parameters depended on the altitude of the UAV and the filtering method used. Therefore, forest managers should be aware that the estimation of stand parameters depends on the UAV’s flight altitude and filtering methods. In general, varying the flying altitude and related algorithms, together with the use of various filtering methods in old Japanese larch plantations, may improve stand parameter estimation. Therefore, as in other conifer plantations, we speculate that there will be variations in the estimated value. However, a detailed investigation is needed for other old conifer plantations that are distributed with different formats of stand structure such as shape of the tree crown, distribution of crown area, acuteness of canopy top, and number of branches. Future studies should focus on refining these methods, exploring the potential of other algorithms and techniques, using high-resolution hyperspectral imagery for more accurate and efficient tree detection, and stand parameter estimation using UAV photogrammetry.

Author Contributions

For conceptualization, methodology, formal analysis, and writing—original draft preparation, J.K.; resources, supervision, writing—review and editing, T.O.; writing—review and editing, S.T.; writing—review and editing, T.H. All authors have read and agreed to the published version of the manuscript.

Funding

The LiDAR data were provided by the UTHF from the datasets funded by JURO KAWACHI DONATION FUND, the grant of joint research between the UTHF and Oji Forest & Products Co., Ltd., and JSPS KAKENHI No. 16H04946. This work was partially supported by JSPS KAKENHI grant number 18K05742 and Project for co-creation of a regenerative, recycling-oriented future society through the creation of new value from trees and plants by Sumitomo Forestry Co., Ltd. and the University of Tokyo.

Informed Consent Statement

Not applicable.

Data Availability Statement

The field and UAV datasets presented in this study are available on request from the corresponding author.

Acknowledgments

We would like to thank the technical staff of the UTHF—Masaki Matsui, Noriyuki Kimura, Nozomi Oikawa, Shinya Inukai, and Yuji Nakagawa—for their significant contribution in field measurements and UAV data collection at the study site.

Conflicts of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest and the funders were not involved in the study design, collection, analysis, interpretation of data, the writing of this article or the decision to submit it for publication.

Appendix A

Table A1. Parameter setting of UAV image processing.
Table A1. Parameter setting of UAV image processing.
Photogrammetric ProcessParameters
Image alignmentAccuracy: medium
Pair selection: Reference
Key points: 40,000
Tie points: 1000
Guided marker positioning4
Camera optimization parametersF, b1, b2, cx, cy, k1–k4, p1, p2
Building dense cloudQuality: medium
Depth filtering: Mild
Quality: medium
Depth filtering: mild
Building meshSurface type: medium
Source data: Dense cloud
Interpolation: Disabled
Face count: high
Building DEMType: Geographic
Source data: Dense cloud
Interpolation: enabled
Building orthomosaicType: Geographic
Surface: DEM
Blending mode: Mosaic
Enable hole filling
Table A2. Processing time of UAV imagery in Agisoft Metashape (version 1.8.4.)
Table A2. Processing time of UAV imagery in Agisoft Metashape (version 1.8.4.)
Photogrammetric ProcessSettingUAV 80 mUAV 120 m
Image alignmentMatching time15 min 34 s4 min 9 s
Alignment time25 min 55 s3 min 34 s
Camera optimizationOptimization time1 min 13 s13 s
Building dense cloudProcessing time1 h 1 min18 min 5 s
Generation time2 h 7 min18 min 5 s
Depth map and reconstruct1 h 30 min1 h 7 min
Building DEMProcessing time2 min 6 s15 s
Building OrthomosaicProcessing time1 h 28 min1 h 5 min
Systems usedSoftware: Agsoft Metashape Professional
Software version: 1.8.4 build 14856
OS: Windows 64 bit
RAM: 31.71 GB
CPU: 11th Gen Inter(R) Core (TM) i7-11800H @2.30 GHz
GPU: NVIDIA GeForce RTX 3050 Ti Laptop GPU
Table A3. Results of drone-image processing.
Table A3. Results of drone-image processing.
UAV Attribute UAV 80 mUAV 120 m
Acquired images 1257372
Flying altitude80 m120 m
Point density931 points/m2328 points/m2
Pixel resolution3.25 cm/pix5.39 cm/pix
Image resolution8192 × 5460 pix8192 × 5460 pix
Ground resolution0.813 cm/pix1.35 cm/pix
Tie points1,109,968483,325
Projections3,707,6211,030,596
Mean reprojection error (pixel)2.84 pix3.04 pix
Total error with GCPs0.331 pix0.367 pix
Dense clouds136,581,15874,639,588
Coordinate systemJGD2000 Japan–19 zone XII/GSIGEO 2000 geoid
Figure A1. Representation of the field tree location in the respective orthomosaics: (a) Field tree location and stand area in derived orthomosaic at UAV 80 m; (b) Field tree location and stand area in derived orthomosaic at UAV 120 m. Red dots indicate the spatial location of the trees while numbers (yellow color) represent the respective tree number labeled in the field. Two orange dots represent the distribution of the ground control points, GCP 1 and GCP 2.
Figure A1. Representation of the field tree location in the respective orthomosaics: (a) Field tree location and stand area in derived orthomosaic at UAV 80 m; (b) Field tree location and stand area in derived orthomosaic at UAV 120 m. Red dots indicate the spatial location of the trees while numbers (yellow color) represent the respective tree number labeled in the field. Two orange dots represent the distribution of the ground control points, GCP 1 and GCP 2.
Sensors 23 09907 g0a1aSensors 23 09907 g0a1b
Figure A2. Representation of UAV treetop and field treetop for part of the stand, in which illustration of TP, FN, and FP: (a) LM treetop (b) SRP treetop. Where TP—is the number of correctly detected trees; FP—is the number of incorrectly detected trees; FN—is the number of incorrectly undetected trees; TN—not applicable, is denoted as those places where no tree exists and the model finds no trees.
Figure A2. Representation of UAV treetop and field treetop for part of the stand, in which illustration of TP, FN, and FP: (a) LM treetop (b) SRP treetop. Where TP—is the number of correctly detected trees; FP—is the number of incorrectly detected trees; FN—is the number of incorrectly undetected trees; TN—not applicable, is denoted as those places where no tree exists and the model finds no trees.
Sensors 23 09907 g0a2

References

  1. Yang, Z.; Zheng, Q.; Zhuo, M.; Zeng, H.; Hogan, J.A.; Lin, T.C. A Culture of Conservation: How an Ancient Forest Plantation Turned into an Old-Growth Forest Reserve–The Story of the Wamulin Forest. People Nat. 2021, 3, 1014–1024. [Google Scholar] [CrossRef]
  2. Hoshi, H. Forest Tree Genetic Resources Conservation Stands of Japanese Larch (Larix kaempferi (Lamb.) Carr.); Genetic Resources Department, Forest Tree Breeding Center: Ibaraki, Japan, 2004; Volume 1. Available online: https://www.ffpri.affrc.go.jp/ftbc/research/kakonokouhousi/documents/e-tokubetu.pdf (accessed on 1 March 2023).
  3. Sato, M.; Seki, K.; Kita, K.; Moriguchi, Y.; Hashimoto, M.; Yunoki, K.; Ohnishi, M. Comparative Analysis of Diterpene Composition in the Bark of the Hybrid Larch F1, Larix gmelinii var. japonica × L. kaempferi and their Parent Trees. J. Wood Sci. 2009, 55, 32–40. [Google Scholar] [CrossRef]
  4. Mishima, K.; Hirakawa, H.; Iki, T.; Fukuda, Y.; Hirao, T.; Tamura, A.; Takahashi, M. Comprehensive Collection of Genes and Comparative Analysis of Full-Length Transcriptome Sequences from Japanese Larch (Larix kaempferi) and Kuril Larch (Larix gmelinii var. japonica). BMC Plant Biol. 2022, 22, 470. [Google Scholar] [CrossRef] [PubMed]
  5. Nagamitsu, T.; Nagasaka, K.; Yoshimaru, H.; Tsumura, Y. Provenance Tests for Survival and Growth of 50-Year-Old Japanese Larch (Larix kaempferi) Trees related to Climatic Conditions in Central Japan. Tree Genet. Genomes 2014, 10, 87–99. [Google Scholar] [CrossRef]
  6. Nagaike, T. Snag Abundance and Species Composition in a Managed Forest Landscape in Central Japan Composed of Larix kaempferi Plantations and Secondary Broadleaf Forests. Silva Fenn. 2009, 43, 755–766. [Google Scholar] [CrossRef]
  7. Takata, K.; Kurlnobu, S.; Koizumi, A.; Yasue, K.; Tamai, Y.; Kisanuki, M. Bibliography on Japanese Larch (Larix kaempferi (Lamb.) Carr.). Eurasian J. For. Res. 2005, 8, 111–126. [Google Scholar]
  8. Forestry Agency. State of Japan’s Forests and Forest Management-3rd Country Report of Japan to the Montreal Process; Japan, 2019. Available online: https://www.maff.go.jp/e/policies/forestry/attach/pdf/index-8.pdf (accessed on 20 March 2023).
  9. Kitao, N. Current State of Larch-Forestry in Hokkaido [Japan]: Area Studies for the Management of Experiment Forests of Kyoto University. Bull. Kyoto Univ. For. 1983, 55, 107–121. (In Japanese) [Google Scholar]
  10. Torita, H.; Masaka, K. Influence of Planting Density and Thinning on Timber Productivity and Resistance to Wind Damage in Japanese Larch (Larix kaempferi) Forests. J. Environ. Manag. 2020, 268, 110298. [Google Scholar] [CrossRef]
  11. Krause, S.; Sanders, T.G.M.; Mund, J.P.; Greve, K. UAV-Based Photogrammetric Tree Height Measurement for Intensive Forest Monitoring. Remote Sens. 2019, 11, 758. [Google Scholar] [CrossRef]
  12. Phalla, T.; Ota, T.; Mizoue, N.; Kajisa, T.; Yoshida, S.; Vuthy, M.; Heng, S. The Importance of Tree Height in Estimating Individual Tree Biomass while Considering Errors in Measurements and Allometric Models. Agrivita 2018, 40, 131–140. [Google Scholar] [CrossRef]
  13. Ramli, M.F.; Tahar, K.N. Homogeneous Tree Height Derivation from Tree Crown Delineation using Seeded Region Growing (SRG) Segmentation. Geo-Spat. Inf. Sci. 2020, 23, 195–208. [Google Scholar] [CrossRef]
  14. Panagiotidis, D.; Abdollahnejad, A.; Surový, P.; Chiteculo, V. Determining Tree Height and Crown Diameter from High-Resolution UAV Imagery. Int. J. Remote Sens. 2017, 38, 2392–2410. [Google Scholar] [CrossRef]
  15. Jayathunga, S.; Owari, T.; Tsuyuki, S.; Hirata, Y. Potential of UAV Photogrammetry for Characterization of Forest Canopy Structure in Uneven-Aged Mixed Conifer–Broadleaf Forests. Int. J. Remote Sens. 2020, 41, 53–73. [Google Scholar] [CrossRef]
  16. Sadeghi, S.M.M.; Attarod, P.; Pypker, T.G. Differences in Rainfall Interception during the Growing and Non-Growing Seasons in a Fraxinus rotundifolia Mill. Plantation Located in a Semiarid Climate. J. Agr. Sci. Tech. 2015, 17, 145–156. [Google Scholar]
  17. Thenkabail, P.S. Land Resources Monitoring, Modeling, and Mapping with Remote Sensing; Thenkabail, P.S., Ed.; CRC Press: Boca Raton, FL, USA, 2015; ISBN 9780429089442. [Google Scholar]
  18. Gao, H.; Bi, H.; Li, F. Modelling Conifer Crown Profiles as Nonlinear Conditional Quantiles: An Example with Planted Korean Pine in Northeast China. For. Ecol. Manag. 2017, 398, 101–115. [Google Scholar] [CrossRef]
  19. Valjarević, A.; Djekić, T.; Stevanović, V.; Ivanović, R.; Jandziković, B. GIS Numerical and Remote Sensing Analyses of Forest Changes in the Toplica Region for the Period of 1953–2013. Appl. Geogr. 2018, 92, 131–139. [Google Scholar] [CrossRef]
  20. Avery, T.E.; Burkhart, H. Forest Measurements, 5th ed.; McGraw Hill: Boston, MA, USA, 2002. [Google Scholar]
  21. Thiel, C.; Schmullius, C. Comparison of UAV Photograph-Based and Airborne LiDAR-Based Point Clouds over Forest from a Forestry Application Perspective. Int. J. Remote Sens. 2017, 38, 2411–2426. [Google Scholar] [CrossRef]
  22. Zarco-Tejada, P.J.; Diaz-Varela, R.; Angileri, V.; Loudjani, P. Tree Height Quantification using Very High Resolution Imagery Acquired from an Unmanned Aerial Vehicle (UAV) and Automatic 3D Photo-Reconstruction Methods. Eur. J. Agron. 2014, 55, 89–99. [Google Scholar] [CrossRef]
  23. Gu, J.; Grybas, H.; Congalton, R.G. Individual Tree Crown Delineation from UAS Imagery Based on Region Growing and Growth Space Considerations. Remote Sens. 2020, 12, 2363. [Google Scholar] [CrossRef]
  24. Bohlin, J.; Wallerman, J.; Fransson, J.E.S. Forest Variable Estimation using Photogrammetric Matching of Digital Aerial Images in Combination with a High-Resolution DEM. Scand. J. For. Res. 2012, 27, 692–699. [Google Scholar] [CrossRef]
  25. Järnstedt, J.; Pekkarinen, A.; Tuominen, S.; Ginzler, C.; Holopainen, M.; Viitala, R. Forest Variable Estimation using a High-Resolution Digital Surface Model. ISPRS J. Photogramm. Remote Sens. 2012, 74, 78–84. [Google Scholar] [CrossRef]
  26. White, J.C.; Wulder, M.A.; Vastaranta, M.; Coops, N.C.; Pitt, D.; Woods, M. The Utility of Image-Based Point Clouds for Forest Inventory: A Comparison with Airborne Laser Scanning. Forests 2013, 4, 518–536. [Google Scholar] [CrossRef]
  27. Wang, X.; Zhao, Q.; Han, F.; Zhang, J.; Jiang, P. Canopy Extraction and Height Estimation of Trees in a Shelter Forest Based on Fusion of an Airborne Multispectral Image and Photogrammetric Point Cloud. J. Sens. 2021, 2021, 5519629. [Google Scholar] [CrossRef]
  28. Moe, K.T.; Owari, T.; Furuya, N.; Hiroshima, T.; Morimoto, J. Application of UAV Photogrammetry with LiDAR Data to Facilitate the Estimation of Tree Locations and DBH Values for High-Value Timber Species in Northern Japanese Mixed-Wood Forests. Remote Sens. 2020, 12, 2865. [Google Scholar] [CrossRef]
  29. Hastaoglu, K.Ö.; Gogsu, S.; Gul, Y. Determining the Relationship between the Slope and Directional Distribution of the UAV Point Cloud and the Accuracy of Various IDW Interpolation. Int. J. Eng. Geosci. 2022, 7, 161–173. [Google Scholar] [CrossRef]
  30. Xu, Z.; Shen, X.; Cao, L.; Coops, N.C.; Goodbody, T.R.H.; Zhong, T.; Zhao, W.; Sun, Q.; Ba, S.; Zhang, Z.; et al. Tree Species Classification using UAS-Based Digital Aerial Photogrammetry Point Clouds and Multispectral Imageries in Subtropical Natural Forests. Int. J. Appl. Earth Obs. Geoinf. 2020, 92, 102173. [Google Scholar] [CrossRef]
  31. Kaartinen, H.; Hyyppä, J.; Yu, X.; Vastaranta, M.; Hyyppä, H.; Kukko, A.; Holopainen, M.; Heipke, C.; Hirschmugl, M.; Morsdorf, F.; et al. An International Comparison of Individual Tree Detection and Extraction using Airborne Laser Scanning. Remote Sens. 2012, 4, 950–974. [Google Scholar] [CrossRef]
  32. Vauhkonen, J.; Seppänen, A.; Packalén, P.; Tokola, T. Improving Species-Specific Plot Volume Estimates Based on Airborne Laser Scanning and Image Data using Alpha Shape Metrics and Balanced Field Data. Remote Sens. Environ. 2012, 124, 534–541. [Google Scholar] [CrossRef]
  33. Guerra-Hernández, J.; Cosenza, D.N.; Rodriguez, L.C.E.; Silva, M.; Tomé, M.; Díaz-Varela, R.A.; González-Ferreiro, E. Comparison of ALS- and UAV(SfM)-Derived High-Density Point Clouds for Individual Tree Detection in Eucalyptus Plantations. Int. J. Remote Sens. 2018, 39, 5211–5235. [Google Scholar] [CrossRef]
  34. Kukunda, C.B.; Duque-Lazo, J.; González-Ferreiro, E.; Thaden, H.; Kleinn, C. Ensemble Classification of Individual Pinus Crowns from Multispectral Satellite Imagery and Airborne LiDAR. Int. J. Appl. Earth Obs. Geoinf. 2018, 65, 12–23. [Google Scholar] [CrossRef]
  35. Takahashi, T.; Yamamoto, K.; Senda, Y.; Tsuzuku, M. Estimating Individual Tree Heights of Sugi (Cryptomeria japonica D. Don) Plantations in Mountainous Areas using Small-Footprint Airborne LiDAR. J. For. Res. 2005, 10, 135–142. [Google Scholar] [CrossRef]
  36. Machimura, T.; Fujimoto, A.; Hayashi, K.; Takagi, H.; Sugita, S. A Novel Tree Biomass Estimation Model Applying the Pipe Model Theory and Adaptable to UAV-Derived Canopy Height Models. Forests 2021, 12, 258. [Google Scholar] [CrossRef]
  37. Iizuka, K.; Yonehara, T.; Itoh, M.; Kosugi, Y. Estimating Tree Height and Diameter at Breast Height (DBH) from Digital Surface Models and Orthophotos obtained with an Unmanned Aerial System for a Japanese Cypress (Chamaecyparis obtusa) Forest. Remote Sens. 2018, 10, 13. [Google Scholar] [CrossRef]
  38. Ota, T.; Ogawa, M.; Mizoue, N.; Fukumoto, K.; Yoshida, S. Forest Structure Estimation from a UAV-Based Photogrammetric Point Cloud in Managed Temperate Coniferous Forests. Forests 2017, 8, 343. [Google Scholar] [CrossRef]
  39. Popescu, S.C.; Wynne, R.H. Seeing the Trees in the Forest. Photogramm. Eng. Remote Sens. 2004, 70, 589–604. [Google Scholar] [CrossRef]
  40. Abdullah, S.; Tahar, K.N.; Abdul Rashid, M.F.; Osoman, M.A. Estimating Tree Height Based on Tree Crown from UAV Imagery. Malays. J. Sustain. Environ. 2022, 9, 99. [Google Scholar] [CrossRef]
  41. Jeronimo, S.M.A.; Kane, V.R.; Churchill, D.J.; McGaughey, R.J.; Franklin, J.F. Applying LiDAR Individual Tree Detection to Management of Structurally Diverse Forest Landscapes. J. For. 2018, 116, 336–346. [Google Scholar] [CrossRef]
  42. Koontz, M.J.; Latimer, A.M.; Mortenson, L.A.; Fettig, C.J.; North, M.P. Cross-Scale Interaction of Host Tree Size and Climatic Water Deficit Governs Bark Beetle-Induced Tree Mortality. Nat. Commun. 2021, 12, 129. [Google Scholar] [CrossRef]
  43. Swayze, N.C.; Tinkham, W.T.; Vogeler, J.C.; Hudak, A.T. Influence of Flight Parameters on UAS-Based Monitoring of Tree Height, Diameter, and Density. Remote Sens. Environ. 2021, 263, 112540. [Google Scholar] [CrossRef]
  44. Nasiri, V.; Darvishsefat, A.A.; Arefi, H.; Pierrot-Deseilligny, M.; Namiranian, M.; Le Bris, A. Unmanned Aerial Vehicles (UAV)-Based Canopy Height Modeling under Leaf-on and Leaf-off Conditions for Determining Tree Height and Crown Diameter (Case Study: Hyrcanian Mixed Forest). Can. J. For. Res. 2021, 51, 962–971. [Google Scholar] [CrossRef]
  45. Arzt, M.; Deschamps, J.; Schmied, C.; Pietzsch, T.; Schmidt, D.; Tomancak, P.; Haase, R.; Jug, F. LABKIT: Labeling and Segmentation Toolkit for Big Image Data. Front. Comput. Sci. 2022, 4, 10. [Google Scholar] [CrossRef]
  46. Grznárová, A.; Mokroš, M.; Surový, P.; Slavík, M.; Pondelík, M.; Mergani, J. The Crown Diameter Estimation from Fixed Wing Type of UAV Imagery. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci.-ISPRS Arch. 2019, 42, 337–341. [Google Scholar] [CrossRef]
  47. Silva, C.A.; Hudak, A.T.; Vierling, L.A.; Valbuena, R.; Cardil, A.; Mohan, M.; de Almeida, D.R.A.; Broadbent, E.N.; Almeyda Zambrano, A.M.; Wilkinson, B.; et al. TREETOP: A Shiny-based Application and R Package for Extracting Forest Information from LiDAR Data for Ecologists and Conservationists. Methods Ecol. Evol. 2022, 13, 1164–1176. [Google Scholar] [CrossRef]
  48. Young, D.J.N.; Koontz, M.J.; Weeks, J.M. Optimizing Aerial Imagery Collection and Processing Parameters for Drone-Based Individual Tree Mapping in Structurally Complex Conifer Forests. Methods Ecol. Evol. 2022, 13, 1447–1463. [Google Scholar] [CrossRef]
  49. Maras, E.E.; Nasery, N. Investigating the Length, Area and Volume Measurement Accuracy of UAV-Based Oblique Photogrammetry Models Produced with and without Ground Control Points. Int. J. Eng. Geosci. 2023, 8, 32–51. [Google Scholar] [CrossRef]
  50. Lerma, J.L.; Navarro, S.; Cabrelles, M.; Villaverde, V. Terrestrial Laser Scanning and Close Range Photogrammetry for 3D Archaeological Documentation: The Upper Palaeolithic Cave of Parpalló as a Case Study. J. Archaeol. Sci. 2010, 37, 499–507. [Google Scholar] [CrossRef]
  51. Yakar, M.; Ulvi, A.; Yiğit, A.Y.; Alptekin, A. Discontinuity Set Extraction from 3D Point Clouds Obtained by UAV Photogrammetry in a Rockfall Site. Surv. Rev. 2023, 55, 416–428. [Google Scholar] [CrossRef]
  52. Godfrey, I.; Avard, G.; Brenes, J.P.S.; Cruz, M.M.; Meghraoui, K. Using Sniffer4D and SnifferV Portable Gas Detectors for UAS Monitoring of Degassing at the Turrialba Volcano Costa Rica. Adv. UAV 2023, 3, 54–90. [Google Scholar]
  53. Gómez, C.; Wulder, M.A.; Montes, F.; Delgado, J.A. Modeling Forest Structural Parameters in the Mediterranean Pines of Central Spain using QuickBird-2 Imagery and Classification and Regression Tree Analysis (CART). Remote Sens. 2012, 4, 135–159. [Google Scholar] [CrossRef]
  54. Morin, D.; Planells, M.; Guyon, D.; Villard, L.; Mermoz, S.; Bouvet, A.; Thevenon, H.; Dejoux, J.-F.; Le Toan, T.; Dedieu, G. Estimation and Mapping of Forest Structure Parameters from Open Access Satellite Images: Development of a Generic Method with a Study Case on Coniferous Plantation. Remote Sens. 2019, 11, 1275. [Google Scholar] [CrossRef]
  55. Jayathunga, S.; Owari, T.; Tsuyuki, S. Evaluating the Performance of Photogrammetric Products using Fixed-Wing UAV Imagery over a Mixed Conifer–Broadleaf Forest: Comparison with Airborne Laser Scanning. Remote Sens. 2018, 10, 187. [Google Scholar] [CrossRef]
  56. Gao, S.; Zhang, Z.; Cao, L. Individual Tree Structural Parameter Extraction and Volume Table Creation Based on Near-Field LiDAR Data: A Case Study in a Subtropical Planted Forest. Sensors 2021, 21, 8162. [Google Scholar] [CrossRef] [PubMed]
  57. Belmonte, A.; Sankey, T.; Biederman, J.A.; Bradford, J.; Goetz, S.J.; Kolb, T.; Woolley, T. UAV-derived Estimates of Forest Structure to Inform Ponderosa Pine Forest Restoration. Remote Sens. Ecol. Conserv. 2020, 6, 181–197. [Google Scholar] [CrossRef]
  58. Kameyama, S.; Sugiura, K. Effects of Differences in Structure from Motion Software on Image Processing of Unmanned Aerial Vehicle Photography and Estimation of Crown Area and Tree Height in Forests. Remote Sens. 2021, 13, 626. [Google Scholar] [CrossRef]
  59. Hao, Y.; Widagdo, F.R.A.; Liu, X.; Quan, Y.; Liu, Z.; Dong, L.; Li, F. Estimation and Calibration of Stem Diameter Distribution using UAV Laser Scanning Data: A Case Study for Larch (Larix olgensis) Forests in Northeast China. Remote Sens. Environ. 2022, 268, 112769. [Google Scholar] [CrossRef]
  60. Liu, K.; Shen, X.; Cao, L.; Wang, G.; Cao, F. Estimating Forest Structural Attributes using UAV-LiDAR Data in Ginkgo Plantations. ISPRS J. Photogramm. Remote Sens. 2018, 146, 465–482. [Google Scholar] [CrossRef]
  61. Navarro, A.; Young, M.; Allan, B.; Carnell, P.; Macreadie, P.; Ierodiaconou, D. The Application of Unmanned Aerial Vehicles (UAVs) to estimate Above-Ground Biomass of Mangrove Ecosystems. Remote Sens. Environ. 2020, 242, 111747. [Google Scholar] [CrossRef]
  62. Silva, C.A.; Hudak, A.T.; Vierling, L.A.; Loudermilk, E.L.; O’Brien, J.J.; Hiers, J.K.; Jack, S.B.; Gonzalez-Benecke, C.; Lee, H.; Falkowski, M.J.; et al. Imputation of Individual Longleaf Pine (Pinus palustris Mill.) Tree Attributes from Field and LiDAR Data. Can. J. Remote Sens. 2016, 42, 554–573. [Google Scholar] [CrossRef]
  63. Korpela, I.; Anttila, P.; Pitkänen, J. The Performance of a Local Maxima Method for Detecting Individual Tree Tops in Aerial Photographs. Int. J. Remote Sens. 2006, 27, 1159–1175. [Google Scholar] [CrossRef]
  64. Lindberg, E.; Hollaus, M. Comparison of Methods for Estimation of Stem Volume, Stem Number and Basal Area from Airborne Laser Scanning Data in a Hemi-Boreal Forest. Remote Sens. 2012, 4, 1004–1023. [Google Scholar] [CrossRef]
  65. Gao, T.; Gao, Z.; Sun, B.; Qin, P.; Li, Y.; Yan, Z. An Integrated Method for estimating Forest-Canopy Closure Based on UAV LiDAR Data. Remote Sens. 2022, 14, 4317. [Google Scholar] [CrossRef]
  66. Cao, Y.; Ball, J.G.C.; Coomes, D.A.; Steinmeier, L.; Knapp, N.; Wilkes, P.; Disney, M.; Calders, K.; Burt, A.; Lin, Y.; et al. Benchmarking Airborne Laser Scanning Tree Segmentation Algorithms in Broadleaf Forests Shows High Accuracy Only for Canopy Trees. Int. J. Appl. Earth Obs. Geoinf. 2023, 123, 103490. [Google Scholar] [CrossRef]
  67. Mohan, M.; Leite, R.V.; Broadbent, E.N.; Wan Mohd Jaafar, W.S.; Srinivasan, S.; Bajaj, S.; Dalla Corte, A.P.; Do Amaral, C.H.; Gopan, G.; Saad, S.N.M.; et al. Individual Tree Detection using UAV-LiDAR and UAV-SfM Data: A Tutorial for Beginners. Open Geosci. 2021, 13, 1028–1039. [Google Scholar] [CrossRef]
  68. Fu, L.; Duan, G.; Ye, Q.; Meng, X.; Luo, P.; Sharma, R.P.; Sun, H.; Wang, G.; Liu, Q. Prediction of Individual Tree Diameter using a Nonlinear Mixed-Effects Modeling Approach and Airborne LiDAR Data. Remote Sens. 2020, 12, 1066. [Google Scholar] [CrossRef]
  69. Leite, R.; Silva, C.; Mohan, M.; Cardil, A.; Almeida, D.; Carvalho, S.; Jaafar, W.; Guerra-Hernández, J.; Weiskittel, A.; Hudak, A.; et al. Individual Tree Attribute Estimation and Uniformity Assessment in Fast-Growing Eucalyptus spp. Forest Plantations using LiDAR and Linear Mixed-Effects Models. Remote Sens. 2020, 12, 3599. [Google Scholar] [CrossRef]
  70. Liang, X.; Wang, Y.; Jaakkola, A.; Kukko, A.; Kaartinen, H.; Hyyppa, J.; Honkavaara, E.; Liu, J. Forest Data Collection using Terrestrial Image-Based Point Clouds from a Handheld Camera Compared to Terrestrial and Personal Laser Scanning. IEEE Trans. Geosci. Remote Sens. 2015, 53, 5117–5132. [Google Scholar] [CrossRef]
  71. Piermattei, L.; Karel, W.; Wang, D.; Wieser, M.; Mokroš, M.; Surový, P.; Koreň, M.; Tomaštík, J.; Pfeifer, N.; Hollaus, M. Terrestrial Structure from Motion Photogrammetry for Deriving Forest Inventory Data. Remote Sens. 2019, 11, 950. [Google Scholar] [CrossRef]
  72. Sun, Y.; Jin, X.; Pukkala, T.; Li, F. Predicting Individual Tree Diameter of Larch (Larix olgensis) from UAV-LiDAR Data using Six Different Algorithms. Remote Sens. 2022, 14, 1125. [Google Scholar] [CrossRef]
  73. Hirschmugl, M.; Sobe, C.; Di Filippo, A.; Berger, V.; Kirchmeir, H.; Vandekerkhove, K. Review on the Possibilities of Mapping Old-Growth Temperate Forests by Remote Sensing in Europe. Environ. Model. Assess. 2023, 28, 761–785. [Google Scholar] [CrossRef]
  74. Qiu, Z.; Feng, Z.-K.; Wang, M.; Li, Z.; Lu, C. Application of UAV Photogrammetric System for Monitoring Ancient Tree Communities in Beijing. Forests 2018, 9, 735. [Google Scholar] [CrossRef]
  75. Holiaka, D.; Kato, H.; Yoschenko, V.; Onda, Y.; Igarashi, Y.; Nanba, K.; Diachuk, P.; Holiaka, M.; Zadorozhniuk, R.; Kashparov, V.; et al. Scots Pine Stands Biomass Assessment using 3D Data from Unmanned Aerial Vehicle Imagery in the Chernobyl Exclusion Zone. J. Environ. Manag. 2021, 295, 113319. [Google Scholar] [CrossRef] [PubMed]
  76. Zhou, X.; Zhang, X. Individual Tree Parameters Estimation for Plantation Forests Based on UAV Oblique Photography. IEEE Access 2020, 8, 96184–96198. [Google Scholar] [CrossRef]
  77. Moe, K.T.; Owari, T.; Furuya, N.; Hiroshima, T. Comparing Individual Tree Height Information Derived from Field Surveys, LiDAR and UAV-DAP for High-Value Timber Species in Northern Japan. Forests 2020, 11, 223. [Google Scholar] [CrossRef]
  78. Greenhouse Gas Inventory Office of Japan and Ministry of Environment, Japan (Ed.) National Greenhouse Gas Inventory Report of JAPAN 2023; Center for Global Environmental Research, Earth System Division, National Institute for Environmental Studies: Japan. Available online: https://www.nies.go.jp/gio/archive/nir/jqjm1000001v3c7t-att/NIR-JPN-2023-v3.0_gioweb.pdf (accessed on 3 February 2023).
  79. Tavasci, L.; Lambertini, A.; Donati, D.; Girelli, V.A.; Lattanzi, G.; Castellaro, S.; Gandolfi, S.; Borgatti, L. A Laboratory for the Integration of Geomatic and Geomechanical Data: The Rock Pinnacle “Campanile Di Val Montanaia”. Remote Sens. 2023, 15, 4854. [Google Scholar] [CrossRef]
  80. Mot, L.; Hong, S.; Charoenjit, K.; Zhang, H. Tree Height Estimation using Field Measurement and Low-Cost Unmanned Aerial Vehicle (UAV) at Phnom Kulen National Park of Cambodia. In Proceedings of the 2021 9th International Conference on Agro-Geoinformatics, Agro-Geoinformatics 2021, Shenzhen, China, 26–29 July 2021. [Google Scholar]
  81. Pourreza, M.; Moradi, F.; Khosravi, M.; Deljouei, A.; Vanderhoof, M.K. GCPs-Free Photogrammetry for estimating Tree Height and Crown Diameter in Arizona Cypress Plantation using UAV-Mounted GNSS RTK. Forests 2022, 13, 1905. [Google Scholar] [CrossRef]
  82. Mohan, M.; Silva, C.A.; Klauberg, C.; Jat, P.; Catts, G.; Cardil, A.; Hudak, A.T.; Dia, M. Individual Tree Detection from Unmanned Aerial Vehicle (UAV) Derived Canopy Height Model in an Open Canopy Mixed Conifer Forest. Forests 2017, 8, 340. [Google Scholar] [CrossRef]
  83. Ahongshangbam, J.; Röll, A.; Ellsäßer, F.; Hendrayanto; Hölscher, D. Airborne Tree Crown Detection for Predicting Spatial Heterogeneity of Canopy Transpiration in a Tropical Rainforest. Remote Sens. 2020, 12, 651. [Google Scholar] [CrossRef]
  84. Goutte, C.; Gaussier, E. A Probabilistic Interpretation of Precision, Recall and F-Score, with Implication for Evaluation. In Proceedings of the Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2005; Volume 3408, pp. 345–359. [Google Scholar]
  85. Sokolova, M.; Japkowicz, N.; Szpakowicz, S. Beyond Accuracy, F-Score and ROC: A Family of Discriminant Measures for Performance Evaluation. In AI 2006: Advances in Artificial Intelligence; AI 2006. Lecture Notes in Computer Science; Sattar, A., Kang, B., Eds.; Springer: Berlin/Heidelberg, Germany, 2006; Volume 4304, pp. 1015–1021. [Google Scholar] [CrossRef]
  86. Akaike, H. Information Theory and an Extension of the Maximum Likelihood Principle; Selected Papers of Hirotugu Akaike. Springer Series in Statistics; Parzen, E., Tanabe, K., Kitagawa, G., Eds.; Springer: New York, NY, USA, 1998; pp. 199–213. [Google Scholar] [CrossRef]
  87. Kock, N.; Lynn, G.S. Lateral Collinearity and Misleading Results in Variance-Based SEM: An Illustration and Recommendations. J. Assoc. Inf. Syst. 2012, 13, 546–580. [Google Scholar] [CrossRef]
  88. Xiao, C.; Qin, R.; Huang, X.; Li, J. A Study of using Fully Convolutional Network for Treetop Detection on Remote Sensing Data. In Proceedings of the ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Karlsruhe, Germany; 2018; Volume 4, pp. 163–169. Available online: https://isprs-annals.copernicus.org/articles/IV-1/163/2018/isprs-annals-IV-1-163-2018.pdf (accessed on 20 April 2023).
  89. Nevalainen, O.; Honkavaara, E.; Tuominen, S.; Viljanen, N.; Hakala, T.; Yu, X.; Hyyppä, J.; Saari, H.; Pölönen, I.; Imai, N.N.; et al. Individual Tree Detection and Classification with UAV-Based Photogrammetric Point Clouds and Hyperspectral Imaging. Remote Sens. 2017, 9, 185. [Google Scholar] [CrossRef]
  90. Islami, M.M.; Rusolono, T.; Setiawan, Y.; Rahadian, A.; Hudjimartsu, S.A.; Prasetyo, L.B. Height, Diameter and Tree Canopy Cover Estimation Based on Unmanned Aerial Vehicle (UAV) Imagery with Various Acquisition Height. Media Konserv. 2021, 26, 17–27. [Google Scholar] [CrossRef]
  91. White, J. Estimating the Age of Large & Veteran Trees in Britain; Forestry Commission: Edinburgh, Scotland, 1998; Available online: https://www.ancienttreeforum.org.uk/wp-content/uploads/2015/03/John-White-estimating-file-pdf.pdf (accessed on 28 April 2023).
  92. Gilmartin, E. Ancient and Veteran Trees: An Assessment Guid; The Woodland Trust: Grantham, UK, 2022.
  93. Cai, S.; Zhang, W.; Liang, X.; Wan, P.; Qi, J.; Yu, S.; Yan, G.; Shao, J. Filtering Airborne LiDAR Data Through Complementary Cloth Simulation and Progressive TIN Densification Filters. Remote Sens. 2019, 11, 1037. [Google Scholar] [CrossRef]
  94. Tang, S.; Dong, P.; Buckles, B.P. Three-Dimensional Surface Reconstruction of Tree Canopy from Lidar Point Clouds using a Region-Based Level Set Method. Int. J. Remote Sens. 2013, 34, 1373–1385. [Google Scholar] [CrossRef]
  95. Yoshida, T.; Hasegawa, M.; Taira, H.; Noguchi, M. Stand Structure and Composition of a 60-Year-Old Larch (Larix kaempferi) Plantation with Retained Hardwoods. J. For. Res. 2005, 10, 351–358. [Google Scholar] [CrossRef]
  96. Kita, K.; Fujimoto, T.; Uchiyama, K.; Kuromaru, M.; Akutsu, H. Estimated Amount of Carbon Accumulation of Hybrid Larch in Three 31-Year-Old Progeny Test Plantations. J. Wood Sci. 2009, 55, 425–434. [Google Scholar] [CrossRef]
  97. Yu, X.; Hyyppä, J.; Vastaranta, M.; Holopainen, M.; Viitala, R. Predicting Individual Tree Attributes from Airborne Laser Point Clouds Based on the Random Forests Technique. ISPRS J. Photogramm. Remote Sens. 2011, 66, 28–37. [Google Scholar] [CrossRef]
  98. Chen, Q.; Gong, P.; Baldocchi, D.; Tian, Y.Q. Estimating Basal Area and Stem Volume for Individual Trees from Lidar Data. Photogramm. Eng. Remote Sens. 2007, 73, 1355–1365. [Google Scholar] [CrossRef]
  99. Jucker, T.; Caspersen, J.; Chave, J.; Antin, C.; Barbier, N.; Bongers, F.; Dalponte, M.; van Ewijk, K.Y.; Forrester, D.I.; Haeni, M.; et al. Allometric Equations for Integrating Remote Sensing Imagery into Forest Monitoring Programmes. Glob. Chang. Biol. 2017, 23, 177–190. [Google Scholar] [CrossRef]
  100. Hulshof, C.M.; Swenson, N.G.; Weiser, M.D. Tree Height-Diameter Allometry across the United States. Ecol. Evol. 2015, 5, 1193–1204. [Google Scholar] [CrossRef]
  101. Mousavi, V.; Varshosaz, M.; Remondino, F. Evaluating Tie Points Distribution, Multiplicity and Number on the Accuracy of UAV Photogrammetry Blocks. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2021, XLIII, 39–46. [Google Scholar] [CrossRef]
  102. Snavely, N.; Seitz, S.M.; Szeliski, R. Modeling the World from Internet Photo Collections. Int. J. Comput. Vis. 2008, 80, 189–210. [Google Scholar] [CrossRef]
  103. Lowe, D.G. Distinctive Image Features from Scale-Invariant Keypoints. Int. J. Comput. Vis. 2004, 60, 91–110. [Google Scholar] [CrossRef]
  104. Barazzetti, L. Network Design in Close-Range Photogrammetry with Short Baseline Images. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, IV–2, 17–23. [Google Scholar] [CrossRef]
  105. Farella, E.M.; Torresani, A.; Remondino, F. Refining the Joint 3D Processing of Terrestrial and UAV Images using Quality Measures. Remote Sens. 2020, 12, 2873. [Google Scholar] [CrossRef]
  106. Mousavi, V.; Varshosaz, M.; Rashidi, M.; Li, W. A New Multi-Criteria Tie Point Filtering Approach to increase the Accuracy of UAV Photogrammetry Models. Drones 2022, 6, 413. [Google Scholar] [CrossRef]
  107. Izere, P. Plant Height Estimation Using RTK-GNSS Enabled Unmanned Aerial Vehicle (UAV) Photogrammetry. Master’s Thesis, University of Nebraska, Lincoln, NE, USA, 2023. [Google Scholar]
  108. Tomaštík, J.; Mokroš, M.; Surový, P.; Grznárová, A.; Merganič, J. UAV RTK/PPK Method—An Optimal Solution for Mapping Inaccessible Forested Areas? Remote Sens. 2019, 11, 721. [Google Scholar] [CrossRef]
  109. Zhao, B.; Li, J.; Wang, L.; Shi, Y. Positioning Accuracy Assessment of a Commercial RTK UAS. In Proceedings of the Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping V; SPIE Defense + Commercial Sensing: 2020, 1141409; p. 8. Available online: https://www.spiedigitallibrary.org/conference-proceedings-of-spie/11414/2557899/Positioning-accuracy-assessment-of-a-commercial-RTK-UAS/10.1117/12.2557899.short?SSO=1 (accessed on 20 September 2023).
  110. Štroner, M.; Urban, R.; Seidl, J.; Reindl, T.; Brouček, J. Photogrammetry using UAV-Mounted GNSS RTK: Georeferencing Strategies without GCPs. Remote Sens. 2021, 13, 1336. [Google Scholar] [CrossRef]
  111. Tahar, K.N. An Evaluation on Different Number of Ground Control Points in Unmanned Aerial Vehicle Photogrammetric Block. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2013, XL–2, 27–29. [Google Scholar] [CrossRef]
  112. Kalacska, M.; Lucanus, O.; Arroyo-Mora, J.; Laliberté, É.; Elmer, K.; Leblanc, G.; Groves, A. Accuracy of 3D Landscape Reconstruction without Ground Control Points using Different UAS Platforms. Drones 2020, 4, 13. [Google Scholar] [CrossRef]
  113. Stott, E.; Williams, R.D.; Hoey, T.B. Ground Control Point Distribution for Accurate Kilometre-Scale Topographic Mapping using an RTK-GNSS Unmanned Aerial Vehicle and SfM Photogrammetry. Drones 2020, 4, 55. [Google Scholar] [CrossRef]
  114. Štroner, M.; Urban, R.; Reindl, T.; Seidl, J.; Brouček, J. Evaluation of the Georeferencing Accuracy of a Photogrammetric Model using a Quadrocopter with Onboard GNSS RTK. Sensors 2020, 20, 2318. [Google Scholar] [CrossRef]
  115. Martínez-Carricondo, P.; Agüera-Vega, F.; Carvajal-Ramírez, F.; Mesas-Carrascosa, F.-J.; García-Ferrer, A.; Pérez-Porras, F.-J. Assessment of UAV-Photogrammetric Mapping Accuracy Based on Variation of Ground Control Points. Int. J. Appl. Earth Obs. Geoinf. 2018, 72, 1–10. [Google Scholar] [CrossRef]
  116. Yu, J.J.; Kim, D.W.; Lee, E.J.; Son, S.W. Determining the Optimal Number of Ground Control Points for Varying Study Sites through Accuracy Evaluation of Unmanned Aerial System-Based 3D Point Clouds and Digital Surface Models. Drones 2020, 4, 49. [Google Scholar] [CrossRef]
  117. Frey, J.; Kovach, K.; Stemmler, S.; Koch, B. UAV Photogrammetry of Forests as a Vulnerable Process. A Sensitivity Analysis for a Structure from Motion RGB-Image Pipeline. Remote Sens. 2018, 10, 912. [Google Scholar] [CrossRef]
Figure 1. The study area map (Coordinate system: JGD2000 Japan–19 zone XII/GSIGEO 2000 geoid): Study locations (43°13′ N, 142°23′ E) of the larch plantation in the forest management sub-compartment 87B in the University of Tokyo Hokkaido Forest (UTHF) in Japan. The red dot with value represents the spatial position of a larch tree with tree number. Green, pink and yellow areas represent compartment 87, sub compartment 87B and larch stand area, respectively.
Figure 1. The study area map (Coordinate system: JGD2000 Japan–19 zone XII/GSIGEO 2000 geoid): Study locations (43°13′ N, 142°23′ E) of the larch plantation in the forest management sub-compartment 87B in the University of Tokyo Hokkaido Forest (UTHF) in Japan. The red dot with value represents the spatial position of a larch tree with tree number. Green, pink and yellow areas represent compartment 87, sub compartment 87B and larch stand area, respectively.
Sensors 23 09907 g001
Figure 2. UAV photogrammetry process in the field: (a) DJI M300 RTK UAV in the study area; (b) UAV flight plan at 80 m altitude with the base map; (c) UAV flight plan at 120 m altitude with the base map. Legend of different colors shows the elevation of the study area from mean sea level whereas the plus (+) sign with a value represents the length of one side of the flight area in the respective locations. In this case, there were five numbers due to the pentagon shape of the flight plan. The actual flight path had a buffer of 30 m around the flight area. Symbols Sensors 23 09907 i001, Sensors 23 09907 i002, Sensors 23 09907 i003, Sensors 23 09907 i004 represents the save (tap to save current settings and create a mission flight), delete selected waypoint (tap to delete the selected waypoint), start flight button (tap to perform the flight mission), and clear waypoints (tap to clear all the added waypoint), respectively. The meaning of location 1. 東京大学樹木園桜公園 and 2. 樹木園 are the cherry blossom park of the University of Tokyo arboretum and arboretum, respectively.
Figure 2. UAV photogrammetry process in the field: (a) DJI M300 RTK UAV in the study area; (b) UAV flight plan at 80 m altitude with the base map; (c) UAV flight plan at 120 m altitude with the base map. Legend of different colors shows the elevation of the study area from mean sea level whereas the plus (+) sign with a value represents the length of one side of the flight area in the respective locations. In this case, there were five numbers due to the pentagon shape of the flight plan. The actual flight path had a buffer of 30 m around the flight area. Symbols Sensors 23 09907 i001, Sensors 23 09907 i002, Sensors 23 09907 i003, Sensors 23 09907 i004 represents the save (tap to save current settings and create a mission flight), delete selected waypoint (tap to delete the selected waypoint), start flight button (tap to perform the flight mission), and clear waypoints (tap to clear all the added waypoint), respectively. The meaning of location 1. 東京大学樹木園桜公園 and 2. 樹木園 are the cherry blossom park of the University of Tokyo arboretum and arboretum, respectively.
Sensors 23 09907 g002
Figure 3. The workflow of the study including field data collection, UAV photogrammetry process; Canopy height model (CHM) generation; Feature extraction—Treetop, TH, and CA; manual CA delineation process; DBH, V, and CST estimation; and accuracy test.
Figure 3. The workflow of the study including field data collection, UAV photogrammetry process; Canopy height model (CHM) generation; Feature extraction—Treetop, TH, and CA; manual CA delineation process; DBH, V, and CST estimation; and accuracy test.
Sensors 23 09907 g003
Figure 4. The illustration of CHM derivation from UAV DSM and LiDAR DTM at two flight altitudes, UAV 80 m and UAV 120 m: (a) LiDAR DTM; (b) UAV DSM 80 m; (c) UAV CHM 80 m; (d) UAV DSM 120 m; (e) UAV CHM 120 m.
Figure 4. The illustration of CHM derivation from UAV DSM and LiDAR DTM at two flight altitudes, UAV 80 m and UAV 120 m: (a) LiDAR DTM; (b) UAV DSM 80 m; (c) UAV CHM 80 m; (d) UAV DSM 120 m; (e) UAV CHM 120 m.
Sensors 23 09907 g004aSensors 23 09907 g004b
Figure 5. The scatter plots of field TH with UAV TH: (a) Field TH with LM lowpass filtering at UAV 80 m; (b) Field TH with SM filtering at UAV 80 m; (c) Field TH with SG filtering at UAV 80 m; (d) Field TH with LML filtering at UAV 120 m; (e) Field TH with SG filtering at UAV 120 m; (f) Field TH with SG filtering at UAV 120 m. Black lines represent the zero intercept of the trend line. Red lines represent the regression line of the data (black dots).
Figure 5. The scatter plots of field TH with UAV TH: (a) Field TH with LM lowpass filtering at UAV 80 m; (b) Field TH with SM filtering at UAV 80 m; (c) Field TH with SG filtering at UAV 80 m; (d) Field TH with LML filtering at UAV 120 m; (e) Field TH with SG filtering at UAV 120 m; (f) Field TH with SG filtering at UAV 120 m. Black lines represent the zero intercept of the trend line. Red lines represent the regression line of the data (black dots).
Sensors 23 09907 g005
Figure 6. The results of the delineation of individual tree crowns: (a) Manual crown delineation at UAV 80 m with spatial positions of filed trees; (b) Manual crown delineation at UAV 120 m with spatial positions of filed trees; (c) SRP crown delineation at UAV 80 m with UAV treetops; (d) SRP crown delineation at UAV 120 m with UAV treetops. Yellow and orange lines in (a) and (b) represent the manual crown delineation of larch and other trees, respectively and red dots represent the spatial location of trees in the field. Black lines in (c) and (d) represent the SRP crown delineation of all trees and white dots represent the SRP detected UAV treetops.
Figure 6. The results of the delineation of individual tree crowns: (a) Manual crown delineation at UAV 80 m with spatial positions of filed trees; (b) Manual crown delineation at UAV 120 m with spatial positions of filed trees; (c) SRP crown delineation at UAV 80 m with UAV treetops; (d) SRP crown delineation at UAV 120 m with UAV treetops. Yellow and orange lines in (a) and (b) represent the manual crown delineation of larch and other trees, respectively and red dots represent the spatial location of trees in the field. Black lines in (c) and (d) represent the SRP crown delineation of all trees and white dots represent the SRP detected UAV treetops.
Sensors 23 09907 g006
Figure 7. The scatter plots of manual CA with SRP CA: (a) Manual CA with SRP mean CA at UAV 80 m; (b) Manual CA with SG CA at UAV 80 m; (c) Manual CA with SM CA at UAV 120 m; (d) Manual CA with SG CA at UAV 120 m. Black lines represent the zero intercept of the trend line. Red lines represent the regression line of the data (black dots).
Figure 7. The scatter plots of manual CA with SRP CA: (a) Manual CA with SRP mean CA at UAV 80 m; (b) Manual CA with SG CA at UAV 80 m; (c) Manual CA with SM CA at UAV 120 m; (d) Manual CA with SG CA at UAV 120 m. Black lines represent the zero intercept of the trend line. Red lines represent the regression line of the data (black dots).
Sensors 23 09907 g007
Figure 8. The scatter plots of DBH, V, and CST prediction with field estimated values: (a) Field DBH with Gaussian DBH at UAV 80 m; (b) Prediction of field V with SG volume at UAV 120 m; (c) Prediction of field CST with SG CST at UAV 120 m. Black lines represent the zero intercept of the trend line. Red lines represent the regression line of the data (black dots).
Figure 8. The scatter plots of DBH, V, and CST prediction with field estimated values: (a) Field DBH with Gaussian DBH at UAV 80 m; (b) Prediction of field V with SG volume at UAV 120 m; (c) Prediction of field CST with SG CST at UAV 120 m. Black lines represent the zero intercept of the trend line. Red lines represent the regression line of the data (black dots).
Sensors 23 09907 g008
Table 1. Summary statistics of the field data.
Table 1. Summary statistics of the field data.
Field ParameterUnitMeanSD *Range
Tree height (TH)m tree−135.203.2725.80–42.90
Tree diameter (DBH)cm tree−160.947.1445.45–79.93
Basal area (BA)m2 tree−10.300.070.16–0.50
Stem volume (V)m3 tree−13.761.101.84–7.42
Carbon stock (CST)MgC tree−11.150.340.56–2.27
Tree densitystems ha–1147
* SD—standard deviation.
Table 2. Specifications and parameter setting of the UAV for imagery collection.
Table 2. Specifications and parameter setting of the UAV for imagery collection.
UAV ParameterSetting
ModelDJI Matrice 300 RTK
(Da-Jiang Innovations, Shenzhen, China)
Camera modelDJI Zenmuse P1 RGB
(Da-Jiang Innovations, Shenzhen, China)
Lens specifications *Sensor Dimensions: 35.000 mm × 23.328 mm
Resolution: 8192 × 5460
Focal Length: 35 mm
Pixel Size: 4.39 × 4.39 μm
Flight altitude80 m and 120 m
Front overlap90%
Side overlap90%
Flight time30 min
Flight take-off speed15 m/s
Average Flight speed—80 m5 m/s
Average Flight speed—120 m7 m/s
Ground Sampling Distance—80 m1.00 cm/pixel
Ground Sampling Distance—120 m1.51 cm/pixel
* Processing report of Agisoft Metashape.
Table 3. Individual tree detection (ITD) by a combination of filtering, search radius, and window sizes at two flight altitudes of UAV 80 m and UAV 120 m.
Table 3. Individual tree detection (ITD) by a combination of filtering, search radius, and window sizes at two flight altitudes of UAV 80 m and UAV 120 m.
UAV AltitudeFiltering Method *Search Radius/Sigma (m)Window Size (m)/Search MethodUAV Treetop **
UAV 80 mLML1circle304
2circle182
3circle135
LMG1circle316
2circle224
3circle184
4circle182
5circle182
SM FWS: 3 × 3, SWS: 3 × 3242
FWS: 5 × 5, SWS: 3 × 3151
SG1FWS: 3 × 3258
2FWS: 3 × 3245
3FWS: 3 × 3244
1FWS: 5 × 5154
2FWS: 5 × 5152
3FWS: 5 × 5153
UAV 120 mLML1circle300
2circle182
3circle131
LMG1circle317
2circle226
3circle190
4circle187
5circle187
SM FWS: 3 × 3, SWS: 3 × 3243
FWS: 5 × 5, SWS: 3 × 3149
SG1FWS: 3 × 3249
2FWS: 3 × 3244
1FWS: 5 × 5150
2FWS: 5 × 5149
* LML—local maxima lowpass filtering; LMG—local maxima Gaussian filtering; SM—shiny mean filtering; SG—shiny Gaussian filtering; ** UAV treetop—detected number of treetops.
Table 4. The results of ITD (UAV treetop) with tree density, precision, recall, and F-score (TP, FP, and FN) by different filtering methods at two UAV flying altitudes.
Table 4. The results of ITD (UAV treetop) with tree density, precision, recall, and F-score (TP, FP, and FN) by different filtering methods at two UAV flying altitudes.
UAV Altitude Filtering Method *UAV TreetopTree DensityTPFPFNPrecisionRecallF-Score
UAV 80 mLML2176189.0912452120.910.700.79
SM5144154.7112222140.900.850.87
SG2145155.7812223140.900.840.87
UAV 120 mLML2178191.2411959170.880.670.76
SM5142152.5611626200.850.820.83
SG2142152.5611527210.850.810.83
* LML2—local maxima lowpass filtering at search radius 2 m; SM5—shiny mean filtering at search radius 5 m; SG2—shiny Gaussian filtering at search radius 2 m.
Table 5. The results of mean TH with maximum and minimum values for different filtering methods at two UAV flying altitudes with field TH.
Table 5. The results of mean TH with maximum and minimum values for different filtering methods at two UAV flying altitudes with field TH.
Field and UAV Altitude Filtering Method *Mean TH ± SD
(m)
MaximumMinimum
FieldField TH (n = 136)35.23 ± 3.2742.9025.80
UAV 80 mLML2 (n = 136)35.49 ± 2.8140.6226.71
SM5 (n = 123)35.63 ± 2.5540.6227.40
SG2 (n = 122)35.65 ± 2.5240.6227.40
UAV 120 mLML2 (n = 125)35.76 ± 2.8941.9727.18
SM5 (n = 123)35.08 ± 2.6340.8527.37
SG2 (n = 122)35.81 ± 2.6240.8627.15
* LML2—local maxima lowpass filtering at search radius 2 m; SM5—shiny mean filtering at search radius 5 m; SG2—shiny Gaussian filtering at search radius 2 m; n—number of trees.
Table 6. The results of CA with maximum and minimum values and CC for different filtering methods at two UAV flying altitudes.
Table 6. The results of CA with maximum and minimum values and CC for different filtering methods at two UAV flying altitudes.
UAV AltitudeFiltering Method *Mean TH ± SD
(m)
MaximumMinimumCC Larch
(%)
CC All
(%)
UAV 80 mManual CA (n = 136)56.65 ± 21.27144.7623.8974.7477.44
SM5 (n = 98)63.72 ± 18.73106.0031.2592.0296.05
SG2 (n = 98)63.83 ± 18.65106.0031.2592.7796.06
UAV 120 mManual CA (n = 136)56.44 ± 21.24144.7623.8974.7477.44
SM5 (n = 106)55.57 ± 14.33100.5019.5078.7282.29
SG2 (n = 95)56.64 ± 14.29100.5019.5078.7282.29
* LML2—local maxima lowpass filtering at search radius 2 m; SM5—shiny mean filtering at search radius 5 m; SG2—shiny Gaussian filtering at search radius 2 m.
Table 7. The results of the MLR model for tree DBH, V, and CST for two flight altitudes.
Table 7. The results of the MLR model for tree DBH, V, and CST for two flight altitudes.
UAV AltitudeDependent Variable
(Unit)
Independent
Variables **
Selected ModelParameter
Estimates
R2RMSE
UAV 80 mDBH (cm)SG_TH, ND, M_CA, C_periIntercept45.02948 ***0.275.64
SG_TH0.20821 *
M_CA0.16068 ***
UAV 120 mV (m3 tree−1)SG_TH, M_CA, ND, SG_CAIntercept−0.773327 ***0.300.87
SG_TH0.086222 *
M_CA0.025442 ***
C (MgC tree−1)SG_TH, M_CA, ND, SG_CAIntercept−0.229230 ***0.290.24
SG_TH0.026223 *
M_CA0.007716 ***
** M_CA—manual crown area; F_TH—field tree height; C_peri—the perimeter of a tree crown in m; ND—near distance, is the shortest distance of a tree that is near to the target tree or Euclidean distance from each cell in the raster to the closest source in m; SM_TH—SRP mean tree height; SG_TH—SRP Gaussian tree height; significance code—*** p < 0.001, * p < 0.05.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Karthigesu, J.; Owari, T.; Tsuyuki, S.; Hiroshima, T. UAV Photogrammetry for Estimating Stand Parameters of an Old Japanese Larch Plantation Using Different Filtering Methods at Two Flight Altitudes. Sensors 2023, 23, 9907. https://doi.org/10.3390/s23249907

AMA Style

Karthigesu J, Owari T, Tsuyuki S, Hiroshima T. UAV Photogrammetry for Estimating Stand Parameters of an Old Japanese Larch Plantation Using Different Filtering Methods at Two Flight Altitudes. Sensors. 2023; 23(24):9907. https://doi.org/10.3390/s23249907

Chicago/Turabian Style

Karthigesu, Jeyavanan, Toshiaki Owari, Satoshi Tsuyuki, and Takuya Hiroshima. 2023. "UAV Photogrammetry for Estimating Stand Parameters of an Old Japanese Larch Plantation Using Different Filtering Methods at Two Flight Altitudes" Sensors 23, no. 24: 9907. https://doi.org/10.3390/s23249907

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop