Next Article in Journal
Accuracy Assessment in Convolutional Neural Network-Based Deep Learning Remote Sensing Studies—Part 2: Recommendations and Best Practices
Previous Article in Journal
Climatic Regulation of Vegetation Phenology in Protected Areas along Western South America
Communication

Estimating Tree Diameters from an Autonomous Below-Canopy UAV with Mounted LiDAR

1
Department of Biological Sciences, National University of Singapore, 14 Science Drive 4, Singapore 117543, Singapore
2
Yale-NUS College, 16 College Avenue West, Singapore 138527, Singapore
3
Department of Electrical and Computer Engineering, National University of Singapore, Singapore 117576, Singapore
4
Peng Cheng Laboratory, 2 Xingke Road, Nanshan, Shenzhen 518066, China
*
Author to whom correspondence should be addressed.
Academic Editor: Dominik Seidel
Remote Sens. 2021, 13(13), 2576; https://doi.org/10.3390/rs13132576
Received: 27 May 2021 / Revised: 19 June 2021 / Accepted: 26 June 2021 / Published: 2 July 2021
(This article belongs to the Special Issue Remote Sensing of Tropical Vegetation)

Abstract

Below-canopy UAVs hold promise for automated forest surveys because their sensors can provide detailed information on below-canopy forest structures, especially in dense forests, which may be inaccessible to above-canopy UAVs, aircraft, and satellites. We present an end-to-end autonomous system for estimating tree diameters using a below-canopy UAV in parklands. We used simultaneous localization and mapping (SLAM) and LiDAR data produced at flight time as inputs to diameter-estimation algorithms in post-processing. The SLAM path was used for initial compilation of horizontal LiDAR scans into a 2D cross-sectional map, and then optimization algorithms aligned the scans for each tree within the 2D map to achieve a precision suitable for diameter measurement. The algorithms successfully identified 12 objects, 11 of which were trees and one a lamppost. For these, the estimated diameters from the autonomous survey were highly correlated with manual ground-truthed diameters (R2=0.92, root mean squared error = 30.6%, bias = 18.4%). Autonomous measurement was most effective for larger trees (>300 mm diameter) within 10 m of the UAV flight path, for medium trees (200–300 mm diameter) within 5 m, and for trees with regular cross sections. We conclude that fully automated below-canopy forest surveys are a promising, but still nascent, technology and suggest directions for future research.
Keywords: below-canopy survey; UAV-mounted LiDAR; simultaneous localization and mapping; tree diameter estimation below-canopy survey; UAV-mounted LiDAR; simultaneous localization and mapping; tree diameter estimation

1. Introduction

Reliable and efficient methods for assessing forests’ physical structures are needed for numerous applications in forestry, ecology, and conservation [1,2,3]. The estimation of quantities such as stand volume, biomass, productivity, and basal area all depend on physical measurements of trees. Remote-sensing technologies can potentially improve both the reliability of measurements—through the use of high-density LiDAR, cameras, and other sensors—and the efficiency—through automation and the use of unmanned aerial vehicles (UAVs).
Here we focus on automated forest scanning from UAV platforms. Such platforms are capable of surveying larger forest areas than ground-based vehicles [4,5] and stationary platforms [6,7,8,9]. Satellite platforms can survey much larger areas, but at lower resolution. Recent years have seen major advances in the use of cameras [10,11] and LiDAR [12,13,14] for forest scanning, and in the software used to turn the scans into digital models from which forests’ physical structures can be measured [1,3,15]. Most UAV-based surveys of forests to date have used above-canopy UAVs [2,10,12,16,17,18,19,20,21]. These are effective in temperate and boreal forests [2,20,21], where trees are deciduous or foliage is relatively sparse, so that sensors can penetrate through the entire forest profile. In denser forests, however, sensor penetration from above-canopy UAV surveys may be low, limiting data collection to the upper forest layers [18]. This limitation is particularly severe in tropical evergreen forests, which are globally important because of their high biodiversity [22] and carbon storage [23].
The use of below-canopy UAVs for forest surveys has received more limited attention and is technically challenging, mainly because of the structural complexity of the below-canopy forest environment and the unreliability of GPS signals there. Several below-canopy UAV studies to date have relied on remotely piloted UAVs. Chisholm et al. [24] used a remotely piloted UAV with onboard horizontal LiDAR to estimate tree diameters in parklands. At least two recent studies have used remotely piloted UAVs with onboard LiDAR to build a 3D model of a forest and thence estimate tree diameters [3,25].
For below-canopy surveys to be fully autonomous, however, remote human pilots must ultimately be replaced by simultaneous localization and mapping (SLAM) technology, which has advanced rapidly in recent years [26,27,28,29,30]. Liao et al. [31] and Gao et al. [32] demonstrated navigation of an autonomous UAV in a below-canopy forest environment, but did not focus on measuring forest physical structure. Other studies have focused on the specific problem of autonomous trail following by UAVs in forests (e.g., [33]). Here, we present the first example, to our knowledge, of an end-to-end below-canopy autonomous UAV-based system for assessing a forest’s physical structure. We took data from an autonomous UAV flight through parkland [31] and estimate tree diameters using a post-processing LiDAR scan alignment algorithm that exploits the SLAM trajectory generated by the UAV at flight time.

2. Materials and Methods

2.1. UAV Survey

The methods for our UAV survey have been reported elsewhere [31]; we give a brief summary here. The hardware comprised a quadrotor UAV (measuring 1.2 m × 1.2 m × 0.5 m) with two onboard LiDAR scanners for detecting obstacles in the horizontal and vertical directions, a range finder to measure altitude, an inertial measurement unit for measuring angles and acceleration, a Pixhawk flight controller, and a Mastermind processor for high level control. The horizontal LidAR scanner, a Hokuyo UTM-30LX, performed 40 horizontal scans per second, and each scan comprised 962 pulses with evenly spaced beam angles over a 240° field of view. Each pulse reported a distance to target, with a distance of infinity indicating no pulse return. Henceforth we use the term “scan” to refer to one of these horizontal scans and its constituent pulse returns. The high-level software system for autonomous navigation included the construction of an occupancy grid map, SLAM, and 3D path planning.
The survey was conducted in April 2015 in parkland near the Ayer Rajah Expressway in Singapore (1°17′54″ N 103°46′59″ E; Figure 1). The parkland environment comprised scattered trees and palms of commonly planted native and introduced species in Singapore (including Andira inermis, Peltophorum pterocarpum, Syzygium polyanthum, Swietenia macrophylla, Xanthostemon chrysanthus, Khaya senegalensis, and Cocos nucifera) and occasional lampposts, underlain by grass and walking paths, with no bushes or undergrowth. The UAV was programmed to fly a closed loop of approximately 100 m over two minutes, at a height of ~1.2 m. The exact path was not pre-planned but autonomously charted during the flight. The path was flown only once.

2.2. Data Analysis

In post-processing, we analyzed the SLAM data from the autonomous UAV, along with the raw LiDAR data (which were input to SLAM at flight time), to estimate the diameters of trees in the study area. The data analysis comprised three main steps: use of the SLAM path to transform and collate the raw LiDAR data to produce a horizontal cross-section map of the study area, identification of clusters of LiDAR points corresponding to putative trees, and estimation of tree diameter for each cluster.
The transformation and collation of LiDAR data across the multiple scans involved projecting each scan’s LiDAR points into global coordinate space using the known orientation of the UAV at the corresponding time and the estimated location of the UAV from the SLAM algorithm. To alleviate the computational load, we did not use all 4800 horizontal LiDAR scans from the two-minute flight, but instead selected 500 scans equally spaced in time. Of the 962 points in each scan, we used only those with pulse return distances of 0.1–7.0 m; points outside this range were considered unreliable. We identified clusters of LiDAR points in the collated map using the algorithm described in Chisholm et al. [24], with each point’s weight set to the reciprocal of its distance from the UAV, with points within a distance 0.2 m of each other treated as part of the same cluster, and with the minimum cluster size (sum of point weights) equal to 0.1.
The resulting clusters of points typically did not resemble tree cross-sections because of errors in the estimated UAV positions from the SLAM algorithm (i.e., the SLAM output alone was insufficiently precise for the purposes of tree diameter estimation). To overcome this, we implemented a novel algorithm that adjusted the estimated UAV positions corresponding to the scans for each of the n clusters with the goal of aligning the collated scans to produce a more coherent shape. The tree diameter was then measured from the aligned scans. The combined alignment and measurement algorithm for a cluster (with index i ) worked as follows:
(1)
First, scans with fewer than eight pulses reporting finite distances to target were discarded. Such scans contained too little information for reliable comparison to other scans, and thus for reliable UAV position estimation. To limit the computational load, we used a maximum of 16 scans, equally spaced in time among those available, for each cluster.
(2)
The two scans closest together in time were merged by adjusting the estimated UAV horizontal-plane coordinates of the second scan ( x U A V ,   y U A V ) so that the length of the minimal spanning tree of the merged tree was minimized (if there was more than one candidate for this pair of scans, the pair was chosen randomly from among candidate pairs). The minimization was achieved with the optimx() function in R with the L-BFGS-B method with two fitting parameters x U A V and y U A V , and bounds of ±ϵ on the two parameters, where ϵ = 0.2 m was the estimated maximum error in the UAV position from the SLAM algorithm based on preliminary visual inspection of the overlaid scans. The starting estimates for x U A V and y U A V were chosen randomly from the intervals [ x ^ U A V ϵ ,   x ^ U A V + ϵ ] and [ y ^ U A V ϵ ,   y ^ U A V + ϵ ] . The algorithm repeated the call to optimx() 100 times with different starting values for the parameter estimates, to ensure the global minimum was found in each case.
(3)
Step 2 was repeated until all scans had been merged.
(4)
A circle was fitted to the resulting merged points using Pratt’s method [34], and the cluster was accepted as a physical tree if the resulting fitted circle had a diameter of less than 1 m, if the circular standard deviation of points was greater than π / 8 (indicating that a sufficient arc of the putative trunk had been scanned), and if the R 2 value was greater than 0.9 .
(5)
If the cluster was accepted as a physical tree in step 4, the diameter of the fitted circle was taken as the estimated diameter of the tree ( d ^ i ).

2.3. Manual Survey

After processing the UAV LiDAR data, we returned to the field site in September 2017, mapped out all trees and lampposts within an estimated 20 m of the UAV’s flight path, and measured their DBHs ( d i ) with diametric tape. The manually measured DBHs were then compared to the UAV-measured DBHs using the coefficient of determination ( R 2 ), median absolute error ( median | ( d ^ i d i ) / d i | ), root mean squared error ( ( 1 / n ) i = 1 n ( ( d ^ i d i ) / d i ) 2 ), and bias ( ( 1 / n ) i = 1 n ( d ^ i d i ) / d i ). We did not attempt to account for growth of the trees over the 29 months between the UAV survey and the manual survey, because tree growth is known to be highly variable across individuals, species, site, and year [35], but we acknowledge this as a source of error.

3. Results

The autonomous UAV successfully traced a closed loop of length ~100 m over two minutes as reported in Liao et al. [31] (Figure 2), with a mean height of 1.26 m (excluding take-off and landing stages). The SLAM and LiDAR data are provided as Supplementary Materials (Tables S1 and S2). A total of 27 objects were identified as potential trees from the clustering algorithm of Chisholm et al. [24]. Of these, 13 were accepted by our alignment and measurement algorithm (Figure 3 shows the results of the alignment algorithm for one of the objects). In the subsequent field survey, 11 of the 13 potential objects were matched to real trees, and one was matched to a lamppost. The remaining potential object could not be matched to any real object and had a highly irregular shape—it likely resulted from LiDAR scans on humans supervising the UAV flight—and was discarded.
For the 12 confirmed objects (Table 1), there was high correspondence between the autonomous diameter estimates and the manual ground-truthed estimates (Figure 4a; R 2 = 0.92 , median absolute error = 10.4 % ; root mean squared error = 30.6 % ; bias = 18.4 % ). Similar results were obtained if the lamppost was excluded ( R 2 = 0.91 , median absolute error = 13.9 % ; root mean squared error = 31.9 % ; bias = 19.5 % ).
Trees were more accurately detected when they were closer to the UAV. For the 50 objects (including 46 trees and four lampposts) in our survey area (Figure 2), the average distance to the UAV path was 8.1 m, while the average distance for accurately detected and measured objects was just 3.4 m (Figure 4b; Table 1). Larger trees tended to be more accurately measured than small trees, in terms of percentage error (Figure 4b; Spearman’s ρ = 0.67 , p = 0.028 for the relationship of percentage error to DBH; though note this relationship did not hold if the lamppost, which was narrow but was accurately measured, was included). The tree with lowest percentage error of 1.8% in measured DBH was a 740 mm tree standing 3.0 m away from the UAV’s path (Tree 21; UAV estimated DBH = 727 mm; Table 1). The tree with the lowest absolute error of 10 mm was a 308 mm tree standing 4.4 m from the UAV’s path (Tree 19; UAV estimated DBH = 298 mm; Table 1).

4. Discussion

We have presented an end-to-end autonomous system for assessing forest physical structure from a below-canopy UAV: both the UAV flight and the post-processing to estimate tree DBHs were achieved with automated algorithms. This advances on previous work where diameters have been estimated from remotely piloted below-canopy UAV flights [3,24,25]. We have also shown here how the SLAM path generated by an autonomous UAV can facilitate analysis and physical structure assessment in a GPS-denied environment, but that imprecision in the SLAM path means that post-processing is still required for alignment of the LiDAR scans.
The correlation between UAV-estimated DBH and manually estimated DBH was higher than in a previous study of ours using a remotely piloted UAV [24] ( R 2 = 0.92 vs. R 2 = 0.45 ), but this was mainly because the range of tree sizes was larger in the present study; the root mean square errors were similar across the two studies ( 30.6 % here vs. 25.1 % in the previous study) and the bias was actually higher in the present study ( 18.4 % vs. 1.2 % ), although this difference was likely due to small sample sizes. Our two poorest estimates were for smaller trees, and these failures illuminate current technical challenges. One of these trees (Tree 2; DBH = 232 mm; UAV-estimated DBH = 403 mm; reddest point on Figure 4b) was scanned from one side at the beginning of the flight and the other side at the end, and the post-processing algorithm was unable to align the scans from the two sides accurately. For the other inaccurately measured small tree (Tree 7; DBH = 267 mm; UAV estimated DBH = 410 mm), insufficient angular variance was present in the scans. Interestingly, the lamppost, which stood 1.4 m from the UAV’s path, was accurately measured, despite having a narrow DBH of 87 mm (Object 15; UAV estimated DBH = 93 mm), demonstrating in principle that even small trees can be assessed accurately providing they are regularly shaped, not too far for the UAV, and are scanned over sufficiently large angles. Autonomous flight software could address this by actively identifying blind spots and directing the UAV to scan them. To address the issue that some tree trunks have irregular non-circular shapes, fitting algorithms could allow shapes more flexible than circles (e.g., ellipses or non-convex shapes) and estimate basal area rather than diameter.
Autonomous surveys of forest physical structure could be further improved via the integration of 3D LiDAR, optical cameras, and more sophisticated methods of point cloud analysis [1,3,4,6,7,11,12,15,17,18,21,36]. Chen et al. [3] flew a remotely piloted below-canopy UAV with mounted 3D LiDAR in a temperate pine forest and developed novel software for reconstructing tree geometry from 3D point clouds. They reported a median absolute DBH error of 1.7 cm, lower than our median error of 3.0 cm. Hyyppä et al. [25] conducted a similar study in a pine-dominated boreal forest, and also developed novel software for 3D point cloud analysis, yielding a root mean squared error of 2.2% for tree DBHs, substantially lower than our value of 30.6%. We have focused on estimation of DBH here because it is currently a demanding, expensive, and time-consuming part of manual forest surveys, and thus a suitable starting point for developing automated technology. If a 3D point cloud is available, other forest structural properties, such as volume, can be assessed also.
One outstanding challenge is autonomous navigation in forest environments that are more complex than the parkland at our study site. In real forests, objects such as resprouting trees, bushes, lianas, ferns, epiphytes, and branches pose substantial obstacles to autonomous vehicles. Some autonomous navigation studies in forests to date have focused on trail following (e.g., [33]), which presents unique challenges but is a distinct problem from comprehensively surveying large tracts of forests. Fortunately, the general goal of autonomous navigation in complex environments is a problem of general scientific and engineering interest [30], and forest ecologists can expect to benefit from ongoing developments in coming years. We foresee that gradual improvements in below-canopy UAV surveys in forests will make them increasingly useful as a complement to above-canopy surveys and analyses of satellite data [2], and will lead to more comprehensive assessments of forest physical structure, with benefits for forest ecology and conservation, particularly in evergreen tropical forests.

Supplementary Materials

The following are available online at https://www.mdpi.com/article/10.3390/rs13132576/s1, Table S1: SLAM path data (bagfile-_serial_pub.csv), Table S2: LiDAR scan data (bagfile-_laser_horizontal_scan.csv).

Author Contributions

Conceptualization: R.A.C.; methodology, F.L. and R.A.C.; manual field work, R.A.C. and M.E.R.-R.; writing—original draft preparation, R.A.C.; writing—review and editing, all authors. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by James S. McDonnell Foundation, grant number #220020470.

Data Availability Statement

Data are contained within the article or supplementary material.

Acknowledgments

We thank Kwek Yan Chong for helping with tree identifications.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Guimarães, N.; Pádua, L.; Marques, P.; Silva, N.; Peres, E.; Sousa, J.J. Forestry remote sensing from unmanned aerial vehicles: A review focusing on the data, processing and potentialities. Remote Sens. 2020, 12, 1046. [Google Scholar] [CrossRef]
  2. Kellner, J.R.; Armston, J.; Birrer, M.; Cushman, K.C.; Duncanson, L.; Eck, C.; Falleger, C.; Imbach, B.; Král, K.; Krůček, M.; et al. New opportunities for forest remote sensing through ultra-high-density drone lidar. Surv. Geophys. 2019, 40, 959–977. [Google Scholar] [CrossRef] [PubMed]
  3. Chen, S.W.; Nardari, G.V.; Lee, E.S.; Qu, C.; Liu, X.; Romero, R.A.F.; Kumar, V. Sloam: Semantic lidar odometry and mapping for forest inventory. IEEE Robot. Autom. Lett. 2020, 5, 612–619. [Google Scholar] [CrossRef]
  4. Hyyti, H.; Visala, A. Feature based modeling and mapping of tree trunks and natural terrain using 3D laser scanner measurement system. In Proceedings of the 8th IFAC Symposium on Intelligent Autonomous Vehicles, Gold Coast, Australia, 26–28 June 2013; pp. 248–255. [Google Scholar]
  5. Miettinen, M.; Öhman, M.; Visala, A.; Forsman, P. Simultaneous localisation and mapping for forest harvesters. In Proceedings of the IEEE International Conference on Robotics and Automation, Roma, Italy, 10–14 April 2007. [Google Scholar]
  6. Forsman, P.; Halme, A. 3-D mapping of natural environments with trees by means of mobile perception. IEEE Trans. Robot. 2005, 21, 482–490. [Google Scholar] [CrossRef]
  7. McDaniel, M.W.; Nishihata, T.; Brooks, C.A.; Salesses, P.; Iagnemma, K. Terrain classification and identification of tree stems using ground-based LiDAR. J. Field Robot. 2012, 29, 891–910. [Google Scholar] [CrossRef]
  8. Tsubouchi, T.; Asano, A.; Mochizuki, T.; Kandou, S.; Shiozawa, K.; Matsumoto, M.; Tomimura, S.; Nakanishi, S.; Mochizuki, A.; Chiba, Y.; et al. Forest 3D mapping and tree sizes measurement for forest management based on sensing technology for mobile robots. In Proceedings of the Field and Service Robotics, Matsushima, Japan, 16–19 July 2013; pp. 357–368. [Google Scholar]
  9. Watt, P.J.; Donoghue, D.N.M. Measuring forest structure with terrestrial laser scanning. Int. J. Remote Sens. 2005, 26, 1437–1446. [Google Scholar] [CrossRef]
  10. Jones, A.R.; Segaran, R.R.; Clarke, K.D.; Waycott, M.; Goh, W.S.H.; Gillanders, B.M. Estimating mangrove tree biomass and carbon content: A comparison of forest inventory techniques and drone imagery. Front. Mar. Sci. 2020, 6, 784. [Google Scholar] [CrossRef]
  11. Swinfield, T.; Lindsell, J.A.; Williams, J.V.; Harrison, R.D.; Gemita, E.; Schönlieb, C.B.; Coomes, D.A. Accurate measurement of tropical forest canopy heights and aboveground carbon using structure from motion. Remote Sens. 2019, 11, 928. [Google Scholar] [CrossRef]
  12. Lin, Y.; Hyyppä, J.; Jaakkola, A. Mini-UAV-borne LiDAR for fine-scale mapping. Geosci. Remote Sens. Lett. 2011, 3, 426–430. [Google Scholar] [CrossRef]
  13. Wulder, M.A.; White, J.C.; Nelson, R.F.; Næsset, E.; Ørka, H.O.; Coops, N.C.; Hilker, T.; Bater, C.W.; Gobakken, T. Lidar sampling for large-area forest characterization: A review. Remote Sens. Environ. 2012, 121, 196–209. [Google Scholar] [CrossRef]
  14. Asner, G.P.; Mascaro, J. Mapping tropical forest carbon: Calibrating plot estimates to a simple LiDAR metric. Remote Sens. Environ. 2014, 140, 614–624. [Google Scholar] [CrossRef]
  15. Iglhaut, J.; Cabo, C.; Puliti, S.; Piermattei, L.; O’Connor, J.; Rosette, J. Structure from motion photogrammetry in forestry: A review. Curr. For. Rep. 2019, 5, 155–168. [Google Scholar] [CrossRef]
  16. Cushman, K.C.; Kellner, J.R. Prediction of forest aboveground net primary production from high-resolution vertical leaf-area profiles. Ecol. Lett. 2019, 22, 538–546. [Google Scholar] [CrossRef]
  17. Wallace, L.; Lucieer, A.; Watson, C.; Turner, D. Development of a UAV-LiDAR System with Application to Forest Inventory. Remote Sens. 2012, 4, 1519–1543. [Google Scholar] [CrossRef]
  18. Zahawi, R.A.; Dandois, J.P.; Holl, K.D.; Nadwodny, D.; Reid, J.L.; Ellis, E.C. Using lightweight unmanned aerial vehicles to monitor tropical forest recovery. Biol. Conserv. 2015, 186, 287–295. [Google Scholar] [CrossRef]
  19. Patenaude, G.; Hill, R.; Milne, R.; Gaveau, D.; Briggs, B.; Dawson, T. Quantifying forest above ground carbon content using LiDAR remote sensing. Remote Sens. Environ. 2004, 93, 368–380. [Google Scholar] [CrossRef]
  20. Jaakkola, A.; Hyyppä, J.; Yu, X.; Kukko, A.; Kaartinen, H.; Liang, X.; Hyyppä, H.; Wang, Y. Autonomous collection of forest field reference--the outlook and a first step with UAV laser scanning. Remote Sens. 2017, 9, 785. [Google Scholar] [CrossRef]
  21. Krůček, M.; Král, K.; Cushman, K.; Missarov, A.; Kellner, J.R. Supervised segmentation of ultra-high-density drone lidar for large-area mapping of individual trees. Remote Sens. 2020, 12, 3260. [Google Scholar] [CrossRef]
  22. Wilson, E.O.; Peter, F.M.; Raven, P.H. Our diminishing tropical forests. In Biodiversity; Wilson, E.O., Peter, F.M., Eds.; National Academy Press: Washington, DC, USA, 1988. [Google Scholar]
  23. Sullivan, M.J.P.; Lewis, S.L.; Affum-Baffoe, K.; Castilho, C.; Costa, F.; Sanchez, A.C.; Ewango, C.E.N.; Hubau, W.; Marimon, B.; Monteagudo-Mendoza, A.; et al. Long-term thermal sensitivity of Earth’s tropical forests. Science 2020, 368, 869–874. [Google Scholar] [CrossRef]
  24. Chisholm, R.A.; Cui, J.; Lum, S.K.Y.; Chen, B.M. UAV LiDAR for below-canopy forest surveys. J. Unmanned Veh. Syst. 2013, 1, 61–68. [Google Scholar] [CrossRef]
  25. Hyyppä, E.; Hyyppä, J.; Hakala, T.; Kukko, A.; Wulder, M.A.; White, J.C.; Pyörälä, J.; Yu, X.; Wang, Y.; Virtanen, J.-P. Under-canopy UAV laser scanning for accurate forest field measurements. ISPRS J. Photogramm. Remote Sens. 2020, 164, 41–60. [Google Scholar] [CrossRef]
  26. Zaffar, M.; Ehsan, S.; Stolkin, R.; Maier, K.M. Sensors, SLAM and long-term autonomy: A review. In Proceedings of the 2018 NASA/ESA Conference on Adaptive Hardware and Systems, Edinburgh, UK, 6–9 August 2018; pp. 285–290. [Google Scholar]
  27. Li, J.; Bi, Y.; Lan, M.; Qin, H.; Shan, M.; Lin, F.; Chen, B.M. Real-time simultaneous localization and mapping for UAV: A survey. In Proceedings of the International Micro Air Vehicle Competition and Conference, Beijing, China, 17–21 October 2016; pp. 237–242. [Google Scholar]
  28. Bachrach, A.; Prentice, S.; He, R.; Roy, N. RANGE-Robust Autonomous Navigation in GPS-Denied Environments. J. Field Robot. 2011, 28, 644–666. [Google Scholar] [CrossRef]
  29. Ryding, J.; Williams, E.; Smith, M.J.; Eichhorn, M.P. Assessing handheld mobile laser scanners for forest surveys. Remote Sens. 2015, 7, 1095–1111. [Google Scholar] [CrossRef]
  30. Taheri, H.; Xia, Z.C. SLAM; definition and evolution. Eng. Appl. Artif. Intell. 2021, 97, 104032. [Google Scholar] [CrossRef]
  31. Liao, F.; Lai, S.; Hu, Y.; Cui, J.; Wang, J.L.; Teo, R.; Lin, F. 3D motion planning for UAVs in GPS-denied unknown forest environment. In Proceedings of the 2016 IEEE Intelligent Vehicles Symposium (IV), Gothenburg, Sweden, 19–22 June 2016; pp. 246–251. [Google Scholar]
  32. Gao, F.; Wu, W.; Gao, W.; Shen, S. Flying on point clouds: Online trajectory generation and autonomous navigation for quadrotors in cluttered environments. J. Field Robot. 2019, 36, 710–733. [Google Scholar] [CrossRef]
  33. Giusti, A.; Guzzi, J.; Cireşan, D.C.; He, F.-L.; Rodríguez, J.P.; Fontana, F.; Faessler, M.; Forster, C.; Schmidhuber, J.; Di Caro, G. A machine learning approach to visual perception of forest trails for mobile robots. IEEE Robot. Autom. Lett. 2015, 1, 661–667. [Google Scholar] [CrossRef]
  34. Pratt, V. Direct least-squares fitting of algebraic surfaces. SIGGRAPH Comput. Graph. 1987, 21, 145–152. [Google Scholar] [CrossRef]
  35. Condit, R.; Ashton, P.S.; Manokaran, N.; LaFrankie, J.V.; Hubbell, S.P.; Foster, R.B. Dynamics of the forest communities at Pasoh and Barro Colorado: Comparing two 50-ha plots. Philos. Trans. R. Soc. Lond. Ser. B Biol. Sci. 1999, 354, 1739–1748. [Google Scholar] [CrossRef] [PubMed]
  36. Williams, J.; Schönlieb, C.-B.; Swinfield, T.; Lee, J.; Cai, X.; Qie, L.; Coomes, D.A. 3D segmentation of trees through a flexible multiclass graph cut algorithm. IEEE Trans. Geosci. Remote Sens. 2019, 58, 754–776. [Google Scholar] [CrossRef]
Figure 1. A view of the study area with the UAV in flight.
Figure 1. A view of the study area with the UAV in flight.
Remotesensing 13 02576 g001
Figure 2. Map showing horizontal cross-section of plot with SLAM trajectory (curve, with arrows showing direction of UAV, starting and ending at { x , y } = { 0 , 0 } ) and clusters of transformed and collated LiDAR scan points (indicated by sequence numbers in red).
Figure 2. Map showing horizontal cross-section of plot with SLAM trajectory (curve, with arrows showing direction of UAV, starting and ending at { x , y } = { 0 , 0 } ) and clusters of transformed and collated LiDAR scan points (indicated by sequence numbers in red).
Remotesensing 13 02576 g002
Figure 3. An example cluster illustrating the action of the scan-matching algorithm (minimization of length of minimal spanning tree—see main text). (a) The input to the algorithm comprised 16 horizontal LiDAR scans (each color designates one scan) for cluster 22 (see Figure 2), transformed and collated using the SLAM trajectory for the UAV; (b) the output from the algorithm comprised the same 16 scans but with their x and y positions adjusted.
Figure 3. An example cluster illustrating the action of the scan-matching algorithm (minimization of length of minimal spanning tree—see main text). (a) The input to the algorithm comprised 16 horizontal LiDAR scans (each color designates one scan) for cluster 22 (see Figure 2), transformed and collated using the SLAM trajectory for the UAV; (b) the output from the algorithm comprised the same 16 scans but with their x and y positions adjusted.
Remotesensing 13 02576 g003
Figure 4. Results of DBH measurements: (a) the UAV-based DBH measurements were highly correlated with the manual measurements ( R 2 = 0.92 ) (black line is the one-to-one line); (b) larger trees (towards the top of the graph) and trees closer to the UAV (towards the left of the graph) were more accurately measured (greener colors).
Figure 4. Results of DBH measurements: (a) the UAV-based DBH measurements were highly correlated with the manual measurements ( R 2 = 0.92 ) (black line is the one-to-one line); (b) larger trees (towards the top of the graph) and trees closer to the UAV (towards the left of the graph) were more accurately measured (greener colors).
Remotesensing 13 02576 g004
Table 1. Summary data for the 12 accurately detected objects in the study area (object numbers correspond to those in Figure 2).
Table 1. Summary data for the 12 accurately detected objects in the study area (object numbers correspond to those in Figure 2).
Object NumberDBH Manual (mm)DBH UAV (mm)Minimum Distance from UAV Path (m)
22324032.3
41962523.2
72674103.7
123733483.2
145495672.4
1587931.4
182302621.9
193082984.4
217407273.0
224285073.0
237857586.5
262082875.9
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Back to TopTop