Next Article in Journal
Wrist Exoskeleton Actuated by a 2-RRU-U Parallel Manipulator
Previous Article in Journal
Analysis of Energy Efficiency in Spur Gear Transmissions: Cycloidal Versus Involute Profiles
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Unmanned Ground Vehicle for Identifying Trees Infested with Bark Beetles

by
Jonathan Flores
,
Sergio Salazar
,
Iván González-Hernández
,
Yukio Rosales-Luengas
and
Rogelio Lozano
*
Program of Aerial and Submarine Autonomous Navigation Systems, Department of Research and Multidisciplinary Studies, Center for Research and Advanced Studies of the National Polytechnic Institute, Mexico City 07360, Mexico
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Machines 2024, 12(12), 944; https://doi.org/10.3390/machines12120944
Submission received: 29 November 2024 / Revised: 18 December 2024 / Accepted: 20 December 2024 / Published: 22 December 2024
(This article belongs to the Section Vehicle Engineering)

Abstract

This research presents an unmanned ground vehicle for identifying infested trees by bark beetles in mountain forests. The ground vehicle uses sensors for autonomous navigation and obstacle avoidance. The identification of infested trees is carried out by classifying the resin stains on the bark of unhealthy trees with a computer vision algorithm. This approach proposes tracking bark beetle spread in forest trees with image data of the infested trees considering resin sprouts as early indicators of the infestation in contrast to aerial monitoring, which only detects trees in advanced stages. Terrain autonomous vehicle direction is controlled by changing the velocities of left- and right-side wheels. A rotating LiDAR sensor is used to detect trees and avoid objects. The dynamic model of the vehicle is presented, and a control algorithm is proposed for path-following. Moreover, the stability of the system is proven using a Lyapunov function. In order to demonstrate the performance of the control and classification algorithms, experimental results from an outdoor forest environment are presented.

1. Introduction

Forests and natural areas are essential indicators of economic, social, and environmental development throughout the world. However, various factors affect hundreds of hectares of forests every year. Threats include forest fires, urban expansion, pests, and more. Government, civil, and private organizations have implemented strategies to conserve forests around the world. In China, a wall of trees has been established over decades to stop the expansion of the Gobi Desert that threatens important agricultural areas [1]. The use of unmanned aerial vehicles (UAVs) for monitoring natural areas has increased in recent decades due to their ability to cover areas that could be considered dangerous to humans. In [2], the authors discussed the state of the art in using UAVs for remote sensing of sea anomalies in the petroleum industry. Pests also pose a risk to the environment, particularly because their small size facilitates their spread, and the absence of natural predators fails to keep them in check. For these reasons, monitoring efforts that can identify the direct or indirect spread of pests are crucial for designing strategies to mitigate their expansion and prevent deforestation. The bark beetle is a significant threat that is present today throughout the northern hemisphere [3,4,5,6,7,8,9]. This pest infests coniferous forests by taking shelter inside the bark, destroying the conduits that nourish the tree. This pest is particularly damaging due to climate change and global warming, which contribute to its spread and enable it to deforest acres of forest within just a few weeks [10,11]. As the infestation progresses and the chlorophyll supply diminishes, the affected trees turn reddish (as evidence of the infestation). This discoloration has facilitated the use of artificial vision techniques from an aerial perspective to monitor the advance of bark beetles in forests.
For example, in [3], the authors calculated and compared 12 existing vegetation indices and developed the NDRS (normalized distance red & SWIR) index using red bands to detect forest stress before and during the early stages of infestation through satellite images from Sentinel-1 (radar) and Sentinel-2 (optical), taken in the forested region of Remningstorp, Sweden. Techniques such as linear discriminant analysis (LDA) and random forest (RF) models were used to assess the separability between healthy and stressed tree pixels. The NDRS index provided accuracy ranging from 0.80 to 0.89, depending on the infestation stage. In [4], the authors used Sentinel-2 data to detect early infestation stages (green attacks) in Norwegian spruce monoculture forests. In this study, the authors selected 14 satellite observations from April to November 2018, focusing on 9 spectral bands and 6 vegetation indices. The random forest algorithm was used to classify forests into healthy and different infestation stage classes. The overall accuracy of this detection model for distinguishing between healthy trees and those in a green attack stage was 78%. In [5], the authors evaluated the potential of images taken by RapidEye (optical) and TerraSAR-X (radar) satellites to detect green attacks in spruce forests in Germany using three statistical approaches: generalized linear models (GLMs), maximum entropy (ME), and random forest (RF). This study found that the combination of RapidEye and TerraSAR-X data allowed for early detection with good accuracy (kappa = 0.74). In [6], the authors compared low-resolution MODIS and high-resolution Sentinel-2 images to evaluate their effectiveness in detecting beetle damage. MODIS and Sentinel-2 images were used to calculate the vegetation index (NDVI) in Florida forests in July 2019. The authors found that Sentinel-2 outperforms MODIS in detecting beetle outbreaks, although even higher spatial and temporal resolutions are required for earlier detection.
In [12], the authors claimed to use specialized neural networks (generative adversarial networks, GANs) and a multispectral camera mounted on a commercial UAV to detect bark beetles in trees. Normally, using GANs on a small dataset might lead to collapse. To prevent this, the authors proposed a modified conditional GAN, achieving a classification accuracy of 87.5% and precision of 87.2%. In [13], the authors presented a collection of multispectral images of the pine beetle. The trees remained green until the beetle brood entered the pupal stage. As part of their conclusion, they proposed using multispectral image processing from unmanned aerial vehicles and suggested that aerial remote sensing is a complement to ground surveys. Ref. [14] presented the effectiveness of using deep neural networks (DNNs) to identify bark beetle infestations in individual trees from UAS images. The study compared many spectral ranges, as well as RGB, multispectral (MS), and hyperspectral (HS) imaging, and evaluated many neural network structures for each image type. They concluded that MS and HS images perform better than RGB images. In [15], the authors obtained a set of images of the region of interest acquired by a UAV. To distinguish healthy trees from those attacked by bark beetles, specific DeepForest and YOLOv5 models were proposed to identify diseased trees. In addition, a method based on morphological imaging was used to map the area of the damaged region in Mexico. The authors of [16] claimed that healthy and infested trees can be distinguished at an early stage of infestation using NIR (near-infrared) vegetation indices that increase in time after infestation. In this work, a UAV with multispectral imaging was utilized for the early detection of bark beetle infestation, even in individual trees.
Although the use of satellite images can help detect beetle infestations, it still presents significant challenges: much research is needed to refine the use of these data, especially for the green attack stage [17]; the high-resolution real-time images required are costly; and imaging is limited by cloud cover [7]. As an alternative to using satellites, drones are considered for the precise detection of bark beetle infestations. For example, in [18], the authors evaluated the use of custom cameras (RGB and NIR) mounted on low-cost rotary-wing drones (UAVs) with RGB and NIR capabilities to identify infested trees in the early stages in Krkonoše National Park, Czech Republic. This research showed that low-cost sensors mounted on drones can detect infestations with an accuracy of 78–96% at the level of individual trees, even in the early stages. In [19], the authors investigated the ability to detect early attacks (“green attack”) of the spruce bark beetle (Ips typographus) using multispectral images of tree canopies captured by drones in the forests of Remningstorp, Sweden. The authors proposed new indices to improve detection accuracy compared to existing indices, achieving identification of up to 90% of infested trees after 10 weeks.
However, images from satellites, most drones (both fixed-wing and rotary), and even a combination of both, only provide information about the tree canopy [16,20,21,22], which allows for detecting tree stress but does not easily differentiate the causes (e.g., drought vs. infestation) [7]. Aerial perspectives have advantages such as covering much larger areas without obstacles (above a certain height) and without requiring flat or semi-flat terrain. However, the most significant disadvantage of the aerial perspective is that computer vision algorithms are only able to recognize trees that are in the late stages of infestation, i.e., when the tree is already dying or dead. On the other hand, a terrestrial perspective allows for the identification of trees in the first stage of infestation, which occurs when bark-destroying insects attempt to penetrate the bark. Then, trees produce resin. According to the “Manual for the identification, management and monitoring of bark-destroying insects” [23], if more than 20 resin shoots are identified, it is considered to be in the first stage of infestation. This early detection is not possible from an aerial perspective.
Further evidence of a bark beetle attack is that when the bark of trees is perforated, it generates resin as a natural defense; this response is insufficient due to the trees’ water stress during the dry season [24]. Recognizing the resin-suppurating bark is a task that cannot be performed from an aerial perspective but could lead to early identification of the infestation, allowing action to be taken in the early stages of the infestation [25].
This paper is organized as follows: Section 2 presents the mathematical model and the control strategy design for the unmanned ground vehicle. The numerical simulation results are shown in Section 3. Section 4 is devoted to the experimental platform, obstacle avoidance algorithms, and tree bark resin dot identification detection algorithms. Finally, the discussion and conclusions are shared in Section 5.

2. Mathematical Model and Control Design

Unmanned ground vehicles (UGVs) are mechanically simpler than other autonomous vehicles, such as aerial vehicles or submarines. Their main advantage is that they do not depend on lift or buoyancy for their movement. However, there are different challenges such as wheel traction, irregularities in the terrain, and obstacles on the road that hinder the performance of the assigned tasks. The performance of tasks by autonomous vehicles is largely attributable to the design of control algorithms that respond to disturbances, ensuring accurate trajectory tracking in both numerical simulations and real-world environments.
Figure 1 shows the servoless four-wheeled ground vehicle used in this research, which employs differential-drive forces from its left and right wheels for displacement.

2.1. Dynamical Model

Figure 2 shows the unmanned ground vehicle diagram. According to [26], it can be modeled in terms of the error with respect to the road, as follows:
m x ¨ = F r + F l m y ¨ + m x ˙ ψ ˙ = F l + F r I Z ψ ¨ = ( L f F l ) ( L r F r )
where
F r = C r ω r F l = C l ω l
C r and C l are constants that represent a set of wheel constants, ω r and ω l are the angular velocities of the wheels, m is the total mass, v is the longitudinal velocity, ψ represents the angular position, R is the radius of the wheels, and L is the distance from the center of mass to the wheels. F R and F L are the left and right forces, respectively. τ R is the right torque, τ L is the left torque, and J is the inertia moment with respect to the center of mass.

2.2. Control Algorithm

In this section, the unmanned ground vehicle is discussed as a second-order mechanical mathematical model system that accounts for vehicle forces and external disturbances due to terrain. A modified robust proportional-derivative (PD) control structure is proposed. This control technique obtains position and velocity signals from the GPS, while the embedded inertial central provides position and angular velocity data. Therefore, this control technique is easy to implement because the measurements are obtained directly from the inertial sensors and the global positioning system from the autopilot signal. The goal of this section is to control the position ( x , y ) as well as yaw angle ψ to track the desired trajectories. For this proposal, let us introduce the following dynamics:
M ( η ) η ¨ + C ( η , η ˙ ) η ˙ = τ η + ρ
where η = ( x , y , ψ ) T , and | ρ | γ is a bounded disturbance.
From the above, ρ = ( ρ 1 , ρ 2 , ρ 3 ) T represents the external perturbation affecting the trajectory of the vehicle. γ = ( γ 1 , γ 2 , γ 3 ) T is the known upper bound. This bound represents the perturbations in the terrain. Normally, this bound is known according to the physical limitation of the vehicle interacting with the terrain. In addition, we have the following:
M = m 0 0 0 m 0 0 0 I z
M represents the inertial matrix.
C = 0 0 0 0 m x ˙ ψ ˙ 0 0 0 0
C represents the Coriolis force matrix, and
τ η = u 1 u 1 u 2
where τ η is the control input with u 1 = τ R + τ L , u 2 = l ( τ R + τ L ) and l is a constant that indicates the distance between the wheels (left and right) of the UGV. In order to demonstrate the stability of the system, the following positive function is proposed:
V ( η , η ˙ ) = 1 2 η ˙ T M ( η ) η ˙ + i = 1 n k a i ln [ cosh ( k p i η i ) ]
where η i represents the i-th element of the η vector, and k a i and k p i represent the i-th elements of the diagonal of the positive definite matrices K a and K p R 3 × 3 , respectively. The goal of using the positive function is to find a control law in such a way that the derivative of V is negatively defined.
Then, V ( η , η ˙ ) 0 because M ( η ) > 0 , and ln [ cosh ( · ) ] is a positive radially unbounded function. The time derivative of (4) is given as follows:
V ˙ = 1 2 η ˙ T M ˙ ( η ) η ˙ + η ˙ T M η ¨ + i = 1 n k a i k p i tanh ( k p i η i )
rewriting (5), we have the following:
V ˙ ( η , η ˙ ) = 1 2 η ˙ T M ˙ ( η ) η ˙ + η ˙ T M ( η ) η ¨ + η ˙ T K a K p tanh ( K p η )
where
tanh ( k p η ) = tanh ( k p 1 x ) , tanh ( k p 2 y ) , tanh ( k p 3 ψ ) T
Introducing (3) into (5) gives the following:
V ˙ ( η , η ˙ ) = η ˙ T ( 1 2 M ˙ ( η ) C ( η , η ˙ ) ) η ˙ + η ˙ T ( τ η + ρ + k a k p tanh ( k p η ) )
The derivative of the inertia matrix with respect to time and the centripetal–Coriolis force matrix [27] satisfies the following antisymmetric relation for any vector u as shown below:
u T ( 1 2 M ˙ ( η ) C ( η , η ˙ ) ) u = 0 u R 3
then
V ˙ ( η , η ˙ ) = η ˙ T τ η + ρ + k s p tanh ( k v η )
Now, we introduce the saturated PD control as follows:
τ η = k s p tanh ( k p η ) k s v tanh ( k v η ˙ ) γ
where k s p = k a k p , k s v = k b k v are positive diagonal matrices. Introducing the control law (11) into (10) gives
V ˙ ( η , η ˙ ) η ˙ T k s v tanh ( k v η ˙ ) 0
In this case, η ˙ 0 in a certain time. By using LaSalle’s theorem, and from (3) and (11), η is bounded in a small region.

3. Numerical Simulations

The numerical results of the control algorithm applied to the mathematical model of the UGV are presented in this section. The numerical simulations feature circular and straight-line path references with Gaussian noise at the inputs and various initial conditions, as shown in Table 1, where the parameters correspond to the real vehicle and the control strategy gains are selected in order to obtain a fast transient response and smooth overdamping.
Figure 3 presents the results for tracking the circular and straight-line path references. Here, the horizontal position x y of the UGV is controlled by the sum of the wheel forces and the caster angle ϕ by the difference. The approximation s i g n ( · ) tan 1 ( · ) is implemented to avoid the vibration effect during the simulations and experiments.
The performance of the control algorithm applied to the unmanned ground vehicle model is numerically demonstrated in the following reference routes (straight line or circle), despite uncertainties not contemplated in the dynamic model or by disturbances induced by Gaussian noise.

4. Experimental Setup

The experimental platform consists of a four-wheel-drive frame with a Rocker-bogie suspension system to ensure the vehicle’s orientation stability on uneven terrain; the UGV is shown in Figure 4. Each wheel is equipped with its own motor for 4 × 4 traction, and steering is achieved through differential speed control of the left and right wheels.
The control algorithm was programmed on a controller board compatible with the open-source programmable navigation system PX4-Firmware, which has built-in sensors, such as the inertial measurement unit (IMU), magnetometer, and barometer. This autopilot is compatible with peripheral sensors that support communication protocols such as the inter-integrated circuit (I2C), universal asynchronous receiver-transmitter (UART), controller area network (CAN), serial peripheral interface (SPI), analog–digital converter (ADC), etc. All input and output signals (sensors and actuators) are filtered and processed to perform tasks in navigation control.
A centimeter-precision differential Global Navigation Satellite System (GNSS) is used for the relative positioning of the autonomous vehicle and mission planning. An omnidirectional laser scanner is used to detect the surrounding environment at a range of up to 10 m. The unmanned ground vehicle bases its autonomous navigation on the global positioning system (GPS), so it can assign a home position, assign a route following way-points, and map a virtual grid or fence. A proportional–derivative (PD) algorithm is used for the position control of the vehicle.

4.1. Obstacle Avoidance

The UGV is also equipped with an omnidirectional laser scanner to detect the surrounding environment and objects in its path. Obstacle avoidance is based on the BendyRuler algorithm, which looks for open spaces to continue moving forward and then retraces its path to the final destination coordinates. This algorithm is simple and computationally inexpensive since it does not reconstruct maps and, therefore, does not require a companion computer for its operation.

BendyRuler Algorithm

The BendyRuler algorithm is implemented on the controller board to facilitate vehicle navigation during autonomous missions. The vehicle detects obstacles using the environmental information obtained in real-time from the two-dimensional, 360-degree LiDAR sensor. The algorithm simulates possible paths in the form of curved arcs as potential secondary trajectory options that the vehicle could follow and then return to the main path. Each virtual path is evaluated according to certain safety, progression, and smoothness criteria to decide how close it passes detected obstacles, how far it advances toward the main path, and how physically feasible it is for the vehicle to follow the path, respectively.
The algorithm, implemented on the controller board, discards paths that could lead to potential collisions as sensed in the local environment. As the vehicle moves forward, the algorithm continuously recalculates the trajectories based on new information from the environment. Currently, the distance sensor used has a range of 10 m, meaning the vehicle detects the tree at 10 m away and it can select an avoidance radius from 1 to 10 m. However, choosing a maximum radius of 10 m would make it impossible to navigate through a forest if the trees are less than 10 m from each other. Figure 5 shows the obstacle avoidance maneuver on a real-time trajectory. The outdoor experiment was performed with a tree as an obstacle and the vehicle went around it at 1.5 m and continued on its way. Figure 6 shows the trajectory tracking on the x y axes, showcasing tree avoidance and subsequent resumption of the trajectory on each axis. Table 2 presents the mean standard deviations of the UGV position and velocity.

4.2. Resin Due to Bark Beetles Identification

A traditional method for forest monitoring involves walking tours of the surrounding areas by local inhabitants. In this work, an unmanned ground vehicle was used for environmental monitoring to identify trees and detect resin on their bark to estimate whether the tree was infested with pests such as the bark beetle. An autonomous mission was assigned to the ground vehicle to tour an area with trees not foreseen on the route. The vehicle recorded the route on video while navigating and avoiding obstacles such as trees or rocks. The collected video was processed after the mission using the YOLOv4 artificial vision algorithm for the detection of trees and resin stains on the bark to estimate the presence of the bark beetle in its early stages. Figure 7 shows the tree resin dot detection, which can be used to estimate the presence of bark beetles.
Figure 8 and Figure 9 present the training loss and accuracy considering 15 epochs, 270 iterations, and a 0.001 learn rate. Specific values can be observed in Table 3. For images—70 were used for training, 10 for validation, and 20 for testing. Finally, The algorithm was able to detect resin stains on trees with an average accuracy of 87.03% in daylight conditions.
Figure 10 shows the confusion matrix representing the performance of the trained YOLOv4 model for detecting trees and resin spots on bark. The following results can be observed:
  • True positives (TPs): 8 → The model correctly detected 8 instances of the tree class.
  • False negatives (FNs): 1 → The model did not detect 1 instance of the tree class and misclassified it as resin.
  • False positives (FPs): 1 → The model misclassified 1 instance of resin as a tree.
  • True negatives (TNs): 9 → The model correctly classified 9 instances of the resin class.
The tree class has a recall of 0.89, meaning that the model correctly detects 89% of the trees. The resin class has a recall of 0.90, demonstrating better performance in detecting resin. The F1 scores of both classes indicate a balanced performance between precision and recall, being 0.89 for tree and 0.90 for resin. These metrics reflect the fact that the model demonstrates consistent performance and that it classifies both classes effectively.

5. Conclusions and Discussion

Environmental monitoring is a significant task worldwide due to its link with economic development, which depends on the environment. Therefore, it is important to implement autonomous vehicles that help improve this monitoring. While aerial monitoring offers a broader perspective of a forest, the terrestrial perspective can provide more accurate information on the health of the trees in the early stages of infection.
The use of unmanned terrestrial vehicles can follow pedestrian paths that inhabitants take—paths that are imperceptible from an aerial perspective. A UGV can travel a path or cover an area autonomously, adapting to the changes that nature induces on rural roads.
The design of robust control algorithms for autonomous navigation is essential to ensure correct path tracking, whether based on satellite positioning systems or vision, although vision-based navigation requires a higher computational cost. Vehicle navigation in an outdoor and changing environment requires the use of sensors that provide real-time information about the environment at a low computational cost.
The UGV’s navigation system is based on path following and obstacle avoidance algorithms, such as PD-Saturation and BendyRuler. The information provided by object detection (0–10 m) is used to calculate the distance to detected objects (e.g., trees) and make decisions about the route to follow. If the system detects a tree within a close range (1.5 m), it activates the avoidance algorithm to change the UGV’s path and avoid a collision.
Offline image and video post-processing offer advantages by reducing the equipment required for navigation. The UGV operator can collect the information captured by the onboard camera, and by training a convolutional neural network, trees and, if applicable, resin stains on the tree trunks encountered along the route can be detected.
Identifying trees with resin dots is crucial for monitoring the spread of the bark beetle, which is currently one of the main agents of deforestation in the northern hemisphere. The tests were carried out in a forest environment with varied lighting and terrain conditions. The performance of the system in detecting trees and resin was evaluated in images captured during the day, under different viewing angles. The performance of the resin point detection model on tree bark is robust to variations in lighting caused by shadows in the environment, with an accuracy of 87.03%, although in low light conditions, the detection accuracy decreases slightly. This pest benefits from global warming and water stress in forested regions, spreading throughout all seasons of the year. The results obtained from the classification of trees and resin demonstrate good performance in detecting bark worm infestation in trees at early stages, which is vital for safeguarding the health of the tree. However, it would be advantageous to increase the number of positive and negative images and continue to vary the angle and lighting conditions. This would strengthen the detector and enable a more precise estimation of the pest infestation levels in forests.

Author Contributions

Conceptualization and project administration, S.S. and J.F.; methodology and data curation, I.G.-H.; software and virtualization, Y.R.-L.; validation and formal analysis, R.L. All authors have read and agreed to the published version of the manuscript.

Funding

This project was funded by the Department of Research and Multidisciplinary Studies of Research and Advanced Studies Center of the National Polytechnic Institute (CINVESTAV-IPN).

Data Availability Statement

Bark texture images database available at https://figshare.com/account/articles/27927438 (accessed on 18 December 2024).

Acknowledgments

The authors are grateful to the National Council of Science, Humanities, and Technology (CONAHCYT) for its support.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
UAVunmanned aerial vehicle
UGVunmanned ground vehicle
IMUinertial measurement unit
I2Cinter-integrated circuit
UARTuniversal asynchronous receiver-transmitter
CANcontroller area network
SPIserial peripheral interface
ADCanalog-to-digital converter
LiDARlight detection and ranging
GPSglobal positioning system
GNSSglobal navigation satellite System
PDproportional–derivative

References

  1. Sternberg, T.; Rueff, H.; Middleton, N. Contraction of the Gobi desert, 2000–2012. Remote Sens. 2015, 7, 1346–1358. [Google Scholar] [CrossRef]
  2. Asadzadeh, S.; de Oliveira, W.J.; de Souza Filho, C.R. UAV-based remote sensing for the petroleum industry and environmental monitoring: State-of-the-art and perspectives. J. Pet. Sci. Eng. 2022, 208, 109633. [Google Scholar] [CrossRef]
  3. Huo, L.; Persson, H.J.; Lindberg, E. Early detection of forest stress from European spruce bark beetle attack, and a new vegetation index: Normalized distance red & SWIR (NDRS). Remote Sens. Environ. 2021, 255, 112240. [Google Scholar]
  4. Bárta, V.; Lukeš, P.; Homolová, L. Early detection of bark beetle infestation in Norway spruce forests of Central Europe using Sentinel-2. Int. J. Appl. Earth Obs. Geoinf. 2021, 100, 102335. [Google Scholar] [CrossRef]
  5. Ortiz, S.M.; Breidenbach, J.; Kändler, G. Early detection of bark beetle green attack using TerraSAR-X and RapidEye data. Remote Sens. 2013, 5, 1912–1931. [Google Scholar] [CrossRef]
  6. Gomez, D.F.; Ritger, H.M.; Pearce, C.; Eickwort, J.; Hulcr, J. Ability of remote sensing systems to detect bark beetle spots in the southeastern US. Forests 2020, 11, 1167. [Google Scholar] [CrossRef]
  7. Kautz, M.; Feurer, J.; Adler, P. Early detection of bark beetle (Ips typographus) infestations by remote sensing—A critical review of recent research. For. Ecol. Manag. 2024, 556, 121595. [Google Scholar] [CrossRef]
  8. Li, Y.; Johnson, A.J.; Gao, L.; Wu, C.; Hulcr, J. Two new invasive Ips bark beetles (Coleoptera: Curculionidae) in mainland China and their potential distribution in Asia. Pest Manag. Sci. 2021, 77, 4000–4008. [Google Scholar] [CrossRef]
  9. Fettig, C.J.; Hilszczański, J. Management strategies for bark beetles in conifer forests. In Bark Beetles; Elsevier: Amsterdam, The Netherlands, 2015; pp. 555–584. [Google Scholar]
  10. Marais, G.C.; Stratton, I.C.; Hulcr, J.; Johnson, A.J. Progress in Developing a Bark Beetle Identification Tool. bioRxiv 2024. [Google Scholar] [CrossRef]
  11. Gitau, C.; Bashford, R.; Carnegie, A.; Gurr, G. A review of semiochemicals associated with bark beetle (Coleoptera: Curculionidae: Scolytinae) pests of coniferous trees: A focus on beetle interactions with other pests and their associates. For. Ecol. Manag. 2013, 297, 1–14. [Google Scholar] [CrossRef]
  12. Victor, S.; Mykhailo, H.; Samuli, J. An intelligent system for determining the degree of tree bark beetle damage based on the use of generative-adversarial neural networks. Plant-Environ. Interact. 2024, 5, e70015. [Google Scholar] [CrossRef] [PubMed]
  13. Kanaskie, C.R.; Routhier, M.R.; Fraser, B.T.; Congalton, R.G.; Ayres, M.P.; Garnas, J.R. Early Detection of Southern Pine Beetle Attack by UAV-Collected Multispectral Imagery. Remote Sens. 2024, 16, 2608. [Google Scholar] [CrossRef]
  14. Turkulainen, E.; Honkavaara, E.; Näsi, R.; Oliveira, R.A.; Hakala, T.; Junttila, S.; Karila, K.; Koivumäki, N.; Pelto-Arvo, M.; Tuviala, J.; et al. Comparison of deep neural networks in the classification of bark beetle-induced spruce damage using UAS images. Remote Sens. 2023, 15, 4928. [Google Scholar] [CrossRef]
  15. Godinez-Garrido, G.; Gonzalez-Islas, J.C.; Gonzalez-Rosas, A.; Flores, M.U.; Miranda-Gomez, J.M.; Gutierrez-Sanchez, M.d.J. Estimation of Damaged Regions by the Bark Beetle in a Mexican Forest Using UAV Images and Deep Learning. Sustainability 2024, 16, 10731. [Google Scholar] [CrossRef]
  16. Klouček, T.; Modlinger, R.; Zikmundová, M.; Kycko, M.; Komárek, J. Early detection of bark beetle infestation using UAV-borne multispectral imagery: A case study on the spruce forest in the Czech Republic. Front. For. Glob. Chang. 2024, 7, 1215734. [Google Scholar] [CrossRef]
  17. Hollaus, M.; Vreugdenhil, M. Radar satellite imagery for detecting bark beetle outbreaks in forests. Curr. For. Rep. 2019, 5, 240–250. [Google Scholar] [CrossRef]
  18. Klouček, T.; Komárek, J.; Surovỳ, P.; Hrach, K.; Janata, P.; Vašíček, B. The use of UAV mounted sensors for precise detection of bark beetle infestation. Remote Sens. 2019, 11, 1561. [Google Scholar] [CrossRef]
  19. Huo, L.; Lindberg, E.; Bohlin, J.; Persson, H.J. Assessing the detectability of European spruce bark beetle green attack in multispectral drone images with high spatial-and temporal resolutions. Remote Sens. Environ. 2023, 287, 113484. [Google Scholar] [CrossRef]
  20. Bozzini, A.; Brugnaro, S.; Morgante, G.; Santoiemma, G.; Deganutti, L.; Finozzi, V.; Battisti, A.; Faccoli, M. Drone-based early detection of bark beetle infested spruce trees differs in endemic and epidemic populations. Front. For. Glob. Chang. 2024, 7, 1385687. [Google Scholar] [CrossRef]
  21. Paczkowski, S.; Datta, P.; Irion, H.; Paczkowska, M.; Habert, T.; Pelz, S.; Jaeger, D. Evaluation of early bark beetle infestation localization by drone-based monoterpene detection. Forests 2021, 12, 228. [Google Scholar] [CrossRef]
  22. Östersund, M. Monitoring Bark Beetle Infestation Using Remote Sensing. Master’s Thesis, Insinööritieteiden Korkeakoulu, Espoo, Finland, 2022. [Google Scholar]
  23. Cibrián, T.D.; Qui nonez, F.S.; Qui nonez, B.S.; Olivo, M.J.; Aguilar, V.J. Manual para la Identificación, Manejo y Monitoreo de Insectos Descortezadores del Pino. Guía Práctica para la Identificación y Manejo de los Descortezadores del Pino; Comisión Nacional Forestal (Conafor): Zapopan, Mexico, 2015.
  24. Ferrenberg, S.; Kane, J.M.; Mitton, J.B. Resin duct characteristics associated with tree resistance to bark beetles across lodgepole and limber pines. Oecologia 2014, 174, 1283–1292. [Google Scholar] [CrossRef] [PubMed]
  25. Kautz, M.; Peter, F.J.; Harms, L.; Kammen, S.; Delb, H. Patterns, drivers and detectability of infestation symptoms following attacks by the European spruce bark beetle. J. Pest Sci. 2023, 96, 403–414. [Google Scholar] [CrossRef]
  26. Rajamani, R. Vehicle Dynamics and Control; Springer: Berlin/Heidelberg, Germany, 2011. [Google Scholar]
  27. Lozano, R. (Ed.) Unmanned Aerial Vehicles: Embedded Control; John Wiley & Sons: Hoboken, NJ, USA, 2013. [Google Scholar]
Figure 1. Unmanned ground vehicle platform.
Figure 1. Unmanned ground vehicle platform.
Machines 12 00944 g001
Figure 2. Unmanned ground vehicle diagram.
Figure 2. Unmanned ground vehicle diagram.
Machines 12 00944 g002
Figure 3. The UGV trajectory with straight-line and circular references, showing a soft overdamping response at the start.
Figure 3. The UGV trajectory with straight-line and circular references, showing a soft overdamping response at the start.
Machines 12 00944 g003
Figure 4. Unmanned Ground Vehicle Experimental Setup.
Figure 4. Unmanned Ground Vehicle Experimental Setup.
Machines 12 00944 g004
Figure 5. Tree avoidance with a radius of 1.5 m during the UGV autonomous mission.
Figure 5. Tree avoidance with a radius of 1.5 m during the UGV autonomous mission.
Machines 12 00944 g005
Figure 6. Trajectory tracking on the x (left) and y (right) axes is followed by tree avoidance, and subsequently, the trajectory is resumed on each axis.
Figure 6. Trajectory tracking on the x (left) and y (right) axes is followed by tree avoidance, and subsequently, the trajectory is resumed on each axis.
Machines 12 00944 g006
Figure 7. Tree bark resin dot identification.
Figure 7. Tree bark resin dot identification.
Machines 12 00944 g007
Figure 8. Training progress accuracy.
Figure 8. Training progress accuracy.
Machines 12 00944 g008
Figure 9. Training progress loss.
Figure 9. Training progress loss.
Machines 12 00944 g009
Figure 10. Confusion matrix.
Figure 10. Confusion matrix.
Machines 12 00944 g010
Table 1. Unmanned ground vehicle simulation parameters.
Table 1. Unmanned ground vehicle simulation parameters.
ParameterValueParameterValueParameterValue
M2 kg R a d i o u s 5 m k v x 5
R 0.15 m x ( 0 ) 0.1 m k p y 4
L 0.5 m y ( 0 ) 0.1 m k v y 5
x d 8 m ψ ( 0 ) 0.1 rad k p ψ 5
y d 5 m k p x 4 k v ψ 4
Table 2. Standard error of the mean.
Table 2. Standard error of the mean.
ParameterValueUnits
Standard deviation of lateral position 0.13 m
Standard deviation of longitudinal position 0.14 m
Standard deviation of lateral velocity 0.09 m/seg.
Standard deviation of longitudinal velocity 0.10 m/seg.
Table 3. Algorithm training specific values.
Table 3. Algorithm training specific values.
EpochIterationTime ElapsedLearn RateTraining LossValidation Loss
1100:00:27 0.001 9217.7 8141.7
35000:06:22 0.001 308.25 338.68
610000:11:23 0.001 208.03 235.77
915000:16:37 0.001 145.71 225.46
1220000:21:17 0.001 144.14 199.41
1425000:25:43 0.001 122.87 190.75
1527000:27:29 0.001 75.623 196.79
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Flores, J.; Salazar, S.; González-Hernández, I.; Rosales-Luengas, Y.; Lozano, R. Unmanned Ground Vehicle for Identifying Trees Infested with Bark Beetles. Machines 2024, 12, 944. https://doi.org/10.3390/machines12120944

AMA Style

Flores J, Salazar S, González-Hernández I, Rosales-Luengas Y, Lozano R. Unmanned Ground Vehicle for Identifying Trees Infested with Bark Beetles. Machines. 2024; 12(12):944. https://doi.org/10.3390/machines12120944

Chicago/Turabian Style

Flores, Jonathan, Sergio Salazar, Iván González-Hernández, Yukio Rosales-Luengas, and Rogelio Lozano. 2024. "Unmanned Ground Vehicle for Identifying Trees Infested with Bark Beetles" Machines 12, no. 12: 944. https://doi.org/10.3390/machines12120944

APA Style

Flores, J., Salazar, S., González-Hernández, I., Rosales-Luengas, Y., & Lozano, R. (2024). Unmanned Ground Vehicle for Identifying Trees Infested with Bark Beetles. Machines, 12(12), 944. https://doi.org/10.3390/machines12120944

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop