Next Article in Journal
Influencing Factors of the Sub-Seasonal Forecasting of Extreme Marine Heatwaves: A Case Study for the Central–Eastern Tropical Pacific
Previous Article in Journal
Weakly Supervised Semantic Segmentation of Remote Sensing Images Using Siamese Affinity Network
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Automated Detection of Araraucaria angustifolia (Bertol.) Kuntze in Urban Areas Using Google Earth Images and YOLOv7x

by
Mauro Alessandro Karasinski
1,*,
Ramon de Sousa Leite
1,
Emmanoella Costa Guaraná
2,
Evandro Orfanó Figueiredo
3,
Eben North Broadbent
4,
Carlos Alberto Silva
5,
Erica Kerolaine Mendonça dos Santos
6,
Carlos Roberto Sanquetta
1 and
Ana Paula Dalla Corte
1
1
BIOFIX Research Center, Federal University of Paraná (UFPR), Curitiba 80210-170, PR, Brazil
2
Forest Engineering Academic Department, Federal University of Rondônia, Rolim de Moura 76801-974, RO, Brazil
3
Embrapa Acre, Rodovia BR-364, Km 14, Rio Branco 69900-056, AC, Brazil
4
Spatial Ecology and Conservation (SPEC) Lab, School of Forest, Fisheries, and Geomatics Sciences, University of Florida, Gainesville, FL 32611, USA
5
Forest Biometrics and Remote Sensing Laboratory (Silva Lab), School of Forest, Fisheries, and Geomatics Sciences, University of Florida, P.O. Box 110410, Gainesville, FL 32611, USA
6
Forest Engineering Academic Department, Federal University of Acre, Rodovia BR 364, Km 04, Rio Branco 69920-90011, AC, Brazil
*
Author to whom correspondence should be addressed.
Remote Sens. 2025, 17(5), 809; https://doi.org/10.3390/rs17050809
Submission received: 3 December 2024 / Revised: 17 February 2025 / Accepted: 21 February 2025 / Published: 25 February 2025

Abstract

:
This study addresses the urgent need for effective methods to monitor and conserve Araucaria angustifolia, a critically endangered species of immense ecological and cultural significance in southern Brazil. Using high-resolution satellite images from Google Earth, we apply the YOLOv7x deep learning model to detect this species in two distinct urban contexts in Curitiba, Paraná: isolated trees across the urban landscape and A. angustifolia individuals within forest remnants. Data augmentation techniques, including image rotation, hue and saturation adjustments, and mosaic augmentation, were employed to increase the model’s accuracy and robustness. Through a 5-fold cross-validation, the model achieved a mean Average Precision (AP) of 90.79% and an F1-score of 88.68%. Results show higher detection accuracy in forest remnants, where the homogeneous background of natural landscapes facilitated the identification of trees, compared to urban areas where complex visual elements like building shadows presented challenges. To reduce false positives, especially misclassifications involving palm species, additional annotations were introduced, significantly enhancing performance in urban environments. These findings highlight the potential of integrating remote sensing with deep learning to automate large-scale forest inventories. Furthermore, the study highlights the broader applicability of the YOLOv7x model for urban forestry planning, offering a cost-effective solution for biodiversity monitoring. The integration of predictive data with urban forest maps reveals a spatial correlation between A. angustifolia density and the presence of forest fragments, suggesting that the preservation of these areas is vital for the species’ sustainability. The model’s scalability also opens the door for future applications in ecological monitoring across larger urban areas. As urban environments continue to expand, understanding and conserving key species like A. angustifolia is critical for enhancing biodiversity, resilience, and addressing climate change.

1. Introduction

Araucaria angustifolia (Bertol.) Kuntze, popularly known as araucaria, pine, Paraná pine, and Brazilian pine, is the only species of the genus Araucaria that occurs naturally in Brazil [1], specifically in the Mixed Ombrophilous Forest (MOF) of the Atlantic Forest biome. The trees of this species can reach up to 30 m in height and 150 cm in diameter at breast height (DBH), with ecological, socio-economic, and cultural relevance, especially in the southern region of Brazil [2,3]. Its seeds are widely used in human food [4,5], in the pharmaceutical industry [6], and in the feeding of wild animals [2,7]. Its branches have shown high energy potential [8], and its wood is of high quality [9], acting as important natural reservoirs of atmospheric carbon dioxide (CO2) [10,11,12] and soil organic carbon [13]. The area of natural occurrence of the species in Brazil has significantly decreased in recent years [14,15]. In 2005, this area originally covered 25,379,316 ha; however, it is estimated that only 12.6% of this original extent remains as forest remnants [16]. More recent reports indicate an even more alarming scenario, with further reductions in the species’ habitat. Intensive logging for timber exploitation, as well as agricultural expansion and urbanization, have contributed to this scenario [14,15,17,18].
To curb logging of the species, the first governmental efforts began in Brazil in 1995 with the publication of State Law No. 11,054/1995 [19]. However, only in 2014 did the species become fully protected throughout Brazil by Federal Ordinance No. 443/2014 [20], which includes it in the “Official National List of Endangered Flora Species” as Endangered (EN). According to the International Union for Conservation of Nature (IUCN), the species is classified as Critically Endangered (CE) [21], making the monitoring of remaining trees extremely important. In this context, it is essential to understand the distribution of remaining individuals to enable the monitoring and preservation of the species, and this understanding can be achieved through forest inventories.
Several studies have estimated the density of A. angustifolia in MOF, ranging from 158 to 777 individuals per hectare in native forest fragments [22,23,24]. In contrast, in urban fragments, the observed density ranged from 17.11 to 33.30 individuals per hectare [25,26,27]. These estimates were obtained through in loco data collection, which is often based on limited sample sets, making inferences for larger populations and/or areas. In addition, the traditional data collection process is time-consuming and expensive [28]. Given these limitations, there is growing interest in alternative methods that can offer greater efficiency, in terms of both cost and effort, such as remote sensing technologies, which promise to revolutionize how forest inventories are conducted.
In this context, remote sensing combined with deep learning techniques based on Convolutional Neural Networks (CNNs) has proven adequate for identifying tree species [29,30,31,32]. CNNs can store spatial and morphological characteristics of an image, and require minimal pre-processing, making them widely used in pattern recognition in images. Some methods can identify different classes of objects in images, while others are based on segmentation, where each pixel is categorized by location, color, intensity, texture, and size [33]. Furthermore, various object detection methods have been applied to detect tree crowns, such as Faster R-CNN [30]; RetinaNet [34,35]; YOLOv2, YOLOv3, YOLOv4, YOLOv5, and Yolov7 [29,32,36,37]; and Mask R-CNN [38]. In general, the object detection method is faster and more robust compared to other tree crown detection methods, allowing scalability to large areas [30]. Object detection methods require high-spatial-resolution images due to the complexity of natural or built landscapes. Remotely piloted aircraft can provide the necessary resolution but are limited to small areas. For example, Ref. [37] successfully applied an enhanced version of YOLOv7 to detect individual tree crowns and extract their widths in Metasequoia glyptostroboides forests in China. Although Unoccupied Aerial Systems (UASs) offer unparalleled flexibility and detailed imagery at a localized scale, manned aircraft extend coverage to larger areas with moderate resolution. Satellites, on the other hand, can be an alternative for vast regions, but free images do not provide the necessary landscape detail. In contrast, digital globes such as Google EarthTM have democratized geospatial science [39,40] by providing very high-resolution images and geospatial accuracy [41,42,43].
The application of these technologies is crucial for understanding the spatial distribution of A. angustifolia, both in natural areas and urban contexts, where these trees also play important ecological roles. Urban green spaces play a pivotal role in maintaining the environmental balance within cities, offering a suite of ecosystem services indispensable for human well-being and urban sustainability. These services include local climate regulation, air pollution control, and protection of aquatic systems [44,45]. Trees, such as A. angustifolia, are particularly significant in sequestering atmospheric carbon dioxide (CO2), functioning as critical carbon sinks and contributing to climate change mitigation efforts. Furthermore, green areas provide essential habitats for many species, enhancing biodiversity and fostering ecological resilience [46]. They also improve urban quality of life by creating opportunities for recreation, social interaction, and psychological well-being. In the context of urban areas, where greenhouse gas emissions are a significant concern, understanding the carbon sink potential of these trees can inform urban forest practices and policies aimed at mitigating climate change. Insights gained from studying araucarias can further inform urban planning and green infrastructure design. By understanding their spatial requirements, ecosystem contributions, and maintenance needs, planners can integrate these trees more effectively into urban landscapes. Prioritizing these investments underscores the vital role of green infrastructure in addressing environmental challenges and promoting long-term urban health.
Therefore, this research develops and assesses a framework for detecting A. angustifolia in urban settings by applying YOLOv7x, a method based on convolutional neural networks, to optical imagery available on Google Earth Pro 7.3 (Google LLC, Mountain View, CA, USA) [47]. Our objective is to investigate the effectiveness of tree detection, in a complex urban setting with scattered forest remnants and isolated trees, as a way to support decision-making and municipal management of urban green areas.

2. Materials and Methods

2.1. Study Area Location

The study was conducted within the urban perimeter of the city of Curitiba (25°25′47″S 49°16′19″W), located in the state of Paraná, southern Brazil (Figure 1). Curitiba covers an area of 434.892 km2, with 78% of this area classified as urban [48]. The region’s climate is classified as humid subtropical mesothermal (Cfb) with no defined dry season [49]. Based on historical data from 1991 to 2020, the average temperature is 23.8 °C in the hottest month and 13.8 °C in the coldest month, with an average annual rainfall of 1591.1 mm [50]. The predominant vegetation in the region is classified as Mixed Ombrophilous Forest (MOF) (Figure 1), dominated by Araucaria angustifolia (Bertol.) Kuntze, which are present in the landscape, constituting parks, squares, nurseries, street trees, and residential areas.
In order to encompass the landscape diversity of the region, six neighborhoods were selected for model training: Batel, Centro, Jardim das Américas, Jardim Botânico, Rebouças, and Santa Felicidade (Figure 1). These neighborhoods present significant diversity in terms of land use and urban characteristics. Batel and Centro are predominantly commercial areas, with a concentration of large buildings and high-rise condominiums. Rebouças is a transitional area, featuring a mix of residential and commercial buildings, as well as industrial zones. Meanwhile, Santa Felicidade is a mostly residential neighborhood with low-density constructions and fewer high-rise buildings. Jardim das Américas and Jardim Botânico, on the other hand, stand out due to the presence of green areas, with a large diversity of plant species and preservation areas, providing a contrast to the more urbanized areas of the other neighborhoods. Given this diversity, we divided the data into two classes for our database—isolated araucarias and araucarias in forest fragments—and we evaluated the detection performance in these two distinct contexts. This diversity of urban and environmental characteristics was crucial to ensuring a varied dataset for model training.

2.2. Pre-Training

2.2.1. Acquisition of RGB Images

The images used in the study were obtained from Google Earth (GE) through the HCMGIS plugin [51] for QGIS [52], using the Basemap Satellite section. The advantage of using GE is that it provides mosaics of images from various satellites—such as SPOT5, LANDSAT, IKONOS, QUICKBIRD, GeoEye-1, Worldview-1, and Worldview-2—in addition to aerial photographs, which ensure high spatial resolution (GSD, Ground Sample Distance). A grid with dimensions of 100 m × 100 m (1.0 ha−1) was generated over the study area, serving as a base to crop the orthomosaic derived from the basemap. Each image was saved with a spatial resolution resampled to 0.1 m per pixel, resulting in dimensions of 1000 × 1000 pixels, using the nearest neighbor resampling method.

2.2.2. Field Data Collection and Data Labeling

To validate the obtained results, the annotations of A. angustifolia in the images were compared with information from field inventories. However, considering the time gap between the field inventories, which were divided into isolated individuals and those present in forest fragments (forest), and the date of acquisition of the satellite images, we performed a complementary analysis using Street View images. This approach allowed us to accurately verify the presence of A. angustifolia in the studied areas, facilitating the annotation and confirmation of the species’ existence.
For data labeling, the images were loaded into the labelImg software [53] to generate the labeled files in the YOLOv7 neural network input format. The annotation involved creating a bounding box around the A. angustifolia individuals in the image (Figure 2), allowing the identification of the bounding box location in relation to the image in which it is inserted. A total of 1097 images were labeled, discarding those that did not contain trees of the species under study, resulting in 4660 labeled A. angustifolia individuals.

2.3. Training

2.3.1. Data Customization

For the training, the YOLOv7x object detection architecture was used, one of the latest versions of the YOLO family, which accepts variations in input sizes. The larger the input size, the higher the computational cost. Based on this, images with a resolution of 640 × 640 pixels were used as the default input. Therefore, each input image was resized from 1000 × 1000 pixels to the final size of 640 × 640 pixels. Table 1 describes the configurations defined for the training.

2.3.2. Data Augmentation

The accuracy of Supervised Deep Learning models’ predictions depends on the quantity and diversity of data available during training. Therefore, during training, the data augmentation technique was applied to modify the training dataset images, generating a larger synthetic dataset than the original and consequently improving the model’s performance.
Initially, random rotations (angle) were applied to the images, both clockwise and counterclockwise, to allow for greater variation in camera positioning. Additionally, the saturation and hue values were adjusted relative to the reference values of the original image, enabling the detection of lighting differences during image capture and variations in colors when detecting new individuals. Finally, the mosaic augmentation technique was used, combining four training images into one, which allows the model to learn how to identify objects at a smaller scale than usual.

2.3.3. Experimentation Environment

For computing, Google Colaboratory (Google Colab) was used. Processing was executed in Google Colab on a Google Chrome web server. Google Colab operates on Ubuntu 22.04 LTS (Canonical Ltd., London, UK) (64-bit) and consists of an Intel Xeon processor (Intel Corporation, Santa Clara, CA, USA) with two 2.3 GHz cores and 83.5 GB of RAM. A 40 GB NVIDIA Tesla (NVIDIA Corporation, Santa Clara, CA, USA) A100-SXM4 GPU was used. To optimize the training process, YOLOv7x was initialized with pre-trained weights from the MS COCO dataset [54]. The pre-trained weights were used for the first convolutional layers of the model, ensuring a processing gain. YOLOv7x was trained for 300 epochs, and the images were subdivided into batches of size 8 (batch_size = 8).

2.4. Metrics

2.4.1. Intersection over Union

To evaluate the training results, several metrics were used. One of the most fundamental metrics related to object detection is Intersection over Union (IoU), also known as the Jaccard Index (Equation (1)):
IoU = Overlap Area Union Area
Some terms are widely known in the training and testing of neural network models, referring to learning and predictions. True Positive (TP) corresponds to the optimal results the model should aim for during training, i.e., the number of A. angustifolia correctly detected. False Positive (FP) corresponds to incorrect predictions made by the model, i.e., when the model detects an A. angustifolia in the wrong location. False Negative (FN) refers to all TPs that the model failed to detect.
The TPs are typically determined using a minimum IoU threshold. The threshold defines the desired minimum spatial accuracy to consider a prediction as correct. We chose an IoU threshold of 0.5 (50%) for our model, as suggested by [55].

2.4.2. Recall and Precision

Recall and precision are widely used metrics to evaluate the results of machine learning experiments [56]. Recall (Equation (2)) refers to the probability that bounding boxes are correctly detected, while precision (Equation (3)) indicates the probability that a detected bounding box classified as an A. angustifolia is, in fact, an A. angustifolia. Precision is calculated as the ratio between the number of correctly detected boxes and the number of boxes predicted by the detector as A. angustifolia.
Recall i = TP TP + FN
Precision i = TP TP + FP

2.4.3. F1-Score

To evaluate the generalization capability of the YOLOv7x model, the F1-score metric (Equation (4)) was used, which corresponds to the harmonic mean of Precision and Recall. This metric provides a balance between these two indicators and is particularly useful in cases of class imbalance:
F 1 _ score = 2 × Precision × Recall Precision + Recall

2.4.4. Average Precision

The Average Precision (AP) was determined by constructing a precision/recall curve, organizing predictions by confidence score. We followed the methodology by [57] which calculates AP by averaging precision at eleven equally spaced recall points between zero and one, facilitating the interpolation of precision at each recall point (Equations (5) and (6)).
AP = 1 11 r { 0 , 0.1 , 0.2 , . . . , 0.9 , 1 } p interp ( r )
p interp ( r ) = max r r p ( r )

2.4.5. K-Fold Cross-Validation

For the validation stage, k-fold cross-validation was performed [58]. The dataset was randomly divided into five equal-sized subsets. The training was conducted five times ( k = 5 ). In the first training, one of the subsets was selected for validation (20%) and the remaining four subsets (80%) were used for training. In the second training, a different subset was selected for validation and the remaining subsets for training. This procedure was repeated five times, ensuring that all data passed through the validation set exactly once. This helped reduce the bias that may occur during detection.

2.4.6. Uncertainty

To handle uncertainties, YOLOv7x calculates the box confidence, which reflects both the certainty that a bounding box contains an A. angustifolia and the accuracy of that bounding box in terms of IoU with the ground truth. Additionally, class confidence evaluates the likelihood that the detected object belongs to the A. angustifolia class. For each predicted box, the model generates an objectness score (po, the probability of containing an object) and a distribution of probabilities across the possible classes (pc, maximum value among the class probabilities). The confidence score is then calculated by multiplying these two values (Equation (7)):
Confidence = p o × max ( p c )
In this case, class confidence essentially confirms the presence of the single class, and the overall confidence score is primarily determined by the box confidence.

3. Results

3.1. Training and Validation

In this study, 1097 images were used to apply the YOLOv7x model using the 5-fold cross-validation approach, with an average training time of 2.08 h for every 300 iterations. The results of the mean Average Precision (AP) for the detection of A. angustifolia are presented in Table 2.
The YOLOv7x object detector successfully predicted the presence of the species A. angustifolia in the study area (Table 2). The species was detected with 90.79% accuracy, and the model demonstrated good generalization capability (F1-score = 88.68%). In Figure 3, the AP values and loss curve throughout the 300 epochs of model training can be observed. The AP was calculated at each epoch, and the weights were saved whenever a new maximum precision value was reached.

3.2. Test

In the initial phase of implementing YOLOv7x for the detection of A. angustifolia, out of the 5335 ground truth individuals, 5081 were true positives, suggesting that when the model predicts the presence of A. angustifolia in the image, it is correct 95.24% of the time. Meanwhile, 254 individuals were not identified by YOLOv7x (FN = 4.76%). At the same time, 729 false positives were observed, meaning the model predicted that A. angustifolia individuals were present in the image when, in fact, they were not. Among the 729 false positives, 13.58% of the confusion of A. angustifolia individuals occurred with palm species (Arecaceae). To increase the model’s precision, a supplementary annotation strategy was implemented. This involved carefully annotating additional data specifically for palm trees, which were then included in the training set. This refinement aimed to increase the representativeness of the target class, allowing the model to develop a better capacity for discriminating between A. angustifolia and palm trees. The subsequent retraining of the model with the refined dataset resulted in a significant reduction in false positives (Figure 4), demonstrating the effectiveness of the approach to improve detection accuracy.
To better understand the prediction behavior of A. angustifolia, we divided the study area into forest areas (forest individuals) and non-forest areas (isolated individuals). The precision, recall, and F1-score results for both scenarios are presented in Table 3. Since the goal of data refinement was to evaluate the impact on false positives, we only considered the precision metric in the analysis of the detection for the dataset refined with palm trees.
The uncertainty associated with the predictions is presented in Figure 4, which allowed for an accurate assessment of the model’s performance at different confidence levels across various environments.
When analyzing the prediction results considering the uncertainty level (Figure 4), it was observed that most correctly predicted A. angustifolia individuals were detected with more than 70% confidence, both in the forest individuals context (TP = 61.7%) and in the isolated individuals context (TP = 72.2%). Additionally, the results shown in Table 3 and Figure 4 suggest that YOLOv7x performed better in forest remnant areas than in built-up areas, which may indicate that the complexity of the background in areas with buildings led to greater difficulty for the model to learn; whereas, in forest remnant areas, the background has a more homogeneous texture.
Moreover, the model detected a considerable number of true positives, even with low detection confidence, highlighting the importance of considering these areas. In contrast, the number of false positives detected increases as uncertainty rises, which suggests caution when considering these individuals. Furthermore, the model did not detect false positives in the 90% to 100% confidence range.
We concluded our analysis by evaluating the density and spatial distribution of A. angustifolia in the city of Curitiba to demonstrate the applicability of the proposed method on a larger scale in the species’ natural occurrence region (Figure 5).
In the comprehensive analysis of the A. angustifolia distribution in Curitiba conducted by the YOLOv7x model, we identified a total of 97,232 individuals with approximately 90% accuracy. Figure 5 illustrates these results in detail, showing the tree distribution in urban forest fragments (Figure 5a) and the specific tree density per hectare (Figure 5b). These forest areas are significantly denser in A. angustifolia, as evidenced by both the Kernel Density map (Figure 5b) and the spatial distribution of the predicted trees (Figure 5c). Additionally, Figure 5d presents the uncertainty distribution of the predictions, where the highest confidence coincides with areas of higher tree density, suggesting a high level of model reliability in areas with a greater concentration of native vegetation.

4. Discussion

In general, the performance of deep learning techniques tends to improve as the dataset size increases [59,60,61,62]. Although data augmentation was applied in our study, the initial amount of data still imposed limitations, which may have directly impacted the detection accuracy of A. angustifolia, which reached 90% (Table 1 and Figure 1 and Figure 2). This rate is considered adequate, especially in the challenging contexts proposed by this study. For comparison, the MS COCO dataset [54] contains 328,000 images, with an average of 4000 images per category, which contrasts sharply with our dataset, which is only a quarter of this size.
The overall performance of YOLOv7x in detecting A. angustifolia is consistent with previous studies that highlight the efficiency of convolutional neural networks in object recognition tasks [63,64,65]. The Average Precision of 90.70% suggests that the model is highly effective in identifying A. angustifolia; although, the presence of false positives, especially in non-forest areas, indicates that the complexity of the urban background can challenge the model’s accuracy. This phenomenon is supported by studies such as [66], which show that urban areas often introduce visual noise that can confuse detection models. This complex urban context is exacerbated by the presence of shadows cast by buildings, as the composition of images from GE obtained from different sensors may lack Sun-synchronous alignment, preventing uniform lighting across image captures, representing an additional challenge in identifying A. angustifolia through satellite images (Figure 6).
These shadows, often projected by buildings, can partially or fully obscure tree canopies, complicating image interpretation and occasionally resulting in false negatives by generating patterns that resemble other objects. Furthermore, the shadows of the araucarias themselves can cause false positives, erroneously increasing the tree count. Bai et al. [67] highlighted that lighting and image quality are often more critical challenges than the machine learning model’s limitations. Image quality worsens in situations where both buildings and isolated trees cast shadows, adding further complexity to the correct detection process.
Although image quality may affect detection, our results using YOLOv7x have shown remarkable effectiveness, achieving an average global F1-score of 0.8868. This performance is an indication of YOLOv7x’s robustness—particularly in terms of precision and speed, which are essential characteristics for large-scale data analysis—and it also highlights the generalization and applicability of the proposed method for large-scale studies. In comparison, the study conducted by [68] used Faster R-CNN, achieving notable F1-scores for specific species, such as Picea abies (F1-score = 0.86) and Pinus sylvestris (F1-score = 0.92) in multi-species configurations. Additionally, both studies observed an increase in detection accuracy as new species were added to the models. In our case, the inclusion of palm trees improved the detection of A. angustifolia (Figure 3), while [68] reported improvements for Pinus sylvestris with the expansion of new species in the model. This similarity reinforces the idea that models trained with a greater diversity of species may exhibit more effective generalization, resulting in higher accuracy in complex environments.
After optimizing and selecting the best set of weights, the YOLOv7x model was applied to the entire urban area of Curitiba, aiming for a comprehensive analysis of the distribution and density of A. angustifolia individuals (Figure 4). The model identified a total of 97,232 A. angustifolia individuals, with an estimated accuracy of approximately 90%. This result not only underscores the model’s effectiveness in detecting this species on a large scale but also provides valuable insights into the population distribution of araucarias in the city, which is essential for future conservation initiatives and urban planning.
Additionally, the integration of prediction data with the map of urban forest fragments, as illustrated in Figure 4a, highlights a clear spatial relationship between the density of araucarias and the presence of urban forest remnants. The analysis reveals that the density of A. angustifolia individuals tends to be higher in forest fragments. This observation is crucial because it suggests that the preservation policies for these fragments may be essential for maintaining or even increasing populations of A. angustifolia, a species of great ecological and cultural importance to the region. The observed correlation also points to the need to consider the structure and distribution of green urban spaces in city planning. Preserving and expanding forest fragments not only supports urban biodiversity but also amplifies environmental benefits, such as improved air quality and microclimatic regulation, which are vital ecosystem services provided by these areas. It is important to note that these areas also serve as significant carbon sinks, helping mitigate climate change by absorbing substantial amounts of carbon dioxide CO2. Studies such as [10,11,12,13] reinforce the vital role these ecosystems play as natural carbon reservoirs, emphasizing the need for conservation strategies that recognize and value these functions.
Although the results obtained so far are promising for the detection of A. angustifolia, we acknowledge that the model’s accuracy could be further improved with the use of higher-resolution satellite imagery. Higher-resolution images could provide more detailed data, especially in urban areas with complex landscapes, helping to reduce detection errors and improve the overall accuracy of the model. Additionally, the use of UAV data, even for smaller areas, could offer superior spatial resolution and more focused coverage, allowing for more precise tree crown detection, especially in regions where satellite imagery may have limitations. We encourage future research to explore these avenues, as combining high-resolution satellite imagery and UAV data could further enhance the model’s performance and contribute to more accurate and reliable species detection. Furthermore, comparing the performance of different versions of YOLO or other object detection methods could provide additional insights into model efficiency and accuracy, potentially advancing the development of more robust approaches for tree crown detection and classification.

5. Conclusions

The combination of canopy structure and automatic image identification technology has enabled the accurate and automatic identification of A. angustifolia individuals. In this research, we described the process of automatically identifying images from GE using CNN with YOLOv7x and validated them using field data and Street View. Our results not only confirm YOLOv7x’s effectiveness in detecting A. angustifolia at scale, achieving a precision of 90.70% and an F1-score of 88.68%, but also offer valuable insights into the distribution and density of araucarias across Curitiba’s urban landscape. The integration of these predictive data with urban forest maps illustrates a clear spatial correlation between araucaria density and the presence of forest fragments. This correlation is crucial for informing conservation strategies, suggesting that preserving urban forest remnants could significantly enhance the sustainability of A. angustifolia populations in the city.
The study’s findings underscore the potential of automatic image-based inventories for urban forestry planning, offering a cost-effective solution for monitoring biodiversity and guiding conservation efforts. Furthermore, the model’s ability to scale to larger urban areas provides an opportunity for future applications in broader ecological monitoring. As urban environments continue to grow, understanding and conserving key species like A. angustifolia becomes ever more critical for maintaining biodiversity, improving urban resilience, and addressing climate change challenges.
Looking ahead, further refinement of detection accuracy, particularly in complex urban environments, and expanding the model to include other species, are vital next steps. These improvements will contribute to more comprehensive automated inventories and the development of strategic conservation policies that support both urban biodiversity and ecosystem services.

Author Contributions

Conceptualization, M.A.K. and A.P.D.C.; methodology, M.A.K., R.d.S.L. and A.P.D.C.; software, M.A.K.; validation, M.A.K., R.d.S.L. and E.K.M.d.S.; formal analysis, M.A.K. and R.d.S.L.; investigation, M.A.K.; resources, M.A.K.; data curation, M.A.K., R.d.S.L., E.K.M.d.S., C.R.S. and A.P.D.C.; writing—original draft preparation, M.A.K.; writing—review and editing, M.A.K., R.d.S.L., E.C.G., E.O.F., E.N.B., C.A.S., E.K.M.d.S., C.R.S. and A.P.D.C.; visualization, M.A.K.; supervision, M.A.K. and A.P.D.C.; project administration, M.A.K.; All authors have read and agreed to the published version of the manuscript.

Funding

This study was partially financed by the Brazilian National Council for Scientific and Technological Development (CNPq) (A. Corte/N.402350/2021-9/N.305422/2021-9).

Data Availability Statement

The data presented in this study are available upon request from the corresponding authors. The data are not publicly available due to privacy.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Dornelles, M.P.; Heiden, G.; Nic Lughadha, E.; Iganci, J. Quantifying and mapping angiosperm endemism in the Araucaria Forest. Bot. J. Linn. Soc. 2022, 199, 449–469. [Google Scholar] [CrossRef]
  2. Bogoni, J.A.; Muniz-Tagliari, M.; Peroni, N.; Peres, C.A. Testing the keystone plant resource role of a flagship subtropical tree species (Araucaria angustifolia) in the Brazilian Atlantic Forest. Ecol. Indic. 2020, 118, 106778. [Google Scholar] [CrossRef]
  3. IBGE. Produção da Extração Vegetal e da Silvicultura. 2022. Available online: https://www.ibge.gov.br/estatisticas/economicas/agricultura-e-pecuaria/9105-producao-da-extracao-vegetal-e-da-silvicultura.html (accessed on 10 July 2024).
  4. Godoy, R.C.B.d.; Negre, M.d.F.d.O.; Mendes, I.M.; Siqueira, G.L.d.A.d.; Helm, C.V. O Pinhão na Culinária; Embrapa: Brasilia, Brazil, 2013; Volume 1, 138p. [Google Scholar]
  5. Castrillon, R.G.; Helm, C.V.; Mathias, A.L. Araucaria angustifolia and the pinhão seed: Starch, bioactive compounds and functional activity—A bibliometric review. CiÊNcia Rural. 2023, 53, e20220048. [Google Scholar] [CrossRef]
  6. Zamarchi, F.; Vieira, I.C. Determination of paracetamol using a sensor based on green synthesis of silver nanoparticles in plant extract. J. Pharm. Biomed. Anal. 2021, 196, 113912. [Google Scholar] [CrossRef] [PubMed]
  7. Bogoni, J.A.; Peres, C.A.; Ferraz, K.M. Effects of mammal defaunation on natural ecosystem services and human well being throughout the entire Neotropical realm. Ecosyst. Serv. 2020, 45, 101173. [Google Scholar] [CrossRef]
  8. Ruiz, E.C.Z.; de Abreu Neto, R.; Behling, A.; Guimarães, F.A.R.; Filho, A.F. Bioenergetic use of Araucaria angustifolia branches. SSRN Electron. J. 2020, 153, 106212. [Google Scholar] [CrossRef]
  9. Dittmann, R.L.; de Souza, J.T.; Talgatti, M.; Baldin, T.; de Menezes, W.M. Stacking methods and lumber quality of Eucalyptus dunnii and Araucaria angustifolia after air drying. Sci. Agrar. Parana. 2017, 16, 260–264. [Google Scholar]
  10. Sanquetta, C.R.; Wojciechowski, J.; Corte, A.P.D.; Rodrigues, A.L.; Maas, G.C.B. On the use of data mining for estimating carbon storage in the trees. Carbon Balance Manag. 2013, 8, 6. [Google Scholar] [CrossRef]
  11. Rosenfield, M.F.; Souza, A.F. Forest biomass variation in Southernmost Brazil: The impact of Araucaria trees. Rev. Biol. Trop. 2014, 62, 359–372. [Google Scholar] [CrossRef]
  12. Roik, M.; Machado, S.d.A.; Figueiredo Filho, A.; Sanquetta, C.R.; Ruiz, E.C.Z. Aboveground biomass and organic carbon of native araucaria angustifolia (bertol.) Kuntze. Floresta Ambiente 2020, 27, e20180103. [Google Scholar] [CrossRef]
  13. Zinn, Y.L.; Fialho, R.C.; Silva, C.A. Soil organic carbon sequestration under Araucaria angustifolia plantations but not under exotic tree species on a mountain range. Rev. Bras. Ciênc. Solo 2024, 48, e0230146. [Google Scholar] [CrossRef]
  14. Scarano, F.R.; Ceotto, P. Brazilian Atlantic forest: Impact, vulnerability, and adaptation to climate change. Biodivers. Conserv. 2015, 24, 2319–2331. [Google Scholar] [CrossRef]
  15. Rezende, C.; Scarano, F.; Assad, E.; Joly, C.; Metzger, J.; Strassburg, B.; Tabarelli, M.; Fonseca, G.; Mittermeier, R. From hotspot to hopespot: An opportunity for the Brazilian Atlantic Forest. Perspect. Ecol. Conserv. 2018, 16, 208–214. [Google Scholar] [CrossRef]
  16. Ribeiro, M.C.; Metzger, J.P.; Martensen, A.C.; Ponzoni, F.J.; Hirota, M.M. The Brazilian Atlantic Forest: How much is left, and how is the remaining forest distributed? Implications for conservation. Biol. Conserv. 2009, 142, 1141–1153. [Google Scholar] [CrossRef]
  17. Castro, M.B.; Barbosa, A.C.M.C.; Pompeu, P.V.; Eisenlohr, P.V.; de Assis Pereira, G.; Apgaua, D.M.G.; Pires-Oliveira, J.C.; Barbosa, J.P.R.A.D.; Fontes, M.A.L.; dos Santos, R.M.; et al. Will the emblematic southern conifer Araucaria angustifolia survive to climate change in Brazil? Biodivers. Conserv. 2019, 29, 591–607. [Google Scholar] [CrossRef]
  18. Joly, C.A.; Metzger, J.P.; Tabarelli, M. Experiences from the Brazilian Atlantic Forest: Ecological findings and conservation initiatives. New Phytol. 2014, 204, 459–473. [Google Scholar] [CrossRef] [PubMed]
  19. Paraná. Legislação Estadual. 1995. Available online: https://leisestaduais.com.br/pr/lei-ordinaria-n-11054-1995-parana-dispoe-sobre-a-lei-florestal-do-estado (accessed on 14 January 2023).
  20. Ministério do Meio Ambiente (MMA). Portaria nº 443, de 17 de Dezembro de 2014. Available online: https://jbb.ibict.br/handle/1/672 (accessed on 1 January 2023).
  21. Martinelli, G.; Moraes, M.A. Livro vermelho da flora do Brasil; CNCFlora, Centro Nacional de Conservação da Flora do Rio de Janeiro: Rio de Janeiro, Brazil, 2013. [Google Scholar]
  22. Herrera, H.A.R.; Rosot, N.C.; Rosot, M.A.D.; de Oliveira, E. Análise florística e fitossociológica do componente arbóreo da Floresta Ombrófila Mista presente na Reserva Florestal EMBRAPA/EPAGRI, Caçador, SC-Brasil. Floresta 2009, 39, 485–500. [Google Scholar] [CrossRef]
  23. Ribeiro, S.B.; Longhi, S.J.; Brena, D.A.; Nascimento, A.R.T. Diversidade e classificação da comunidade arbórea da Floresta Ombrófila Mista da FLONA de São Francisco de Paula, RS. Ciênc. Florest. 2007, 17, 101–108. [Google Scholar] [CrossRef]
  24. Cubas, R.; Watzlawick, L.F.; Figueiredo Filho, A. Incremento, ingresso, mortalidade em um remanescente de Floresta Ombrófila Mista em Três Barras-SC. Ciênc. Florest. 2016, 26, 889–900. [Google Scholar] [CrossRef]
  25. Neto, R.M.R.; Kozera, C.; de Andrade, R.d.R.; Cecy, A.T.; Hummes, A.P.; Fritzsons, E.; Caldeira, M.V.W.; Maria de Nazaré, M.M.; de Souza, M.K.F. Caracterizaçao florística e estrutural de um fragmento de Floresta Ombrófila Mista, em Curitiba, PR–Brasil. Floresta 2002, 32, 3–16. [Google Scholar] [CrossRef]
  26. Boldarini, F.R.; Gris, D.; de Moraes Conceição, L.H.S.; Godinho, L. Phytosociological characterization of an urban fragment of interior Araucaria forest—Paraná, Brazil. Floresta 2024, 54, e-92974. [Google Scholar] [CrossRef]
  27. Heidemann, A.S.; Pelissari, A.L.; Cysneiros, V.C.; Rodrigues, C.K. Avaliação da estrutura espacial em uma floresta urbana por meio da estimativa da densidade de Kernel Assessing spatial structure in an urban forest by Kernel density estimation Evaluación de la estructura espacial en un bosque urbano mediante la estimación de la densidad Kernel. Contrib. Las Cienc. Soc. 2024, 17, 1–24. [Google Scholar] [CrossRef]
  28. WFCA-Melinda. The Evolution of Forest Inventory. 2021. Available online: https://forestbiometrics.org/references-articles/publications/the-evolution-of-forest-inventory/ (accessed on 25 September 2024).
  29. Puttemans, S.; Van Beeck, K.; Goedemé, T. Comparing boosted cascades to deep learning architectures for fast and robust coconut tree detection in aerial images. In Proceedings of the 13th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications, Madeira, Portugal, 27–29 January 2018; SCITEPRESS—Science and Technology Publications: Setubal, Portugal, 2018. [Google Scholar] [CrossRef]
  30. Zheng, J.; Li, W.; Xia, M.; Dong, R.; Fu, H.; Yuan, S. Large-Scale oil palm tree detection from high-resolution remote sensing images using Faster-RCNN. In Proceedings of the IGARSS 2019—2019 IEEE International Geoscience and Remote Sensing Symposium, Yokohama, Japan, 28 July–2 August 2019. [Google Scholar] [CrossRef]
  31. Ferreira, M.P.; Almeida, D.R.A.d.; Papa, D.d.A.; Minervino, J.B.S.; Veras, H.F.P.; Formighieri, A.; Santos, C.A.N.; Ferreira, M.A.D.; Figueiredo, E.O.; Ferreira, E.J.L. Individual tree detection and species classification of Amazonian palms using UAV images and deep learning. For. Ecol. Manag. 2020, 475, 118397. [Google Scholar] [CrossRef]
  32. Itakura, K.; Hosoi, F. Automatic Tree Detection from Three-Dimensional Images Reconstructed from 360° Spherical Camera Using YOLO v2. Remote Sens. 2020, 12, 988. [Google Scholar] [CrossRef]
  33. Sun, Y.; Hao, Z.; Guo, Z.; Liu, Z.; Huang, J. Detection and Mapping of Chestnut Using Deep Learning from High-Resolution UAV-Based RGB Imagery. Remote Sens. 2023, 15, 4923. [Google Scholar] [CrossRef]
  34. Selvaraj, M.G.; Vergara, A.; Montenegro, F.; Ruiz, H.A.; Safari, N.; Raymaekers, D.; Ocimati, W.; Ntamwira, J.; Tits, L.; Omondi, A.B.; et al. Detection of banana plants and their major diseases through aerial images and machine learning methods: A case study in DR Congo and Republic of Benin. ISPRS J. Photogramm. Remote Sens. 2020, 169, 110–124. [Google Scholar] [CrossRef]
  35. Culman, M.; Delalieux, S.; Tricht, K.V. Palm tree inventory from aerial images using retinanet. In Proceedings of the 2020 Mediterranean and Middle-East Geoscience and Remote Sensing Symposium (M2GARSS), Tunis, Tunisia, 9–11 March 2020. [Google Scholar] [CrossRef]
  36. Jintasuttisak, T.; Edirisinghe, E.; Elbattay, A. Deep neural network based date palm tree detection in drone imagery. Comput. Electron. Agric. 2022, 192, 106560. [Google Scholar] [CrossRef]
  37. Dong, C.; Cai, C.; Chen, S.; Xu, H.; Yang, L.; Ji, J.; Huang, S.; Hung, I.K.; Weng, Y.; Lou, X. Crown width extraction of Metasequoia glyptostroboides using improved YOLOv7 based on UAV images. Drones 2023, 7, 336. [Google Scholar] [CrossRef]
  38. Braga, J.R.G.; Peripato, V.; Dalagnol, R.; Ferreira, M.P.; Tarabalka, Y.; Aragão, L.E.O.C.; Velho, H.F.d.C.; Shiguemori, E.H.; Wagner, F.H. Tree Crown Delineation Algorithm Based on a Convolutional Neural Network. Remote Sens. 2020, 12, 1288. [Google Scholar] [CrossRef]
  39. Butler, D. Virtual globes: The web-wide world. Nature 2006, 439, 776–778. [Google Scholar] [CrossRef]
  40. Yu, L.; Gong, P. Google Earth as a virtual globe tool for Earth science applications at the global scale: Progress and perspectives. Int. J. Remote Sens. 2012, 33, 3966–3986. [Google Scholar] [CrossRef]
  41. Hou, H.; Chen, M.; Tie, Y.; Li, W. A universal landslide detection method in optical remote sensing images based on improved YOLOX. Remote Sens. 2022, 14, 4939. [Google Scholar] [CrossRef]
  42. Sepehry Amin, M.; Emami, H. True orthophoto generation using google earth imagery and comparison to UAV orthophoto. Sci.-Res. Q. Geogr. Data (SEPEHR) 2023, 32, 7–25. [Google Scholar]
  43. Sun, L.; Guo, H.; Chen, Z.; Yin, Z.; Feng, H.; Wu, S.; Siddique, K.H. Check dam extraction from remote sensing images using deep learning and geospatial analysis: A case study in the Yanhe River Basin of the Loess Plateau, China. J. Arid. Land 2023, 15, 34–51. [Google Scholar] [CrossRef]
  44. Aguiar, J.T.d.; Brun, F.G.K.; Higuchi, P.; Bobrowski, R. Although it lacks connectivity, isolated urban forest fragments can deliver similar amounts of ecosystem services as in protected areas. CERNE 2023, 29, e-103193. [Google Scholar] [CrossRef]
  45. Nowak, D.J.; Dwyer, J.F. Understanding the benefits and costs of urban forest ecosystems. In Urban and Community Forestry in the Northeast; Springer: Berlin/Heidelberg, Germany, 2007; pp. 25–46. [Google Scholar] [CrossRef]
  46. da Silva Santos, A.; de Souza, I.; de Souza, J.M.T.; Schaffrath, V.R.; Galvão, F.; Bohn Reckziegel, R. Urban Parks in Curitiba as Biodiversity Refuges of Montane Mixed Ombrophilous Forests. Sustainability 2023, 15, 968. [Google Scholar] [CrossRef]
  47. Google. Google Earth. Available online: https://www.google.com.br/earth/ (accessed on 20 January 2025).
  48. IBGE; Coordenação de Meio Ambiente. Áreas Urbanizadas do Brasil: 2019; Coleção Ibgeana, IBGE: Rio de Janeiro, Brazil, 2022; p. 16. 30p. [Google Scholar]
  49. Alvares, C.A.; Stape, J.L.; Sentelhas, P.C.; Gonçalves, J.d.M.; Sparovek, G. Köppen’s climate classification map for Brazil. Meteorol. Z. 2013, 22, 711–728. [Google Scholar] [CrossRef]
  50. Instituto Nacional de Meteorologia. Instituto Nacional de Meteorologia, n.d. Available online: https://portal.inmet.gov.br/normais (accessed on 16 January 2025).
  51. Thang, Q.D. HCMGIS: Plugin for QGIS 3. 2021. Available online: https://plugins.qgis.org/plugins/HCMGIS/ (accessed on 20 February 2025).
  52. QGIS Development Team. QGIS Geographic Information System. Open Source Geospatial Foundation (OSGeo), 2024. Version 3.32. Available online: https://www.osgeo.org/ (accessed on 3 January 2024).
  53. Tzutalin. LabelImg. Git Code 2015. Available online: https://github.com/tzutalin/labelImg (accessed on 20 January 2023).
  54. Lin, T.Y.; Maire, M.; Belongie, S.; Hays, J.; Perona, P.; Ramanan, D.; Dollár, P.; Zitnick, C.L. Microsoft coco: Common objects in context. In Proceedings of the Computer Vision–ECCV 2014: 13th European Conference, Proceedings, Part V 13, Zurich, Switzerland, 6– 12 September 2014; Springer: Berlin/Heidelberg, Germany, 2014; pp. 740–755. [Google Scholar] [CrossRef]
  55. Wang, C.Y.; Bochkovskiy, A.; Liao, H.Y.M. YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Vancouver, BC, Canada, 17–24 June 2023; pp. 7464–7475. [Google Scholar] [CrossRef]
  56. Powers, D.M. Evaluation: From precision, recall and F-measure to ROC, informedness, markedness and correlation. arXiv 2020, arXiv:2010.16061. [Google Scholar]
  57. Everingham, M.; Van Gool, L.; Williams, C.K.I.; Winn, J.; Zisserman, A. The pascal visual object classes (VOC) challenge. Int. J. Comput. Vis. 2010, 88, 303–338. [Google Scholar] [CrossRef]
  58. Anthony, M.; Holden, S.B. Cross-validation for binary classification by real-valued functions: Theoretical analysis. In Proceedings of the 11th Annual Conference on Computational Learning Theory, Madison, WI, USA, 24–26 July 1998; pp. 218–229. [Google Scholar]
  59. Antoniou, A. Data Augmentation Generative Adversarial Networks. arXiv 2017, arXiv:1711.04340. [Google Scholar]
  60. Perez, L.; Wang, J. The Effectiveness of Data Augmentation in Image Classification using Deep Learning. arXiv 2017, arXiv:1712.04621. [Google Scholar]
  61. Perez Malla, C.U.; Valdes Hernandez, M.d.C.; Rachmadi, M.F.; Komura, T. Evaluation of enhanced learning techniques for segmenting ischaemic stroke lesions in brain magnetic resonance perfusion images using a convolutional neural network scheme. Front. Neuroinform. 2019, 13, 33. [Google Scholar] [CrossRef]
  62. Hao, W.; Zhili, S. Improved mosaic: Algorithms for more complex images. J. Phys. Conf. Ser. 2020, 1684, 012094. [Google Scholar] [CrossRef]
  63. Redmon, J. You only look once: Unified, real-time object detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016. [Google Scholar] [CrossRef]
  64. Redmon, J.; Farhadi, A. YOLO9000: Better, faster, stronger. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 7263–7271. [Google Scholar] [CrossRef]
  65. Bochkovskiy, A.; Wang, C.Y.; Liao, H.Y.M. Yolov4: Optimal speed and accuracy of object detection. arXiv 2020, arXiv:2004.10934. [Google Scholar]
  66. Chen, X.; Xiang, S.; Liu, C.L.; Pan, C.H. Vehicle detection in satellite images by hybrid deep convolutional neural networks. IEEE Geosci. Remote Sens. Lett. 2014, 11, 1797–1801. [Google Scholar] [CrossRef]
  67. Bai, T.; Wang, L.; Yin, D.; Sun, K.; Chen, Y.; Li, W.; Li, D. Deep learning for change detection in remote sensing: A review. Geo-Spat. Inf. Sci. 2023, 26, 262–288. [Google Scholar] [CrossRef]
  68. Beloiu, M.; Heinzmann, L.; Rehush, N.; Gessler, A.; Griess, V.C. Individual tree-crown detection and species identification in heterogeneous forests using aerial RGB imagery and deep learning. Remote Sens. 2023, 15, 1463. [Google Scholar] [CrossRef]
Figure 1. Location of the study area in the city of Curitiba, Paraná, Brazil. The highlighted neighborhoods (Batel, Centro, Jardim Botânico, Jardim das Américas, Rebouças, and Santa Felicidade) were used to train and test the YOLOv7x model. The gray area indicates regions where the available images did not have the same quality as the others and, therefore, were not included in the study.
Figure 1. Location of the study area in the city of Curitiba, Paraná, Brazil. The highlighted neighborhoods (Batel, Centro, Jardim Botânico, Jardim das Américas, Rebouças, and Santa Felicidade) were used to train and test the YOLOv7x model. The gray area indicates regions where the available images did not have the same quality as the others and, therefore, were not included in the study.
Remotesensing 17 00809 g001
Figure 2. Components of a bounding box. (bx, by) represent the X and Y coordinates of the center of the bounding box; w represents the width and h the height of the bounding box.
Figure 2. Components of a bounding box. (bx, by) represent the X and Y coordinates of the center of the bounding box; w represents the width and h the height of the bounding box.
Remotesensing 17 00809 g002
Figure 3. Learning curve performance of YOLOv7x in the detection of A. angustifolia in the city of Curitiba, Paraná, Brazil.
Figure 3. Learning curve performance of YOLOv7x in the detection of A. angustifolia in the city of Curitiba, Paraná, Brazil.
Remotesensing 17 00809 g003
Figure 4. Frequency distribution of individuals classified as forest and isolated individuals.
Figure 4. Frequency distribution of individuals classified as forest and isolated individuals.
Remotesensing 17 00809 g004
Figure 5. Overview of A. angustifolia distribution by YOLOv7x in Curitiba, Paraná. (a) Forest areas. (b) Kernel Density Map (trees/ha). (c) Predicted trees. (d) Uncertainty distribution for predicted trees.
Figure 5. Overview of A. angustifolia distribution by YOLOv7x in Curitiba, Paraná. (a) Forest areas. (b) Kernel Density Map (trees/ha). (c) Predicted trees. (d) Uncertainty distribution for predicted trees.
Remotesensing 17 00809 g005
Figure 6. Examples of prediction results: (a) Detection in the context of isolated trees. (b) Detection in forest fragments. (c) Example of a false negative caused by building shadows. (d) Example of a false positive due to confusion with palm trees. (e) Example of a false positive caused by confusion with the shadow projection of an A. angustifolia.
Figure 6. Examples of prediction results: (a) Detection in the context of isolated trees. (b) Detection in forest fragments. (c) Example of a false negative caused by building shadows. (d) Example of a false positive due to confusion with palm trees. (e) Example of a false positive caused by confusion with the shadow projection of an A. angustifolia.
Remotesensing 17 00809 g006
Table 1. YOLOv7x configuration applied to the A. angustifolia detection model training.
Table 1. YOLOv7x configuration applied to the A. angustifolia detection model training.
SettingsValueSettingsValue
batch8anchor_t4.0
width640fl_gamma0.0
height640hsv_h0.015
channels3.0hsv_v0.4
momentum0.937hsv_s0.7
decay0.0005degrees0.0
warmup_momentum0.8translate0.2
warmup_bias_lr0.1scale0.5
lr00.01shear0.0
lrf0.1perspective0.0
box0.05flipud0.0
cls0.3fliplr0.5
cls_pw1.0mosaic1.0
obj0.7mixup0.0
obj_pw1.0train_size878
iou_t0.2validation_size219
batch—batch size; width—image width in pixels; height—image height in pixels; channels—RGB channels; momentum—momentum, how much the history affects the subsequent change in weights; decay—eliminates the imbalance in the dataset; warmup_momentum—gradual increase in the learning rate; warmup_bias_lr—gradual increase to standard values in epochs; lr0—initial learning rate; lrf—maximum learning rate; box—box loss gain; cls—class loss gain; obj—object loss gain; iou_t—IoU training threshold; anchor_t—multiple anchor threshold; fl_gamma—focal loss gamma; hsv_h—HSV-Hue image enhancement; hsv_v—HSV-Value image enhancement; hsv_s—HSV-Saturation enhancement; degrees—randomly rotates images clockwise and counterclockwise; translate—image translation enhancement; scale—scaling factor; flipud—flips the image up and down; fliplr—flips the image left and right; mosaic—combination of images; train_size—training dataset size; validation_size—validation dataset size.
Table 2. Performance of YOLOv7x in the detection of A. angustifolia: Average Precision (AP), precision, recall, and F1-score per fold in the 5-fold cross-validation process.
Table 2. Performance of YOLOv7x in the detection of A. angustifolia: Average Precision (AP), precision, recall, and F1-score per fold in the 5-fold cross-validation process.
Metrics1-Fold2-Fold3-Fold4-Fold5-FoldAverage
Precision0.89630.93190.89510.91490.89720.9071
Recall0.83550.86710.86100.87570.89890.8676
F1-score0.86480.89830.87770.89490.89800.8868
AP (%)0.88830.91190.90410.90950.92570.9079
AP—Average Precision for all training considering all labels; F1-score—Harmonic mean of precision and recall.
Table 3. Precision, recall, and F1-score for different scenarios of A. angustifolia detection.
Table 3. Precision, recall, and F1-score for different scenarios of A. angustifolia detection.
ScenarioPrecisionRecallF1-Score
Isolated0.83790.94320.8874
Forest0.91290.96140.9365
Isolated + Palm Refined0.8543--
Forest + Palm Refined0.9136--
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Karasinski, M.A.; Leite, R.d.S.; Guaraná, E.C.; Figueiredo, E.O.; Broadbent, E.N.; Silva, C.A.; Santos, E.K.M.d.; Sanquetta, C.R.; Dalla Corte, A.P. Automated Detection of Araraucaria angustifolia (Bertol.) Kuntze in Urban Areas Using Google Earth Images and YOLOv7x. Remote Sens. 2025, 17, 809. https://doi.org/10.3390/rs17050809

AMA Style

Karasinski MA, Leite RdS, Guaraná EC, Figueiredo EO, Broadbent EN, Silva CA, Santos EKMd, Sanquetta CR, Dalla Corte AP. Automated Detection of Araraucaria angustifolia (Bertol.) Kuntze in Urban Areas Using Google Earth Images and YOLOv7x. Remote Sensing. 2025; 17(5):809. https://doi.org/10.3390/rs17050809

Chicago/Turabian Style

Karasinski, Mauro Alessandro, Ramon de Sousa Leite, Emmanoella Costa Guaraná, Evandro Orfanó Figueiredo, Eben North Broadbent, Carlos Alberto Silva, Erica Kerolaine Mendonça dos Santos, Carlos Roberto Sanquetta, and Ana Paula Dalla Corte. 2025. "Automated Detection of Araraucaria angustifolia (Bertol.) Kuntze in Urban Areas Using Google Earth Images and YOLOv7x" Remote Sensing 17, no. 5: 809. https://doi.org/10.3390/rs17050809

APA Style

Karasinski, M. A., Leite, R. d. S., Guaraná, E. C., Figueiredo, E. O., Broadbent, E. N., Silva, C. A., Santos, E. K. M. d., Sanquetta, C. R., & Dalla Corte, A. P. (2025). Automated Detection of Araraucaria angustifolia (Bertol.) Kuntze in Urban Areas Using Google Earth Images and YOLOv7x. Remote Sensing, 17(5), 809. https://doi.org/10.3390/rs17050809

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop