Next Article in Journal
Role of Endogenous Hormone Dynamics in Regulating the Development of Suaeda salsa L. Under Salt Stress
Next Article in Special Issue
Soybean Leaf Disease Recognition Methods Based on Hyperparameter Transfer and Progressive Fine-Tuning of Large Models
Previous Article in Journal
Effects of Managed Disturbances on Plant Community Structure and Nutritional Function in the Tropical Savanna Habitat of the Endangered Eld’s Deer (Rucervus eldii)
Previous Article in Special Issue
Intelligent Batch Harvesting of Trellis-Grown Fruits with Application to Kiwifruit Picking Robots
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Hierarchical Deep Learning Framework for Mapping Honey-Producing Tree Species in Dense Forest Ecosystems Using Sentinel-2 Imagery

by
Athanasios Antonopoulos
1,*,
Tilemachos Moumouris
2,
Vasileios Tsironis
2,
Athena Psalta
2,
Evangelia Arapostathi
1,
Antonios Tsagkarakis
1,
Panayiotis Trigas
3,
Paschalis Harizanis
1 and
Konstantinos Karantzalos
2
1
Laboratory of Sericulture and Apiculture, Agricultural University of Athens, 11855 Athens, Greece
2
Remote Sensing Laboratory, National Technical University of Athens, Iroon Polytechneiou 9, 15780 Athens, Greece
3
Laboratory of Systematic Botany, Agricultural University of Athens, 11855 Athens, Greece
*
Author to whom correspondence should be addressed.
Agronomy 2025, 15(12), 2858; https://doi.org/10.3390/agronomy15122858
Submission received: 4 November 2025 / Revised: 27 November 2025 / Accepted: 10 December 2025 / Published: 12 December 2025
(This article belongs to the Special Issue Digital Twins in Precision Agriculture)

Abstract

The sustainability of apiculture within Mediterranean forest ecosystems is contingent upon the extent and health of melliferous tree habitats. This study outlines a five-year initiative (2020–2024) aimed at mapping and monitoring four principal honey-producing tree species—pine (Pinus halepensis and Pinus nigra), Greek fir (Abies cephalonica), oak (Quercus ithaburensis subsp. macrolepis), and chestnut (Castanea sativa)—across Evia, Greece. This is achieved through the utilization of high-resolution Sentinel-2 satellite imagery in conjunction with a hierarchical deep learning framework. Distinct from prior vegetation mapping endeavors, this research introduces an innovative application of a hierarchical framework for species-level semantic segmentation of apicultural flora, employing a U-Net convolutional neural network to capture fine-scale spatial and temporal dynamics. The proposed framework first stratifies forests into broadleaf and coniferous types using Copernicus DLT data, and subsequently applies two specialized U-Net models trained on Sentinel-2 NDVI time series and DEM-derived topographic variables to (i) discriminate pine from fir within coniferous forests and (ii) distinguish oak from chestnut within broadleaf stands. This hierarchical decomposition reduces spectral confusion among structurally similar species and enables fine-scale semantic segmentation of apicultural flora. Our hierarchical framework achieves 92.1% overall accuracy, significantly outperforming traditional multiclass approaches (89.5%) and classical ML methods (76.9%). The results demonstrate the framework’s efficacy in accurately delineating species distributions, quantifying the ecological and economic impacts of the catastrophic 2021 forest fires, and projecting long-term habitat recovery trajectories. The integration of a novel hierarchical approach with Deep Learning-driven monitoring of climate- and disturbance-driven changes in honey-producing habitats marks a significant step towards more effective assessment and management of four major beekeeping tree species. These findings highlight the significance of such methodologies in guiding conservation, restoration, and adaptive management strategies, ultimately supporting resilient apiculture and safeguarding ecosystem services in fire-prone Mediterranean landscapes.

1. Introduction

Honeybee (Apis spp. Linnaeus (Hymenoptera: Apidae)) is recognized as a keystone species that exerts significant impacts on both agricultural production and the maintenance of natural ecosystems through its pollination services [1,2]. The ecological services provided by honeybees extend beyond direct pollination, influencing plant-pollinator network structures and contributing to ecosystem stability and biodiversity conservation [3,4]. Even so, the persistence of healthy bee populations has been shown to be inextricably linked to the availability and quality of suitable foraging habitats [5,6]. Its role as an effective pollinator has been demonstrated across a wide range of crops and wild plants, with studies showing substantial improvements in fruit set and yield following, for example, Apis mellifera pollination [7,8,9]. Research has documented that pollinator populations are declining worldwide due to multiple factors including habitat loss, climate change, pesticide use, and ecosystem fragmentation [6,10]. Considering that 87 of the leading global food crops rely on pollinators for seed production, accounting for about 35% of the global crop production intended for human consumption, their population decline does not only significantly impacts beekeeping but also affects negatively biodiversity conservation, crop yields and threatens food security [6,11].
Evia is distinguished as one of Greece’s most important beekeeping regions, recognized for its rich tradition in apiculture and exceptional production of high-quality pine honey [12,13]. Evia’s forests primarily consist of pine (Pinus spp.), Greek fir (Abies cephalonica), oak (Quercus spp.), and chestnut (Castanea sativa)—provide essential melliferous resources that support both the productivity and unique character of Greek honey production [12,14]. Pine forests are particularly significant, as pine honey produced from the honeydew secretions of the endemic insect Marchalina hellenica represents approximately 60% of Greece’s annual honey production and over 70% of total honeydew honey production in the country [12,13]. Despite its ecological and economic importance, the beekeeping sector in Evia faces mounting threats from environmental changes and increasing disturbance frequency.
In Mediterranean regions, the frequency and severity of forest fires have increased dramatically [15,16], with the catastrophic 2021 forest fires destroying approximately 517 km2 (51,700 hectares) of northern Evia’s forests [17,18]. Such large-scale disturbances result in substantial losses of bee habitats, directly reducing the availability of floral resources and causing significant economic impacts for local beekeepers [19,20]. These effects are further compounded by recurrent droughts, land-use changes, and habitat fragmentation, all of which threaten the long-term sustainability of apiculture in fire-prone Mediterranean landscapes [5,6]. Given these challenges, the protection and systematic monitoring of melliferous habitats have become critically important for maintaining honeybee populations, supporting rural economies, and preserving essential ecosystem services [21,22]. Remote sensing technologies, particularly high-resolution, multi-temporal satellite imagery, offer powerful tools for tracking changes in habitat extent and quality over time [23,24].
The European Space Agency’s Sentinel-2 mission, with its frequent revisit capabilities and detailed spectral information, has been demonstrated to enable effective discrimination of key honey-producing tree species and detection of habitat loss due to disturbances such as forest fires [25,26,27]. Studies have shown that Sentinel-2 data can achieve high classification accuracies (>90%) for forest monitoring applications, particularly when combined with advanced machine learning approaches [24,25,27].
The integration of artificial intelligence, specifically deep learning architectures, with satellite remote sensing has emerged as a promising approach for forest monitoring and species classification [28,29]. The U-Net convolutional neural network architecture, originally developed for biomedical image segmentation [30], has been proven highly effective for various image segmentation tasks due to its encoder-decoder structure with skip connections [31,32,33]. This architecture has been successfully adapted for environmental monitoring applications, demonstrating strong performance in complex classification scenarios [34,35].
Recent studies have shown that combining structural information from airborne LiDAR with very high-resolution multispectral imagery can significantly improve tree-species mapping. Methods that exploit detailed three-dimensional canopy structure and intensity metrics from LiDAR at the individual tree-crown level have been used to enhance species discrimination and to better handle challenges such as class imbalance and label noise in training data [36]. Complementarily, approaches based on object-oriented classification of high-resolution, multi-band satellite imagery (e.g., WorldView-2) have demonstrated that segmenting sunlit tree crowns and using ensemble learning techniques such as Random Forests can yield high accuracies and highlight the benefits of object-based over pixel-based analysis for species differentiation [37]. Together, these studies illustrate the complementary strengths of structural LiDAR information and very high-resolution spectral data for improving tree-species classification.
Recent work on tree species classification with remote sensing has advanced along complementary directions of data fusion, self-/semi-supervised learning, careful data preparation, and hybrid (learning + symbolic) methods. Zhong et al. [38] provide a comprehensive review of the field, showing that hyperspectral imagery (HSI), LiDAR, very-high-resolution (VHR), and RGB data (and multimodal combinations thereof) dominate studies. They highlight that deep learning architectures and multimodal fusion have become central trends in advancing classification accuracy and scalability.
Building on the need to combine detection and labeling, Harmon et al. [39] propose a neuro-symbolic framework that jointly performs crown delineation and species classification. Their approach integrates learned representations with symbolic/structural reasoning, improving interpretability and leveraging crown-level context to enhance classification performance.
On the representation-learning side, Wang et al. [40] introduce M-SSL, a pixel-level multisource self-supervised learning approach. This method employs MAAE/MVAE encoders and depth-wise cross-attention mechanisms to leverage HSI/MSI data and contrastive/generative pretext tasks. By doing so, it reduces annotation requirements while improving the quality of downstream species maps.
Complementing these modeling advances, Ni-Meister et al. [41] emphasize the importance of careful data preparation and preprocessing for hyperspectral data. They demonstrate that steps such as denoising, dimensionality reduction, and thoughtful training/validation design materially affect classifier performance and should be prioritized alongside algorithmic improvements.
Although there is an increasing interest in applications of remote sensing in apiculture such as mapping beekeeping suitability areas for apiary installation for honey production, overwintering of the colonies and optimization of other beekeeping practices [42,43,44,45,46,47], avoiding honeybee colony losses [48], detecting and mapping of potential propolis plant sources [49] there are only a handful of studies about mapping and monitoring of beekeeping flora worldwide [50,51,52]. In the existing literature, both satellite and UAV (Unmanned Aerial Vehicle) data have been used successfully for detecting, mapping and monitoring vegetation with apicultural value. Adgaba et al. (2017) [52] identified and mapped 182 plant species, including Zizyphus spina-christi, Acacia tortilis, Acacia origina, Lavandula dentata which have significant value for beekeeping in Saudi Arabia, using satellite data processed with Hopfield Artificial Neural Network techniques.
In Greece, Papachristoforou et al. (2023) [51] combined UAV data with vegetation indices from satellite data of Google Earth Engine in order to identify and distinguish two plant species, one of apicultural importance (Thymbra capitata) and one not (Sarcopoterium spinosum) in Lemnos island, achieving overall accuracy of 98% with Random Forest model. Fir, pine and oak forests have been previously mapped for the Greek mainland and islands during 2019 by Antonopoulos et al. (2023) [50] using medium resolution satellite data (MODIS). Sentinel-2 imagery has lower spatial resolution, leading to mixed pixels that contain multiple land cover types, which affects vegetation index accuracy. Unlike UAV imagery with very high resolution that avoids this problem, Sentinel-2’s resolution introduces this challenge but allows for broader area coverage.
The novelty of this work lies in the integration of a novel hierarchical strategy over an extended period of time for the effective monitoring of four significant beekeeping tree species. By combining monthly Sentinel-2 composites with a Deep Learning (DL) hierarchical segmentation framework over 2020–2024, we generate high-temporal-resolution, species-specific maps and change analyses for the honey-producing tree habitats under study in Evia, Greece. The resulting maps and analyses, including fine-scale habitat distributions, temporal trajectories, and quantified post-fire losses, deliver directly actionable information for local beekeeping management, fire-impact assessment, and restoration prioritization. The findings underscore the urgent need for targeted habitat protection and restoration to support sustainable beekeeping and safeguard the ecosystem services upon which both natural systems and local communities depend.

2. Materials and Methods

2.1. Study Area

The island of Evia (Euboea) is the second-largest island in Greece, located off the eastern coast of Central Greece in the Aegean Sea. Geographically, Evia extends approximately between latitudes 38.3° N and 39.2° N and longitudes 22.8° E and 24.1° E, covering a total land area of about 3684 km2. The island’s topography is notably varied, ranging from low-lying coastal plains to rugged mountainous regions, with altitudes spanning from sea level up to approximately 1743 m at Mount Dirfi, its highest peak [53]. Evia’s diverse landscape forms a complex Eastern Mediterranean ecosystem. The island features a mosaic of habitats, including dense coniferous and broadleaved forests, shrublands, agricultural areas, and wetlands. The climate is predominantly Mediterranean, characterized by hot, dry summers and mild, wet winters, though significant microclimatic variation exists due to the island’s elevation gradients and proximity to the sea.
Forests cover a significant portion of Evia’s land [54]. They are ecologically and economically important, supporting local industries such as timber and resin harvesting. Notably, Evia’s forests are a vital resource for apiculture, with pine honey production being especially prominent. The health and distribution of four dominant tree species, coniferous Pines (Pinus halepensis Pinus nigra) and Greek fir (Abies cephalonica), as well as broadleaved Oaks (Quercus ithaburensis subsp. macrolepis) and Chestnut (Castanea sativa), are directly linked to the region’s honey production and biodiversity. The forested landscape of Evia is highly heterogeneous, with pure and mixed stands of the dominant species distributed across the island’s varied terrain [50]. While detailed forest inventory statistics for Evia are less extensive than for some larger European regions, local forestry records indicate that pine forests (primarily Pinus halepensis and Pinus nigra) dominate the lowland and mid-elevation zones, while Greek fir and chestnut are more prevalent at higher altitudes and humid areas. Oak species are widespread, often forming mixed stands with other broadleaves or conifers.
This environmental diversity, combined with the economic importance of forest products, makes Evia a key region for apiculture in Greece. The island’s forests are not only a source of timber and resin but are also closely linked to long-standing local beekeeping traditions. In particular, the production of pine honey, one of the most characteristic products of Greek apiculture, for which Evia is widely recognized, depends directly on the extent and condition of its pine stands. Greek fir, oak, and chestnut likewise provide important nectar and honeydew flows, contributing distinct honey types and extending the foraging season for bees, thereby supporting the diversity and resilience of local apicultural practices [55]. Taken together, these factors underscore Evia’s prominent role in national honey production and highlight the need for effective monitoring strategies for its honey-producing forests.
Given the dense and heterogeneous nature of Evia’s forests, accurately monitoring the distribution and condition of these key tree species is essential for sustaining honey yields and supporting the livelihoods of local beekeepers. The strong link between forest composition and apicultural productivity means that changes in tree species distribution, whether due to natural dynamics or anthropogenic pressures, can have immediate and significant impacts on honey production. For these reasons, Evia provides an ideal setting to develop and apply advanced remote sensing techniques tailored to the needs of the apiculture sector, with the goal of enabling more informed and sustainable management of both forest and beekeeping resources.

2.2. Hierarchical Classification Methodology

To address the inherent complexity of mapping spectrally similar tree species across the heterogeneous landscape of Evia, a hierarchical segmentation framework was adopted. This methodological choice was motivated by the need to decompose a challenging multi-class problem, where spectral overlap and landscape variability can confound direct classification, into a series of more tractable, sequential tasks. By structuring the workflow into discrete stages, each tailored to specific ecological or spectral distinctions, the approach enables the integration of specialized models and the exploitation of high-confidence, existing data products [56], thereby enhancing overall classification accuracy and robustness. The hierarchical workflow comprises the sequential steps in Figure 1:

2.2.1. Forest Area Delineation

The initial stage of the hierarchical workflow involves the precise delineation of forested areas within the region of interest by harmonizing and integrating authoritative data sources, specifically official national-level governmental forest maps and detailed forest-type maps from local forestry authorities that contain among others areas vegetation characterized as Shrubs. The resulting forest mask ensures that all subsequent analyses are spatially constrained to relevant forested pixels, minimizing the risk of misclassification in non-forested areas and providing a consistent analytical baseline.

2.2.2. Broadleaf-Coniferous Forest-Type Stratification

Within the delineated forest extent, a further stratification is performed to distinguish between the two principal ecological categories: coniferous and broadleaved forests. This step leverages the High Resolution Layer Tree Cover and Forest product from the Copernicus Land Monitoring Service (CLMS), which offers a reliable, standardized baseline for differentiating these major forest types at a continental scale [57,58]. Specifically, the Dominant Leaf Type (DLT) dataset was used to differentiate the tree cover of the ROI, into broadleaf and coniferous trees as well as non-tree covered areas. The spatial resolution of the dataset is 10 m and different versions throughout the years starting from 2018 are provided. The DLT dataset is produced using temporal convolutional neural networks trained on Sentinel-2 time series, with training data sourced from LUCAS 2018, existing CLMS products, and expert interpretation of VHR and Sentinel-2 imagery. Post-classification, spatial and temporal filtering ensures high consistency across the time series while preserving real land cover changes, targeting at least 90% producer and user accuracy. The dataset continues the HRL Dominant Leaf Type time series, with improvements in models and input data over time [59,60,61]. The stratification not only reflects ecological realities but also serves to partition the dataset for subsequent, more specialized classification.

2.2.3. Species-Level Semantic Segmentation

Building upon the broad forest-type stratification, the final and most granular classification stage employs deep learning–based semantic segmentation. Two independent models are developed, each optimized for a specific forest type: one model is trained exclusively within the coniferous stratum to discriminate between Greek fir (Abies cephalonica) and pine (Pinus spp.), while the other operates within the broadleaf stratum to distinguish chestnut (Castanea sativa) from oak (Quercus spp.). This targeted approach allows each model to focus on subtle spectral and phenological differences relevant to its respective species pair, which would be more difficult to resolve in a single, flat multi-class model.
For the core task of species semantic segmentation, we employed a U-Net–type fully convolutional neural network [30]. The network accepts a multi-channel input tensor of size C × H × W , where C = 13 corresponds to the stacked monthly spectral index time series and the DEM channel. The architecture follows a three-level encoder–decoder with symmetric skip connections; its detailed specification, including the number of layers, filter sizes, activation functions, and channel dimensions at each stage, is provided in Table 1.
Training is performed using binary cross-entropy loss. Background pixels are explicitly ignored by computing the loss only over pixels belonging to one of the two target species, and labels within this subset are mapped to a binary scheme for the two species in each stratum. We also experimented with a weighted cross-entropy formulation to further address imbalance, but it did not consistently improve performance over the unweighted loss under our stratified, two-class setup. Consequently, we adopted the standard (unweighted) binary cross-entropy with masked background as our final configuration.

2.3. Composite Species Map Generation

The outputs from the two species-level models are spatially integrated using the stratification masks established in the earlier steps. This hierarchical fusion yields a seamless, composite map that details the distribution of the four target species across the entire island. The final map thus reflects both the ecological structure of Evia’s forests and the fine-scale spatial heterogeneity of species composition. The final output is a seamless, composite species map. It is generated by spatially integrating the predictions from the two species-level models into the framework established by the initial stratification masks. This hierarchical fusion results in a single, comprehensive map detailing the distribution of the four target species across the island.

2.4. Data Acquisition and Training Dataset Development

2.4.1. Satellite and Ancillary Data

The core of our analysis relies on time-series data from the European Space Agency’s Sentinel-2 mission, accessed via the CLMS Mosaic Service [57,58]. We utilized complete annual sets of monthly, cloud-free Level-2A mosaics. These products are atmospherically corrected to surface reflectance and pre-filtered for clouds and cloud shadows, providing analysis-ready data at a 10 m spatial resolution in the visible and near-infrared bands. For each month and year, only pixels classified as cloud-free and of high radiometric quality in the CLMS processing chain were retained, ensuring consistent temporal stacks suitable for phenology-based analysis.
This dense time series is critical for capturing the distinct phenological signatures that help differentiate between tree species. As an essential ancillary dataset, a Digital Elevation Model (DEM) from the Shuttle Radar Topography Mission (SRTM), with a native resolution of 30 m, was incorporated to model the strong influence of topography and elevation on species distribution patterns [62]. To ensure spatial consistency with the Sentinel-2 imagery, the DEM was reprojected to 10 m using bilinear interpolation, thereby preserving elevation values while achieving pixel-to-pixel alignment with the spectral data.

2.4.2. Ground Truth and Training Data Generation

A cornerstone of this study was the construction of a high-fidelity training dataset, in which geospatial polygons of forest stands supplied by local forestry authorities were rigorously refined by domain experts to ensure accurate and representative samples for model training. Through careful photo-interpretation of the Sentinel-2 time-series imagery, experts analyzed canopy texture, seasonal color changes, and phenological patterns to validate and adjust the polygon boundaries. A critical criterion for inclusion in the final training set was species density; we exclusively selected polygons that represented medium to high-density stands of a single target species. This step was vital for minimizing the inclusion of mixed pixels and ensuring the models were trained on pure, unambiguous examples of each class. To provide a final layer of high-confidence validation, we conducted a targeted field campaign using a DJI Mavic 3 Multispectral UAV, DJI company, Shenzhen, China.
Flights were performed over key sample areas to collect ultra-high-resolution imagery as seen in Figure 2. The selected flight height was to 100m and the resulting Ground Sample Distance (GSD) was 2.66 cm/px. Moreover, forward and side image-overlap was set 80% coverage ensuring a high-quality result. This data was processed into orthomosaics using Agisoft Metashape (https://www.agisoft.com/), providing a clear ground-truth reference (Figure 2) for validating the accuracy of our training labels.
In total, the final dataset comprised approximately 9548 ha of pine, 6427 ha of fir, 34.738 ha of oak, and 25.198 ha of chestnut. These annotated areas were spatially distributed to cover the full range of environmental conditions within the study area, including core stands and transition zones, thereby enhancing model generalization. The data were divided into training and validation subsets in an 80–20 split. This approach prevented data leakage by ensuring no part of a stand appeared in both sets and maintained proportional representation of each tree species across splits.

2.5. Feature Engineering and Data Preprocessing

2.5.1. Spectral Index Selection

To augment the raw spectral information and enhance the model’s ability to discriminate between species, we engineered a set of features based on well-established Spectral Indices (SIs). These indices distill spectral information into metrics directly related to vegetation health, structure, and moisture content. SIs used in this study were selected based on prior studies [27] which suggest that different phenological stages can be effectively distinguished through variations in chlorophyll content and associated changes in reflectance, particularly in the near-infrared region.
A comparative study was performed to select the most effective indices, with the final selection based on a trade-off between model performance and computational cost. The indices considered are detailed in Table 2. Their combined use provides a multi-faceted view of the vegetation, capturing information on chlorophyll content (MCARI), canopy water stress (NDMI), and overall vigor (NDVI, EVI), which are all key differentiating factors between the target species [63].

2.5.2. Input Data Preparation

To manage the computational demands of processing large-scale satellite imagery with deep learning models, we adopted a tile-based processing strategy. The entire ROI mosaic was dynamically partitioned into smaller, overlapping tiles using a sliding window. To optimize training efficiency, only tiles containing at least 300 annotated (non-background) pixels were fed to the model.
All input data were normalized to ensure stable neural network training. The DEM band, with elevation values ranging from −19 to 1741 m, was normalized using a hyperbolic tangent (tanh) transformation with a center value of 700 m and a width parameter of 150 m. This transformation introduces the ability to account for differences between species under study and also an area that holds the probability for mixed pixels.

2.6. Annual Map Updating and Temporal Refinement

Recognizing that forests are dynamic systems subject to change from events like forest fires, disease, or logging, we established a protocol for annual map updates. This procedure ensures the long-term relevance and accuracy of proposed tree loss predicting methodology. The first step in the annual update cycle is to incorporate forest fire data from the Copernicus Emergency Management Service (EMS). Areas identified as recently burned are masked out and excluded from the current year’s classification. For the remaining unburnt forest areas, the classification models are re-run using the new year’s Sentinel-2 mosaics. To improve temporal stability and correct for potential single-year prediction errors or anomalies, the newly generated map is fused with the map from the previous year. This temporal pooling acts as a filter, reinforcing stable classifications and smoothing out spurious pixel-level changes, which results in a more robust and reliable time-series of forest species distribution.

2.7. Implementation Details

Both models were optimized using a Cross-Entropy loss function [64], which is standard for pixel-wise classification tasks. The loss was computed exclusively on valid, annotated pixels, effectively ignoring any background or non-target areas within a training tile. Hyperparameter tuning was conducted using a grid search strategy over the following ranges: initial learning rate between 1 × 10 4 and 1 × 10 2 , batch size between 8 and 32, and dropout rate between 0.0 and 0.5. Each configuration was trained with early stopping, and the model with the lowest validation loss was selected. The final configuration used an initial learning rate of 1 × 10 3 , a batch size of 16, and a dropout rate of 0.2.
To ensure generalization and prevent overfitting, we employed an early stopping mechanism monitored on a dedicated validation set; training was halted if the validation loss did not improve over a set number of epochs (patience of P epochs), with a maximum limit of 150 epochs.

3. Results

3.1. Honey-Producing Trees Habitats in 2020

In 2020, there were extensive areas of honey-producing trees in the island of Evia, and pine and oak forest were spread in the north and central parts of the island. Pine was a large contiguous area ideal for production of pine honey, while oak was a mixed forest with pine spread in the north and central parts, thus fir was spread in the top of the mountains in the central part of the island while chestnut was spread in small patches in the northwest and southeast parts of the island. Pine and oak were the most extensive forest types, covering ca. 57,000 and 48,600 ha, respectively, and fir and chestnut were the least extensive forest types, covering ca. 10,000 and 8500 ha, respectively, providing a suitable environment for honeybees to flourish and produce abundant honey (Figure 3).

3.2. Ablation Study

3.2.1. Feature Selection for the Hierarchical Framework (HF)

To systematically evaluate the potential contribution of individual or combined spectral indices to our model’s performance, we conducted an ablation study regarding the model’s input. This investigation involved training and evaluating the model with various configurations of input features, starting with the Normalized Difference Vegetation Index (NDVI) as a baseline. Subsequent experiments incorporated additional indices, both individually and in different combinations, to assess their incremental impact. The results of this ablation study, detailing the performance metrics for each input configuration, are summarized in the Table 3.
After conducting experiments with different combinations the results showed that for this segmentation task a simpler and more efficient approach resulted the best results. NDVI performed relatively better in comparison with other indices, followed closely by the results of EVI + NDMI. SAVI did not contribute more to the performance as other indices paired with EVI or NDVI. Furthermore, NDMI was a better feature in modelling the spectral characteristics of different tree species and showed promising results.

3.2.2. Random Forest and SVM Classifiers

To further provide comparison for our methodology, we also include the results of experimentation with two traditional Machine Learning classifiers on the same feature set used by the semantic models (monthly spectral indices and the DEM). Specifically, we implemented a Random Forest (RF) classifier and a Support Vector Machine (SVM). As previously mentioned, the hierarchical approach was followed, and the results demonstrated that both classifiers produced results with an accuracy significantly lower than those produced by the U-Net architecture as seen in Table 4.

3.2.3. U-Net and Features Performance

The U-Net architecture employed in this study exhibited robust performance in the classification of honey-producing tree species within the region of interest (ROI). Throughout our experiments, we systematically evaluated a range of spectral indices, both individually and in various combinations, including NDVI, EVI, SAVI, MCARI, and NDMI. Our findings indicate that, while alternative indices and their pairings were thoroughly tested, NDVI consistently outperformed or matched the results of other indices, both when used alone and in combination. Notably, the inclusion of additional indices did not yield significant improvements and, in some cases, led to diminished classification accuracy. As a result, we determined that using NDVI as the sole spectral input provided the most efficient and effective approach. Furthermore, integrating NDVI with the digital elevation model (DEM) of the study area proved highly effective in distinguishing between coniferous and broadleaf species. This was evidenced by the high precision and F1-Score metrics achieved on the validation set. The results underscore the suitability of our methodology for large-scale monitoring of honey-producing tree types, particularly in complex Mediterranean landscapes such as those found in Evia, Greece.
To provide a qualitative understanding of our segmentation model’s performance, we present visual examples of different classes in Figure 4. In this, we showcase representative instances of True Positives (TP), False Positives (FP), and False Negatives (FN) across each of the four classes under study. The selected cases aim to illustrate the model’s strengths in accurately identifying target regions (TPs), as well as common challenges such as instances where the model incorrectly identifies areas during segmentation.
After close examination of the results, we pinpointed cases where the model was unable to segment correctly areas, impacting the overall performance. As seen in the following Figure 5, FIR class is not segmented correctly and represents a FN. This is likely because these areas are transition zones where PINE and FIR coexist.
Additionally, our study showed that mixed forest posed a significant challenge in both classifiers, for coniferous and broadleaf trees. Due to the resolution of the input data, areas of mixed as well as dense forest were not segmented accurately as seen in Figure 5a,b.

3.2.4. Multiclass Semantic Segmentation

To assess the performance of traditional segmentation methods against our framework, we include the results of experiments utilizing different numbers of classes and the same architecture in Table 5:
  • Multi-class semantic segmentation using U-Net architecture (MCSS).
  • Multi-class semantic segmentation with background class (MCSS-B).
  • Multi-class semantic segmentation confined only to forest areas (MCSS-F).
Multi-class semantic segmentation using U-Net architecture (MCSS): The straightforward four-class U-Net struggled primarily because Sentinel-2’s 10 m pixels frequently contain mixed crowns and sub-pixel mixtures of species; spectral overlap between coniferous and broadleaf canopies therefore produces label noise that a single flat model cannot reliably disentangle. In addition, our training polygons were selected for high-density, single-species stands, so the model did not generalize well to heterogeneous or edge areas common in the ROI. Differentiating over forest/non-forest areas with that setup is not possible, thus, every pixel in the original image is classified as a specific tree type, hindering our model’s predictive performance significantly.
Multi-class semantic segmentation with background class (MCSS-B): Introducing an explicit background class created a large, dominant category that absorbed many ambiguous and edge pixels, effectively acting as a “hyper-class.” This increased class imbalance and encouraged the network to favour background predictions for uncertain pixels, reducing accuracy for the true forest classes. The extra class also amplified annotation inconsistencies (what to label as background vs. sparse forest), producing more confusion at class boundaries and harming per-class segmentation performance.
Multi-class semantic segmentation confined only to forest areas (MCSS-F): Restricting segmentation to forested pixels reduced non-forest confusion but substantially shrank the available and diverse training data, which increased the risk of overfitting and sensitivity to local patch heterogeneity. Crucially, mixed stands and within-canopy mixtures remained unresolved at Sentinel-2 resolution, so many pixels in transition zones were still misclassified. Thus, while forest-confinement reduced some types of error, it did not overcome the fundamental mixed-pixel and data-constraint issues that limited reliable per-class discrimination.
Our hierarchical framework (HF) exhibits a higher computation cost (1929 s) than the single-stage U-Net variants (MCSS 1623 s; MCSS-B 1565 s; MCSS-F 1263 s). This additional time is expected: HF runs two different models (and the associated preprocessing/hand-off steps), which increases computation and I/O overhead, but it produces a clear accuracy improvement. Note that all timings were measured on a NVIDIA GeForce RTX 3060 (6 GB) and none of the compared models were tuned for runtime performance, so these numbers are indicative rather than optimized. It should be noted, nonetheless, that since computation is easily tractable, it is sufficient for our case since it consists of a one-off mapping of geographically restricted area.

4. Discussion

4.1. Impact of Forest Fires on Beekeeping and Ecosystem Damage

The 2021 forest fires in Evia destroyed roughly 17,000 ha of pine and 5600 ha of oak forests, which suggest reductions of 30% and 11.5% respectively, compared with 2020 (Figure 6). These losses are particularly critical given that pine forests underpin approximately 60% of Greek honey production [50,65], primarily through the honeydew insect Marchalina hellenica, leading to reduce in both honey yields and beekeepers’ incomes. The decline in oak, which supplies essential nectar and pollen, further restricts foraging resources. Fir and chestnut forests, although unaffected by the fires, are limited and too scattered to offset these losses.
Similar to trends observed in Spain, Chile and North Africa, where severe droughts and high temperatures have caused sustained declines in honey yields, suppressed forest regeneration and altered plant community structures [66,67], the Evia fires, Figure 7, illustrate how climate-induced disturbances propagate through both forest ecosystems and dependent livelihoods.
Post-fire landscape fragmentation slows the natural recovery of both vegetation and important honeydew-producing insects like Marchalina hellenica [68], as many remaining pine-stands now lie beyond the optimal 3 km foraging radius for Apis mellifera. In addition, it becomes more challenging for bees to find contiguous patches of food sources, and for insect populations to re-establish. As a result, honey production in the region will remain low for years, and beekeepers will have to adapt by moving hives, providing supplementary feeding, or facing continued economic hardship. Full restoration of productive pine and oak habitats, and thus stable honey yields, may take more than a decade and will depend on active habitat recovery and careful management.

4.2. Forest Recovery, Honeydew Dynamics, and Management Implications

Pine regeneration in Mediterranean ecosystems depends on serotinous cones that release seeds following fire. Studies show that Pinus pinaster and P. pinea can germinate rapidly, but seedling survival is strongly modulated by drought severity and stand conditions [69]. Even under successful germination, mature pines capable of sustaining M. hellenica populations typically require 20–25 years, while full honeydew productivity may take > 35 years. Edge effects—now widespread after fires—reduce insect fecundity by ~22% and lower honey yield potential.
Oak forests, in contrast, recover primarily through vigorous resprouting, achieving partial canopy closure within five years but providing limited nectar for several subsequent seasons. Long-term monitoring confirms that increased temperature and drought intensify regeneration bottlenecks across Mediterranean oak and pine forests [70].
Climate change thus acts as a compound stressor, amplifying fire recurrence, delaying maturation, and altering phenological synchrony between bees, host trees, and honeydew insects [71]. Experimental studies show shortened flowering periods and lower hive weight gain in drought years, directly reflecting reduced forage availability. Moreover, Mediterranean honey bees display nutritional deficits during extended dearth periods, suggesting that resource instability—rather than just colony disease—drives long-term production declines [72].
To offset these pressures, climate-smart forest management is essential. Recent work highlights the value of selective thinning, retention forestry, and assisted regeneration for improving post-fire survival of pine seedlings and maintaining soil moisture [73]. Integrating such silvicultural strategies with AI-based forest monitoring—as demonstrated here—can help identify regeneration gaps early, guide hive relocation, and enhance landscape connectivity for pollinators.

4.3. Future Research Needs

Future research should link remote sensing–based species maps with on-ground ecological metrics such as sapling survival, soil moisture, and nectar yield. The combination of Sentinel-2 imagery, UAV surveys, and LiDAR-derived canopy structure can capture successional changes more precisely [74]. Beyond ecological mapping, incorporating economic and beekeeping data will allow predictive modeling of honey yield under multiple climate scenarios.
Operational scaling requires automating the hierarchical classification pipeline and developing decision dashboards for forest agencies and beekeepers—turning scientific insights into actionable management tools. The proposed framework can serve as the foundation of a Mediterranean Pollinator Habitat Observatory, promoting adaptive planning amid growing environmental uncertainty.

5. Conclusions

The 2021 Evia forest fires represents a case of compound ecological and economic disturbance, where forest loss, fragmentation, and climate-induced stress converged to threaten apiculture. Studies in the area that are of similar complexity in terms of algorithms used, offer lower Overall Accuracy. Additionally, the trade-off between large scale monitoring capability and accuracy metrics, show that our proposed methodology offers significant advantages.
Our hierarchical deep learning framework, combining Sentinel-2 time-series and U-Net semantic segmentation, achieved > 92% accuracy in mapping honey-producing tree species, indicating honeybee habitat losses at an alarming rate in a very important beekeeping region of Greece, combined with climate change phenomena and extended use of pesticides in agriculture. This performance surpasses traditional machine learning approaches and offers a scalable solution for annual ecosystem monitoring. By transforming spectral and phenological information into actionable maps, it supports early detection of habitat loss, post-fire regeneration tracking, and adaptive apicultural planning.
The integration of DL-driven monitoring, climate-smart forestry, and resilient beekeeping strategies represents a forward-looking model for sustainable management in fire-prone Mediterranean landscapes. Expanding this approach regionally could underpin a trans-Mediterranean early-warning network for pollinator habitat loss—helping to safeguard both forest biodiversity and rural livelihoods under accelerating climate change.

Author Contributions

Conceptualization, A.A., P.H., P.T., A.T. and K.K.; methodology, A.A., T.M., V.T., A.P. and K.K.; software, A.A., T.M., V.T. and A.P.; validation, A.A., T.M., V.T., A.P., E.A., A.T., P.T., P.H. and K.K.; formal analysis, A.A., T.M., V.T. and A.P.; investigation, A.A., T.M., V.T., A.P. and E.A.; resources, A.A., A.T. and K.K.; data curation, A.A., T.M., V.T., A.P. and E.A.; writing—original draft preparation, A.A., T.M., V.T., A.P. and E.A.; writing—review and editing, A.A., T.M., V.T., A.P., E.A., A.T., P.T., P.H. and K.K.; visualization, A.A., T.M., V.T., A.P. and E.A.; supervision, A.A., A.T., P.T., P.H. and K.K.; project administration, A.A., A.T. and K.K.; funding acquisition, A.A. and A.T. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding author.

Acknowledgments

The authors would like to sincerely thank the Forestry Department of Evia.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Schwartz, K.R.; Minor, H.; Magro, C.; McConnell, J.; Capani, J.; Griffin, J.; Doebel, H. The neonicotinoid imidacloprid alone alters the cognitive behavior in Apis mellifera L. and the combined exposure of imidacloprid and Varroa destructor mites synergistically contributes to trial attrition. J. Apic. Res. 2020, 60, 431–438. [Google Scholar] [CrossRef]
  2. Urban, J.M.; Broome, R. Beyond pollination: Honey bees (Apis mellifera) as zootherapy keystone species. Front. Ecol. Evol. 2018, 6, 161. [Google Scholar] [CrossRef]
  3. Worthy, S.H.; Acorn, J.H.; Frost, C.M. Honey bees (Apis mellifera) modify plant-pollinator network structure, but do not alter wild species’ interactions. PLoS ONE 2023, 18, e0287332. [Google Scholar] [CrossRef] [PubMed]
  4. Dáttilo, W.; Cruz, C.P.; Luna, P.; Ratoni, B.; Hinojosa-Díaz, I.A.; Neves, F.S.; Leponce, M.; Villalobos, F.; Guevara, R. The impact of the honeybee Apis mellifera on the organization of pollination networks is positively related with its interactive role throughout its geographic range. Diversity 2022, 14, 917. [Google Scholar] [CrossRef]
  5. Cunningham-Minnick, M.J.; Milam, J.; Fassler, A.; King, D.I. Best management practices for bee conservation in forest openings. Conserv. Sci. Pract. 2024, 6, e13231. [Google Scholar] [CrossRef]
  6. Brunet, J.; Fragoso, F.P. What are the main reasons for the worldwide decline in pollinator populations? CAB Rev. 2024, 19, 0016. [Google Scholar] [CrossRef]
  7. Massah, D.O.; Adamou, M.; Nukenine, N.E.; Kosini, D.; Yatahaï, C.M.; Tchoubou, S.A.; Mohammadou, M.; Youssoupha, O. Effects of Botanical Extracts on Foraging and Pollination Activity of Apis Mellifera (Hymenoptera: Apidae) on Glycine Max (Fabaceae) Flowers at Bini (Ngaoundere, Cameroon). Am. J. Agric. Sci. Eng. Technol. 2023, 7, 37–46. [Google Scholar] [CrossRef]
  8. Cavigliasso, P.; Bello, F.; Rivadeneira, M.F.; Monzon, N.O.; Gennari, G.P.; Basualdo, M. Pollination efficiency of managed bee species (Apis mellifera and Bombus pauloensis) in highbush blueberry (Vaccinium corymbosum) productivity. J. Hort. Res. 2020, 28, 25–34. [Google Scholar] [CrossRef]
  9. Zhang, K.; Li, Y.; Sun, K.; Bao, J.; He, C.; Hou, X. Supplementary honey bee (Apis mellifera L.) pollination enhances fruit growth rate and fruit yield in Paeonia ostii (family: Paeoniaceae). PLoS ONE 2022, 17, e0272921. [Google Scholar] [CrossRef]
  10. Tsingalia, H.M.; Mandela, H.K. Effects of precipitation and temperature on the species composition and pollinator efficiency of Ocimum kilimandscharicum flower visitors in Kakamega forest ecosystem. Open J. Environ. Biol. 2023, 8, 39–47. [Google Scholar] [CrossRef]
  11. IPBES. The Assessment Report of the Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services on Pollinators, Pollination and Food Production; Potts, S.G., Imperatriz-Fonseca, V.L., Ngo, H.T., Eds.; Secretariat of the Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services; IPBES: Bonn, Germany, 2016; 552p. [Google Scholar] [CrossRef]
  12. Gounari, S.; Zotos, C.E.; Dafnis, S.D.; Moschidis, G.; Papadopoulos, G.K. On the impact of critical factors to honeydew honey production: The case of Marchalina hellenica and pine honey. J. Apic. Res. 2021, 62, 383–393. [Google Scholar] [CrossRef]
  13. Dafnis, S.D.; Gounari, S.; Zotos, C.E.; Papadopoulos, G.K. The effect of cold periods on the biological cycle of Marchalina hellenica. Insects 2022, 13, 375. [Google Scholar] [CrossRef]
  14. Oğuzoğlu, Ş.; Avcı, M.; İpekdal, K. Predators of the giant pine scale, Marchalina hellenica (Gennadius 1883; Hemiptera: Marchalinidae), out of its natural range in Turkey. Open Life Sci. 2021, 16, 682–694. [Google Scholar] [CrossRef] [PubMed]
  15. Evelpidou, N.; Tzouxanioti, M.; Gavalas, T.; Spyrou, E.; Saitis, G.; Petropoulos, A.; Karkani, A. Assessment of fire effects on surface runoff erosion susceptibility: The case of the summer 2021 forest fires in Greece. Land 2022, 11, 21. [Google Scholar] [CrossRef]
  16. Oxford Analytica. August wildfires will force Greek rethink on climate. Expert Brief. 2021. [Google Scholar] [CrossRef]
  17. Alexiou, S.; Deligiannakis, G.; Pallikarakis, A.; Papanikolaou, I.; Psomiadis, E.; Reicherter, K. Comparing high accuracy t-LiDAR and UAV-SfM derived point clouds for geomorphological change detection. ISPRS Int. J. Geo-Inf. 2021, 10, 367. [Google Scholar] [CrossRef]
  18. Valkanou, K.; Karymbalis, E.; Bathrellos, G.; Skilodimou, H.; Tsanakas, K.; Papanastassiou, D.; Gaki-Papanastassiou, K. Soil loss potential assessment for natural and post-fire conditions in Evia Island, Greece. Geosciences 2022, 12, 367. [Google Scholar] [CrossRef]
  19. Tarbill, G.L.; White, A.M.; Sollmann, R. Response of pollinator taxa to fire is consistent with historic fire regimes in the Sierra Nevada and mediated through floral richness. Ecol. Evol. 2023, 13, e10761. [Google Scholar] [CrossRef]
  20. Tarbill, G.L.; White, A.M.; Sollmann, R. Floral richness drives pollinator diversity after fire in upland forests and meadows of the Sierra Nevada, California. Insect Conserv. Divers. 2024, 77, 187. [Google Scholar] [CrossRef]
  21. Grushecky, S.T.; Knopka, S.C.; Owen, S.F.; Edwards, J.W. Enhancing pollinator habitats: The role of reclaimed natural gas pipeline rights-of-way in the Appalachian region. J. Environ. Manag. 2024, 300, 123–134. [Google Scholar] [CrossRef]
  22. Escobedo-Kenefic, N.; Cardona, E.; Arizmendi, M.d.C.; Domínguez, C.A. Do forest reserves help maintain pollinator diversity and pollination services in tropical agricultural highlands? A case study using Brassica rapa as a model. Front. Bee Sci. 2024, 2, 1393431. [Google Scholar] [CrossRef]
  23. Shokati, H.; Mashal, M.; Noroozi, A.; Abkar, A.A.; Mirzaei, S.; Mohammadi-Doqozloo, Z.; Taghizadeh-Mehrjardi, R.; Khosravani, P.; Nabiollahi, K.; Scholten, T. Random Forest-Based Soil Moisture Estimation Using Sentinel-2, Landsat-8/9, and UAV-Based Hyperspectral Data. Remote Sens. 2024, 16, 1962. [Google Scholar] [CrossRef]
  24. Brauchler, M.; Stoffels, J.; Nink, S. Extension of an Open GEOBIA Framework for Spatially Explicit Forest Stratification with Sentinel-2. Remote Sens. 2022, 14, 727. [Google Scholar] [CrossRef]
  25. Antoniadis, K.; Georgopoulos, N.; Katagis, T.; Stavrakoudis, D.; Gitas, I.Z. Classification of seasonal Sentinel-2 imagery for mapping vegetation in Mediterranean ecosystems. In Proceedings of the Ninth International Conference on Remote Sensing and Geoinformation of the Environment (RSCy2023), Ayia Napa, Cyprus, 3–5 April 2023; p. 1278609. [Google Scholar] [CrossRef]
  26. Malinowski, R.; Lewiński, S.; Rybicki, M.; Gromny, E.; Jenerowicz, M.; Krupiński, M.; Nowakowski, A.; Wojtkowski, C.; Krupiński, M.; Krätzschmar, E.; et al. Automated Production of a Land Cover/Use Map of Europe Based on Sentinel-2 Imagery. Remote Sens. 2020, 12, 3523. [Google Scholar] [CrossRef]
  27. Vasilakos, C.; Kavroudakis, D.; Georganta, A. Machine Learning Classification Ensemble of Multitemporal Sentinel-2 Images: The Case of a Mixed Mediterranean Ecosystem. Remote Sens. 2020, 12, 2005. [Google Scholar] [CrossRef]
  28. Guisao-Betancur, A.; Gómez Déniz, L.; Marulanda-Tobón, A. Forest/Nonforest Segmentation Using Sentinel-1 and -2 Data Fusion in the Bajo Cauca Subregion in Colombia. Remote Sens. 2024, 16, 5. [Google Scholar] [CrossRef]
  29. Host, T.K.; Russell, M.B.; Windmuller-Campione, M.A.; Slesak, R.A.; Knight, J.F. Ash Presence and Abundance Derived from Composite Landsat and Sentinel-2 Time Series and Lidar Surface Models in Minnesota, USA. Remote Sens. 2020, 12, 1341. [Google Scholar] [CrossRef]
  30. Ronneberger, O.; Fischer, P.; Brox, T. U-Net: Convolutional Networks for Biomedical Image Segmentation. In Medical Image Computing and Computer-Assisted Intervention—MICCAI 2015; Navab, N., Hornegger, J., Wells, W.M., Frangi, A.F., Eds.; Springer International Publishing: Cham, Switzerland, 2015; pp. 234–241. [Google Scholar]
  31. Siregar, O.R.; Sasongko, P.S.; Endah, S.N. Optic disc segmentation on eye retinal image with U-Net convolutional neural network architecture. In Proceedings of the 2021 5th International Conference on Informatics and Computational Sciences (ICICoS), Semarang, Indonesia, 24–25 November 2021; IEEE: New York, NY, USA, 2021; pp. 69–74. [Google Scholar] [CrossRef]
  32. Komosar, A.; Stefanović, D.; Sladojević, S. An overview of image processing in biomedicine using U-Net convolutional neural network architecture. J. Comput. Forensic Sci. 2024, 3, 5–20. [Google Scholar] [CrossRef]
  33. D’Alessandro, V.I.; Palermo, L.; Attivissimo, F.; Di Nisio, A.; Lanzolla, A.M.L. U-Net convolutional neural network for multisource heterogeneous iris segmentation. In Proceedings of the 2023 IEEE International Symposium on Medical Measurements and Applications (MeMeA), Jeju, Republic of Korea, 14–16 June 2023; IEEE: New York, NY, USA, 2023; pp. 1–5. [Google Scholar] [CrossRef]
  34. Motyl, M.; Madej, Ł. Supervised pearlitic–ferritic steel microstructure segmentation by U-Net convolutional neural network. Arch. Civ. Mech. Eng. 2022, 22, 206. [Google Scholar] [CrossRef]
  35. Erwin, A.S.; Desiani, A.; Suprihatin, B.; Fathoni. The augmentation data of retina image for blood vessel segmentation using U-Net convolutional neural network method. Int. J. Comput. Intell. Appl. 2022, 21, 2250004. [Google Scholar] [CrossRef]
  36. Nguyen, H.M.; Demir, B.; Dalponte, M. A Weighted SVM-Based Approach to Tree Species Classification at Individual Tree Crown Level Using LiDAR Data. Remote Sens. 2019, 11, 2948. [Google Scholar] [CrossRef]
  37. Immitzer, M.; Atzberger, C.; Koukal, T. Tree Species Classification with Random Forest Using Very High Spatial Resolution 8-Band WorldView-2 Satellite Data. Remote Sens. 2012, 4, 2661–2693. [Google Scholar] [CrossRef]
  38. Zhong, L.; Dai, Z.; Fang, P.; Cao, Y.; Wang, L. A Review: Tree Species Classification Based on Remote Sensing Data and Classic Deep Learning-Based Methods. Forests 2024, 15, 852. [Google Scholar] [CrossRef]
  39. Harmon, I.; Weinstein, B.; Bohlman, S.; White, E.; Wang, D.Z. A Neuro-Symbolic Framework for Tree Crown Delineation and Tree Species Classification. Remote Sens. 2024, 16, 4365. [Google Scholar] [CrossRef]
  40. Wang, X.; Yang, N.; Liu, E.; Gu, W.; Zhang, J.; Zhao, S.; Sun, G.; Wang, J. Tree Species Classification Based on Self-Supervised Learning with Multisource Remote Sensing Images. Appl. Sci. 2023, 13, 1928. [Google Scholar] [CrossRef]
  41. Ni-Meister, W.; Albanese, A.; Lingo, F. Assessing Data Preparation and Machine Learning for Tree Species Classification Using Hyperspectral Imagery. Remote Sens. 2024, 16, 3313. [Google Scholar] [CrossRef]
  42. Kamga, G.A.F.; Bouroubi, Y.; Germain, M.; Mbom, A.M.; Chagnon, M. Expert knowledge-based modelling approach for mapping beekeeping suitability area. Ecol. Inform. 2024, 80, 102530. [Google Scholar] [CrossRef]
  43. Marnasidis, S.; Kantartzis, A.; Malesios, C.; Hatjina, F.; Arabatzis, G.; Verikouki, E. Mapping priority areas for apiculture development with the use of geographical information systems. Agriculture 2021, 11, 182. [Google Scholar] [CrossRef]
  44. Gerula, D.; Gąbka, J. The Effect of Land Cover on the Nectar Collection by Honeybee Colonies in Urban and Rural Areas. Appl. Sci. 2025, 15, 4497. [Google Scholar] [CrossRef]
  45. Amiri, F.; Shariff, A.R.B.M. Application of geographic information systems in land-use suitability evaluation for beekeeping: A case study of Vahregan watershed (Iran). Afr. J. Agric. Res. 2012, 7, 89–97. [Google Scholar] [CrossRef]
  46. Abou-Shaara, H.F. Wintering map for honey bee colonies in El-Behera governorate, Egypt by using Geographical Information System (GIS). J. Appl. Sci. Environ. Manag. 2013, 17, 403–408. [Google Scholar] [CrossRef]
  47. Abou-Shaara, H.F. GIS analysis to locate more suitable wintering areas for honey bee colonies in agricultural and desert lands. Afr. Entomol. 2021, 29, 405–413. [Google Scholar] [CrossRef]
  48. Abou-Shaara, H.F.; Al-Ghamdi, A.A.; Mohamed, A.A. A suitability map for keeping honey bees under harsh environmental conditions using geographical information system. World Appl. Sci. J. 2013, 22, 1099–1105. [Google Scholar]
  49. Abou-Shaara, H.F.; Eid, K.S. Increasing the profitability of propolis production in honey bee colonies by utilizing remote sensing techniques to spot locations of trees as potential sources of resin. Remote Sens. Lett. 2019, 10, 922–927. [Google Scholar] [CrossRef]
  50. Antonopoulos, A.; Gounari, O.; Falagas, A.; Tsagkarakis, A.; Karantzalos, K. Mapping bee-keeping forest plants from medium spatial resolution multispectral satellite data. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2023, 48, 1839–1845. [Google Scholar] [CrossRef]
  51. Papachristoforou, A.; Prodromou, M.; Hadjimitsis, D.; Christoforou, M. Detecting and distinguishing between apicultural plants using UAV multispectral imaging. PeerJ 2023, 11, e15065. [Google Scholar] [CrossRef]
  52. Adgaba, N.; Alghamdi, A.; Sammoud, R.; Shenkute, A.; Tadesse, Y.; Ansari, M.J.; Hepburn, C. Determining spatio-temporal distribution of bee forage species of Al-Baha region based on ground inventorying supported with GIS applications and Remote Sensed Satellite Image analysis. Saudi J. Biol. Sci. 2017, 24, 1038–1044. [Google Scholar] [CrossRef]
  53. European Environment Agency (EEA). Natura 2000 Standard Data Form–Site Code: GR2420002. Natura 2000 Database, Release 55. Available online: https://natura2000.eea.europa.eu/Natura2000/sdf/#/sdf?site=GR2420002&release=55&nav=7 (accessed on 11 August 2025).
  54. Karakizi, C. Land Cover and Crop Type Mapping at National Scale from Multitemporal High Resolution Satellite Data; National Technical University of Athens (NTUA), School of Rural, Surveying & Geomatics Engineering, Remote Sensing Laboratory: Athens, Greece, 2022. [Google Scholar]
  55. Tananaki, C.; Rodopoulou, M.-A.; Dimou, M.; Kanelis, D.; Liolios, V. The Total Phenolic Content and Antioxidant Activity of Nine Monofloral Honey Types. Appl. Sci. 2024, 14, 4329. [Google Scholar] [CrossRef]
  56. European Environment Agency. Dominant Leaf Type 2018–Present (Raster 10 m), Europe, Yearly, Nov. 2024; Copernicus Land Monitoring Service (CLMS), Ed. 01.00, Citation Identifier: copernicus_r_3035_10_m_dlt_p_2018-now_v01_r00. 2024. Available online: https://sdi.eea.europa.eu/catalogue/srv/api/records/82f93572-9888-47ef-97a1-5cac5985a26a?language=all (accessed on 10 August 2025).
  57. Miranda, E.; Mutiara, A.B.; Ernastuti; Wibowo, W.C. Forest Classification Method Based on Convolutional Neural Networks and Sentinel-2 Satellite Imagery. Int. J. Fuzzy Inf. Syst. 2019, 19, 272–282. [Google Scholar] [CrossRef]
  58. Zhang, T.; Su, J.; Xu, Z.; Luo, Y.; Li, J. Sentinel-2 Satellite Imagery for Urban Land Cover Classification by Optimized Random Forest Classifier. Appl. Sci. 2021, 11, 543. [Google Scholar] [CrossRef]
  59. Liu, P.; Ren, C.; Wang, Z.; Jia, M.; Yu, W.; Ren, H.; Xia, C. Evaluating the Potential of Sentinel-2 Time Series Imagery and Machine Learning for Tree Species Classification in a Mountainous Forest. Remote Sens. 2024, 16, 293. [Google Scholar] [CrossRef]
  60. Chowdhury, M.; Saifullah, S.M.; Rahman, M.S. Leveraging Machine Learning Algorithms and Sentinel-2 Satellite Imagery for Land Use Land Cover Classification of Bangladesh Using Google Earth Engine. In Proceedings of the 2024 27th International Conference on Computer and Information Technology (ICCIT), Cox’s Bazar, Bangladesh, 20–22 December 2024; pp. 3384–3389. [Google Scholar] [CrossRef]
  61. Ma, J.; Shen, H.; Cai, Y.; Zhang, T.; Su, J.; Chen, W.-H.; Li, J. UCTNet with Dual-Flow Architecture: Snow Coverage Mapping with Sentinel-2 Satellite Imagery. Remote Sens. 2023, 15, 4213. [Google Scholar] [CrossRef]
  62. Huete, A.R.; Didan, K.; Miura, T.; Rodriguez, E.P.; Gao, X.; Ferreira, L.G. Overview of the radiometric and biophysical performance of the MODIS vegetation indices. Remote Sens. Environ. 2002, 83, 195–213. [Google Scholar] [CrossRef]
  63. Mulkal, R.; Aflah, N.; Muchlis. Exploring the role of spectral indices on improving land cover classification accuracy based on Sentinel-2 satellite imagery in Banda Aceh City, Indonesia. Indones. J. Environ. Sustain. 2024, 2, 96–107. [Google Scholar] [CrossRef]
  64. Mao, A.; Mohri, M.; Zhong, Y. Cross-Entropy Loss Functions: Theoretical Analysis and Applications. In Proceedings of the 40th International Conference on Machine Learning (ICML 2023), Honolulu, HI, USA, 23–29 July 2023; pp. 21892–21930. [Google Scholar]
  65. Bacandritsos, N.; Saitanis, C.J.; Papanastasiou, I. Morphology and life cycle of Marchalina hellenica (Gennadius) (Hemiptera: Margarodidae) on pine (Parnis Mt.) and fir (Helmos Mt.) forests of Greece. Ann. Societe Entomol. Fr. 2004, 40, 169–176. [Google Scholar] [CrossRef]
  66. Gajardo-Rojas, M.; Muñoz, A.A.; Barichivich, J.; Klock-Barría, K.; Gayo, E.M.; Fontúrbel, F.E.; Olea, M.; Lucas, C.M.; Veas, C. Declining Honey Production and Beekeeper Adaptation to Climate Change in Chile. Prog. Phys. Geogr. Earth Environ. 2022, 46, 737–756. [Google Scholar] [CrossRef]
  67. Enríquez de Salamanca, Á. Effects of Climate Change on Forest Regeneration in Central Spain. Atmosphere 2022, 13, 1143. [Google Scholar] [CrossRef]
  68. Bucur, D.; Păduraru, C.; Dinca, A. Determination of Forest Fire Risk with Respect to Marchalina hellenica Potential Distribution to Protect Pine Honey Production Sites in Turkey. Environ. Sci. Pollut. Res. 2024, 31, 57692–57703. [Google Scholar] [CrossRef]
  69. Vergarechea, M.; Calama, R.; Fortin, M.; del Río, M. Climate-Mediated Regeneration Occurrence in Mediterranean Pine Forests: A Modeling Approach. For. Ecol. Manag. 2019, 446, 10–19. [Google Scholar] [CrossRef]
  70. Erdozain, M.; Bonet, J.A.; Martínez de Aragón, J.; de Miguel, S. Forest Thinning and Climate Interactions Driving Early-Stage Regeneration Dynamics of Maritime Pine in Mediterranean Areas. For. Ecol. Manag. 2023, 539, 121036. [Google Scholar] [CrossRef]
  71. Flores, J.M.; Gil-Lebrero, S.; Gámiz, V.; Rodríguez, M.I.; Ortiz, M.A.; Quiles, F.J. Effect of Climate Change on Honey Bee Colonies in a Temperate Mediterranean Zone Assessed through a Remote Hive Weight Monitoring System in Conjunction with Exhaustive Colony Assessments. Sci. Total Environ. 2019, 653, 1111–1119. [Google Scholar] [CrossRef]
  72. Knoll, S.; Fadda, V.; Ahmed, F.; Cappai, M.G. The Nutritional Year-Cycle of Italian Honey Bees (Apis mellifera ligustica) in a Southern Temperate Climate. Agriculture 2024, 14, 730. [Google Scholar] [CrossRef]
  73. Alfieri, D.; Tognetti, R.; Santopuoli, G. Exploring Climate-Smart Forestry in Mediterranean Forests through an Innovative Composite Climate-Smart Index. J. Environ. Manag. 2024, 368, 122002. [Google Scholar] [CrossRef]
  74. Njimi, H.; Chehata, N.; Revers, F. Fusion of Dense Airborne LiDAR and Multispectral Sentinel-2 and Pleiades Satellite Imagery for Mapping Riparian Forest Species Biodiversity at Tree Level. Sensors 2024, 24, 1753. [Google Scholar] [CrossRef]
Figure 1. Overview of the hierarchical classification method employed in our study. Top level contains the filtered forest area selected for our study after the utilization of different layers. The second and last level incorporates broadleaf-coniferous stratification as well as species-level segmentation from the model.
Figure 1. Overview of the hierarchical classification method employed in our study. Top level contains the filtered forest area selected for our study after the utilization of different layers. The second and last level incorporates broadleaf-coniferous stratification as well as species-level segmentation from the model.
Agronomy 15 02858 g001
Figure 2. Collected UAV data during on-site collection process, (a) mixed forest covered area during winter period (Abies cephalonica and Castanea sativa) (b) Pinus spp. covered area.
Figure 2. Collected UAV data during on-site collection process, (a) mixed forest covered area during winter period (Abies cephalonica and Castanea sativa) (b) Pinus spp. covered area.
Agronomy 15 02858 g002
Figure 3. Classified map of honey-producing tree habitats in Evia (2020).
Figure 3. Classified map of honey-producing tree habitats in Evia (2020).
Agronomy 15 02858 g003
Figure 4. Qualitative examples of segmentation results for each class, illustrating True Positives (TP as green), False Positives (FP as yellow), and False Negatives (FN as orange). Each row corresponds to a specific class, and within each row, three representative cases are shown to highlight different scenarios of segmentation performance.
Figure 4. Qualitative examples of segmentation results for each class, illustrating True Positives (TP as green), False Positives (FP as yellow), and False Negatives (FN as orange). Each row corresponds to a specific class, and within each row, three representative cases are shown to highlight different scenarios of segmentation performance.
Agronomy 15 02858 g004
Figure 5. Examples of problematic cases: (a) False Negative (FN) regions from the FIR class and (b) False Positive (FP) regions from the PINE class, illustrating instances where the model struggled to accurately segment the target areas. In case illustrated in (a) PINE trees are segmented as FIR and similarly case (b) is not correctly segmented as PINE. These cases represent large areas in our dataset and as a result a large impact in our results is observed.
Figure 5. Examples of problematic cases: (a) False Negative (FN) regions from the FIR class and (b) False Positive (FP) regions from the PINE class, illustrating instances where the model struggled to accurately segment the target areas. In case illustrated in (a) PINE trees are segmented as FIR and similarly case (b) is not correctly segmented as PINE. These cases represent large areas in our dataset and as a result a large impact in our results is observed.
Agronomy 15 02858 g005
Figure 6. Bar chart display of intra-temporal change, produced by model inference, in the area (ha) of each habitat type from 2020 to 2024.
Figure 6. Bar chart display of intra-temporal change, produced by model inference, in the area (ha) of each habitat type from 2020 to 2024.
Agronomy 15 02858 g006
Figure 7. Overlay of major forest fires events (2021–2024) on classified habitats, with insets highlighting additional burn scars.
Figure 7. Overlay of major forest fires events (2021–2024) on classified habitats, with insets highlighting additional burn scars.
Agronomy 15 02858 g007
Table 1. Detailed specification of the U-Net architecture used for species-level semantic segmentation. All convolutions use padding of 1 and stride of 1 unless otherwise stated. BN = Batch Normalization, ReLU = Rectified Linear Unit.
Table 1. Detailed specification of the U-Net architecture used for species-level semantic segmentation. All convolutions use padding of 1 and stride of 1 unless otherwise stated. BN = Batch Normalization, ReLU = Rectified Linear Unit.
StageOperationInput → Output ChannelsSpatial Size
Input 13 × H × W H × W
Encoder level 0 (inc)Conv + BN + ReLU 13 64 H × W
Conv + BN + ReLU 64 64 H × W
Conv + BN + ReLU 64 64 H × W
Down block 1MaxPool 64 64 H / 2 × W / 2
(Encoder lvl 1)Conv + BN + ReLU 64 128 H / 2 × W / 2
Conv + BN + ReLU 128 128 H / 2 × W / 2
Down block 2MaxPool 128 128 H / 4 × W / 4
(Encoder lvl 2)Conv + BN + ReLU 128 256 H / 4 × W / 4
Conv + BN + ReLU 256 256 H / 4 × W / 4
Down block 3MaxPool 256 256 H / 8 × W / 8
(Encoder lvl 3)Conv + BN + ReLU 256 256 H / 8 × W / 8
Conv + BN + ReLU 256 256 H / 8 × W / 8
Up block 1Upsample (bilinear) 256 256 H / 4 × W / 4
(Decoder lvl 3)Concatenate (skip) ( 256 , 256 ) 512 H / 4 × W / 4
Conv + BN + ReLU 512 128 H / 4 × W / 4
Conv + BN + ReLU 128 128 H / 4 × W / 4
Up block 2Upsample (bilinear) 128 128 H / 2 × W / 2
(Decoder lvl 2)Concatenate (skip) ( 128 , 128 ) 256 H / 2 × W / 2
Conv + BN + ReLU 256 64 H / 2 × W / 2
Conv + BN + ReLU 64 64 H / 2 × W / 2
Up block 3Upsample (bilinear) 64 64 H × W
(Decoder lvl 1)Concatenate (skip) ( 64 , 64 ) 128 H × W
Conv + BN + ReLU 128 64 H × W
Conv + BN + ReLU 64 64 H × W
Output layerConv 64 1 H × W
Table 2. Spectral vegetation indices and their formulas.
Table 2. Spectral vegetation indices and their formulas.
IndexNameFormula
NDVINormalized Difference Vegetation Index NDVI = NIR RED NIR + RED
EVIEnhanced Vegetation Index EVI = G · NIR RED NIR + C 1 · RED C 2 · BLUE + L
MCARIModified Chlorophyll Absorption Ratio Index MCARI = ( R 700 R 670 ) 0.2 ( R 700 R 550 ) · R 700 R 670
SAVISoil-Adjusted Vegetation Index SAVI = ( NIR RED ) ( 1 + L ) NIR + RED + L
NDMINormalized Difference Moisture Index NDMI = NIR SWIR NIR + SWIR
Table 3. Ablation study results and contribution of indices in different combinations. Selected index and corresponding results are highlighted in bold.
Table 3. Ablation study results and contribution of indices in different combinations. Selected index and corresponding results are highlighted in bold.
Model InputAccuracyF1-Score
ConiferousBroadleafConiferousBroadleaf
NDVI90.493.576.489.9
NDVI + SAVI79.080.271.374.4
NDVI + MCARI79.880.373.575.3
NDVI + NDMI88.691.179.189.8
EVI87.084.884.178.6
EVI + SAVI78.674.676.277.3
EVI + MCARI85.989.284.487.3
EVI + NDMI90.989.474.788.7
Table 4. Overall accuracy and F1 comparison between classical classifiers, Random Forest (RF), Support Vector Machine (SVM) and our Hierarchical Framework (HF). Our framework is highlighted in bold.
Table 4. Overall accuracy and F1 comparison between classical classifiers, Random Forest (RF), Support Vector Machine (SVM) and our Hierarchical Framework (HF). Our framework is highlighted in bold.
ClassifierOverall Accuracy (%)F1-Score (%)Execution Time (s)
RF74.670.91548
SVM76.969.32495
HF92.183.61929
Table 5. Overall accuracy and F1 comparison between different semantic segmentation methods, Multi-class semantic segmentation (MCSS), Multi-class semantic segmentation with background (MCSS-B), Multi-class semantic segmentation confined only to forest areas (MCSS-F), and our Hierarchical Framework (HF), highlighted in bold.
Table 5. Overall accuracy and F1 comparison between different semantic segmentation methods, Multi-class semantic segmentation (MCSS), Multi-class semantic segmentation with background (MCSS-B), Multi-class semantic segmentation confined only to forest areas (MCSS-F), and our Hierarchical Framework (HF), highlighted in bold.
ClassifierOverall Accuracy (%)F1-Score (%)Execution Time (s)
MCSS77.876.41623
MCSS-B86.383.11565
MCSS-F89.587.81263
HF92.183.61929
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Antonopoulos, A.; Moumouris, T.; Tsironis, V.; Psalta, A.; Arapostathi, E.; Tsagkarakis, A.; Trigas, P.; Harizanis, P.; Karantzalos, K. Hierarchical Deep Learning Framework for Mapping Honey-Producing Tree Species in Dense Forest Ecosystems Using Sentinel-2 Imagery. Agronomy 2025, 15, 2858. https://doi.org/10.3390/agronomy15122858

AMA Style

Antonopoulos A, Moumouris T, Tsironis V, Psalta A, Arapostathi E, Tsagkarakis A, Trigas P, Harizanis P, Karantzalos K. Hierarchical Deep Learning Framework for Mapping Honey-Producing Tree Species in Dense Forest Ecosystems Using Sentinel-2 Imagery. Agronomy. 2025; 15(12):2858. https://doi.org/10.3390/agronomy15122858

Chicago/Turabian Style

Antonopoulos, Athanasios, Tilemachos Moumouris, Vasileios Tsironis, Athena Psalta, Evangelia Arapostathi, Antonios Tsagkarakis, Panayiotis Trigas, Paschalis Harizanis, and Konstantinos Karantzalos. 2025. "Hierarchical Deep Learning Framework for Mapping Honey-Producing Tree Species in Dense Forest Ecosystems Using Sentinel-2 Imagery" Agronomy 15, no. 12: 2858. https://doi.org/10.3390/agronomy15122858

APA Style

Antonopoulos, A., Moumouris, T., Tsironis, V., Psalta, A., Arapostathi, E., Tsagkarakis, A., Trigas, P., Harizanis, P., & Karantzalos, K. (2025). Hierarchical Deep Learning Framework for Mapping Honey-Producing Tree Species in Dense Forest Ecosystems Using Sentinel-2 Imagery. Agronomy, 15(12), 2858. https://doi.org/10.3390/agronomy15122858

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop