Next Article in Journal
Plant Protection Products to Control Alternaria Brown Spot Caused by Alternaria alternata in Citrus: A Systematic Review
Previous Article in Journal
Valorization of Vineyard By-Products Through Vermicomposting: A Comparative Pilot-Scale Study with Eisenia fetida and Eisenia andrei
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Fusion of Medium- and High-Resolution Remote Images for the Detection of Stress Levels Associated with Citrus Sooty Mould

by
Enrique Moltó
1,*,
Marcela Pereira-Sandoval
1,
Héctor Izquierdo-Sanz
1 and
Sergio Morell-Monzó
2
1
Instituto Valenciano de Investigaciones Agrarias (IVIA), Centro de Agroingeniería, Carretera CV-315, Km 10.7, 46113 Moncada, Spain
2
Escuela Politécnica Superior de Gandía, Universitat Politècnica de València (Polytechnical University of Valencia), 46730 Gandía, Spain
*
Author to whom correspondence should be addressed.
Agronomy 2025, 15(6), 1342; https://doi.org/10.3390/agronomy15061342
Submission received: 30 April 2025 / Revised: 27 May 2025 / Accepted: 29 May 2025 / Published: 30 May 2025

Abstract

Citrus sooty mould caused by Capnodium spp. alters the quality of fruits on the tree and affects their productivity. Past laboratory and hand-held spectrometry tests have concluded that sooty mould exhibits a typical spectral response in the near-infrared spectrum region. For this reason, this study aims at developing an automatic method for remote sensing of this disease, combining 10 m spatial resolution Sentinel-2 satellite images and 0.25 m spatial resolution orthophotos to identify sooty mould infestation levels in small orchards, common in Mediterranean conditions. Citrus orchards of the Comunitat Valenciana region (Spain) underwent field inspection in 2022 during two months of minimum (August) and maximum (October) infestation. The inspectors categorised their observations according to three levels of infestation in three representative positions of each orchard. Two synthetic images condensing the monthly information were generated for both periods. A filtering algorithm was created, based on high-resolution images, to select informative pixels in the lower resolution images. The data were used to evaluate the performance of a Random Forest classifier in predicting intensity levels through cross-validation. Combining the information from medium- and high-resolution images improved the overall accuracy from 0.75 to 0.80, with mean producer’s accuracies of above 0.65 and mean user’s accuracies of above 0.78. Bowley–Yule skewness coefficients were +0.50 for the overall accuracy and +0.28 for the kappa index.

1. Introduction

Capnodium spp. produces dark, threadlike mycelium on substrate rich in sugars, such as honeydews. These are produced by pests that are present in the region, such as aphids, whiteflies, mealybugs, and scales, which secrete honeydew after feeding, making sooty mould grow and expand. Excessive proliferation of sooty mould may have a significant economic impact on the industry. EU marketing standards indicate that for the fruit to be considered ‘Extra’ grade, and therefore a higher priced product on the market, homogeneity in colouring is required [1]. For this reason, current practice is to meticulously wash the fruit before marketing it, thus increasing production costs.
Although sooty mould is toxic neither to plants nor to humans, it reduces the photosynthetic capacity of the plant [2]. This reduction results in reduced plant vigour and increased photosynthetic stress and therefore lower yields and lower quality fruit.
Traditionally, studies of infestation levels of pest and diseases have been carried out in the laboratory using field samples and conventional microscopy. The Valencian Institute for Agricultural Research (IVIA) has been a pioneer in using electronic sensors for the detection of pests and diseases of citrus [3,4,5]. These studies assessed how different pests and diseases affecting the fruit skin caused changes in how it reflected light, both in visible and near-infrared (NIR) parts of the spectrum. This research helped develop a method to detect sooty mould on fruit with 82% accuracy using multispectral cameras.
Summy and Little [6] used colour and NIR imaging and spectrometry of leaves to detect sooty mould in several varieties of orange trees. Their studies showed that honeydew accumulation leads to significant increases in the NIR/red ratio and that honeydew-covered leaves absorb a considerable amount of light around 850 nm, probably due to melanin of the fungal cell walls.
Sims [7] conducted spectral measurements using spectrophotometers and chlorophyll metres on whitefly (Bemisia tabaci)-affected and fungus-infected leaves in cassava plantations. They used various spectral indexes in the red and NIR range to identify different physiological processes linked to the photosynthetic activity in the leaf and related them with infestations of different fungi.
The spectral index used to detect photosynthetical changes include: the modified chlorophyll absorption ratio index (MCARI) [8], which is sensitive to the relative abundance of chlorophyll; the photochemical reflectance index (PRI) [9] related to light use efficiency; and the carotenoid reflectance index (CRI550) [10], linked to the concentration of carotenoids relative to chlorophyll, which can be interpreted as a measure of plant stress. Other broadband vegetation indexes, such as the normalised difference vegetation index (NDVI) [11] and the enhanced two-band vegetation index (EVI2) [12], are also reported in the literature to detect photosynthetical changes and for potential use in remote sensing.
The development of remote sensing for crop monitoring, based on satellites and drones, offers numerous advantages. On the one hand, it offers the possibility of covering large surfaces with minimal sampling time, but also at relatively short revisit frequency. For instance, the Sentinel mission of the European Union’s Copernicus Programme can provide satellite images every 5 days [13].
However, the relatively low spatial resolution of satellite sensors presents several challenges, especially in many European regions such as the Comunitat Valenciana, where agricultural areas are highly fragmented. In these areas, agriculture is predominantly managed by smallholders, whose land is often divided into several non-contiguous orchards, typically smaller than 1 ha. As a result, boundary pixels make up a large proportion of each field and may include various sources of noise, such as fences, roads, and the reflectance of adjacent crops. Moreover, while arable crop fields tend to be larger and more homogeneous, orchards are smaller, and their trees are usually surrounded by bare soil or by natural or sown vegetation covers. Consequently, the signal captured by satellites from within orchards is often a mixture of reflectances from different plant species and exposed soil.
To address the shortcomings of standalone satellite, UAV, and ground-based datasets, researchers are increasingly adopting data-fusion techniques. By integrating imagery from diverse platforms and ground observations, data fusion leverages the spatial detail, temporal frequency, and spectral richness of each input source [14]. In this sense, Moltó [15] proposed a method for merging images with different temporal and spatial resolutions and with different degrees of spectral quality by combining images from Sentinel 2, orthophotos, and drones.
Similarly, the combined use of imagery from multiple sources with varying spatial resolutions has been explored in previous studies to enhance sooty mould detection. For example, Fletcher [16] integrated hyperspectral, Red, Green, and Blue (RGB), and NIR aerial images at spatial resolutions of 2.44 m and 0.61 m, respectively, to detect citrus sooty mould, achieving a high level of spatial detail (<1 m) and enabling the identification of affected areas in orchards of approximately 2 ha. In contrast, Olsson [17] adopted a multitemporal approach based on vegetation index time series derived from medium and coarse resolution imagery. Specifically, they utilised NDVI and the Green Normalised Difference Vegetation Index (GNDVI) obtained from 10 m SPOT data and 250 m MODIS imagery to monitor sooty mould infestation associated with Physokermes inopinatus in spruce (Picea abies) forests across Sweden. Their method demonstrated the capability to detect defoliation and discoloration patterns, successfully identifying 78% of the affected area within a 3000 km2 region. However, the system tended to overestimate damage extent by approximately 46%, highlighting the trade-off between spatial coverage and detection precision.
The aim of this study is to classify the severity of citrus sooty mould infestation using spectral indices related to photosynthetic stress, by applying image fusion techniques that combine medium spatial resolution satellite imagery with high spatial resolution orthophotos.

2. Materials and Methods

2.1. Study Area and Field Monitoring

The study area covers 180 ha and is located in Valencia, Spain (−0.60872° W to −0.58186° E, 39.57955° N to 39.56421° S); it is characterised by a high prevalence of citrus, fruit, and vegetable farms. Monitoring of citrus sooty mould was conducted by expert entomologists from IVIA (Valencian Institute of Agricultural Research) through field visits on 2 August and on 19 October 2022. These dates were selected to capture the initial appearance of sooty mould (August) and its peak development before harvest (October).
The experts provided georeferenced data, consisting of one to three representative observation points at various locations within each orchard. Each observation was assumed to represent an area range of about 200 m2. Three levels of infestation were defined arbitrarily: 0—no visible presence, 1—incipient infestation, and 2—abundant presence.
In total, 33 orchards were surveyed, resulting in 37 observation points in August and 69 in October. While some orchards were sampled at both dates, observations were not taken from the exact same locations. This yielded a georeferenced dataset of 106 sampling points, each annotated with its corresponding infestation level. Figure 1 shows in detail a region of the study area.

2.2. Overall Image Processing

Sentinel-2 Multispectral Instrument (MSI) Level-2A satellite images (S2-MSI-L2A) were used in this study. MSI provides multispectral imagery at spatial resolutions of 10 m for visible and NIR bands, 20 m for red-edge and shortwave infrared (SWIR) bands, and 60 m for atmospheric correction bands. The mission offers a revisit frequency of approximately five days by combining data from the Sentinel-2A and Sentinel-2B satellites. Particularly relevant for vegetation monitoring are the three narrow red-edge bands (centred at 705, 740, and 783 nm), located between the red and NIR regions of the spectrum and provided at 20 m resolution. These features make Sentinel-2 especially suitable for detecting subtle changes in vegetation condition and photosynthetic activity [18].
The image processing workflow was structured into three steps:
  • Exploratory analysis: to assess the temporal evolution of spectral reflectance across all bands and various vegetation indices, to identify those more correlated with the presence of sooty mould.
  • Band selection and generation of condensed synthetic images: based on the previous step, specific bands and indices were selected to generate two synthetic multiband images representing the months when surveys were conducted (August and October).
  • Pixel selection: Pixels at 10 m resolution often include not only citrus canopies but also weeds, vegetation covers, or bare soil. A filtering process based on orthophotos with a spatial resolution of 0.25 m was applied to remove mostly mixed pixels.

2.3. Exploratory Analysis

The period analysed was from 1 March 2022 to 22 October 2022, corresponding to the beginning of spring and the last sampling date. Time series of S2-MSI-L2A images were filtered to remove cloud pixels and cloud shadows using the Scene Classification Layer (SCL band), a quality assurance band provided with S2-MSI-L2A products that classifies each pixel into different surface types (e.g., clouds, shadows, vegetation, etc.). Additionally, pixels with NIR and red-edge reflectance values lower than 2000, potentially indicating poor-quality data, such as shadows, haze, or very dark surfaces, were also excluded.
Furthermore, since orchard edge pixels often contain noisy information, they were filtered using field boundaries obtained from the Land Parcel Identification System (SIGPAC, Sistema de Información Geográfica de Parcelas Agrícolas), a geographic information system used in Spain to identify agricultural plots eligible for EU agricultural subsidies.
Several vegetation spectral indexes were generated in a subsequent step (Table 1).
The exploratory analysis involved studying the temporal evolution of the average response of S2-MSI-L2A bands and various vegetation indices in relation to sooty mould infestation at the sampling points. Signals exhibiting a logically ordered evolution, either increasing or decreasing with respect to the infestation level, were selected.
Figure 2a–d show examples of the temporal evolution of the average signals from the NIR band, TGI, red band, and NDVI, respectively. A presumably appropriate ordered pattern is observed in the NIR reflectance (Figure 2a), where the signal is lower for higher infestation levels from July onwards. The TGI index evolution (Figure 2b) shows that levels 1 and 2 (orange and green lines) clearly distance themselves from level 0, particularly since mid-June. However, Figure 2c,d, corresponding to red and NDVI band reflectances, does not show any of these patterns.
After this process, B6, B7, B8A (all narrow red-edge bands), B8 (NIR), B11 (SWIR), and the spectral index TGI (red, green, blue) were selected.

2.4. Condensed, Synthetic Images Representing August and October

In order to condense the monthly information around the two sampling dates, two synthetic images were generated from 7-band (B6, B7, B8, B8A, B11 and TGI) images acquired 14 days before and after each sampling date. All the bands were resampled by bilinear interpolation at 10 m after the cloud and cloud shadow pixel removal. The value of each pixel in the synthetic image was the median of the resulting series.

2.5. Pixel Filtering Using Orthophotos

Institut Cartogràfic Valencià (ICV) [28] delivers a comprehensive set of high-resolution (0.25 m) multispectral RGBI orthophotos on an annual basis. These orthophotos are produced using aerial imagery captured during dedicated flights over the Valencian Community during May. The images are then processed, corrected, and georeferenced to generate updated, reliable representations of the territory.
These images were used to identify S2-MSI-L2A pixels containing at least an arbitrary percentage of citrus canopies. A 0.25 m spatial resolution NDVI image of the study area was generated from the orthophotos. NDVI values above 0.1 are typically indicative of vegetation presence. This threshold is often used to distinguish vegetated areas from bare soil or non-vegetated surfaces [29]. However, analysis of various agricultural crops demonstrated that the relationship between NDVI and canopy cover can differ based on crop species [30]. A threshold of 0.25 was set empirically to differentiate between plants and soil after testing in different areas of the orthophotos that positively included citrus trees. Figure 3 represents the resulting binary image, with yellow pixels indicating NDVI > 0.25 and black pixels indicating NDVI < 0.25. Subsequently, a morphological opening (erosion followed by dilation) was applied to mark isolated citrus canopies. Figure 3a shows the thresholded NDVI image, while Figure 3b,c depict the before and after images when applying the morphological filter to an arbitrarily selected area marked with a red box. In Figure 3b, the presence of vegetation covers between the rows and at the field edges can be observed. Figure 3c shows that only citrus canopies remain after the morphological opening.
A 10 × 10 m grid was superimposed on the image obtained in the previous step, with each grid cell corresponding to a S2-MSI-L2A pixel. Only those S2-MSI-L2A pixels containing more than 45% canopy cover were considered reliable. This threshold was arbitrarily selected after trial-and-error visual inspection of the results in different parts of the image containing citrus orchards not used in the study. Figure 4a displays the S2-MSI-L2A pixel grid overlaid on the binary image. The green box highlights the area shown in greater detail in Figure 4b. In this enlarged view, pixels with canopy cover greater than 45% are shown in orange and were selected for the next step, while pixels with less than 45% canopy cover appear in purple and were excluded from further analysis.

2.6. Classification and Accuracy Estimation

Throughout this study, it was assumed that each expert observation corresponds approximately to a circular area with an 8 m radius (around 200 m2). Therefore, all pixels deemed reliable according to the criteria defined in the previous section and located within a circle of 8 m radius centred on each sampling point were assigned the corresponding observed infestation class. Based on this approach, a dataset of 146 pixels was constructed using the two synthetic multiband images (August and October).
Given the limited dataset size, an iterative cross-validation strategy was employed to evaluate the classification system’s accuracy. Specifically, 30 iterations were performed, where in each run, a random training set comprising 80% of the data was used to train the model, and the remaining 20% was reserved for validation.
Figure 5 illustrates how pixels were assigned to training and validation sets in an iteration. The circular shapes in yellow and green represent field-sampled locations, with green indicating infestation level 0 and yellow indicating level 1. In each iteration, pixels selected for the training set are shown in white, while those used for validation are shown in blue.
Classifiers were constructed using the Random Forest (RF) algorithm. RF is an ensemble method that averages predictions across many decision trees, each trained on a bootstrapped subset of the data. This approach reduces variance and helps prevent overfitting, which is particularly important when working with limited data. RF also down-weights less relevant features, improving generalisation even with a small sample size. In addition, individual noisy data points are less likely to affect the overall model, as their influence is diluted across multiple trees. Since comparative analysis was beyond the scope of this research, no other classification method was employed.
For each cross-validation iteration, the confusion matrix was used to compute the overall classification accuracy and the kappa index [31]. Additionally, the average producer’s accuracy (i.e., the probability that a reference sample is correctly classified) and user’s accuracy (i.e., the probability that a pixel labelled as a certain class corresponds to that class on the ground) were calculated.
To further characterise the distribution of classification accuracy across iterations, the Bowley–Yule skewness coefficient [32] was computed. This coefficient is a robust, non-parametric measure of skewness based on the quartiles of the distribution, defined as Equation (1):
SkewnessBY = (Q3 + Q1 − 2Q2)/(Q3 − Q1)
where SkewnessBY is Bowley–Yule skewness coefficient, and Q1, Q2, and Q3 are the first, second (median), and third quartiles, respectively. A positive coefficient indicates a distribution skewed to the right (with a longer tail of higher values), while a negative coefficient indicates left skewness. Values close to zero mean that the distribution is nearly symmetrical.
To assess the effect of integrating high-resolution orthophoto data with satellite imagery, the entire classification process was repeated under two conditions: with and without the exclusion of S2-MSI-L2A pixels having less than 45% vegetation cover.

3. Results

As demonstrated below, the fusion of medium- and high-resolution imagery substantially enhances our ability to detect varying levels of infestation. In fact, across thirty-fold cross-validation, we observed marked improvements in overall accuracy, Cohen’s kappa values, and shifts in Bowley–Yule skewness coefficients that underscore the robustness and sensitivity gains afforded by multi-resolution data fusion.
Table 2 summarises the statistics of the overall accuracy and kappa values obtained with and without filtering the S2-MSI pixels with the orthophoto images. Classification using the Random Forest algorithm without image fusion yielded an average overall accuracy of 0.75 and a median value of 0.76. Based on the first (0.71) and third (0.78) quartiles, the Bowley–Yule skewness coefficient was −0.42, indicating a clear negative skewness, which suggests a higher frequency of below-average accuracy values. The average kappa index was 0.60, with a median of 0.61, and first and third quartiles of 0.53 and 0.65, respectively. This resulted in a Bowley–Yule skewness coefficient of −0.45, reflecting a clear negative skewness in the distribution of kappa values.
A general improvement in performance is observed: the average overall accuracy increases by 5 percentage points from 0.75 (without image fusion) to 0.80 (with image fusion), while the median rises substantially from 0.76 to 0.78. The Bowley–Yule skewness coefficient for overall accuracy is +0.50, indicating a clear positive skew, which suggests a higher frequency of above-average accuracy values.
Regarding the kappa index, which accounts for the agreement expected merely by chance, both the mean (0.67) and median (0.66) improve compared to the results without using the image fusion procedure (0.60 and 0.61, respectively). The skewness of the kappa distribution, with a Bowley–Yule coefficient of +0.28, also indicates a mild positive skew, albeit less pronounced than that observed for overall accuracy. The table also shows that the standard deviation of overall accuracy is below 0.10, reflecting a modest spread in the results, while that of the kappa index is slightly higher (0.13).
Table 3 presents the average producer and user accuracies obtained over the 30 iterations using and not using the proposed image fusion methodology. The results indicate improved classification performance using the image fusion procedure, particularly in the producer accuracy for infestation levels 1 and 0, which reach averages of 89% and 79%, respectively. However, the producer’s accuracy for level 2 is in both cases lower (65% with image fusion and 0.43% without), suggesting greater classification difficulty for this class. User’s accuracies are consistently higher across levels 0 and 2, with values ranging from 65% to 84% without image fusion and from 78% to 88% with image fusion. This indicates a stronger agreement in the latter case between predicted and observed classes from the user’s perspective.
Table 4 shows the aggregate confusion matrix over the results of the classifiers on all test sets without image fusion over the 30 iterations. Equivalent producer accuracies are reflected in brackets. An important confusion occurs between adjacent classes. Over the 30 iterations, 2.5% of the data classified as level 0 were actually level 2 and 14.3% of the data classified as level 2 were actually level 0, which compromises the practical use of the classifier.
Table 5 shows the aggregate confusion matrix over the results of the classifiers on all test sets using the proposed fusion procedure data over the 30 iterations. Again, equivalent producer accuracies are reflected in brackets. It is important to note that most of the confusion occurs between adjacent classes. Over the 30 iterations, only 1.9% of the data classified as level 0 were actually level 2 and 6.5% of the data classified as level 2 were actually level 0, thus considerably improving the performance.

4. Discussion

This study demonstrates the effectiveness of fusing freely available S2-MSI-L2A imagery with high-resolution orthophotos for detecting sooty mould in citrus orchards in a typical Mediterranean fragmented landscape. Three different infestation levels were defined and identified by experts through field visits in August and October 2022. The bands and indices most adapted to the presence of sooty mould were selected. Then, two synthetic images, one for each date, were generated with the purpose of condensing the spectral information around the sampling dates. A filtering preprocess was carried out by fusing these synthetic images with high-resolution images, allowing only selected pixels to be used to train and validate the classification algorithm. This has resulted in a clearly improved performance with respect to a non-fused approach.
Unlike Fletcher [16], who worked with commercial aerial imagery and suggested that fused images perform better in areas larger than 0.2 ha, this study succeeded in applying the fusion approach in a substantial number of smaller plots (average < 0.3 ha) by adopting a pixel evaluation strategy based on 10 × 10 m grids overlapping the high-resolution images. This adaptation broadens the applicability of image fusion techniques to fragmented agricultural landscapes.
The segmentation process based on NDVI thresholding and morphological filtering was key to isolating citrus canopy pixels. This allowed only S2-MSI-L2A pixels with more than 45% canopy cover to be retained for classification, significantly improving the reliability of the dataset. The classification results using Random Forest yielded consistent overall accuracies above 0.70, with an average of 0.80 after fusion, and kappa values also improved compared to the non-fusion case. The Bowley–Yule skewness coefficients calculated for both accuracy and kappa metrics reflect a slight to moderate positive skewness after fusion (+0.50 and +0.28, respectively), suggesting that the fusion process not only improves average performance but also results in more consistently high values across iterations. In contrast, the skewness values close to zero or slightly negative in the non-fused case suggest a more symmetric or even slightly left-skewed distribution, indicative of lower performance.
Compared with other studies, such as that of Olsson [17], who reported 78% accuracy using NDVI/GNDVI but with a 46% overestimation due to confusion with non-target species, this approach achieves comparable or better accuracy while reducing misclassification risk by incorporating morphological filtering and pixel selection based on canopy coverage. Moreover, instead of relying exclusively on spectral indices, this study employed multiband time series associated with photosynthetic stress. This strategy agrees with prior laboratory-based research by Blasco [3,4] and Moltó [5], which identified the NIR band as highly effective for detecting sooty mould symptoms in fruits.
It is important to remark that there is a lack of recent scientific literature related to sooty mould detection using satellite imagery, although the latest studies have taken proximal approaches, using images from surveillance home security cameras [33] or mobile phones [34]. Nevertheless, the importance of detecting and monitoring citrus pests and diseases is undeniable and is still under research. For instance, Della Bellver et al. [35] analysed the spectral differences between healthy plots and those affected by Delottococcus aberiae, a mealybug. Similarly, Vieira et al. [36] investigated, under laboratory conditions, how the bacterium Candidatus Liberibacter alters the reflectance profile of asymptomatic citrus leaves.
Despite the above-mentioned achievements, the study presents some limitations, primarily related to the relatively small sample size (146 observations). While the Random Forest algorithm is well-suited for small datasets, it remains challenging to fully eliminate spatial autocorrelation between training and validation sets. Although this issue was partially addressed through the cross-validation strategy, it may still result in overly optimistic accuracy estimates and limit the model’s generalizability. To enhance robustness and reliability, future research should include a larger number of samples distributed across more diverse spatial and temporal conditions. For instance, a larger number of sampling points distributed over a wider area could allow for a targeted train/test split where train and test pixels came from different sampling points, thus decreasing spatial autocorrelation.
Spectral unmixing techniques could also be explored in future research. By estimating the fractional contribution of different land cover components within each S2-MSI-L2A pixel, spectral unmixing could isolate the spectral signature of the citrus canopy more precisely. This approach might prove especially useful where background interference significantly affects pixel-level spectral responses.
It must be noticed that satellite imagery can highlight shifts in vegetation health but often lacks the spatial detail and contextual clues needed to pinpoint pest outbreaks. By integrating ground-based observations, weather records, historical infestation maps, and crop-type data, future work could gain a richer, multidimensional perspective that may distinguish pest-induced stress from abiotic factors.

5. Conclusions

This study presents a novel methodological approach to fuse medium spatial resolution satellite images with high spatial resolution orthophoto images. The combination of these two sources of information allowed for an improved identification of various levels of severity of citrus sooty mould infestation with respect to satellite data alone. This approach is particularly useful in highly fragmented landscapes, where medium spatial resolutions may not be sufficient to retrieve crop features. Moreover, the method has been developed using freely accessible remote images, thus providing a viable alternative to the use of higher resolution, commercial satellite images.

Author Contributions

Conceptualisation, E.M.; methodology, E.M.; software, E.M.; validation, E.M.; formal analysis, E.M. and M.P.-S.; investigation, E.M. and M.P.-S.; resources, E.M.; data curation, E.M.; writing—original draft preparation, E.M.; writing—review and editing, M.P.-S., H.I.-S. and S.M.-M.; visualisation, E.M.; supervision, E.M.; project administration, E.M. All authors have read and agreed to the published version of the manuscript.

Funding

This study was partially funded by the Agencia Valenciana de Innovación (AVI) of Generalitat Valenciana (Project INNEST/2021/250), IVIA, and the European Regional Development Fund (ERDF). Héctor Izquierdo benefits from an Instituto Nacional de Investigación y Tecnología Agraria y Alimentaria (INIA) pre-doctoral contract (reference PRE2021-100395) financed by the Spanish Ministry of Science and Innovation (MCIN/AEI/10.13039/501100011033) and the European Social Fund Plus (ESF+).

Data Availability Statement

The orthophotos and field boundaries data were obtained from the Instituto Cartográfico Valenciano in the public domain: https://geocataleg.gva.es/ (accessed on 28 May 2025). Restrictions apply to the availability of infestation-level data and could be available from the corresponding author with the permission of the farmers involved.

Acknowledgments

The authors would like to thank the Entomology unit of the IVIA for providing the sooty mould field sampling data for the development of this study.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Unión Europea. Reglamento Delegado (UE) 2021/1890 de la Comisión de 2 de Agosto de 2021 que Modifica el Reglamento de Ejecución (UE) nº 543/2011 en lo Que Atañe a las Normas de Comercialización en el Sector de las Frutas y Hortalizas. Referencia DOUE-L-2021-81443. Available online: https://www.boe.es/buscar/doc.php?id=DOUE-L-2021-81443 (accessed on 29 April 2025).
  2. Insausti, P.; Ploschuk, E.L.; Izaguirre, M.M.; Podworny, M. The effect of sunlight interception by sooty mould on chlorophyll content and photosynthesis in orange leaves (Citrus sinensis L.). Eur. J. Plant Pathol. 2015, 143, 559–565. [Google Scholar] [CrossRef]
  3. Blasco, J.; Aleixos, N.; Gómez-Sanchís, J.; Moltó, E. Citrus sorting by identification of the most common defects using multispectral computer vision. J. Food Eng. 2007, 83, 384–393. [Google Scholar] [CrossRef]
  4. Blasco, J.; Aleixos, N.; Gómez-Sanchís, J.; Moltó, E. Recognition and classification of external skin damage in citrus fruits using multispectral data and morphological features. Biosyst. Eng. 2009, 103, 137–145. [Google Scholar] [CrossRef]
  5. Moltó, E.; Blasco, J.; Gomez-Sanchis, J. Chapter 10—Analysis of hyperspectral images of citrus fruits A2. In Hyperspectral Imaging for Food Quality Analysis and Control; Sun, D.-W., Ed.; Academic Press: Cambridge, MA, USA, 2010; pp. 321–348. [Google Scholar]
  6. Summy, K.R.; Little, C.R. Using color infrared imagery to detect sooty mould and fungal pathogens of glasshouse-propagated plants. HortScience 2008, 43, 1485–1491. [Google Scholar] [CrossRef]
  7. Sims, N.C.; De Barro, P.J.; Newnham, G.J.; Kalyebi, A.; Macfadyen, S.; Malthus, T.J. Spectral separability and mapping potential of cassava leaf damage symptoms caused by whiteflies (Bemisia tabaci). Pest. Manag. Sci. 2017, 74, 246–255. [Google Scholar] [CrossRef]
  8. Daughtry, C.; Walthall, C.; Kim, M.; De Colstoun, E.B.; McMurtrey, J. Estimating corn leaf chlorophyll concentration from leaf and canopy reflectance. Remote Sens. Environ. 2000, 74, 229–239. [Google Scholar] [CrossRef]
  9. Gamon, J.A.; Serrano, L.; Surfus, J.S. The photochemical reflectance index: An optical indicator of photosynthetic radiation use efficiency across species, functional types, and nutrient levels. Oecologia 1997, 112, 492–501. [Google Scholar] [CrossRef]
  10. Gitelson, A.A.; Zur, Y.; Chivkunova, O.B.; Merzlyak, M.N. Assessing Carotenoid Content in Plant Leaves with Reflectance Spectroscopy. J. Photochem. Photobiol. 2002, 75, 272–281. [Google Scholar] [CrossRef]
  11. Tucker, C.J. Red and photographic infrared linear combinations for monitoring vegetation. Remote Sens. Environ. 1979, 8, 127–150. [Google Scholar] [CrossRef]
  12. Jiang, Z.; Huete, A.R.; Didan, K.; Miura, T. Development of a two-band enhanced vegetation index without a blue band. Remote Sens. Environ. 2008, 112, 3833–3845. [Google Scholar] [CrossRef]
  13. Sentinel, Copernicus. Europe’s Eyes on Earth: The EU’s Copernicus Programme. Available online: https://www.copernicus.eu/ (accessed on 29 April 2025).
  14. Allu, A.R.; Mesapam, S. Impact of remote sensing data fusion on agriculture applications: A review. Eur. J. Agron. 2025, 164, 127478. [Google Scholar] [CrossRef]
  15. Moltó, E. Fusion of Different Image Sources for Improved Monitoring of Agricultural Plots. Sensors 2022, 22, 6642. [Google Scholar] [CrossRef] [PubMed]
  16. Fletcher, R.S. Evaluating high spatial resolution imagery for detecting citrus orchards affectes by sooty mould. Int. J. Remote Sens. 2005, 26, 495–502. [Google Scholar] [CrossRef]
  17. Olsoon, P.-A.; Jönsson, A.M.; Eklundh, L. A new invasive insect in Sweden—Physokermes inopinatus: Tracing forest damage with satellite based remote sensing. For. Ecol. Manag. 2012, 285, 29–37. [Google Scholar] [CrossRef]
  18. Drusch, M.; Del Bello, U.; Carlier, S.; Colin, O.; Fernandez, V.; Gascon, F.; Hoersch, B.; Isola, C.; Laberinti, P.; Martimort, P.; et al. Sentinel-2: ESA’s optical high-resolution mission for GMES operational services. Remote Sens. Environ. 2012, 120, 25–36. [Google Scholar] [CrossRef]
  19. Gitelson, A.A.; Kaufman, Y.J.; Merzlyak, M.N. Use of a Green Channel in Remote Sensing of Global Vegetation from EOS-MODIS. Remote Sens. Environ. 1996, 58, 289–298. [Google Scholar] [CrossRef]
  20. Huete, A.R.; Didan, K.; Miura, T.; Rodriguez, E.P.; Gao, X.; Ferreira, L.G. Overview of the radiometric and biophysical performance of the MODIS vegetation indices. Remote Sens. Environ. 2002, 83, 195–213. [Google Scholar] [CrossRef]
  21. Gitelson, A.A.; Gritz, Y.A.; Merzlyak, M.N. Relationships between leaf chlorophyll content and spectral reflectance and algorithms for non-destructive chlorophyll assessment in higher plant leaves. J. Plant Physiol. 2003, 160, 271–282. [Google Scholar] [CrossRef]
  22. Louhaichi, M.; Borman, M.M.; Johnson, D.E. Spatially Located Platform and Aerial Photography for Documentation of Grazing Impacts on Wheat. Geocarto Int. 2001, 16, 65–70. [Google Scholar] [CrossRef]
  23. Hunt, E.R.; Daughtry, C.S.; Eitel, J.U.; Long, D.S. Remote Sensing Leaf Chlorophyll Content Using a Visible Band Index. J. Agron. 2011, 103, 1090–1099. [Google Scholar] [CrossRef]
  24. Peñuelas, J.; Gamon, J.A.; Griffin, K.L.; Field, C.B. Assessing community type, plant biomass, pigment composition, and photosynthetic efficiency of aquatic vegetation from spectral reflectance. Remote Sens. Environ. 1993, 46, 110–118. [Google Scholar] [CrossRef]
  25. Gao, B.C. NDWI—A normalized difference water index for remote sensing of vegetation liquid water from space. Remote Sens. Environ. 1996, 58, 257–266. [Google Scholar] [CrossRef]
  26. Key, C.H.; Benson, N.C. Measuring and remote sensing of burn severity. In Proceedings of the Joint Fire Science Conference and Workshop, Boise, Idaho, 15–17 June 1999; Neuenschwander, L.F., Ryan, K.C., Eds.; University of Idaho and International Association of Wildland Fire: Moscow, ID, USA, 1999; Volume II, p. 284. [Google Scholar]
  27. Peñuelas, J.; Baret, F.; Filella, I. Semi-Empirical Indices to Assess Carotenoids/Chlorophyll-a Ratio from Leaf Spectral Reflectance. Photosynthetica 1995, 31, 221–230. [Google Scholar]
  28. Institut Cartogràfic Valencià. Ortofoto de 2022 de la Comunitat Valenciana en RGBI y de 25 cm de Resolución Espacial. 2025. Available online: https://geocataleg.gva.es/ (accessed on 29 April 2025).
  29. Han, L.; Yang, G.; Yang, H.; Xu, B.; Li, Z.; Yang, X. Clustering Field-Based Maize Phenotyping of Plant-Height Growth and Canopy Spectral Dynamics Using a UAV Remote-Sensing Approach. Front. Plant Sci. 2018, 9, 1638. [Google Scholar] [CrossRef]
  30. Sebbar, B.; Moumni, A.; Lahrouni, A. Decisional tree models for land cover mapping and change detection based on phenological behaviors. Application case: Localization of non-fully-exploited agricultural surfaces in the eastern part of the Haouz plain in the semi-arid central morocco. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2020, XLIV-4/W3-2020, 365–373. [Google Scholar] [CrossRef]
  31. Cohen, J.A. Coefficient of Agreement for Nominal Scales. Educ. Psychol. Meas. 1960, 20, 37–46. [Google Scholar] [CrossRef]
  32. Bowley, A.L. Elements of Statistics; Charles Scribner’s Sons: New York, NY, USA, 1920. [Google Scholar]
  33. Apacionado, B.V.; Ahamed, T. Sooty Mold Detection on Citrus Tree Canopy Using Deep Learning Algorithms. Sensors 2023, 23, 8519. [Google Scholar] [CrossRef]
  34. Siam, A.F.K.; Bishshash, P.; Nirob, M.A.S.; Mamun, S.B.; Assaduzzaman, M.; Noori, S.R.H. A comprehensive image dataset for the identification of lemon leaf diseases and computer vision applications. Data Brief 2025, 58, 111244. [Google Scholar] [CrossRef]
  35. Della Bellver, F.; Franch Gras, B.; Moletto-Lobos, I.; Guerrero Benavent, C.J.; San Bautista Primo, A.; Rubio, C.; Vermote, E.; Saunier, S. Pest Detection in Citrus Orchards Using Sentinel-2: A Case Study on Mealybug (Delottococcus aberiae) in Eastern Spain. Remote Sens. 2024, 16, 4362. [Google Scholar] [CrossRef]
  36. Vieira, J.G.; Santana, E.D.; Conceição, F.G.; Iost Filho, F.H.; de Pazini, J.B.; Rodrigues, R.; Yamamoto, P.T. Candidatus Liberibacter asiaticus infection alters the reflectance profile in asymptomatic citrus plants. Pest. Manag. Sci. 2025, 81, 1299–1306. [Google Scholar] [CrossRef]
Figure 1. Monitored fields in the study area. The black lines represent the boundary of the selected orchard; the black dots correspond to the observation points in the orchard; and the coloured boxes correspond to the defined infestation levels (green is level 0, yellow is level 1, and red is level 2).
Figure 1. Monitored fields in the study area. The black lines represent the boundary of the selected orchard; the black dots correspond to the observation points in the orchard; and the coloured boxes correspond to the defined infestation levels (green is level 0, yellow is level 1, and red is level 2).
Agronomy 15 01342 g001
Figure 2. Temporal evolution according to infestation level: The X-axis represents the time and the Y-axis represents the value according to the spectral band or vegetation index. (a) Average reflectivity of NIR band. (b) Average value of TGI. (c) Average reflectivity of red band. (d) Average value of NDVI.
Figure 2. Temporal evolution according to infestation level: The X-axis represents the time and the Y-axis represents the value according to the spectral band or vegetation index. (a) Average reflectivity of NIR band. (b) Average value of TGI. (c) Average reflectivity of red band. (d) Average value of NDVI.
Agronomy 15 01342 g002
Figure 3. (a) Example of the NDVI thresholded image on several orchards. In yellow NDVI > 0.25, and in black NDVI < 0.25. (b) Image inside the red box before applying the opening. (c) Image inside the red box after applying the opening.
Figure 3. (a) Example of the NDVI thresholded image on several orchards. In yellow NDVI > 0.25, and in black NDVI < 0.25. (b) Image inside the red box before applying the opening. (c) Image inside the red box after applying the opening.
Agronomy 15 01342 g003
Figure 4. Example of pixel selection on the superimposed images. (a) The green box delimits the enlarged area of the right image. (b) The 10 × 10 m cell grids in orange are considered reliable pixels while those in purple are excluded from further analysis.
Figure 4. Example of pixel selection on the superimposed images. (a) The green box delimits the enlarged area of the right image. (b) The 10 × 10 m cell grids in orange are considered reliable pixels while those in purple are excluded from further analysis.
Agronomy 15 01342 g004
Figure 5. Example of pixel assignment in one iteration. The white grids delimit pixels belonging to a training set, and the blue grids delimit pixels in the validation set. Green circles correspond to level 0 observations (absence of sooty mould), and yellow circles correspond to level 1 (incipient presence).
Figure 5. Example of pixel assignment in one iteration. The white grids delimit pixels belonging to a training set, and the blue grids delimit pixels in the validation set. Green circles correspond to level 0 observations (absence of sooty mould), and yellow circles correspond to level 1 (incipient presence).
Agronomy 15 01342 g005
Table 1. Spectral index applied.
Table 1. Spectral index applied.
IndexAcronymEquationReference
Normalised Difference Vegetation IndexNDVI ρ N I R ρ R e d ρ N I R + ρ R e d [11]
Green Normalised Difference Vegetation IndexGNDVI ρ N I R ρ G r e e n ρ N I R + ρ G r e e n [19]
Enhanced Vegetation IndexEVI G   ρ N I R ρ G r e e n ρ N I R + C 1 ρ R e d C 2 ρ B l u e + 1 [20]
Enhanced Vegetation Index (2)EVI2 G   ρ N I R ρ G r e e n ρ N I R + C 1 C 2 / c   ρ R e d + 1 [12]
Green Chlorophyll IndexGCI ( ρ G r e e n ) 1 ( ρ N I R ) 1 [21]
Green Leaf IndexGLI 2 ρ G r e e n ρ R e d ρ B l u e 2 ρ G r e e n + ρ R e d + ρ B l u e [22]
Triangular Greenness IndexTGI 0.5 [ ( λ R e d λ B l u e ) ( ρ R e d ρ G r e e n ) ( λ R e d λ G r e e n ) ( ρ R e d ρ B l u e ) ] [23]
Normalised Pigment Chlorophyll Ratio IndexNPCI ρ R e d ρ B l u e ρ R e d + ρ B l u e [24]
Normalised Difference Water IndexNDWI ρ N I R ρ S W I R 1 ρ N I R + ρ S W I R 1 [25]
Normalised Burn RatioNBR ρ N I R ρ S W I R 2 ρ N I R + ρ S W I R 2 [26]
Structure Intensive Pigment IndexSIPI ρ N I R ρ B l u e ρ N I R ρ R e d [27]
Table 2. Statistics for accuracy and kappa values over the 30 iterations without and with image fusion.
Table 2. Statistics for accuracy and kappa values over the 30 iterations without and with image fusion.
Without Image FusionWith Image Fusion
AccuracyKappaAccuracyKappa
Minimum0.650.450.580.37
1st quartile0.710.530.760.59
Median0.760.610.780.66
Mean0.750.600.800.67
3rd quartile0.780.650.870.78
Maximum0.870.800.970.95
Std Dev. sample0.060.090.090.13
Table 3. Averages of user’s and producer’s accuracies in the 30 iterations without and with image fusion.
Table 3. Averages of user’s and producer’s accuracies in the 30 iterations without and with image fusion.
Without Image FusionWith Image Fusion
Producer’s AccuracyUser’s AccuracyProducer’s AccuracyUser’s Accuracy
Level 00.870.840.790.88
Level 10.810.720.890.78
Level 20.430.650.650.82
Table 4. Confusion matrix of the test sets aggregated over 30 iterations without image fusion. Equivalent producer accuracies are added in parentheses.
Table 4. Confusion matrix of the test sets aggregated over 30 iterations without image fusion. Equivalent producer accuracies are added in parentheses.
Observed Value
Level 0Level 1Level 2
Predicted
value
Level 0527 (86.7%)66 (10.9%)15 (2.5%)
Level 163 (10.0%)509 (80.7%)59 (9.4%)
Level 245 (14.3%)135 (42.9%)135 (42.9%)
Table 5. Confusion matrix of the test sets aggregated over 30 iterations with merged data. Equivalent producer accuracies are added in parentheses.
Table 5. Confusion matrix of the test sets aggregated over 30 iterations with merged data. Equivalent producer accuracies are added in parentheses.
Observed Value
Level 0Level 1Level 2
Predicted
value
Level 0209 (79.5%)49 (18.6%)5 (1.9%)
Level 120 (4.8%)367 (88.9%)26 (6.3%)
Level 212 (6.5%)62 (33.3%)112 (60.2%)
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Moltó, E.; Pereira-Sandoval, M.; Izquierdo-Sanz, H.; Morell-Monzó, S. Fusion of Medium- and High-Resolution Remote Images for the Detection of Stress Levels Associated with Citrus Sooty Mould. Agronomy 2025, 15, 1342. https://doi.org/10.3390/agronomy15061342

AMA Style

Moltó E, Pereira-Sandoval M, Izquierdo-Sanz H, Morell-Monzó S. Fusion of Medium- and High-Resolution Remote Images for the Detection of Stress Levels Associated with Citrus Sooty Mould. Agronomy. 2025; 15(6):1342. https://doi.org/10.3390/agronomy15061342

Chicago/Turabian Style

Moltó, Enrique, Marcela Pereira-Sandoval, Héctor Izquierdo-Sanz, and Sergio Morell-Monzó. 2025. "Fusion of Medium- and High-Resolution Remote Images for the Detection of Stress Levels Associated with Citrus Sooty Mould" Agronomy 15, no. 6: 1342. https://doi.org/10.3390/agronomy15061342

APA Style

Moltó, E., Pereira-Sandoval, M., Izquierdo-Sanz, H., & Morell-Monzó, S. (2025). Fusion of Medium- and High-Resolution Remote Images for the Detection of Stress Levels Associated with Citrus Sooty Mould. Agronomy, 15(6), 1342. https://doi.org/10.3390/agronomy15061342

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop