Next Article in Journal
Applicability of Susceptibility Model for Rock and Loess Earthquake Landslides in the Eastern Tibetan Plateau
Previous Article in Journal
Remote Sensing-Based Statistical Approach for Defining Drained Lake Basins in a Continuous Permafrost Region, North Slope of Alaska
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Colourimetric Approach to Ecological Remote Sensing: Case Study for the Rainforests of South-Eastern Australia

1
Centre for Ecosystem Science, School of Biological, Earth and Environmental Sciences, University of NSW, Sydney, NSW 2052, Australia
2
NSW Department of Planning, Industry and Environment, Newcastle, NSW 2052, Australia
3
School of Environmental and Life Sciences, University of Newcastle, Callaghan, NSW 2308, Australia
4
NSW Department of Planning, Infrastructure and Environment, Parramatta, NSW 2124, Australia
*
Author to whom correspondence should be addressed.
Remote Sens. 2021, 13(13), 2544; https://doi.org/10.3390/rs13132544
Submission received: 13 May 2021 / Revised: 18 June 2021 / Accepted: 20 June 2021 / Published: 29 June 2021
(This article belongs to the Section Ecological Remote Sensing)

Abstract

:
To facilitate the simplification, visualisation and communicability of satellite imagery classifications, this study applied visual analytics to validate a colourimetric approach via the direct and scalable measurement of hue angle from enhanced false colour band ratio RGB composites. A holistic visual analysis of the landscape was formalised by creating and applying an ontological image interpretation key from an ecological-colourimetric deduction for rainforests within the variegated landscapes of south-eastern Australia. A workflow based on simple one-class, one-index density slicing was developed to implement this deductive approach to mapping using freely available Sentinel-2 imagery and the super computing power from Google Earth Engine for general public use. A comprehensive accuracy assessment based on existing field observations showed that the hue from a new false colour blend combining two band ratio RGBs provided the best overall results, producing a 15 m classification with an overall average accuracy of 79%. Additionally, a new index based on a band ratio subtraction performed better than any existing vegetation index typically used for tropical evergreen forests with comparable results to the false colour blend. The results emphasise the importance of the SWIR1 band in discriminating rainforests from other vegetation types. While traditional vegetation indices focus on productivity, colourimetric measurement offers versatile multivariate indicators that can encapsulate properties such as greenness, wetness and brightness as physiognomic indicators. The results confirmed the potential for the large-scale, high-resolution mapping of broadly defined vegetation types.

Graphical Abstract

1. Introduction

False colour images have typically offered a starting point to the satellite imagery processing and analysis, allowing the interpreter to visualise and recognise distinguishable patterns in landscapes with the data from bands in and beyond the visual spectrum, into the near-infrared and shortwave infrared ranges. However, surprisingly, the direct measurement of colour metrics (or colourimetry) from false colour imagery has received little attention, particularly for vegetation classification. Recent machine learning approaches are still seen by many researchers as black boxes that are difficult to interpret, which is not very practical when end users will want to visualise, understand and communicate how mapping solutions work, how reliable mapping decisions are and how they were arrived at [1,2,3,4,5]. Indeed, the visualisation of what machine learning approaches are actually learning for the explanation of models and enhancement of trust is still an open area of research [3,6,7,8,9]. The focus of this study was to revisit the colourimetric approaches, such as those from Wester et al. [10] to Pekel et al. [11], and develop them further, given the recent availability of temporal data at a high enough resolution to do it well. This study aimed to apply ecological knowledge, visual analytics and evidential reasoning [11,12] to simplify large-scale, high-resolution remote sensing classifications, keeping them feasible, scalable and more transparent and communicable for a wider public and non-expert audience with the direct and quantitative measurement of colour from false colour satellite imagery. It has shown via an extensive validation across several ecoregions that colourimetry can provide high mapping accuracies. There should always be a scope to choose the most appropriate method for an outcome, and the proposed method simply adds an option to the toolkit, with comparative strengths and weaknesses that differ from those of the other methodological options.
Satellite Imagery Interpretation (SII) can be considered as the quantitative (and, thus, less subjective) multispectral analogue to the commonly known, accepted and continually evolving technique of API (Aerial Photographic Interpretation) [13,14,15,16]. API has been described as a visual problem-solving activity whereby the analyst uses knowledge and experience to derive insights from the patterns of visual cues in an image [17]. A good example of the implementation of SII is the augmented visual interpretation applied in the Food and Agriculture Organization of the United Nations’ Collect Earth software for land use and land cover assessment [18]. SII can also be seen as a remote sensing application of Visual Analytics, which has been described as “combin(ing) automated analysis techniques with interactive visualisations for an effective understanding, reasoning and decision making on the basis of very large and complex datasets” [12]. SII with a holistic visual analysis of the landscape [19,20] was formalised with the creation and application of an image interpretation key from an ecological colourimetric deduction. Image interpretation keys are intended to summarise complex information and to train inexperienced personnel in the interpretation of complex or unfamiliar topics or provide a reference for experienced interpreters to rapidly identify examples pertaining to specific features [21]. The interpretation key presented in the study is a type of domain ontology [22] that can be agreed upon by experts and validated by an accuracy assessment to resolve cognitive biases. A deductive colourimetric approach is thus proposed to map landscape associations that can be visually interpreted from an image by an analyst and communicated to and applied by non-experts.
A mapping case study was carried out for the rainforests of south-eastern Australia, a diverse vegetation formation within a variegated landscape, to demonstrate the utility of a deductive colourimetric approach. We show how the classification of a land cover class can be simplified by identifying the match between the ecological and false colour characteristics that make it unique within its environment during a particular season. Vegetation indices have been shown to be useful for estimating variables related to photosynthesis at the canopy or ecosystem scale, including phenology, primary productivity, net carbon fixation, gross primary productivity and processes related to plant transpiration [23]. However, rainforests in Australia vary from dry to very moist, with a wide range of structural variability [24]. Further, different unrelated vegetation types juxtaposed with rainforests can share similar structural or productivity-related values, and false colour RGB composites have the potential to distinguish land cover features uniquely, serving as useful indicators of physiognomic traits for vegetation. Band ratio RGBs were created and tested to increase the hue angle separability of classes by using band ratios for each RGB channel. While the use of band ratio arithmetic in RGB composites is not new—as an often-cited example, Sultan [25] used the multiplication of band ratios in just one channel to emphasise the distinction of minerals in a lithological mapping study—band ratio RGBs have never been used in vegetation studies before, and direct measurements from them have never been used to classify features.

1.1. Rationale for a Colourimetric Approach

1.1.1. Colourimetry and Interpretation

Image processing is performed to decrease the correlation between bands and enhance the contrast between colours to highlight the information of greatest interest to the analyst [10]. Commonly applied RGB combinations include ‘Natural colour’ (RGB: Red, Green, Blue), ‘Colour infrared’ (RGB: NIR, Red, Green), ‘Land/water’ (RGB: NIR, SWIR1, Red) and ‘Agricultural’ (SWIR2, NIR, Green) [26]. Further spectral distinction can be provided by the more spectrally representative multivariate visualisations provided by band ratio combinations [27]. Colour can be considered as a fundamental biophysical variable [28] and has been identified as the most commonly referenced perceptual cue that can be interpreted by inference and deduction [16,17,29,30,31]. Wester et al. [10] recommended hue, because it is easy to interpret colours that consistently represent particular features. Colour, particularly hue, has thus been considered the most perceptibly intuitive way for humans to differentiate and communicate their observations about features. The HSV colour space simulates the way humans perceive colours, where hue represents the dominant spectral component [32]. Decorrelating hue increases the separability of features by reducing the variations in chromatic modulation (Saturation) and brightness (Value) [33].
In terms of direct or colourimetric measurements, Pekel et al. [34] applied the visually intuitive metric of hue angle and showed its temporal relationship with the vegetation index NDVI in a study to monitor the vegetation extent. Other terrestrial environment studies that applied a hue angle include: [35,36]. There have been many more aquatic detection and classification studies based on the hue angle, typically for RGB combinations of the Shortwave Infrared, Near-infrared and Red bands, including: [37,38,39,40,41,42]. The technique has been proven effective at a global level with the high-resolution mapping of global surface water [11].

1.1.2. Scalability

A major problem of purely statistical methods for high thematic resolution mapping is the scaling issues that can arise when features modelled by field samples at a particular scale cannot be transferred directly to different regions or scales [1,43,44,45]. The principal components analysis (PCA) has typically been used in the past on band ratio RGBs to reduce the dimensionality of multivariate data and to classify landscape features based on the size and significance of the component loadings [27]. However, principal components are influenced by the sample size and assume that variables change linearly along underlying environmental gradients, and distorted ordinations can result from the nonlinearity and inclusion of collinear variables [46]. In contrast, the hue angle is scalable, since it is not parameterised, and visually determined thresholds from SII can mitigate the need for the intensive sampling that is typically required for statistically modelled approaches.

1.1.3. Reducing Sampling Cost with Satellite Imagery Interpretation

Sufficient field training data is rarely available, as it may be expensive and logistically difficult to collect for the requirements of conventional statistical approaches to classification, as well as their accuracy assessments across large and remote areas [47]. Other issues such as accessibility to private lands or places that are physically difficult to reach can sometimes make these approaches impossible. However, analysts can learn to identify many physiognomic and floristic classes in remotely sensed imagery, given a field-based knowledge of general distributions with the free and comprehensive reference data collection from high-resolution and multispectral image archives [48] that can be accessed and processed online, as it can now with Google Earth Engine (GEE) [49]. This study was thus primarily knowledge-based and measurement-driven in an attempt to promote simplicity and reduce the effort and cost of sampling, following recommendations for the abandonment of traditional statistical methods, which have been found to be sensitive to scaling effects, and placing more emphasis on the visualisation and interpretation of geospatial data over a range of scales [50]. This involved using ecological and field knowledge to deduce landscape patterns in false colour images and comparing and associating freely available Very High-Resolution (VHR; at one meter or less [33]) imagery with medium-resolution satellite imagery, such as the 15 m Sentinel-2 imagery provided by GEE to visually define classes with density sliced vegetation indices or false colour hue angles.
Most vegetation data from the past have been sampled preferentially (or purposively) and are thus considered not to fulfil the statistical assumption of independence of observations necessary for valid statistical testing and inference, and while systematic, simple random and stratified random sampling better meet some of the statistical assumptions, purposive sampling typically yields data sets that cover a broader range of vegetation variability [51]. Random sampling is not efficient at assisting pattern recognition but is, rather, only a measure of probability. Following a Holistic [20] and Gestalt approach [52], it is acknowledged in this study that the interpretation of part of an image depends on the interpretation of the rest of the image and, indeed, the surrounding landscape. The interpretation was largely contextual and used the aforementioned perceptual cues to associate what hues can uniquely and consistently define rainforest by visual ecological deduction, grouping observations by qualities, including proximity, similarity and continuity. SII is proposed to emulate the process of purposive sampling by allowing the interpreter to interactively take a holistic assessment of the landscape, iteratively panning around and zooming in and out from features that appear to form a pattern in multispectral imagery, and confirming assumptions colourimetrically in combination with interpretations from VHR imagery. This study will present the identification of colourimetric benchmarks that identify breaks and ranges of recognisable ecosystem features in the multispectral colour spaces of satellite imagery, which will be explained further by example in the Methods. The cognitive deduction involved can be seen as a visually interactive process of association and elimination analogous to the presence and background learning described by Deng et al. [53]. In the same way that we can generalise urban environments being bright and having angular geometries—and oceanic water appearing dark compared to land, for example—we can deduce that tree floristic classes and other tree communities can be distinguished in reference imagery from nearby associations by characteristic canopy appearance in high-resolution imagery or distinct phenologic observations from either high- or medium-resolution reference imagery [18,48].

1.2. Colourimetric Ontologies and Multidimensional Colour Blending

Ontology-driven classifications provide the identification of meaningful and communicable features [54]. In this study, false colour image classifications are driven by ontology. The RGB combinations selected for testing in this study was based on a priori knowledge and experimentation. Red and Near-Infrared bands in combination (either through a simple ratio or a normalised difference like NDVI) indicate vegetative vigour or photosynthetic activity, and the first Shortwave Infrared band (SWIR1) in either Landsat or Sentinel 2 sensors has a strong relationship with moisture, with the highest coefficient for wetness in the Tassel Capped transformation [55,56,57]. We can thus assume that combining these three bands in an RGB composite could provide a corresponding multivariate relationship in the HSV colour space, indicating the photosynthetic capacity and moisture. While simple RGBs like the ‘Land/Water’ RGB (which is composed of these three bands) provide good contrasting visualisations for interpretation, they tend to display narrow data distributions for hue angle. This could be resolved by applying image stretches like histogram equalisations; however, stretching was avoided in order to keep the solutions absolute and scalable. The band ratio RGBs were tested instead, as they usually provide wider data distributions and conveniently reduce hill shading compared with simple RGBs.
Various statistical methods previously proposed for the optimal selection of false colour RGB band combinations [58,59,60,61,62] are not scalable due to their dependence on available field samples and their extent. To avoid scaling issues, the band ratios in this study were created deductively by assigning and dividing the SWIR1 band in each RGB channel with the bands from the ‘Natural colour’ and ‘Infrared colour’ combinations to create band ratio RGBs with the expectation that they would separate features as much as possible with the wetness, structure and greenness related data provided by these combinations.
Miller et al. [63] presented multidimensional blending (or transparency weighted image combination) as a strategy for distilling and communicating multiple sources of information simultaneously in a controlled manner for communication to the human analyst. This study compared mapping results from (1) traditional multispectral indices with (2) the hues from false colour band ratio RGBs and (3) a false colour band ratio RGB blend; which were intended for quantitative measurement rather than just communication. The RGB combinations were selected from a matrix of the possible arithmetic combinations of multiplications and divisions between two band ratio RGBs, to be further explained and justified in Methods.

1.3. Simplicity and Transparency through One-Class, One-Index Density Slicing

For the sake of simplicity and transparency for general public use and to compare the precision of the results at the pixel level rather than a generalised and arbitrary segment level, a one-class, one-index thresholding approach was applied to extract one visibly distinguishable and ontologically communicable feature at a time directly from false colour imagery. A direct image analysis was preferred to hybrid approaches that combine image classification and environmental modelling, because it maintains the highest precision possible by not corrupting the raw data from satellite imagery with other spatial variables. The latter are generally only available at resolutions coarser than the imagery, making it impossible to accurately map fine features like riparian gullies, rivers, roads or disturbed or illegally deforested areas.
Multiple class classifications can be inefficient and suboptimal in terms of accuracy when the main goal is to classify only one or a few classes [64]. One-class classifications focus on the classes of interest, redirecting the resources and effort from other separable or potentially unrelated classes, providing opportunities to derive highly accurate classifications from small training sets. In this study, visual density slicing was applied to categorise the data in the candidate indices by dividing the range of values into quantile intervals, then assigning each interval a category colourimetrically. Examples of the simple, yet robust use of density slicing to define thresholds for mapping in the past include [65,66,67,68] and due to its facility, it is used operationally—for example, by land managers classifying and mapping wildfire severity [69].

2. Case Study: Rainforests of South-Eastern Australia

Worldwide, rainforests provide important and irreplaceable ecosystem services, including: sustaining high levels of biodiversity, storing carbon, moderating flood and drought cycles, limiting soil erosion, reducing downstream flooding, influencing rainfall patterns, providing habitat to wildlife and indigenous people and many useful products upon which local communities depend [48,70]. They thus have high conservation, as well as cultural values that have been affected by anthropogenic disturbances at multiple scales [71]. The rainforests of New South Wales (NSW) in south-eastern Australia were chosen to develop and test the colourimetric approach due to their diverse characteristics and extensive range over a large environmentally heterogeneous region of more than 18 million hectares—approximately 75% the size of the whole of the United Kingdom.

Existing Approaches to Rainforest Mapping and the Need for Detail and Automation

A review of the published literature indicates that, while there are some more recent large-scale maps of generic forest cover or biomass of up to 30 m in resolution, there have not been any automated large-scale forest type maps distinguishing rainforest extents since 2013 at less than 250 m resolution [72].
The existing maps of rainforest for large extents in Australia held within the National Vegetation Information System (NVIS) [73] and the National Forest Inventory (NFI) [74] are quite coarse at 100 m resolution. For New South Wales, mapping largely coincides with the rainforest formations described by Keith [75] and mapped by Keith and Simpson [76]. The mapping is known to have varying degrees of uncertainty, having been compiled from a variety of sources at different spatial and thematic resolutions and laboriously digitised by hand with API, which has typically been left to the varying discretion and skill of different operators across different mapping extents with photography from different years and different seasons and have unfortunately never had comprehensive accuracy assessments conducted or documented [47]. A great deal of feature variety can exist within a 100 m-squared pixel, and many features that are of interest to conservation efforts, including rivers and riparian rainforest features such as gullies, can be missed completely. Multispectral satellite imagery has been identified as the most accessible remotely sensed data for monitoring rainforests [48], since it is now free and can be acquired with regularity at relatively high resolutions (at 10 m with Sentinel-2 and 15 m with pan-sharpened Landsat imagery), making updating and monitoring much more feasible than costly one-off API digitisations.

3. Methods

3.1. Study Area and Knowledge Base

This study was limited to the known extent of rainforests in NSW, Australia between latitudes 28°9′ S to 37°30′ S and longitudes 149°5′ E to 153°38′ E. The rainforests of NSW are the most varied of any in Australia and are one of the most important repositories of the State’s biological diversity [75]. They are characterised by closed and continuous canopies dominated by non-eucalypt species, with high tree densities, sometimes in multiple vertical layers, with the foliage cover exceeding 70%. They vary from lush subtropical forests to dry vine thickets within broad climatic limits and are associated with the patchy occurrence of suitable soils and the influence of fire and humans [77,78]. Their patch sizes vary from less than one hectare in sheltered gullies to extensive tracts and mosaics within large eucalypt tall open forests [15,79]. They are generally composed of relatively soft to somewhat leathery, horizontally oriented leaves with high specific leaf areas, and their forms and ecology contrast greatly with that of the native sclerophyll taxa, which are generally absent from rainforests [75]. The full range of mainland rainforest types as defined by Keith [75] were considered, including: (1) Subtropical Rainforests, (2) Warm Temperate Rainforests, (3) Cool Temperate Rainforests, (4) Dry Rainforests (5) Littoral (coastal) Rainforests and (6) Western Vine Thickets. These types demonstrate substantial variations in structures, species compositions, climates, soils and biogeography, with further considerable inter-type variations.

3.2. Ecological Colourimetric Deduction and Interpretation Ontology for Rainforests

An ecological colourimetric deduction was derived from SII applied on the ‘Land/Water’ false colour RGB combination. The appearance of rainforests was generalised to be greener and glossier/brighter relative to surrounding sclerophyll dominated forests, even in the drier rainforest formations with more coriaceous leaf textures. Figure 1 shows examples from an API perspective of VHR imagery from Google Earth, demonstrating the textural, brightness and greenness contrasts between the rainforest types and native sclerophylls and how they can be confirmed in the street view. The same can be done from Google Maps.
It was postulated that false colour image hue angles should perform better than conventional productivity-related vegetation indices like NDVI, because rainforests on the east coast of Australia can vary in moisture from dry to very moist but share the common physiognomic traits of canopy leaf glossiness and greenness and site wetness at a stand level relative to the drier native sclerophylls in the same areas, the range of which most linear vegetation indices would not be able to take into account by themselves at high resolutions.
The SII inspection and association to the existing 100 m NVIS and NFI reference classifications confirmed that there is a noticeable pattern in the greenness and brightness of rainforests compared to sclerophyll forests in the ‘Natural colour’ RGB combination (Red, Green and Blue). This appears much more separable in the ‘Land/Water’ RGB combination (NIR, SWIR1 and Red), which differentiates rainforests by a range of orange hues uniquely from the other hues associated with sclerophyll vegetation. Table 1 presents a SII key that was created and used to interpret the range of expressions that rainforests can exhibit across different landscapes with the perceptual cues of hue, brightness, texture, shape/size and association.
If viewed with the same linear stretch, the orange hues are consistent for rainforests in all the ecoregions, except for the vine thickets in the Northwest slopes, which, on close inspection, were clearly over-mapped in the past due to the low (100 m) resolution, and require histogram stretches at the current extent to be visualised more clearly as orange.

3.3. Two Stage Classification Design

A two-stage classification was undertaken to assess the scalability and regional transferability of each candidate index. The first stage focussed on four sample areas that represented the full range of ecological and climate variation across the whole study area. These sample areas were intended to reduce the effort in the first stage of image interpretation and were based on Landsat path/row tiles (Figure 2), which were selected for the consistency and convenience of their size. The northeast tile sampled primarily subtropical rainforest, while the southern tile sampled temperate, predominantly gully located rainforest. The north-western tile sampled the dry vine thickets west of the Northwest slopes, and the central tile sampled a variety of these rainforest types.
The colourimetry analysis described in the following sections was applied to the four sample tiles and then the four ecoregions using a selection of candidate indices that produced alternative image classifications. The ecoregional strata for the second stage were rationalised from visual observation of colour differences across the landscape. Figure 2 shows that the sclerophyll forests of the south coast are generally darker than those of the north coast if viewed in the ‘Land/Water’ RGB. The Tablelands and Ranges of the Great Dividing Range are known to display different physiognomic traits to the coast and to the Northwest slopes, and the index thresholds from the first stage confirmed a corresponding colourimetric difference. The strata boundaries were constructed via manual selection of existing subregions from the Interim Biogeographic Regionalisation for Australia [80].
The extent and performance ranking of candidate indices from the accuracy assessment in the first stage were used for the calibrations to guide the threshold definitions in the second stage. This method of inference avoided the use of any sample data. Instead, the available ground observations were used entirely for the accuracy assessments. Figure 3 illustrates the technical workflow that was applied between GEE and a GIS.
The two-stage approach is also intended to reduce the amount of imagery the analyst needs to download from GEE to process and analyse locally, since the image data is quite large. Once thresholds have been assessed from the first stage with the graphic user interfaces for density slicing available in a GIS (which are currently not available in GEE), then the thresholds for the second stage can be refined more easily for the whole extent of each of the ecoregions from GEE with some additional numeric adjustments through trial and error.

3.4. Phenologic Imagery Selection and Processing

In previous studies of seasonal phenology [81,82], the analysts’ and the botanists’ local ecological knowledge and visual inspection of interannual, median-based image composites by season created in GEE for the coast of NSW suggested that rainforests are more distinguishable from other vegetation types during the drier season of summer. Temporal aggregation from median-based composites have been shown to significantly reduce the data volumes on a per-band, per-pixel basis, reducing anomalies, clouds, shadows and abnormal pixilation, resulting in faster and easier analyses suitable for vegetation modelling with equally high accuracy as the time series data [83]. Phan et al. [84] purposively selected summer images to classify vegetation in an environment heavily affected by snow and cloud cover in the winter. Similarly, a median-based summer composite was applied in this study because of the phenological difference between rainforests and the seasonally varying native sclerophylls that are noticeably less green and, hence, more distinguishable from rainforests in the summer. The summer composites are also conveniently the least affected by clouds and hill shade. In this study, interannual, median-based composites were created for a range of years in order to account for the variability of a relatively steady state from all the available Sentinel 2 TOA (Top-Of-Atmosphere reflectance) multispectral imagery for the selected months between November and February, depending on the latitude of the bioregions, and between the years 2015–2018 in order to avoid effects of the ‘black summer’ bush fires between 2019 and 2020, as shown in Table 2.
The interannual, median-based composites, together with the cloud-masking function (maskS2clouds) [49] provided by the GEE public example, provided a seamless, well-colour-balanced imagery mosaic. In order to reduce the noise or ‘salt-and-pepper effect’ on the resultant indices that is typical of pixel-based approaches, a low pass filter was applied to the imagery in GEE prior to classification with the convolve() function, which appeared to produce equivalent results to the focal_mean() or reduceNeighborhood() functions.
A visual inspection indicated that, for all the candidate indices, some agricultural and water features were misclassified as rainforest. Therefore, it was necessary to subtract these areas with an existing forest mask [85] before calculating the areas in the Results.

3.5. Selection of Candidate Indices and False Colour Band Ratio RGB Combinations

For the sake of sensor compatibility, the selection of indices and RGB combinations for comparison was limited to those involving satellite bands available at the higher resolutions for Sentinel 2, as well as Landsat, satellites. Thus, only the Blue, Green, Red, NIR, SWIR1 and SWIR2 bands were considered, because the Red-edge bands are only available for Sentinel 2, and the Thermal bands are only available for Landsat and are of a much lower resolution. Table 3 lists the vegetation indices and false colour RGB combinations that were tested and the reasons for their selection. Hue was preferred due to its aforementioned communicability and because it is less affected by shadows the way saturation and value or intensity are in the typically rugged terrain of rainforests on the south-eastern coast of Australia. Hue was calculated from GEE with the Image.rgbToHsv.select(′hue′) function.
The band ratio RGB blend was selected from a matrix of possible arithmetic combinations of multiplications and divisions between the two-band ratio RGBs of NIR/Red, SWIR1/Green, Red/Blue (which is a combination of the ‘Land/Water’ RGB and the ‘Natural colour’ RGB) and the SWIR1/Infrared Colour Band Ratio RGB described in Table 3. The former band ratio RGB was chosen, because it retains the appearance of the ‘Land/Water’ RGB, which was used for the SII key for rainforests in Table 1 and as the Reference RGB visualisation in Table 3, with related HSV S and V values from the visual interpretation. Multiplications and divisions were chosen based on the assumption that they would further separate the relations present in the two selected RGBs as much as possible, more than additions and subtractions, which have more chance of producing negative or oversaturated values. The possible blends that were assessed are presented in Table A1 in Appendix A.
After testing the thresholds for the hue angles of each of these combinations, the combination *, /, / was selected to produce:
R,G,B: (NIR/Red)*(SWIR1/NIR), (SWIR1/Green)/(SWIR1/Red), (Red/Blue)/(SWIR1/Green).
Which can be simplified algebraically to:
R,G,B: SWIR1/Red, Red/Green, (Red/Blue)/(SWIR1/Green).
This combination was preferred for the visual and numeric separability it provided for rainforests and because it shifted the rainforest hue angle values to the beginning of the colour wheel for values from 0 to between 6.825 and 7.95 degrees (on a range of 0 to 360), depending on the ecoregion, which makes it convenient for the assessment of just one threshold value rather than two.

3.6. Density Slicing with Colourimetric Benchmarks

A benchmark is a standard by which the value of an indicator can be compared and judged and can be representative of central tendencies or boundary (or gradient) conditions and can be composite of measurement indicators [92]. Taking into consideration that density slicing of individual image bands can lead to the misclassification of features with similar brightness to the feature of interest [67] and that the HSV colour space transformation decorrelates value and saturation from hue. Colourimetric benchmarks, based on the transitions of hue between rainforest and non-rainforest, were used to determine the density slicing thresholds of the candidate indices.
A pseudo-colour visualisation was applied to the greyscale of each of the candidate indices with a quantile interval stretch in a GIS and overlayed on the reference imagery (Table 1), which defined the ecological colourimetric deduction—where the orange hues in the ‘Land/Water’ RGB combination were assumed to distinguish rainforest. The intervals from the candidate index pseudo colour stretch were visually associated to the reference image and classified as rainforest where they were orange on the reference image. Visually comparing each candidate index or hue angle to the same reference image in this way ensured that the decision making was applied consistently across the mapped area. The selection of colourimetric benchmarks was an interactive process of augmented visual interpretation, similar to that applied by Bey et al. [18], which involved an iterative process of elimination, in which the intervals of the candidate indices were progressively attributed for presence or absence of rainforest with visual confirmation from the VHR imagery available in GEE on multiple locations across the current ecoregion. In this process, non-rainforest sclerophylls were found to follow a moisture gradient from dry to moist that can be associated with a colourimetric gradient on the Land/Water RGB combination from cyan to red (a complementary relationship on the colour wheel, rather than a circular sequence). These colourimetric extremes thus provided ideal benchmarks for thresholding.

3.7. Accuracy Assessment Design

The accuracy of map outputs from each algorithm was assessed using an independent field data set of floristic observations from the New South Wales vegetation database, BioNet [93]. The data set consisted of almost 35,000 in situ vegetation plots (georeferenced points) that have been assigned to plant community types, which were reduced to a binary label indicating rainforest or non-rainforest. Each of the points was intersected with the candidate rainforest classifications and considered correct if it fell within 30 m of the classified rainforest extent, i.e., with a 1-pixel tolerance of the 15 m resolution classification. A repeated Monte Carlo cross-validation approach (with spatial blocking) was taken to estimate accuracy and confidence intervals, and to avoid the bias that can arise from clustered sample points, we included a spatial blocking procedure in the resampling [94]. This involved taking many random subsamples of the validation data, stratified by spatial blocks, and then expressing the accuracy as the summary of the sampling distribution of the accuracy metrics calculated for each subsample. One hundred and thirteen blocks (~2000 km2 per block) were used across the study area, and the overall accuracy was calculated via a standard percentage agreement error matrix. Since rainforest samples were relatively rare in the overall dataset, the subsample was first limited to contain an even amount of rainforest and non-rainforest samples, and then, 5 points were sampled from each spatial block. One thousand iterations of the resampling were performed, and the median of the distribution was used as the reported overall accuracy value, with the 2.5th and 97.5th percentiles as the 95% confidence interval [94]. Each of the mapping regions (North Coast, South Coast, Tablelands and Ranges and the Northwest slopes) were assessed separately.

4. Results

The colourimetric hue-based indices and the new Aravena ratio subtraction index consistently produced more accurate maps of rainforest than the conventional indices in all the ecoregions, with the Aravena band ratio RGB blend consistently ranking highest (Table 4). The only exception was in the Tablelands and Ranges where NGRDI ranked higher than in the other ecoregions and the Aravena Index ranked fractionally higher than the Aravena band ratio RGB blend. All indices performed worse in the Northwest slopes. Table A2 in Appendix A shows examples of the mapping results for the two highest ranking indices that performed comparably to each other in different landscape types, with the Aravena Rainforest band ratio RGB hue generally mapping a little bit more rainforest than the Aravena Rainforest Index. Table A3 in Appendix A provides a comprehensive accuracy assessment for all the indices by ecoregion, with errors of commission and omission.
The area of rainforest mapped by the best-performing index in each ecoregion differed from the combined extent mapped in the existing 100 m resolution of the NVIS and NFI classifications (Table 5). The visual inspection of Table A2 in Appendix A indicates that finer features such as riparian gullies and vine thickets were picked up more consistently and with more detail in the new 15 m resolution classification.
Lastly, Table 6 shows a ranking of the non-forest agricultural areas that were misclassified as rainforest by the indices. The visual observation suggested that the indices will generally overspill onto agriculture with similar false colour hues. The rank order of such errors across the indices was the same for all ecoregions with the band ratios, having greater errors than the other indices, except that NGRDI had higher error rates than NDVI and EVI on the South Coast and Tablelands. The SWIR1/Infrared Colour Band Ratio RGB hue (and, to a much lesser degree, the NGRDI) also misclassified water as rainforest. Both the new Aravena Rainforest Index and band ratio RGB blend hue displayed higher percentages than conventional indices, with the index consistently lower than the hue but in varying degrees across the ecoregions.

5. Discussion

While false colour interpretation has been around for a long time, the direct measurement of false colour metrics for vegetation mapping has not been applied. In the past, most studies were limited to single images or small sets of images to maximise the phenological differences among forest types [57]. Detailed temporal and phenologic considerations are now more feasible with the increased accessibility of free imagery from NASA or the ESA and with the supercomputing power of platforms like GEE. The results of this study showed that high classification accuracies can be obtained from single interannual seasonal aggregate image composites, especially if band ratios are used instead of single bands and analysis are concentrated on bands from Red and beyond the visible spectrum (in the Near and Shortwave Infrared ranges), which are hardly affected by atmospheric effects and variations in illumination.
This mapping effort has produced the highest resolution (15 m) systematic classification of rainforest extent for NSW, Australia to date, with an overall average accuracy of 79%. This region is ecologically indicative of the whole east coast of Australia. SII and the direct colourimetry of false colour band ratio RGB blends can provide feasible and effective classifications once a consistent ecological-colourimetric deduction has been defined. This is because the band ratios per channel can accentuate and distinguish features arithmetically, keeping the solutions consistent and scalable without the need for parametrisation, leaving the larger budget of field samples available to accuracy assessments and refinements.
Both the Aravena Index and the Aravena band ratio RGB blend hue yielded accurate results that were comparable to each other. Decorrelating hue from RGB composites with the HSV transformation, after accentuating the separability of features with band ratio RGBs and their blends, can produce superior results to conventional vegetation indices. The physiognomic traits of leaf greenness and glossiness and the overall stand moisture can characterise rainforests more effectively than indicators of photosynthetic capacity or productivity alone, since the same hues can generalise the wide range of rainforest types from dry to very moist, which linear productivity indices cannot. They also emphasise the importance of the SWIR1 band, which was used in all the highest-ranking candidates. While this increase in performance comes at a price with the misclassification of nonrelated agriculture, such errors can be easily corrected by the application of a land use or forest mask. The NIR band seems to reduce rainforest discrimination, since even the NGRDI displayed a better performance than the more-often applied NDVI and EVI indices. The results also demonstrated the importance of ecoregionalisation, in that if the whole study area had been attempted with the one threshold for the ecoregion with sclerophylls with the highest productivity, then drier or less productive regions would have been under-mapped.
A lack of consensus remains in Australia as to how rainforests should be defined, and a particular point of debate seems to be the classification of communities with closed canopies of rainforest species below tall eucalypts. These ecotonal or mixed forests have been considered as either non-rainforest, seral stages of rainforest or as distinct vegetation types [95]. Perhaps some of the inaccuracy from this mapping effort is due to rainforest with non-rainforest emergents or sclerophyll-dominated forests with rainforest understoreys. Further field work would be required to validate this.
Rainforests can be considered visually distinguishable optical types [96] that can be directly measured and easily communicated with a false colour colourimetric ontology. This is due to the knowledge-based physiognomic and phenologic relations that can be visually deduced from ecological knowledge and validated with comprehensive geospatial accuracy assessments. False colour hues can therefore be considered scalable landscape metrics, and “Red in the Aravena Rainforest Band Ratio RGB Blend, with hue angle < x°, during Summer” has proven to be a simple, scalable, reusable and communicable colourimetric ontology for the rainforests of the east coast of Australia.
While some false colour combinations appear to be more perceptually intuitive and separable, other combinations statistically differentiate hue easier than others. A combination may distinguish a particular feature more effectively by simplifying rather than diversifying the range of colours represented. Although the ‘Land/Water’ RGB may initially appear to show the widest range of colours and, hence, the best separability of classes, the band ratio RGB blend that performed best in this study only appears to show a spread of predominantly red to blue hues. The use of band ratio RGBs have the additional benefit of reducing atmospheric and hill shade effects. From this study and experimentation with other classes such as water and urban areas, a set of three heuristics can be used to decide which RGB band ratio combination will best distinguish a particular feature of interest consistently throughout a region:
  • Ideal combinations separate features of interest as much as possible on the colour wheel—Analogous (neighbouring) colours like red and orange can be difficult to distinguish; however, complementary or triadic colours like red and blue (as those in the Aravena Rainforest Band Ratio RGB Blend) are much easier to separate.
  • The feature should ideally be distinctly represented from other features by either red or magenta in order to only require one threshold from an extreme to be determined and to avoid any saturation or loss of data across the colour wheel’s discontinuity. This is because red is the first colour in the colour wheel from 0 degrees, while magenta is the last colour in the colour wheel before 360 degrees.
  • RGBs with overly dark or bright tones are usually not preferred, as they have the potential to contain multiple hues that are difficult to discern visually.
Interpreting patterns directly from multispectral satellite imagery produces the highest fidelity end product of ecological observations by avoiding unnecessary loss of resolution or corruption of the raw image data with coarser or pre-modelled data (such as soil or precipitation data). A one-class, one-index approach can thus be robust, analytically simple, scalable, communicable, fast to process and, thus, more cost-effective. An additional advantage is that the products can be adjusted easily, quickly and transparently during stakeholder negotiations and with TEK and LEK (Traditional and Local Ecological Knowledge) holders [97,98]. An understanding of how mapping solutions work, how reliable decisions are and how they were arrived at is particularly valuable to gain the confidence of users during negotiations and stakeholder consultations, such as those of conservation and participatory projects. The approach should further democratise remote sensing by facilitating the transparency and communicability of ecological observations for environmental and land resource consultation, negotiations and management for decision-makers, as well as traditional/indigenous and local stakeholder knowledge exchange and participation, while promoting simplicity and feasibility through rapid processing. While the colourimetric thresholds and evidential reasoning could be considered subjective by some, or requiring expertise, the SII interpretation key provided in this study with its associations and the quantitative colourimetric ontology should provide the formality and confidence necessary to guide and inform different users quite explicitly and consistently, especially if some training is provided to people with local ecological knowledge—as has been shown with various participatory mapping projects [99].
The performance ranking of the candidate indices in this study was intended to show that better inputs could be provided to modern statistical approaches through colourimetry. How colourimetric indices can be further refined, automated or integrated to modern statistical techniques is more of a matter for further investigation. Whether finding the most effective blend for the purposes of feature separability is determined by exploring factorial combinations, as in this study, or by the blending operations typically applied by image manipulation/editing software or by statistical methods from extensive large-scale field sampling is an area of study worth further investigation. Some clustering and machine learning algorithms for this could include K-means ++, Gaussian mixture, Hidden Markov models and Hierarchical or Tree models [100].
The results confirm the potential for large-scale, high-resolution mapping of broadly defined vegetation types and suggest that testing for other applications, such as temporal monitoring for disturbance typing, severity mapping and recovery assessment, is well-warranted. It is hoped that more efficient purposive sampling strategies will be developed in the future with the benefits that colourimetric deductions and visual analytics can contribute and their requirements in mind.

Cautions for Implementation

For the approach to be truly scalable and consistent in its communicability, it is important that the imagery be viewed in, discussed and measured from the same stretch. This is not possible by default from a GIS where a subset will be stretched to the extent of the subset’s histogram unless a fixed linear stretch is saved and applied to all the subsets. It is therefore preferable to attempt the colourimetric deduction from GEE where the whole world can be visualised consistently with the same stretch. Histogram stretching is, however, recommended for the visualisation and classification of the sample areas once the deduction is understood. As with this study, a forest mask or some post editing will be necessary to remove incorrectly classified agricultural features. Caution must be taken when using hue. It is a circular measure with a discontinuity at 360° [100], requiring circular statistics and/or appropriate scaling for further correlation and multivariate analyses [101]. Additionally, while hue is traditionally represented by degrees between 0 and 360, some software work with hue angles between 0 and 240 or 0 and 255 for processing efficiency. Caution should also be taken when applying the inter-annual imagine composite technique in contexts where changes are known to occur. For example, for Australia, it should be understood that some rainforest will have been burnt during the black summer fires (2019 to 2020), and in countries where illegal logging persists, these changes will not be represented clearly. An uncertainty mask of the changes is recommended to account for this.

6. Conclusions

A continuing endeavour of remote sensing research is to maximise the performance of mapping workflows, as well as their simplicity and transparency. Colourimetric approaches have offered potential advances in this area. Accurate results were achieved ecoregionally at a minimal cost by processing a huge amount of data quickly with the freely available data and supercomputing power provided by GEE. While traditional vegetation indices focus on productivity, this study showed that separability-enhanced false colour band ratio RGB hue values can be considered scalable multivariate landscape indicators that can encapsulate multiple physiognomic properties such as greenness, wetness and brightness.
A formalised colourimetric approach to intuitive SII with a visual interpretation key and the use of one class density slicing demonstrates the effective simplification of remote sensing classifications for the purposes of accessibility, communicability, scalability, feasibility and repeatability for visually distinguishable features. A wider audience can be benefited, since no programming or advanced statistical knowledge is required for the user from a GIS once the deduction has been validated and the phenologic image composites have been constructed.
Due to the known representativeness of rainforest types in NSW, we conclude that either the Aravena Rainforest Band Ratio RGB Blend hue angle or the Aravena Rainforest Index could potentially be used to reliably map and monitor rainforests for the whole of Australia.

Author Contributions

Conceptualisation, R.A.A.; methodology, R.A.A.; software, GEE, ArcGIS; validation, M.B.L. and A.R.; formal analysis, R.A.A.; investigation, R.A.A.; data curation, A.R.; writing—original draft preparation, R.A.A.; writing—review and editing, D.A.K.; visualisation, R.A.A.; supervision, D.A.K. and M.B.L.; project administration, R.A.A.; All authors have read and agreed to the published version of the manuscript.

Funding

Australian Research Council.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. Arithmetic combinations tested for the creation of the Aravena Rainforest band ratio RGB blend. All examples are stretched with a histogram equalisation. The preferred combination *, /, / is highlighted in green.
Table A1. Arithmetic combinations tested for the creation of the Aravena Rainforest band ratio RGB blend. All examples are stretched with a histogram equalisation. The preferred combination *, /, / is highlighted in green.
Reference Visualisation (R,G,B: NIR, SWIR1, Red):
Rainforest in Orange
Remotesensing 13 02544 i001
(NIR/Red)-(SWIR1/NIR)(SWIR1/Green)-(SWIR1/Red)(Red/Blue)-(SWIR1/Green)Examples
*** Remotesensing 13 02544 i002
*/* Remotesensing 13 02544 i003
**/ Remotesensing 13 02544 i004
*// Remotesensing 13 02544 i005
/// Remotesensing 13 02544 i006
/*/ Remotesensing 13 02544 i007
//* Remotesensing 13 02544 i008
/** Remotesensing 13 02544 i009
Table A2. Examples comparing the two highest-ranking indices for the different rainforest types in different landscapes, with all image and index histograms equalised to the size of the Landsat tiles used for the first sampling stage and with coordinates in WGS84 decimal degrees.
Table A2. Examples comparing the two highest-ranking indices for the different rainforest types in different landscapes, with all image and index histograms equalised to the size of the Landsat tiles used for the first sampling stage and with coordinates in WGS84 decimal degrees.
Ontological Reference
Image ‘Land/Water’ RGB
15 m Aravena
Rainforest Index in Pseudo Colour
Remotesensing 13 02544 i010
15 m Aravena
Rainforest Index
Classification
15 m Aravena
Rainforest Band Ratio RGB Blend
15 m Aravena
Rainforest Band Ratio RGB Blend HSV Hue in Pseudo Colour
Remotesensing 13 02544 i011
15 m Aravena
Rainforest Band Ratio RGB Blend HSV Hue Classification
Existing 100 m
Reference Classification on ‘Land/Water’ RGB
Example 1: Subtropical Rainforests; Location: 28.485 S, 152.716 E; Scale: 1:40,000.
Remotesensing 13 02544 i012 Remotesensing 13 02544 i013 Remotesensing 13 02544 i014 Remotesensing 13 02544 i015 Remotesensing 13 02544 i016 Remotesensing 13 02544 i017 Remotesensing 13 02544 i018
Example 2: Northern Warm Temperate Rainforests; Location: 31.423 S, 152.134 E; Scale 1:60,000.
Remotesensing 13 02544 i019 Remotesensing 13 02544 i020 Remotesensing 13 02544 i021 Remotesensing 13 02544 i022 Remotesensing 13 02544 i023 Remotesensing 13 02544 i024 Remotesensing 13 02544 i025
Example 3: Southern Warm Temperate Rainforests; Location: 36.033 S, 149.880 E; Scale: 1:20,000.
Remotesensing 13 02544 i026 Remotesensing 13 02544 i027 Remotesensing 13 02544 i028 Remotesensing 13 02544 i029 Remotesensing 13 02544 i030 Remotesensing 13 02544 i031 Remotesensing 13 02544 i032
Example 4: Cool Temperate Rainforests; Location: 32.062 S, 151.481 E; Scale: 1:60,000.
Remotesensing 13 02544 i033 Remotesensing 13 02544 i034 Remotesensing 13 02544 i035 Remotesensing 13 02544 i036 Remotesensing 13 02544 i037 Remotesensing 13 02544 i038 Remotesensing 13 02544 i039
Example 5: Dry Rainforests; Location: 31.043 S, 152.214 E; Scale: 1:40,000.
Remotesensing 13 02544 i040 Remotesensing 13 02544 i041 Remotesensing 13 02544 i042 Remotesensing 13 02544 i043 Remotesensing 13 02544 i044 Remotesensing 13 02544 i045 Remotesensing 13 02544 i046
Example 6: Littoral (coastal) Rainforests; Location: 32.213 S, 152.553 E; Scale: 1:20,000.
Remotesensing 13 02544 i047 Remotesensing 13 02544 i048 Remotesensing 13 02544 i049 Remotesensing 13 02544 i050 Remotesensing 13 02544 i051 Remotesensing 13 02544 i052 Remotesensing 13 02544 i053
Example 7: Western Vine Thickets; Location: 29.661 S, 150.324 E; Scale: 1:30,000.
Remotesensing 13 02544 i054 Remotesensing 13 02544 i055 Remotesensing 13 02544 i056 Remotesensing 13 02544 i057 Remotesensing 13 02544 i058 Remotesensing 13 02544 i059 Remotesensing 13 02544 i060
Table A3. Comprehensive accuracy assessment results.
Table A3. Comprehensive accuracy assessment results.
EcoregionIndices (%)Band Ratio RGB Hues (%)
NDVIEVI2NGRDIAravena
Rainforest
Index
SWIR1/Natural Colour Band
Ratio RGB
SWIR1/Infrared Colour Band
Ratio RGB
Aravena
Rainforest Band Ratio RGB Blend
North Coast
User Not Rainforest99.4599.4497.3394.6591.8994.6592.82
User Rainforest13.1714.9740.7255.0959.8848.5062.87
Producer Not Rainforest55.6956.0464.2669.7671.3766.6773.18
Producer Rainforest95.6595.6593.5190.5287.0889.2088.89
Overall accuracy58.3358.9170.3475.8476.5572.4678.49
South Coast
User Not Rainforest10010099.4798.4094.8897.2597.38
User Rainforest13.6114.2928.5743.5443.5440.1447.62
Producer Not Rainforest59.8759.9464.0769.1168.5467.6670.66
Producer Rainforest10010097.6295.4587.1491.5393.59
Overall accuracy62.2162.2868.2674.3472.6272.2475.80
Tablelands and
Rangelands
User Not Rainforest99.0599.0495.9295.4595.1996.0095.15
User Rainforest38.1838.1867.2776.3654.5565.4574.55
Producer Not Rainforest75.3675.3784.9288.1880.1584.2187.80
Producer Rainforest95.4595.4589.4789.8085.7189.4789.13
Overall accuracy77.9978.0586.0388.6981.3385.3588.13
Northwest Slopes
User Not Rainforest10010010098.3610098.4696.72
User Rainforest0010.7135.7117.8616.0744.64
Producer Not Rainforest54.4754.4757.2664.7158.9358.4167.71
Producer Rainforest0010094.7410088.8991.83
Overall accuracy54.4754.4759.1769.6761.9060.8072.95

References

  1. Ali, I.; Greifeneder, F.; Stamenkovic, J.; Neumann, M.; Notarnicola, C. Review of Machine Learning Approaches for Biomass and Soil Moisture Retrievals from Remote Sensing Data. Remote Sens. 2015, 7, 16398–16421. [Google Scholar] [CrossRef] [Green Version]
  2. Fassnacht, F.; Latifi, H.; Stereńczak, K.; Modzelewska, A.; Lefsky, M.; Waser, L.; Straub, C.; Ghosh, A. Review of studies on tree species classification from remotely sensed data. Remote Sens. Environ. 2016, 186, 64–87. [Google Scholar] [CrossRef]
  3. Ball, J.E.; Anderson, D.T.; Chan, C.S. Comprehensive survey of deep learning in remote sensing: Theories, tools, and challenges for the community. J. Appl. Remote Sens. 2017, 11, 042609. [Google Scholar] [CrossRef] [Green Version]
  4. Janik, A.; Sankaran, K.; Ortiz, A. Interpreting Black-Box Semantic Segmentation Models in Remote Sensing Applications; Archambault, D., Nabney, I., Peltonen, J., Eds.; Machine Learning Methods in Visualisation for Big Data: Porto, Portugal, 2019. [Google Scholar]
  5. Roscher, R.; Bohn, B.; Duarte, M.F.; Garcke, J. Explain it to me—Facing remote sensing challenges in the bio- and geosciences with explainable machine learning. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2020, 3, 817–824. [Google Scholar] [CrossRef]
  6. Carvalho, D.V.; Pereira, E.M.; Cardoso, J.S. Machine Learning Interpretability: A Survey on Methods and Metrics. Electronics 2019, 8, 832. [Google Scholar] [CrossRef] [Green Version]
  7. Kovalerchuk, B.; Ahmad, M.A.; Teredesai, A. Survey of Explainable Machine Learning with Visual and Granular Methods Beyond Quasi-Explanations. Econom. Financ. Appl. 2021. [Google Scholar] [CrossRef]
  8. Campos-Taberner, M.; García-Haro, F.J.; Martínez, B.; Izquierdo-Verdiguier, E.; Atzberger, C.; Camps-Valls, G.; Gilabert, M.A. Understanding deep learning in land use classification based on Sentinel-2 time series. Sci. Rep. 2020, 10, 1–12. [Google Scholar] [CrossRef] [PubMed]
  9. Chatzimparmpas, A.; Martins, R.M.; Jusufi, I.; Kerren, A. A survey of surveys on the use of visualization for interpreting machine learning models. Inf. Vis. 2020, 19, 207–233. [Google Scholar] [CrossRef] [Green Version]
  10. Wester, K.; Lundén, B.; Bax, G. Analytically processed Landsat TM images for visual geological interpretation in the northern Scandinavian Caledonides. ISPRS J. Photogramm. Remote Sens. 1990, 45, 442–460. [Google Scholar] [CrossRef]
  11. Pekel, J.-F.; Cottam, A.; Gorelick, N.; Belward, A.S. High-resolution mapping of global surface water and its long-term changes. Nature 2016, 540, 418–422. [Google Scholar] [CrossRef]
  12. Keim, D.; Andrienko, G.; Fekete, J.D.; Görg, C.; Kohlhammer, J.; Melançon, G. Visual Analytics: Definition, Process, and Challenges. In Information Visualization; Springer: Berlin/Heidelberg, Germany, 2008. [Google Scholar]
  13. Ray, R.G. Aerial Photographs in Geologic Interpretation and Mapping, Geological Survey Professional Paper 373; United States Government Printing Office: Washington, DC, USA, 1960. [Google Scholar]
  14. Fensham, R.; Fairfax, R. Aerial photography for assessing vegetation change: A review of applications and the relevance of findings for Australian vegetation history. Aust. J. Bot. 2002, 50, 415–429. [Google Scholar] [CrossRef]
  15. Horning, N. Justification for Using Photo Interpretation Methods to Interpret Satellite Imagery Version 1.0; American Museum of Natural History, Center for Biodiversity and Conservation: New York, NY, USA, 2004. [Google Scholar]
  16. Morgan, J.L.; Gergel, S.E.; Coops, N. Aerial Photography: A Rapidly Evolving Tool for Ecological Management. Bioscience 2010, 60, 47–59. [Google Scholar] [CrossRef]
  17. White, R. Human expertise in the interpretation of remote sensing data: A cognitive task analysis of forest disturbance attribution. Int. J. Appl. Earth Obs. Geoinf. 2019, 74, 37–44. [Google Scholar] [CrossRef]
  18. Bey, A.; Díaz, A.S.-P.; Maniatis, D.; Marchi, G.; Mollicone, D.; Ricci, S.; Bastin, J.-F.; Moore, R.; Federici, S.; Rezende, M.; et al. Collect Earth: Land Use and Land Cover Assessment through Augmented Visual Interpretation. Remote Sens. 2016, 8, 807. [Google Scholar] [CrossRef] [Green Version]
  19. Smuts, J.C. Holism and Evolution; Macmillan: New York, NY, USA, 1926. [Google Scholar]
  20. Zonneveld, I.S. The land unit – A fundamental concept in landscape ecology, and its applications. Landsc. Ecol. 1989, 3, 67–86. [Google Scholar] [CrossRef]
  21. Avery, E.T.; Berlin, G.L. Fundamentals of Remote Sensing and Airphoto Interpretation; Macmillan: Stuttgart, Germany, 2003. [Google Scholar]
  22. White, R.A.; Çöltekin, A.; Hoffman, R.R. Remote Sensing and Cognition: Human Factors in Image Interpretation, 1st ed.; CRC Press: Boca Raton, FL, USA, 2018. [Google Scholar]
  23. Glenn, E.P.; Huete, A.R.; Nagler, P.L.; Nelson, S.G. Relationship Between Remotely-sensed Vegetation Indices, Canopy Attributes and Plant Physiological Processes: What Vegetation Indices Can and Cannot Tell Us About the Landscape. Sensors 2008, 8, 2136–2160. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  24. Metcalfe, D.; Green, P. Chapter 15. Rainforests and Vine Thickets. In Australian Vegetation, 3rd ed.; Keith, D., Ed.; Wiley: Hoboken, NJ, USA, 2017; p. 766. [Google Scholar]
  25. Sultan, M.; Arvidson, R.E.; Sturchio, N.C.; Guinness, E.A. Lithologic mapping in arid regions with Landsat thematic mapper data: Meatiq dome, Egypt. Geol. Soc. Am. Bull. 1987, 99, 748–762. [Google Scholar] [CrossRef]
  26. Horning, N. Selecting the Appropriate Band Combination for an RGB Image Using Landsat Imagery Version 1.0; American Museum of Natural History, Center for Biodiversity and Conservation: New York, NY, USA, 2004. [Google Scholar]
  27. Van der Meer, F.D.; Van der Werff, H.M.; Van Ruitenbeek, F.J.; Hecker, C.A.; Bakker, W.H.; Noomen, M.F.; van der Meijde, M.; Carranza, E.J.M.; de Smeth, J.B.; Woldai, T. Multi-and Hyperspectral geologic remote sensing: A review. Int. J. Appl. Earth Obs. Geoinf. 2012, 14, 112–128. [Google Scholar] [CrossRef]
  28. Jensen, J.R. Biophysical Remote Sensing. Ann. Assoc. Am. Geogr. 1983, 73, 111–132. [Google Scholar] [CrossRef]
  29. Estes, J.E.; Hajic, E.J.; Tinney, L.R. Fundamentals of image analysis: Analysis of visible and thermal infrared data. In Manual of Remote Sensing; Colwell, R.N., Ed.; American Society of Photogrammetry: Falls Church, VI, USA, 1983. [Google Scholar]
  30. Campbell, J.B. Introduction to Remote Sensing; Guiliford Press: New York, NY, USA, 2002. [Google Scholar]
  31. Bianchetti, R.A.; MacEachren, A.M. Cognitive Themes Emerging from Air Photo Interpretation Texts Published to 1960. ISPRS Int. J. Geoinf. 2015, 4, 551–571. [Google Scholar] [CrossRef] [Green Version]
  32. Joblove, G.H.; Greenberg, D. Color spaces for computer graphics. ACM SIGGRAPH Comput. Graph. 1978, 12, 20–25. [Google Scholar] [CrossRef]
  33. Movia, A.; Beinat, A.; Sandri, T. Land use classification from VHR aerial images using invariant colour components and texture. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, 41, 311–317. [Google Scholar] [CrossRef] [Green Version]
  34. Pekel, J.-F.; Ceccato, P.; Vancutsem, C.; Cressman, K.; Vanbogaert, E.; Defourny, P. Development and Application of Multi-Temporal Colorimetric Transformation to Monitor Vegetation in the Desert Locust Habitat. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2011, 4, 318–326. [Google Scholar] [CrossRef]
  35. Wu, S.; Chen, H.; Zhao, Z.; Long, H.; Song, C. An Improved Remote Sensing Image Classification Based on K-Means Using HSV Color Feature. In Proceedings of the 10th International Conference on Computational Intelligence and Security, CIS 2014, Kunming, China, 15–16 November 2014; pp. 201–204. [Google Scholar]
  36. Lessel, J.; Ceccato, P. Creating a basic customizable framework for crop detection using Landsat imagery. Int. J. Remote Sens. 2016, 37, 6097–6107. [Google Scholar] [CrossRef]
  37. Pekel, J.-F.; Vancutsem, C.; Bastin, L.; Clerici, M.; Vanbogaert, E.; Bartholomé, E.; Defourny, P. A near real-time water surface detection method based on HSV transformation of MODIS multi-spectral time series data. Remote Sens. Environ. 2014, 140, 704–716. [Google Scholar] [CrossRef] [Green Version]
  38. Bertels, L.; Smets, B.; Wolfs, D. Dynamic Water Surface Detection Algorithm Applied on PROBA-V Multispectral Data. Remote Sens. 2016, 8, 1010. [Google Scholar] [CrossRef] [Green Version]
  39. Namikawa, L.; Körting, T.; Castejon, E. Water body extraction from RapidEye images: An automated methodology based on Hue component of color transformation from RGB to HSV model. Braz. J. Cartogr. 2016, 68, 1097–1111. [Google Scholar]
  40. Woerd, H.; Wernand, M. Hue-Angle Product for Low to Medium Spatial Resolution Optical Satellite Sensors. Remote Sens. 2018, 10, 180. [Google Scholar] [CrossRef] [Green Version]
  41. Lehmann, M.K.; Nguyen, U.; Allan, M.; Van Der Woerd, H.J. Colour Classification of 1486 Lakes across a Wide Range of Optical Water Types. Remote Sens. 2018, 10, 1273. [Google Scholar] [CrossRef] [Green Version]
  42. Zhao, Y.; Shen, Q.; Wang, Q.; Yang, F.; Wang, S.; Li, J.; Zhang, F.; Yao, Y. Recognition of Water Colour Anomaly by Using Hue Angle and Sentinel 2 Image. Remote Sens. 2020, 12, 716. [Google Scholar] [CrossRef] [Green Version]
  43. Cushman, S.A.; Littell, J.; McGarigal, K. The Problem of Ecological Scaling in Spatially Complex, Nonequilibrium Ecological Systems. In Spatial Complexity, Informatics, and Wildlife Conservation; Springer: Tokyo, Japan, 2010. [Google Scholar]
  44. Tran, B.N.; Tanase, M.A.; Bennett, L.T.; Aponte, C. Evaluation of Spectral Indices for Assessing Fire Severity in Australian Temperate Forests. Remote Sens. 2018, 10, 1680. [Google Scholar] [CrossRef] [Green Version]
  45. Stoian, A.; Poulain, V.; Inglada, J.; Poughon, V.; Derksen, D. Land Cover Maps Production with High Resolution Satellite Image Time Series and Convolutional Neural Networks: Adaptations and Limits for Operational Systems. Remote Sens. 2019, 11, 1986. [Google Scholar] [CrossRef] [Green Version]
  46. Wedding, L.M.; Lepczyk, C.A.; Pittman, S.J.; Friedlander, A.M.; Jorgensen, S. Quantifying seascape structure: Extending terrestrial spatial pattern metrics to the marine realm. Mar. Ecol. Prog. Ser. 2011, 427, 219–232. [Google Scholar] [CrossRef] [Green Version]
  47. Lewis, D.; Phinn, S. Accuracy assessment of vegetation community maps generated by aerial photography interpretation: Perspective from the tropical savanna, Australia. J. Appl. Remote Sens. 2011, 5, 053565. [Google Scholar] [CrossRef] [Green Version]
  48. Helmer, E.H.; Nicholas, R.; Goodwin, V.G.; Carlos, M.S., Jr.; Gregory, P.A. Characterizing tropical forests with multispectral imagery. In Land Resources: Monitoring, Modeling and Mapping; Thenkabail, P.S., Ed.; Taylor & Francis Group: Boca Raton, FL, USA, 2015; p. 849. [Google Scholar]
  49. Gorelick, N.; Hancher, M.; Dixon, M.; Ilyushchenko, S.; Thau, D.; Moore, R. Google Earth Engine: Planetary-scale geospatial analysis for everyone. Remote Sens. Environ. 2017, 202, 18–27. [Google Scholar] [CrossRef]
  50. Jelinski, D.; Wu, J. The Modifiable Areal Unit Problem and Implications for Landscape Ecology. Landsc. Ecol. 1996, 11, 129–140. [Google Scholar] [CrossRef]
  51. Roleček, J.; Chytrý, M.; Hájek, M.; Lvončík, S.; Tichý, L. Sampling design in large-scale vegetation studies: Do not sacrifice ecological thinking to statistical purism! Folia Geobot. Phytotaxon. 2007, 42, 199–208. [Google Scholar] [CrossRef] [Green Version]
  52. Guberman, S.; Maximov, V.; Pashintsev, A. Gestalt and Image Understanding. Gestalt Theory 2012, 34, 143. [Google Scholar]
  53. Deng, X.; Li, W.; Liu, X.; Guo, Q.; Newsam, S. One-class remote sensing classification: One-class vs. binary classifiers. Int. J. Remote Sens. 2018, 39, 1890–1910. [Google Scholar] [CrossRef]
  54. Rajbhandari, S.; Aryal, J.; Osborn, J.; Lucieer, A.; Musk, R. Leveraging Machine Learning to Extend Ontology-Driven Geographic Object-Based Image Analysis (O-GEOBIA): A Case Study in Forest-Type Mapping. Remote Sens. 2019, 11, 503. [Google Scholar] [CrossRef] [Green Version]
  55. Kumar, L.; Schmidt, K.S.; Dury, S.; Skidmore, A.K. Review of hyperspectral remote sensing and vegetation Science. In Imaging Spectrometry: Basic Principles and Prospective Applications; Van Der Meer, F.D., De Jong, S.M., Eds.; Kluwer: Dordrecht, The Netherlands, 2001. [Google Scholar]
  56. Thenkabail, P.S.; Hall, J.; Lin, T.; Ashton, M.S.; Harris, D.; Enclona, E.A. Detecting floristic structure and pattern across topographic and moisture gradients in a mixed species Central African forest using IKONOS and Landsat-7 ETM+ images. Int. J. Appl. Earth Obs. Geoinf. 2003, 4, 255–270. [Google Scholar] [CrossRef]
  57. Pasquarella, V.; Holden, C.E.; Kaufman, L.; Woodcock, C.E. From imagery to ecology: Leveraging time series of all available Landsat observations to map and monitor ecosystem state and dynamics. Remote Sens. Ecol. Conserv. 2016, 2, 152–170. [Google Scholar] [CrossRef]
  58. Chavez, P.S., Jr.; Berlin, G.L.; Sowers, L.B. Statistical method for selecting Landsat MSS ratios. J. Appl. Photogr. Eng. 1982, 8, 23–30. [Google Scholar]
  59. Sheffield, C. Selecting band combinations from multispectral data. Photogramm. Eng. Remote Sens. 1985, 51, 681–687. [Google Scholar]
  60. Saha, S.K.; Kudrat, M. Selection of spectral band combination for land cover/land use classification using a brightness value overlapping index (BVOI). J. Indian Soc. Remote Sens. 1991, 19, 141–147. [Google Scholar] [CrossRef]
  61. Beauchemin, M.; Fung, K.B. On statistical band selection for image visualization. Photogramm. Eng. Remote Sens. 2001, 67, 571–574. [Google Scholar]
  62. Ming, D.; Du, J.; Zhang, X.; Liu, T. Modified average local variance for pixel-level scale selection of multiband remote sensing images and its scale effect on image classification accuracy. J. Appl. Remote Sens. 2013, 7, 073565. [Google Scholar] [CrossRef]
  63. Miller, S.D.; Lindsey, D.T.; Seaman, C.J.; Solbrig, J.E. GeoColor: A Blending Technique for Satellite Imagery. J. Atmos. Ocean. Technol. 2020, 37, 429–448. [Google Scholar] [CrossRef]
  64. Sanchez, H.C.; Boyd, D.; Foody, G. One-Class Classification for Mapping a Specific Land-Cover Class: SVDD Classification of Fenland. IEEE Trans. Geosci. Remote Sens. 2007, 45, 1061–1073. [Google Scholar] [CrossRef] [Green Version]
  65. Bennett, M.W.A. Rapid monitoring of wetland water status using density slicing. In Proceedings of the 4th Australasian Remote Sensing Conference, Adelaide, Australia, 14–18 September 1987; pp. 682–691. [Google Scholar]
  66. Frazier, P.S.; Page, K.J. Water Body Detection and Delineation with Landsat TM Data. Photogramm. Eng. Remote Sens. 2000, 66, 1461–1467. [Google Scholar]
  67. Hamandawana, H.; Eckardt, F.; Ringrose, S. The use of step-wise density slicing in classifying high-resolution panchromatic photographs. Int. J. Remote Sens. 2006, 27, 4923–4942. [Google Scholar] [CrossRef]
  68. Yang, X.; Tien, D. An automated image analysis approach for classification and mapping of woody vegetation from digital aerial photograph. World Rev. Sci. Technol. Sustain. Dev. 2010, 7, 13–23. [Google Scholar] [CrossRef] [Green Version]
  69. Brewer, C.K.; Winne, J.C.; Redmond, R.L.; Opitz, D.W.; Mangrich, M.V. Classifying and Mapping Wildfire Severity. Photogramm. Eng. Remote Sens. 2005, 71, 1311–1320. [Google Scholar] [CrossRef] [Green Version]
  70. Laurance, W.F. Emerging Threats to Tropical Forests. Ann. Mo. Bot. Gard. 2015, 100, 159–169. [Google Scholar] [CrossRef]
  71. Laurance, S. An Amazonian rainforest and its fragments as a laboratory of global change. Biol. Rev. 2017, 93, 223–247. [Google Scholar] [CrossRef]
  72. Mayaux, P.; Pekel, J.-F.; Desclée, B.; Donnay, F.; Lupi, A.; Frédéric, A.; Clerici, M.; Bodart, C.; Brink, A.; Nasi, R.; et al. State and evolution of the African rainforests between 1990 and 2010. Philos. Trans. R. Soc. Lond. Ser. B Biol. Sci. 2013, 368, 20120300. [Google Scholar] [CrossRef] [Green Version]
  73. Australian Government Department of Agriculture, Water and the Environment. NVIS (National Vegetation Information System) V5.1 ©; Australian Government Department of Agriculture, Water and the Environment: Canberra, Australia, 2018. [Google Scholar]
  74. Montreal Process Implementation Group for Australia and National Forest Inventory Steering Committee. Australia’s State of the Forests Report 2018; ABARES: Canberra, Australia, 2018. [Google Scholar]
  75. Keith, D. Ocean Shores to Desert Dunes: The Native Vegetation of New South Wales and the ACT. TAXON 2005, 54, 1120. [Google Scholar]
  76. Keith, D.; Simpson, C. Vegetation Formations and Classes of NSW (Version 3.03), VIS_ID 3848; Department of Planning, Industry and Environment: Canberra, Australia, 2018. [Google Scholar]
  77. Webb, L.J. A Physiognomic Classification of Australian Rain Forests. J. Ecol. 1959, 47, 551–570. [Google Scholar] [CrossRef] [Green Version]
  78. Bowman, D.M.J.S. Australian Rainforest: Island of Green in a Land of Fire; Cambridge University Press: Cambridge, UK, 2000. [Google Scholar]
  79. Australian Government, Department of Environment and Energy. 2017: NVIS Fact Sheet MVG 1—Rainforests and Vine Thickets; Australian Government, Department of Environment and Energy: Canberra, Australia, 2017. [Google Scholar]
  80. Department of Agriculture, Water and the Environment. Interim Biogeographic Regionalisation for Australia (IBRA v7) Subregions—States and Territories; Department of Agriculture, Water and the Environment: Canberra, Australia, 2012. [Google Scholar]
  81. Sinha, P.; Kumar, L.; Reid, N. Seasonal Variation in Land-Cover Classification Accuracy in a Diverse Region. Photogramm. Eng. Remote Sens. 2012, 78, 271–280. [Google Scholar] [CrossRef]
  82. Huete, A.R.; Didan, K.; Shimabukuro, Y.E.; Ratana, P.; Saleska, S.R.; Hutyra, L.R.; Yang, W.; Nemani, R.R.; Myneni, R. Amazon rainforests green-up with sunlight in dry season. Geophys. Res. Lett. 2006, 33. [Google Scholar] [CrossRef] [Green Version]
  83. Ruefenacht, B. Comparison of Three Landsat TM Compositing Methods: A Case Study Using Modeled Tree Canopy Cover. Photogramm. Eng. Remote Sens. 2016, 82, 199–211. [Google Scholar] [CrossRef]
  84. Phan, T.N.; Kuch, V.; Lehnert, L. Land Cover Classification using Google Earth Engine and Random Forest Classifier—The Role of Image Composition. Remote Sens. 2020, 12, 2411. [Google Scholar] [CrossRef]
  85. Fisher, A.; Day, M.; Gill, T.; Roff, A.; Danaher, T.; Flood, N. Large-Area, High-Resolution Tree Cover Mapping with Multi-Temporal SPOT5 Imagery, New South Wales, Australia. Remote Sens. 2016, 8, 515. [Google Scholar] [CrossRef] [Green Version]
  86. Gillieson, D.; Lawson, T.J.; Searle, L. Applications of High Resolution Remote Sensing in Rainforest Ecology and Management. Living A Dyn. Trop. For. Landsc. 2009, 334–348. [Google Scholar] [CrossRef]
  87. Sesnie, S.E.; Finegan, B.; Gessler, P.E.; Thessler, S.; Bendana, Z.R.; Smith, A.M.S. The multispectral separability of Costa Rican rainforest types with support vector machines and Random Forest decision trees. Int. J. Remote Sens. 2010, 31, 2885–2909. [Google Scholar] [CrossRef]
  88. Silva, F.B.; Shimabukuro, Y.E.; Aragao, L.E.; Anderson, L.O.; Pereira, G.; Cardozo, F.; Arai, E. Large-scale heterogeneity of Amazonian phenology revealed from 26-year long AVHRR/NDVI time-series. Environ. Res. Lett. 2013, 8, 024011. [Google Scholar] [CrossRef] [Green Version]
  89. Verheggen, A.; Mayaux, P.; de Wasseige, C.; Defourny, P. Mapping Congo Basin vegetation types from 300m and 1km multi-sensor time series for carbon stocks and forest areas estimation. Biogeosciences 2012, 9, 5061–5079. [Google Scholar] [CrossRef] [Green Version]
  90. Rocha, A.V.; Shaver, G.R. Advantages of a two band EVI calculated from solar and photosynthetically active radiation fluxes. Agric. For. Meteorol. 2009, 149, 1560–1563. [Google Scholar] [CrossRef]
  91. Zhang, X.; Friedl, M.; Tan, B.; Goldberg, M.; Yu, Y. Long-Term Detection of Global Vegetation Phenology from Satellite Instruments. Phenol. Clim. Chang. 2012. [Google Scholar] [CrossRef] [Green Version]
  92. Washington-Allen, R.; West, N.; Ramsey, R.; Efroymson, R. A Protocol for Retrospective Remote Sensing: Based Ecological Monitoring of Rangelands. Rangelands 2006, 59, 19–29. [Google Scholar] [CrossRef]
  93. NSW Government (2019) NSW BioNet. Office of Environment and Heritage. 2020. Available online: http://www.bionet.nsw.gov.au/ (accessed on 15 March 2021).
  94. Lyons, M.B.; Keith, D.A.; Phinn, S.R.; Mason, T.J.; Elith, J. A comparison of resampling methods for remote sensing classification and accuracy assessment. Remote Sens. Environ. 2018, 208, 145–153. [Google Scholar] [CrossRef]
  95. Department of Environment and Conservation. Natural Resource Management Field Assessment Guidelines—Rainforest Identification Field Guide; NSW: Canberra, Australia, 2004. [Google Scholar]
  96. Ustin, S.L.; Gamon, J.A. Remote sensing of plant functional types. New Phytol. 2010, 186, 795–816. [Google Scholar] [CrossRef]
  97. Brown, M.I.; Pearce, T.; Leon, J.; Sidle, R.; Wilson, R. Using remote sensing and traditional ecological knowledge (TEK) to understand mangrove change on the Maroochy River, Queensland, Australia. Appl. Geogr. 2018, 94, 71–83. [Google Scholar] [CrossRef]
  98. Eddy, I.M.; Gergel, S.E.; Coops, N.C.; Henebry, G.M.; Levine, J.; Zerriffi, H.; Shibkov, E. Integrating remote sensing and local ecological knowledge to monitor rangeland dynamics. Ecol. Indic. 2017, 82, 106–116. [Google Scholar] [CrossRef]
  99. Koskinen, J.; Leinonen, U.; Vollrath, A.; Ortmann, A.; Lindquist, E.; D’Annunzio, R.; Pekkarinen, A.; Käyhkö, N. Participatory mapping of forest plantations with Open Foris and Google Earth Engine. ISPRS J. Photogramm. Remote Sens. 2019, 148, 63–74. [Google Scholar] [CrossRef]
  100. Chavolla, E.; Zaldivar, D.; Cuevas, E.; Perez-Cisneros, M.A. Color Spaces Advantages and Disadvantages in Image Color Clustering Segmentation. Econom. Financ. Appl. 2017. [Google Scholar] [CrossRef]
  101. Fisher, N. Statistical Analysis of Circular Data; Cambridge University Press: Cambridge, UK, 1996. [Google Scholar]
Figure 1. Examples comparing rainforest types to native sclerophylls from Google Earth and Google Earth street views for the locations marked with a red circle.
Figure 1. Examples comparing rainforest types to native sclerophylls from Google Earth and Google Earth street views for the locations marked with a red circle.
Remotesensing 13 02544 g001
Figure 2. Sample tiles and ecoregional strata for the 2-stage classification design, with the rainforest extent from existing 100 m resolution NVIS and NFI references in yellow.
Figure 2. Sample tiles and ecoregional strata for the 2-stage classification design, with the rainforest extent from existing 100 m resolution NVIS and NFI references in yellow.
Remotesensing 13 02544 g002
Figure 3. Workflow between GEE (Google Earth Engine) and a desktop GIS (Geographic Information System). Currently, in a participatory mapping process, for example, end users would only need to apply the GIS side of the workflow, while a technician with coding expertise would be responsible for the GEE side of the workflow (unless a GEE application were customised for the GIS functionality).
Figure 3. Workflow between GEE (Google Earth Engine) and a desktop GIS (Geographic Information System). Currently, in a participatory mapping process, for example, end users would only need to apply the GIS side of the workflow, while a technician with coding expertise would be responsible for the GEE side of the workflow (unless a GEE application were customised for the GIS functionality).
Remotesensing 13 02544 g003
Table 1. Satellite image interpretation key for rainforests. All image coordinates are listed in WGS 1984 decimal degrees, and all images are stretched in the same way in the ‘Land/Water’ RGB: NIR, SWIR1 and Red.
Table 1. Satellite image interpretation key for rainforests. All image coordinates are listed in WGS 1984 decimal degrees, and all images are stretched in the same way in the ‘Land/Water’ RGB: NIR, SWIR1 and Red.
SII Visual Interpretation Key
Example of
Rainforest Type and Location
VHR GEE
Imagery in
‘Natural Colour’ RGB
Sentinel 2 Image
in ‘Land/Water’ RGB with the Same Linear stretch in GEE
Interpretation Ontology of General Appearance
Consistent to the Same Image Stretch
Existing 100 m
Resolution
Classification
Reference for
Rainforests in Green
Sentinel 2 Image in ‘Land/Water’ RGB Enhanced with a Histogram
Equalisation at the Current Extent in a GIS
Subtropical Rainforests
Location:
28.673 S, 152.476 E
Scale: 1:100,000
Remotesensing 13 02544 i061 Remotesensing 13 02544 i062Hue: Orange.
Brightness: Average.
Texture: Rough.
Shape/size: Large.
Association: Scattered along coastal lowlands & escarpment foothills, & may extend up escarpment gullies to altitudes of 900 m.
Remotesensing 13 02544 i063 Remotesensing 13 02544 i064
Northern Warm Temperate
Rainforests
Location:
31.200 S, 152.386 E
Scale: 1:100,000
Remotesensing 13 02544 i065 Remotesensing 13 02544 i066Hue: Orange.
Brightness: Average.
Texture: Rough.
Shape/size: Varying.
Association: In hilly to steep terrain on coastal ranges & plateaux, & may extend above 1000 m.
Remotesensing 13 02544 i067 Remotesensing 13 02544 i068
Southern Warm Temperate
Rainforests
Location:
36.032 S, 149.897 E
Scale: 1:80,000
Remotesensing 13 02544 i069 Remotesensing 13 02544 i070Hue: Orange.
Brightness: Bright.
Texture: Smooth.
Shape/size: Thin & tributary.
Association: Typically in deep, moist, sheltered gullies among coastal ranges & foothills.
Remotesensing 13 02544 i071 Remotesensing 13 02544 i072
Cool Temperate Rainforests
Location:
32.067 S, 151.497 E
Scale: 1:100,000
Remotesensing 13 02544 i073 Remotesensing 13 02544 i074Hue: Dark orange.
Brightness: Duller.
Texture: Rough.
Shape/size: Varying.
Association: High elevation above 900 m.
Remotesensing 13 02544 i075 Remotesensing 13 02544 i076
Dry Rainforests
Location:
31.054 S, 152.237 E
Scale: 1:100,000
Remotesensing 13 02544 i077 Remotesensing 13 02544 i078Hue: Light orange.
Brightness: Bright.
Texture: Smooth.
Shape/size: Patchy.
Association: In rough terrain surrounded by Dry Sclerophylls.
Remotesensing 13 02544 i079 Remotesensing 13 02544 i080
Littoral (coastal) Rainforests
Location:
32.431 S, 152.523 E
Scale: 1:20,000
Remotesensing 13 02544 i081 Remotesensing 13 02544 i082Hue: Orange.
Brightness: Bright.
Texture: Smooth.
Shape/size: Small, patchy or elongated.
Association: Next to the ocean.
Remotesensing 13 02544 i083 Remotesensing 13 02544 i084
Vine thickets in the Northwest Slopes
Location:
29.696 S, 150.322 E
Scale: 1:60,000
Remotesensing 13 02544 i085 Remotesensing 13 02544 i086Hue: Orange to brown.
Brightness: Dull.
Texture: Varying.
Shape/size: Patchy of varying sizes & shape.
Association: On flat to rolling terrain.
Remotesensing 13 02544 i087 Remotesensing 13 02544 i088
Table 2. Inter-annual composite temporal ranges by ecoregional strata.
Table 2. Inter-annual composite temporal ranges by ecoregional strata.
Ecoregional StrataTemporal Range for Interannual Composites
North CoastNovember to December 2015–2018
South CoastDecember to January 2015–2018
Tablelands and RangesNovember to January 2015–2018
Northwest SlopesJanuary to February 2015–2018
Table 3. Candidate indices and RGB band combinations, where # = new for this study.
Table 3. Candidate indices and RGB band combinations, where # = new for this study.
Candidate Indices Relative to Reference RGB
Reference RGB Visualisation (RGB: NIR, SWIR1, Red):

Rainforest in Orange
Remotesensing 13 02544 i089
Candidate
Index
Equation or Hue Angle from an RGB CombinationRationale for TestingExamples
(in Greyscale for
Indices and as RGBs for False Colour Band Ratio Combinations)
NDVI(NIR-Red)/(NIR+Red)Commonly used vegetation index for maps with rainforests or closed tropical evergreen forests from the past [86,87,88]. Remotesensing 13 02544 i090
Rainforest in white
EVI 22.4*(NIR-Red)/(NIR+Red+1)Two band functional equivalent to the EVI, commonly used in tropical forest studies [89]. EVI 2 has been shown to be less sensitive to background reflectance, including bright soils and non-photosynthetically active vegetation [90,91]. Remotesensing 13 02544 i091
Rainforest in white
NGRDI(Green-Red)/(Green+Red)Replacing the NIR band (in NDVI) with the Green band appears to reduce the high value saturation produced by NDVI. Remotesensing 13 02544 i092
Rainforest in white
# Aravena
Rainforest Index
(SWIR1/Red)-(SWIR1/Green)A compromise between the wider spectral range and the lower resolution of the SWIR1 band, subtracting feature related band ratios of SWIR1, where the Red and Green bands are derived from the NGRDI. Remotesensing 13 02544 i093
Rainforest in white
# SWIR1/Natural Colour Band
Ratio RGB hue
Hue angle from band ratio R,G,B:
SWIR1/Red,
SWIR1/Green,
SWIR1/Blue
A distinct hue from a band ratio RGB combination, differentiating the SWIR1 band which displays the highest water absorption, with the bands from the ‘Natural Colour’ RGB combination to include structural and greenness data and narrow down and emphasise the moisture and brightness characteristic of rainforests compared to other forest types. Remotesensing 13 02544 i094
Rainforest in orange
# SWIR1/Infrared Colour Band
Ratio RGB hue
Hue angle from band ratio R,G,B:
SWIR1/NIR,
SWIR1/Red,
SWIR1/Green
A distinct hue from a band ratio RGB differentiating the SWIR1 band with the bands from the ‘Infrared Colour’ RGB combination to include structural, greenness and wetness data. Remotesensing 13 02544 i095
Rainforest in green
# Aravena
Rainforest Band Ratio RGB Blend hue
Hue angle from band ratio R,G,B:
SWIR1/Red,
Red/Green,
(Red/Blue)/(SWIR1/Green)
A distinct hue from a band ratio RGB blend, selectively multiplying or dividing band ratios to produce the maximum spectral separability between rainforests and all other features in the landscape. Remotesensing 13 02544 i096
Rainforest in red
Table 4. Final accuracy assessment results. Italic represents the highest scores.
Table 4. Final accuracy assessment results. Italic represents the highest scores.
Indices (%)Band Ratio RGB Hues (%)
EcoregionsNDVIEVI2NGRDIAravena
Rainforest
Index
SWIR1/Natural Colour Band
Ratio RGB
SWIR1/Infrared
Colour Band
Ratio RGB
Aravena
Rainforest Band
Ratio RGB Blend
North Coast58.2758.9170.2575.7876.6072.4778.51
South Coast62.2062.2868.2874.3372.6572.2775.74
Tablelands and Ranges77.9177.9885.9888.5481.3785.3588.02
Northwest Slopes54.4754.4759.0269.7761.9860.8372.95
Remotesensing 13 02544 i097
Table 5. Area comparison between the combined NVIS and NFI classifications and the classification from the highest-ranking index in this study by ecoregion.
Table 5. Area comparison between the combined NVIS and NFI classifications and the classification from the highest-ranking index in this study by ecoregion.
EcoregionsTotal Area From Existing 100 m Resolution
Combined NVIS and NFI Classification (Ha)
Total from 15 m
Resolution Classification from This Study (Ha)
Percentage
Difference
Index
Used
North Coast385,902395,2382.4% moreAravena Rainforest Band Ratio RGB Blend
South Coast120,77982,56431.6% lessAravena Rainforest Band Ratio RGB Blend
Tablelands and Ranges198,924257,81929.6% moreAravena Rainforest Index
Northwest Slopes57,47028,72650% lessAravena Rainforest Band Ratio RGB Blend
Table 6. Ranking of agricultural areas misclassified as rainforest for all indices and hues by ecoregion.
Table 6. Ranking of agricultural areas misclassified as rainforest for all indices and hues by ecoregion.
Index or HueArea and % of Agriculture Misclassified as Rainforest by Ecoregion
North CoastSouth CoastTablelands and RangesNorthwest Slopes
Area (Ha)%Area (Ha)%Area (Ha)%Area (Ha)%
SWIR1/Natural Colour Band Ratio RGB Hue218,19055.295,850116.1104,78140.639,746138.4
Aravena Rainforest Band Ratio RGB Blend Hue31,478824,10129.268,03026.425,16387.6
Aravena Rainforest Index17,3864.415,16818.453,16320.620,89572.7
SWIR1/Infrared Colour Band Ratio RGB Hue7809257356.973992.9565419.7
EVI 232310.87390.936231.4949533.1
NDVI30460.86860.834751.3920832.1
NGRDI29140.710711.376273786327.4
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Aravena, R.A.; Lyons, M.B.; Roff, A.; Keith, D.A. A Colourimetric Approach to Ecological Remote Sensing: Case Study for the Rainforests of South-Eastern Australia. Remote Sens. 2021, 13, 2544. https://doi.org/10.3390/rs13132544

AMA Style

Aravena RA, Lyons MB, Roff A, Keith DA. A Colourimetric Approach to Ecological Remote Sensing: Case Study for the Rainforests of South-Eastern Australia. Remote Sensing. 2021; 13(13):2544. https://doi.org/10.3390/rs13132544

Chicago/Turabian Style

Aravena, Ricardo A., Mitchell B. Lyons, Adam Roff, and David A. Keith. 2021. "A Colourimetric Approach to Ecological Remote Sensing: Case Study for the Rainforests of South-Eastern Australia" Remote Sensing 13, no. 13: 2544. https://doi.org/10.3390/rs13132544

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop