Next Article in Journal
Sensor Buoy System for Monitoring Renewable Marine Energy Resources
Next Article in Special Issue
UAVs, Hyperspectral Remote Sensing, and Machine Learning Revolutionizing Reef Monitoring
Previous Article in Journal
A High-Resolution SAR Focusing Experiment Based on GF-3 Staring Data
Previous Article in Special Issue
Saliency Detection and Deep Learning-Based Wildfire Identification in UAV Imagery

Aerial Mapping of Forests Affected by Pathogens Using UAVs, Hyperspectral Sensors, and Artificial Intelligence

Insitute for Future Environments; Robotics and Autonomous Systems, Queensland University of Technology (QUT), 2 George St, Brisbane City, QLD 4000, Australia
Horticulture & Forestry Science, Department of Agriculture & Fisheries, Ecosciences Precinct, 41 Boggo Rd Dutton Park, QLD 4102, Australia
BioProtection Technologies, The New Zealand Institute for Plant & Food Research Limited, Gerald St, Lincoln 7608, New Zealand
Author to whom correspondence should be addressed.
Sensors 2018, 18(4), 944;
Received: 22 February 2018 / Revised: 17 March 2018 / Accepted: 21 March 2018 / Published: 22 March 2018
(This article belongs to the Special Issue UAV or Drones for Remote Sensing Applications)


The environmental and economic impacts of exotic fungal species on natural and plantation forests have been historically catastrophic. Recorded surveillance and control actions are challenging because they are costly, time-consuming, and hazardous in remote areas. Prolonged periods of testing and observation of site-based tests have limitations in verifying the rapid proliferation of exotic pathogens and deterioration rates in hosts. Recent remote sensing approaches have offered fast, broad-scale, and affordable surveys as well as additional indicators that can complement on-ground tests. This paper proposes a framework that consolidates site-based insights and remote sensing capabilities to detect and segment deteriorations by fungal pathogens in natural and plantation forests. This approach is illustrated with an experimentation case of myrtle rust (Austropuccinia psidii) on paperbark tea trees (Melaleuca quinquenervia) in New South Wales (NSW), Australia. The method integrates unmanned aerial vehicles (UAVs), hyperspectral image sensors, and data processing algorithms using machine learning. Imagery is acquired using a Headwall Nano-Hyperspec ® camera, orthorectified in Headwall SpectralView ® , and processed in Python programming language using eXtreme Gradient Boosting (XGBoost), Geospatial Data Abstraction Library (GDAL), and Scikit-learn third-party libraries. In total, 11,385 samples were extracted and labelled into five classes: two classes for deterioration status and three classes for background objects. Insights reveal individual detection rates of 95% for healthy trees, 97% for deteriorated trees, and a global multiclass detection rate of 97%. The methodology is versatile to be applied to additional datasets taken with different image sensors, and the processing of large datasets with freeware tools.
Keywords: Austropuccinia psidii; drones; hyperspectral camera; machine learning; Melaleuca quinquenervia; myrtle rust; non-invasive assessment; paperbark; unmanned aerial vehicles (UAV); xgboost Austropuccinia psidii; drones; hyperspectral camera; machine learning; Melaleuca quinquenervia; myrtle rust; non-invasive assessment; paperbark; unmanned aerial vehicles (UAV); xgboost

1. Introduction

Exotic pathogens have caused irreversible damage to flora and fauna within a range of ecosystems worldwide. Popular outbreaks include the enormous devastations of chestnut blight (Endothia parasitica) on American chestnut trees (Castanea dentata) in the U.S. [1,2,3], sudden oak death (Phytophthora ramorum) on oak populations (Quercus agrifolia) in Europe, California, and Oregon [4,5,6], dieback (Phytophthora cinnamomi) on hundreds of hosts globally [7,8,9], and myrtle rust (Austropuccinia psidii) on Myrtaceae family plants in Australia [10,11,12,13]. The effects of the latter case have raised national alerts and response programmes given the extensive host range and the ecological and economic importance of Myrtaceae plants in the Australian environment [14,15,16,17]. As a result, various surveillance and eradication programmes have been applied in an attempt to minimise the impacts invasive pathogens cause on local hosts such as dieback in the Western Australia Jarrah forests [18], sudden oak death in the tan oak forests of the U.S. [19], and rapid ohia death (Ceratocystis fimbriata) on ohia trees (Metrosideros polymorpha) in Hawaii [20].
Modern surveillance methods to map hosts vulnerable to and affected by exotic pathogens can be classified in site-based and remote sensing methods, according to Lawley et al. [21]. Site-based approaches are commonly small regions used to collect exhaustive compositional and structural indicators of vegetation condition with a strong focus on biophysical attributes of single vegetation communities [22,23]. These methods, nonetheless, require deep expertise and time to conduct experimentation, data collection, and validation that, along with their limited area they can cover, represent a challenge while assessing effects on a broad scale [24]. Research has also suggested the design of decision frameworks to monitor and control the most threatened species [17,25,26]. Although these models can determine flora species that require immediate management control, limitations on the amount of tangible, feasible, and broad quantified data of vulnerable host areas [21] have resulted in lack of support from state and federal governments [11].
The role of remote sensing methods to assess and quantify the impacts of invasive pathogens in broad scale has increased exponentially [27,28]. Standard approaches comprise the use of spectral and image sensors through satellite, manned, and unmanned aircraft technology [29]. Concerning sensing technology by itself, applied methods by research communities include the use of non-imaging spectroradiometers, fluorescence, multispectral, hyperspectral, and thermal cameras, and light detection and ranging (LiDAR) technology [30,31,32,33]. These equipment are usually employed for the calculation of spectral indexes [34,35,36,37] and regression models in the host range [38,39]. Nevertheless, these methods are mainly focused on quantification and distribution, among other physical properties of flora species.
Satellite and manned aircraft surveys have reported limitations concerning resolution, operational costs, and unfavourable climate conditions (e.g., cloudiness and hazard winds) [40]. In contrast, the continuous development of unmanned aerial vehicles (UAVs) designs, navigation systems, portable image sensors and cutting-edge machine learning methods allow unobtrusive, accurate, and versatile surveillance tools in precision agriculture and biosecurity [41,42,43,44]. Many studies have positioned UAVs for the collection of aerial imagery in applications such as weed, disease, and pest mapping and wildlife monitoring [21,45,46,47]. More recently, unmanned aerial systems (UASs) have been deployed in cluttered and global positioning system (GPS)-denied environments [48].
Approaches to the use of UAVs, hyperspectral imagery and artificial intelligence are gaining popularity. For example, Aasen et al. [49] deployed UAS to boost vegetation monitoring efforts using hyperspectral three-dimensional (3D) imagery. The authors of Nasi et al. [50] developed techniques to assess pest damages at canopy levels using spectral indexes and k-nearest neighbour (k-NN) classifiers, achieving global detection rates of 90%. Similar research focused on disease monitoring, however, has been limited. The authors of Calderon et al. [51], for instance, evaluated the early detection and quantification of verticillium wilt in olive plants using support vector machines (SVMs) and linear discriminant analysis (LDA), obtaining mixed accuracy results among the evaluated classes of infection severity (59–75%) [52]. The authors of Albetis et al. [53] presented a system to discriminate asymptomatic and symptomatic red and white vineyard cultivars by Flavescence doree, using UAVs, multispectral imagery, and up to 20 data features, collecting contrasting results between the cultivars and maximum accuracy rates of 88%. In sum, the integration of site-based and remote sensing frameworks have boosted the capabilities of these surveillance solutions by combining data from abiotic and biotic factors and spectral responses, respectively [54]. However, this synthesis is still challenging due to the high number of singularities, data correlation, and validation procedures presented in each case study.
Considering the importance of site-based and remote sensing methods to obtain reliable and broader assessments of forest health, specifically, for pest and fungal assessments [55], this paper presents an integrated system that classifies and maps natural and plantation forests exposed and deteriorated by fungal pathogens using UAVs, hyperspectral sensors, and artificial intelligence. The framework is exemplified by a case study of myrtle rust on paperbark tea trees (Melaleuca quinquenervia) in a swamp ecosystem of Northeast New South Wales (NSW), Australia.

2. System Framework

A novel framework was designed for the assessment of natural and plantation forests exposed and potentially exposed to pathogens as presented in Figure 1. It comprises four sections linked to each other, denoted as Data Acquisition, Data Preparation, Training, and Prediction. The system interacts directly and indirectly with the surveyed area to acquire information, preprocess and arrange obtained data into features, fit a supervised machine learning classifier, tune the accuracy and performance indicators to process vast amounts of data, and provide prediction reports through segmented images.

2.1. Data Acquisition

The data acquisition process involves an indirect data collection campaign using an airborne system, and direct ground assessments of the studied pathogen through exclusion trials, which are controlled by biosecurity experts. Airborne data is compiled using a UAV, image sensors, and a ground workstation in the site to acquire data above the tree canopy. Similarly, ground data is collected through field assessment insights by biosecurity experts. Field assessments bring several factors such as growth, reproduction, and regeneration on coppiced trees exposed to any specific pathogen. A database is created by labelling, georeferencing, and correlating relevant insights into every tested plant. Details on the studied area, flight campaigns, and field assessments can be found in Section 3.1, Section 3.2 and Section 3.3.

2.1.1. UAV and Ground Station

This methodology incorporates a hexa-rotor DJI S800 EVO (DJI, Guangdong, China) UAV. The drone features high-performance brushless rotors, a total load capacity of 3.9 kg, and dimensions of 118 cm × 100 cm × 50 cm. It follows an automatic mission route using the DJI Ground Station 4.0 software, controlling the route, speed, height above the ground, and overlapping values remotely. It is worth mentioning, however, that other UAVs with similar characteristics can also be used and included into the airborne data collection process of Figure 1.

2.1.2. Sensors

Spectral data is collected using a Headwall Nano-Hyperspec ® hyperspectral camera (Headwall Photonics Inc., Bolton, MA, USA). This visible near infrared (VNIR) sensor provides spectral wavelength responses up to 274 bands, a wavelength range from 385 to 1000 nm, a spectral resolution of 2.2 nm, a frame rate of 300 Hz, spatial bands of 640 pixels, and a total storage limit of 480 GB. The camera is mounted in a customised gimbal system that augments data quality by minimising external disturbances such as roll, pitch, and yaw oscillations, as depicted in Figure 2.

2.2. Data Preparation

Imagery is downloaded and fed into a range of software solutions to transform raw data into filtered and orthorectified spectral bands in reflectance. Using Headwall SpectralView ® software, raw hypercubes from the surveyed area were automatically processed. The preprocessing operations included radiance, orthorectification (through ground control points), and the white reference illumination spectrum (WRIS). The WRIS spectrum comes from the extraction of the radiance signature of a Spectralon target from an acquired image in the surveyed area. Using the orthorectified imagery and the WRIS, reflectance data and scene shading are calculated through CSIRO|Data61 Scyven 1.3.0. software [56,57]. Considering the massive amount of data contained for any orthorectified hyperspectral cube (4–12 GB), each cube is cropped to process regions of interest only ( 0.5 –1 GB) and handle system resources efficiently, as shown in Figure 3.
An additional image that contains the localisation of tested trees in the exclusion trial is created. From a recovered red–green–blue colour model (RGB) image of the cropped hypercube, each tree is graphically labelled using their tracked GPS coordinates in Argis ArcMap 10.4. To handle all the data processing from the proposed system framework, Algorithm 1 was developed. These tasks were conducted with Python 2.7.14 programming language and several third-party libraries for data manipulation and machine learning, including Geospatial Data Abstraction Library (GDAL) 2.2.2 [58], eXtreme Gradient Boosting (XGBoost) 0.6 [59], Scikit-learn 0.19.1 [60], OpenCV 3.3.0 [61], and Matplotlib 2.1.0 [62].
Algorithm 1 Detection and mapping of vegetation alterations using spectral imagery and sets of features.
Required: orthorectified layers (bands) in reflectance I. Labelled regions from field assessments L.
Data Preparation
1:Load I data.
2: S Spectral indexes array from I.
3: X Features array [I, S].
4: Y Labels array from dataset L.
5: D filtered dataset of features X with corresponding labelled pixel from Y.
6:Split D into training data D T and testing data D E .
7:Fit an XGBoost classifier C using D T .
8: R List of unique relevance values of processed features X from C.
9:for all values in R do
10: D T F Filtered underscored features from D T .
11: Fit C using D T F .
12: Append accuracy values from C into T.
13:end for
14:Fit C using the best features threshold from T.
15:Validate C with k-fold cross-validation from D T F .    ▹number of folds = 10
16: P Predicted values for each sample in X.
17:Convert P array into a 2D orthorectified image.
18: O Displayed/overlayed image.
19:return O
Reflectance cube bands are loaded into the program to calculate suitable spectral indexes and improve the detection rates as mentioned in Step 2. For this approach, the most traditional indexes, such as the normalised difference vegetation index (NDVI) [63], the green normalised difference vegetation index (GNDVI) [64], the soil-adjusted vegetation index (SAVI) [65], and the second modified adjusted vegetation index (MSAVI2) [66], are calculated. Additionally, two-dimensional (2D) smoothing kernels are applied into such indexes, following Equation (1).
K = 1 w 2 1 1 1 1 1 1 1 1 1
where K is the kernel of the filter, and w is the size of the window. Here, up to three kernels for w = 3 , w = 7 , and w = 15 are calculated per vegetation index. All the hypercube bands in reflectance, as well as calculated vegetation spectral indexes, are denominated data features. In Step 3, an array of features is generated from all the retrieved bands I and the calculated indexes S.

2.3. Training and Prediction

The labelled regions from the ground-based assessments are exported from ArcMap and loaded into an array Y. In Step 5, an array D is created by filtering the features from X with their corresponding labels in Y only. Filtered data is separated into a training ( 80 % ) and testing ( 20 % ) data array. In Step 7, data is processed into a supervised XGBoost classifier. This model is utilised considering the moderate amount of labelled data (insufficient to run a deep learning model), the amount of information to be processed for a single hypercube, and the nature of the data from the exclusion trial (ground-based test). This classifier is currently a cutting-edge decision tree and gradient boosting model optimised for large tree structures, excellent performance, and fast execution speeds, outperforming detection rates of standard non-linear models such as random forests, k-NN, LDA, and SVM [59]. Moreover, input data features do not require any scaling (normalisation) in comparison with other models. Once the model is fitted, Step 8 retrieves the relevance of each processed feature (reflectance bands, spectral indexes, and transformed images) from the array X in a list R. The list is sorted to discard irrelevant features, increment the detection rates of the algorithm, avoid over-fitting, decrease the complexity of the model, and reduce computer processing loads.
Algorithm 1 executes a loop to evaluate the optimal number of ranked features that can offer the best balance between accuracy and data processing load. At each instance of the loop, specific features from the training set D T is filtered based on a threshold of relevance scores (Step 10). Later, the XGBoost model is fit using the filtered training set (Step 11) to record all accuracy values per combination. In Step 14, the best combination of accuracy and number of features is retrieved to re-fit the classifier. Finally, the fitted model is validated using k-fold cross-validation (Step 15).
In the prediction stage, unlabelled pixels are processed in the optimised classifier, their values displayed in the same 2D spatial image from the orthorectified hyperspectral cube. Ultimately, classified pixels are depicted using distinguishable colours and exported in tagged image file (TIF) format, a compatible file with georeferencing-based software.

3. Experimentation Setup

3.1. Site

As displayed in Figure 4a, the experimentation occurred in a paperbark tea tree forest located near 71 Boggy Creek Rd, Bungawalbin, NSW 2469, Australia (29°04′42.9″ S 153°15′10.0″ E). Data acquisition was conducted on the 25 August 2016 at 11:40 a.m. Weather Observations for that day stated conditions of a partially cloudy day, with a mean temperature of 18.8 °C, a relative humidity of 46 % , west-northwest winds of 20 km/h, a pressure of 1009.5 hPa, and no precipitation [67]. As seen in Figure 4b, the site includes selected paperbark trees that are monitored to assess the effects of myrtle rust on them.

3.2. Flight Campaign

The acquired data from flight campaign incorporated a single mission route. The UAV was operated with a constant flight height of 20 m above the ground, an overlap of 80 % , a side lap of 50 % , a mean velocity of 4.52 km/h, and a total distance of 1.43 km. The acquired hyperspectral dataset had a vertical and horizontal ground sample distances (GSD) of 4.7 cm/pixel.

3.3. Field Assessments

In order to evaluate the effects and flora response of myrtle rust on paperbark trees, a biological study in an exclusion trial on the mentioned site was conducted. Several on-ground assessments were conducted in individual trees within a replicated block design using four treatments: trees treated with fungicides (F), insecticides (I), fungicides and insecticides (F + I), and trees without any treatment action. Figure 5 illustrates the treatment methods the studied trees received.
From the indicators generated, insect and disease assessments were extracted to label every tree. Overall, the assessment report showed that only the trees that received insecticide and fungicide treatments remained healthy under direct exposure to the rust. In contrast, trees treated with insecticide were affected by rust and those treated with fungicide were affected by insects. Thus, trees treated with fungicides and insecticides were consequently labelled as healthy and the others as affected in the database.

3.4. Preprocessing

As mentioned in Section 2.2, the entire surveyed area included an exclusion trial. As a result, an orthorectified hyperspectral cube in reflectance with spatial dimensions of 2652 × 1882 pixel of a 5.4 GB size was generated. This area was cropped from the original hypercube, reducing computational costs and discarding irrelevant data. Eventually, a 1.7 GB cube of 1200 × 1400 pixel as depicted in Figure 6 was extracted.

3.5. Training and Prediction

The XGBoost classifier contains several hyper-parameters to be set. Following a grid search technique, in which the classifier performance and detection rates were tracked with a set of possible hyper-parameters, the optimal values found for this case study were
estimators = 100 , learning rate = 0.1 , maximum depth = 3
where “estimators” is the number of trees, “learning rate” is the step size of each boosting step, and “maximum depth” is the maximum depth per tree that defines the complexity of the model.

4. Results and Discussion

To visualise the benefits of inserting an optimisation scheme in Step 8 of Algorithm 1, detection rates were tracked by training and running the classifier multiple times with only a set of filtered features per instance. The features were ranked with their relevance by the XGBoost classifier and sorted consequently, as illustrated in Figure 7.
The classifier can achieve high accuracy rates exceeding 97 % of global accuracy when it processes data using from 10 to 40 features only, with an optimal number of features of 24. On the other hand, the classifier merely improves their registers when the number of processed features is more substantial. With this capability, the proposed approach can process fewer data and reduce the number of calculations to achieve high detection values. Additionally, this boosts the capability of the algorithm of processing large datasets in less time, an ideal scenario for mapping vast rural areas. The most relevant features of this study case are depicted in Table 1 and Figure 8.
It is shown how the first four features for this classification task come from specific vegetation indexes and processed images by 2D kernels—specifically, NDVI, shading, and GNDVI features (Figure 8a–d). Although their illustrations show insights of distinguishable intensities between healthy and affected tree regions from Figure 5, these sets of features are insufficient for segmenting areas of other objects. Thus, specific reflectance wavelengths bands such as 999 and 444 nm (Figure 8e,f) are also determinant. Additionally, features processed with 2D kernels obtained better relevance scores than their unprocessed counterparts. That difference was even greater for processed features using big window kernels considering that high amounts of noise, common in raw hyperspectral imagery, altered the performance of the approach. Nonetheless, these rankings do not suggest that these features can be used as global indicators to detect and map similar case studies (myrtle rust); the feature ranking table showed here is relevant to the fitted XGBoost model only, and results may differ if the same features are processed through other machine learning techniques. It is recommended, therefore, to perform individual analyses for every case study.
A total of 11,385 pixel contained in 23 features filtered by their relevance were read again in Step 14 of Algorithm 1. Data was divided into a training array D E with 9108 pixel and a testing array D T with 2277 pixel. The generated confusion matrix of the classifier and its performance report is shown in Table 2 and Table 3.
In sum, most of the classes were predicted favourably. The majority of misclassifications between the “Healthy” and “Affected” classes are possibly caused by human errors while labelling the regions manually in the raw imagery. Considering a weighed importance of precision and recall of 1:1, the F-support scores highlight a detection rate of 97.24 % for healthy trees, 94.25 % for affected trees, and an overall detection rate of 97.35 % . Validation through k-fold cross-validation shows that the presented approach has an accuracy of 96.79%, with a standard deviation of 0.567%.
The performance of Algorithm 1 was tested in a computer with the following characteristics: Processor Intel ® Core™ i7-4770, 256 GB SSD, 16 GB RAM, Windows 7 64bit, and AMD Radeon™ HD 8490. It contains a report of the elapsed seconds for the application to accomplish the primary data processing, training, and prediction tasks, as illustrated in Table 4.
Taking into account the dimensions of the processed hypercube (1400 × 1200) and the initial number of bands (274), it was observed how a great demand of resources was required to open the file itself and calculate spectral indexes, accumulating 61.7 s on average. Similarly, the features filtering process in the training section also demanded considerable time, exceeding 50 s. On the other hand, the elapsed time executing the remaining tasks of the training phase was remarkably short. Specifically, the report highlights the benefits of filtering irrelevant features by comparing the duration of fitting the classifier for the first time with the duration of re-fitting it again with less yet relevant data from 8.74 to 0.99 s. Overall, the application spent 2 min and 51 s to evaluate and map an area of 338 m 2 approximately.
The GSD value of 4.7 cm/pixel from the acquired hyperspectral imagery represented a minor challenge in labelling individual trees, but is still problematic when specific stems or leaflets need to be highlighted. Higher resolution can assist in higher classification rates. As an illustration, the final segmented image of the optimised classifier is shown in Figure 9, where Figure 9a shows the digital labelling of every class region and Figure 9b depicts the generated segmentation output by Algorithm 1. A hypercube covering the entire area flown was also processed using the trained model, with results shown in Figure 10.
Results show a segmentation output using XGBoost as the supervised machine learning classifier that works well for this task. This classifier as well as Algorithm 1 are not only important for their capabilities to offer a pixel-wise classification task, but they also allow a rapid convergence, do not involve many complex mathematical calculations, and filter irrelevant data, compared to other methods. Nevertheless, it is suggested that their prediction performance be revised with new data. Like any model based on decision trees, over-fitting may occur, and misleading results might be generated. In those situations, labelling misclassified data, aggregating them into the features database and rerunning the algorithm is suggested.
The availability to process and classify data with small GSD values demonstrates the potential of UASs for remote sensing equipment compared with satellite and manned aircraft for forest health assessments on forest and tree plantations and with traditional estimation methods, such as statistical regression models. In comparison with similar approaches of non-invasive assessment techniques using UAVs and spectral sensors, this framework does not provide general spectral indexes that can be applied with different classifiers and similar evaluations. In contrast, this presented method boosts the efficiency of the classifier by receiving feedback from the accuracy scores of every feature and transforming the input data in consequence. The more explicit the data for the classifier is, the better the classification rates are. Furthermore, it is also demonstrated that a classifier which processes and combines data from multiple spectral indexes provides better performance than analysing individual components from different source sensors.

5. Conclusions

This paper describes a pipeline methodology for effective detection and mapping of indicators of poor health in forest and plantation trees integrating UAS technology and artificial intelligence approaches. The techniques were illustrated with an accurate classification and segmentation task of paperbark tea trees deteriorated by myrtle rust from an exclusion trial in NSW, Australia. Here, the system achieved detection rates of 97.24 % for healthy trees and 94.72 % for affected trees. The algorithm obtained a multiclass detection rate of 97.35 % . Data labelling is a task that demands many resources from both site-based and remote sensing methods, and, due to human error, affects the accuracy and reliability of the classifier results.
The approach can be used to train various datasets from different sensors to improve detection rates that single solutions offer as well as the capability of processing large datasets using freeware software. The case study demonstrates an effective approach that allows for rapid and accurate indicators, and for alterations of exposed areas at early stages. However, understanding disease epidemiology and interactions between pathogens and hosts is still required for the effective use of these technologies.
Future research should discuss the potential of monitoring the evolution of affected species through time, the prediction of expansion directions and rates of the disease, and how data will contribute to improving control actions to deter their presence in unaffected areas. Technologically, future works should analyse and compare the efficacy of unsupervised algorithms to label vegetation items accurately, integrate the best approaches in the proposed pipeline, and evaluate regression models that predict data based on other biophysical information offered by site-based methods.


This work was funded by the Plant Biosecurity Cooperative Research Centre (PBCRC) 2135 project. The authors would like to acknowledge Jonathan Kok for his contributions in co-planning the experimentation phase. We also gratefully acknowledge the support of the Queensland University of Technology (QUT) Research Engineering Facility (REF) Operations team (Dirk Lessner, Dean Gilligan, Gavin Broadbent and Dmitry Bratanov), who operated the DJI S800 EVO UAV and image sensors, and performed ground referencing. We thank Gavin Broadbent for the design, manufacturing, and tuning of a customised 2-axis gimbal for the spectral cameras. We acknowledge the High-Performance Computing and Research Support Group at QUT, for the computational resources and services used in this work.

Author Contributions

Felipe Gonzalez and Geoff Pegg contributed to experimentation and data collection planning. Felipe Gonzalez supervised the airborne surveys, the quality of the acquired data, and logistics. Juan Sandino designed the proposed pipeline and conducted the data processing phase. Felipe Gonzalez, Geoff Pegg, and Grant Smith provided definitions, assistance, and essential advice. Juan Sandino analysed the generated outputs, and validated and optimised the algorithm. All the authors contributed significantly to the composition and revision of the paper.

Conflicts of Interest

The authors declare no conflict of interest. The founding sponsors had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.


The following abbreviations are used in this manuscript:
F + IFungicides and Insecticides
GDALGeospatial data abstraction library
GPSGlobal positioning system
GNDVIGreen normalised difference vegetation index
GSDGround sampling distance
k-NNk-nearest neighbours
LDALinear discriminant analysis
LiDARLight detection and ranging
MDPIMultidisciplinary digital publishing institute
MSAVI2Second modified soil-adjusted vegetation index
NDVINormalised difference vegetation index
NSWNew South Wales
RGBRed–green–blue colour model
SAVISoil-adjusted Vegetation Index
SVMSupport Vector Machines
TIFTagged Image File
UASUnmanned Aerial System
UAVUnmanned Aerial Vehicle
VNIRVisible Near Infrared
WRISWhite Reference Illumination Spectrum
XGBoosteXtreme Gradient Boosting


  1. Smock, L.A.; MacGregor, C.M. Impact of the American Chestnut Blight on Aquatic Shredding Macroinvertebrates. J. N. Am. Benthol. Soc. 1988, 7, 212–221. [Google Scholar] [CrossRef]
  2. Anagnostakis, S.L. Chestnut Blight: The classical problem of an introduced pathogen. Mycologia 1987, 79, 23–37. [Google Scholar] [CrossRef]
  3. Burke, K.L. The effects of logging and disease on American chestnut. For. Ecol. Manag. 2011, 261, 1027–1033. [Google Scholar] [CrossRef]
  4. Rizzo, D.M.; Garbelotto, M.; Hansen, E.M. Phytophthora ramorum: Integrative research and management of an emerging pathogen in California and oregon forests. Annu. Rev. Phytopathol. 2005, 43, 309–335. [Google Scholar] [CrossRef] [PubMed]
  5. Frankel, S.J. Sudden oak death and Phytophthora ramorum in the USA: A management challenge. Australas. Plant Pathol. 2008, 37, 19–25. [Google Scholar] [CrossRef]
  6. Grünwald, N.J.; Garbelotto, M.; Goss, E.M.; Heungens, K.; Prospero, S. Emergence of the sudden oak death pathogen Phytophthora ramorum. Trends Microbiol. 2012, 20, 131–138. [Google Scholar] [CrossRef] [PubMed]
  7. Hardham, A.R. Phytophthora cinnamomi. Mol. Plant Pathol. 2005, 6, 589–604. [Google Scholar] [CrossRef] [PubMed]
  8. Shearer, B.L.; Crane, C.E.; Barrett, S.; Cochrane, A. Phytophthora cinnamomi invasion, a major threatening process to conservation of flora diversity in the South-West Botanical Province of Western Australia. Aust. J. Bot. 2007, 55, 225–238. [Google Scholar] [CrossRef]
  9. Burgess, T.I.; Scott, J.K.; Mcdougall, K.L.; Stukely, M.J.; Crane, C.; Dunstan, W.A.; Brigg, F.; Andjic, V.; White, D.; Rudman, T.; et al. Current and projected global distribution of Phytophthora cinnamomi, one of the world’s worst plant pathogens. Glob. Chang. Biol. 2017, 23, 1661–1674. [Google Scholar] [CrossRef] [PubMed]
  10. Pegg, G.S.; Giblin, F.R.; McTaggart, A.R.; Guymer, G.P.; Taylor, H.; Ireland, K.B.; Shivas, R.G.; Perry, S. Puccinia psidii in Queensland, Australia: Disease symptoms, distribution and impact. Plant Pathol. 2014, 63, 1005–1021. [Google Scholar] [CrossRef]
  11. Carnegie, A.J.; Kathuria, A.; Pegg, G.S.; Entwistle, P.; Nagel, M.; Giblin, F.R. Impact of the invasive rust Puccinia psidii (myrtle rust) on native Myrtaceae in natural ecosystems in Australia. Biol. Invasions 2016, 18, 127–144. [Google Scholar] [CrossRef]
  12. Howard, C.; Findlay, V.; Grant, C. Australia’s transition to management of myrtle rust. J. For. Sci. 2016, 61, 138–139. [Google Scholar] [CrossRef]
  13. Fernandez Winzer, L.; Carnegie, A.J.; Pegg, G.S.; Leishman, M.R. Impacts of the invasive fungus Austropuccinia psidii (myrtle rust) on three Australian Myrtaceae species of coastal swamp woodland. Austral Ecol. 2017, 43. [Google Scholar] [CrossRef]
  14. Dayton, L.; Higgins, E. Myrtle rust ‘biggest threat to ecosystem’. Available online: (accessed on 19 February 2018).
  15. Carnegie, A.J.; Cooper, K. Emergency response to the incursion of an exotic myrtaceous rust in Australia. Australas. Plant Pathol. 2011, 40, 346–359. [Google Scholar] [CrossRef]
  16. Carnegie, A.J. First Report of Puccinia psidii (Myrtle Rust) in Eucalyptus Plantations in Australia. Plant Dis. 2015, 99, 161. [Google Scholar] [CrossRef]
  17. Pegg, G.; Taylor, T.; Entwistle, P.; Guymer, G.; Giblin, F.; Carnegie, A. Impact of Austropuccinia psidii (myrtle rust) on Myrtaceae-rich wet sclerophyll forests in south east Queensland. PLoS ONE 2017, 12, e0188058. [Google Scholar] [CrossRef] [PubMed]
  18. Government of Western Australia. Phytophthora Dieback—Parks and Wildlife Service. Available online: (accessed on 19 February 2018).
  19. U.S. Forest Service. Sudden Oak Death (SOD)|Partnerships|PSW Research Station|Forest Service. Available online: (accessed on 19 February 2018).
  20. State of Hawaii. Department of Agriculture|How to Report Suspected Ohia Wilt/Rapid Ohia Death. Available online: (accessed on 19 February 2018).
  21. Lawley, V.; Lewis, M.; Clarke, K.; Ostendorf, B. Site-based and remote sensing methods for monitoring indicators of vegetation condition: An Australian review. Ecol. Indic. 2016, 60, 1273–1283. [Google Scholar] [CrossRef]
  22. Oliver, I.; Smith, P.L.; Lunt, I.; Parkes, D. Pre-1750 vegetation, naturalness and vegetation condition: What are the implications for biodiversity conservation? Ecol. Manag. Restor. 2002, 3, 176–178. [Google Scholar] [CrossRef]
  23. Lawley, V.; Parrott, L.; Lewis, M.; Sinclair, R.; Ostendorf, B. Self-organization and complex dynamics of regenerating vegetation in an arid ecosystem: 82 years of recovery after grazing. J. Arid Environ. 2013, 88, 156–164. [Google Scholar] [CrossRef]
  24. Ostendorf, B. Overview: Spatial information and indicators for sustainable management of natural resources. Ecol. Indic. 2011, 11, 97–102. [Google Scholar] [CrossRef]
  25. Roux, J.; Germishuizen, I.; Nadel, R.; Lee, D.J.; Wingfield, M.J.; Pegg, G.S. Risk assessment for Puccinia psidii becoming established in South Africa. Plant Pathol. 2015, 64, 1326–1335. [Google Scholar] [CrossRef]
  26. Berthon, K.; Esperon-Rodriguez, M.; Beaumont, L.; Carnegie, A.; Leishman, M. Assessment and prioritisation of plant species at risk from myrtle rust (Austropuccinia psidii) under current and future climates in Australia. Biol. Conserv. 2018, 218, 154–162. [Google Scholar] [CrossRef]
  27. Lausch, A.; Erasmi, S.; King, D.J.; Magdon, P.; Heurich, M. Understanding Forest Health with Remote Sensing -Part I –A Review of Spectral Traits, Processes and Remote-Sensing Characteristics. Remote Sens. 2016, 8, 1029. [Google Scholar] [CrossRef]
  28. Tuominen, J.; Lipping, T.; Kuosmanen, V.; Haapanen, R. Remote sensing of forest health. In Geoscience and Remote Sensing; Ho, P.G.P., Ed.; InTech: Rijeka, Croatia, 2009; Chapter 02. [Google Scholar]
  29. Lausch, A.; Erasmi, S.; King, D.J.; Magdon, P.; Heurich, M. Understanding forest health with remote sensing-Part II–A review of approaches and data models. Remote Sens. 2017, 9, 129. [Google Scholar] [CrossRef]
  30. Cui, D.; Zhang, Q.; Li, M.; Zhao, Y.; Hartman, G.L. Detection of soybean rust using a multispectral image sensor. Sens. Instrum. Food Qual. Saf. 2009, 3, 49–56. [Google Scholar] [CrossRef]
  31. Candiago, S.; Remondino, F.; De Giglio, M.; Dubbini, M.; Gattelli, M. Evaluating multispectral images and vegetation indices for precision farming applications from UAV images. Remote Sens. 2015, 7, 4026–4047. [Google Scholar] [CrossRef]
  32. Lowe, A.; Harrison, N.; French, A.P. Hyperspectral image analysis techniques for the detection and classification of the early onset of plant disease and stress. Plant Methods 2017, 13, 80. [Google Scholar] [CrossRef] [PubMed]
  33. Khanal, S.; Fulton, J.; Shearer, S. An overview of current and potential applications of thermal remote sensing in precision agriculture. Comput. Electron. Agric. 2017, 139, 22–32. [Google Scholar] [CrossRef]
  34. Devadas, R.; Lamb, D.W.; Simpfendorfer, S.; Backhouse, D. Evaluating ten spectral vegetation indices for identifying rust infection in individual wheat leaves. Precis. Agric. 2009, 10, 459–470. [Google Scholar] [CrossRef]
  35. Ashourloo, D.; Mobasheri, M.; Huete, A. Developing two spectral disease indices for detection of wheat leaf rust (Pucciniatriticina). Remote Sens. 2014, 6, 4723–4740. [Google Scholar] [CrossRef]
  36. Wang, H.; Qin, F.; Liu, Q.; Ruan, L.; Wang, R.; Ma, Z.; Li, X.; Cheng, P.; Wang, H. Identification and disease index inversion of wheat stripe rust and wheat leaf rust based on hyperspectral data at canopy level. J. Spectrosc. 2015, 2015, 1–10. [Google Scholar] [CrossRef]
  37. Heim, R.H.J.; Wright, I.J.; Chang, H.C.; Carnegie, A.J.; Pegg, G.S.; Lancaster, E.K.; Falster, D.S.; Oldeland, J. Detecting myrtle rust (Austropuccinia psidii) on lemon myrtle trees using spectral signatures and machine learning. Plant Pathol. 2018. [Google Scholar] [CrossRef]
  38. Booth, T.H.; Jovanovic, T. Assessing vulnerable areas for Puccinia psidii (eucalyptus rust) in Australia. Australas. Plant Pathol. 2012, 41, 425–429. [Google Scholar] [CrossRef]
  39. Elith, J.; Simpson, J.; Hirsch, M.; Burgman, M.A. Taxonomic uncertainty and decision making for biosecurity: spatial models for myrtle/guava rust. Australas. Plant Pathol. 2013, 42, 43–51. [Google Scholar] [CrossRef]
  40. Salami, E.; Barrado, C.; Pastor, E. UAV flight experiments applied to the remote sensing of vegetated areas. Remote Sens. 2014, 6, 11051–11081. [Google Scholar] [CrossRef][Green Version]
  41. Glassock, R.; Hung, J.Y.; Gonzalez, L.F.; Walker, R.A. Design, modelling and measurement of a hybrid powerplant for unmanned aerial systems. Aust. J. Mech. Eng. 2008, 6, 69–78. [Google Scholar] [CrossRef]
  42. Whitney, E.; Gonzalez, L.; Periaux, J.; Sefrioui, M.; Srinivas, K. A robust evolutionary technique for inverse aerodynamic design. In Proceedings of the European Congress on Computational Methods in Applied Sciences and Engineering, Jyvaskyla, Finland, 24–28 July 2004; Volume 2, pp. 1–2. [Google Scholar]
  43. Gonzalez, L.; Whitney, E.; Srinivas, K.; Periaux, J. Multidisciplinary aircraft design and optimisation using a robust evolutionary technique with variable fidelity models. In Proceedings of the 10th AIAA/ISSMO Multidisciplinary Analysis and Optimization Conference, Albany, NY, USA, 30 August–1 September 2004; Volume 6, pp. 3610–3624. [Google Scholar]
  44. Ken, W.; Chris, H.C. Remote sensing of the environment with small unmanned aircraft systems (UASs), part 1: A review of progress and challenges. J. Unmanned Veh. Syst. 2014, 2, 69–85. [Google Scholar]
  45. Gonzalez, L.; Montes, G.; Puig, E.; Johnson, S.; Mengersen, K.; Gaston, K. Unmanned Aerial Vehicles (UAVs) and artificial intelligence revolutionizing wildlife monitoring and conservation. Sensors 2016, 16, 97. [Google Scholar] [CrossRef] [PubMed][Green Version]
  46. Sandino, J.; Wooler, A.; Gonzalez, F. Towards the automatic detection of pre-existing termite mounds through UAS and hyperspectral imagery. Sensors 2017, 17, 2196. [Google Scholar] [CrossRef] [PubMed]
  47. Vanegas, F.; Bratanov, D.; Powell, K.; Weiss, J.; Gonzalez, F. A novel methodology for improving plant pest surveillance in vineyards and crops using UAV-based hyperspectral and spatial data. Sensors 2018, 18, 260. [Google Scholar] [CrossRef] [PubMed]
  48. Vanegas, F.; Gonzalez, F. Enabling UAV navigation with sensor and environmental uncertainty in cluttered and GPS-denied environments. Sensors 2016, 16, 666. [Google Scholar] [CrossRef] [PubMed]
  49. Aasen, H.; Burkart, A.; Bolten, A.; Bareth, G. Generating 3D hyperspectral information with lightweight UAV snapshot cameras for vegetation monitoring: From camera calibration to quality assurance. ISPRS J. Photogramm. Remote Sens. 2015, 108, 245–259. [Google Scholar] [CrossRef]
  50. Nasi, R.; Honkavaara, E.; Lyytikainen-Saarenmaa, P.; Blomqvist, M.; Litkey, P.; Hakala, T.; Viljanen, N.; Kantola, T.; Tanhuanpaa, T.; Holopainen, M. Using UAV-Based photogrammetry and hyperspectral imaging for mapping bark beetle damage at tree-level. Remote Sens. 2015, 7, 15467–15493. [Google Scholar] [CrossRef]
  51. Calderon, R.; Navas-Cortes, J.; Lucena, C.; Zarco-Tejada, P. High-resolution airborne hyperspectral and thermal imagery for early detection of Verticillium wilt of olive using fluorescence, temperature and narrow-band spectral indices. Remote Sens. Environ. 2013, 139, 231–245. [Google Scholar] [CrossRef]
  52. Calderon, R.; Navas-Cortes, J.A.; Zarco-Tejada, P.J. Early detection and quantification of verticillium wilt in olive using hyperspectral and thermal imagery over large areas. Remote Sens. 2015, 7, 5584–5610. [Google Scholar] [CrossRef]
  53. Albetis, J.; Duthoit, S.; Guttler, F.; Jacquin, A.; Goulard, M.; Poilvé, H.; Féret, J.B.; Dedieu, G. Detection of Flavescence dorée Grapevine Disease using Unmanned Aerial Vehicle (UAV) multispectral imagery. Remote Sens. 2017, 9, 308. [Google Scholar] [CrossRef]
  54. Pause, M.; Schweitzer, C.; Rosenthal, M.; Keuck, V.; Bumberger, J.; Dietrich, P.; Heurich, M.; Jung, A.; Lausch, A. In situ/remote sensing integration to assess forest health–A review. Remote Sens. 2016, 8, 471. [Google Scholar] [CrossRef]
  55. Stone, C.; Mohammed, C. Application of remote sensing technologies for assessing planted forests damaged by insect pests and fungal pathogens: A review. Curr. For. Rep. 2017, 3, 75–92. [Google Scholar] [CrossRef]
  56. Habili, N.; Oorloff, J. Scyllarus™: From Research to Commercial Software. In Proceedings of the ASWEC 24th Australasian Software Engineering Conference, Adelaide, SA, Australia, 28 September–1 October 2015; ACM Press: New York, NY, USA, 2015; Volume II, pp. 119–122. [Google Scholar]
  57. Gu, L.; Robles-Kelly, A.A.; Zhou, J. Efficient estimation of reflectance parameters from imaging spectroscopy. IEEE Trans. Image Process. 2013, 22, 3648–3663. [Google Scholar] [PubMed]
  58. GDAL Development Team. GDAL—Geospatial Data Abstraction Library, Version 2.1.0; Open Source Geospatial Foundation: Beaverton, OR, USA, 2017. [Google Scholar]
  59. Chen, T.; Guestrin, C. XGBoost: A Scalable Tree Boosting System. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD’16), San Francisco, CA, USA, 13–17 August 2016; ACM Press: New York, NY, USA, 2016; pp. 785–794. [Google Scholar]
  60. Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Blondel, M.; Prettenhofer, P.; Weiss, R.; Dubourg, V.; et al. Scikit-learn: Machine Learning in Python. J. Mach. Learn. Res. 2011, 12, 2825–2830. [Google Scholar]
  61. Bradski, G. The OpenCV library. Dr. Dobb’s J. Softw. Tools 2000, 25, 120, 122–125. [Google Scholar]
  62. Hunter, J.D. Matplotlib: A 2D graphics environment. Comput. Sci. Eng. 2007, 9, 90–95. [Google Scholar] [CrossRef]
  63. Rouse, J.W., Jr.; Haas, R.H.; Schell, J.A.; Deering, D.W. Monitoring vegetation systems in the great plains with Erts. NASA Spec. Publ. 1974, 351, 309–317. [Google Scholar]
  64. Gitelson, A.A.; Kaufman, Y.J.; Merzlyak, M.N. Use of a green channel in remote sensing of global vegetation from EOS-MODIS. Remote Sens. Environ. 1996, 58, 289–298. [Google Scholar] [CrossRef]
  65. Huete, A. A soil-adjusted vegetation index (SAVI). Remote Sens. Environ. 1988, 25, 295–309. [Google Scholar] [CrossRef]
  66. Laosuwan, T.; Uttaruk, P. Estimating tree biomass via remote sensing, MSAVI 2, and fractional cover model. IETE Tech. Rev. 2014, 31, 362–368. [Google Scholar] [CrossRef]
  67. Australian Government. Evans Head, NSW–August 2016–Daily Weather Observations; Bureau of Meteorology: Evans Head, NSW, Australia, 2016.
Figure 1. Pipeline process for the detection and mapping of alterations in natural and plantation forests by fungal diseases.
Figure 1. Pipeline process for the detection and mapping of alterations in natural and plantation forests by fungal diseases.
Sensors 18 00944 g001
Figure 2. Assembly of the gimbal system and the Headwall Nano-Hyperspec ® camera into the DJI S800 EVO unmanned aerial vehicle (UAV).
Figure 2. Assembly of the gimbal system and the Headwall Nano-Hyperspec ® camera into the DJI S800 EVO unmanned aerial vehicle (UAV).
Sensors 18 00944 g002
Figure 3. Creation of regions of interest in hyperspectral cubes.
Figure 3. Creation of regions of interest in hyperspectral cubes.
Sensors 18 00944 g003
Figure 4. Site of the case study. (a) Location and covered area of the farm in red and experiment trees in blue; (b) Overview of paperbark tea trees under examination.
Figure 4. Site of the case study. (a) Location and covered area of the farm in red and experiment trees in blue; (b) Overview of paperbark tea trees under examination.
Sensors 18 00944 g004
Figure 5. Aerial view of individual trees through on-ground assessments.
Figure 5. Aerial view of individual trees through on-ground assessments.
Sensors 18 00944 g005
Figure 6. Red–green–blue (RGB) colour model representation of the orthorectified cube with its extracted region of interest.
Figure 6. Red–green–blue (RGB) colour model representation of the orthorectified cube with its extracted region of interest.
Sensors 18 00944 g006
Figure 7. Performance of the classifier using different filtered features. Optimal number of features: 24.
Figure 7. Performance of the classifier using different filtered features. Optimal number of features: 24.
Sensors 18 00944 g007
Figure 8. False colour representation of the first six features by relevance. (a) Smoothed NDVI with k = 15. (b) Smoothed Shading with k = 15. (c) Smoothed GNDVI with k = 7. (d) Smoothed NDVI with k = 7. (e) Smoothed 999 nm reflectance band with k = 15. (f) Raw 444 nm reflectance band.
Figure 8. False colour representation of the first six features by relevance. (a) Smoothed NDVI with k = 15. (b) Smoothed Shading with k = 15. (c) Smoothed GNDVI with k = 7. (d) Smoothed NDVI with k = 7. (e) Smoothed 999 nm reflectance band with k = 15. (f) Raw 444 nm reflectance band.
Sensors 18 00944 g008
Figure 9. Segmentation results of the proposed approach. (a) Recovered hyperspectral image in red–green–blue (RGB) colour model; (b) Segmentation results.
Figure 9. Segmentation results of the proposed approach. (a) Recovered hyperspectral image in red–green–blue (RGB) colour model; (b) Segmentation results.
Sensors 18 00944 g009
Figure 10. Layer of mapping results of the study area in Google Earth.
Figure 10. Layer of mapping results of the study area in Google Earth.
Sensors 18 00944 g010
Table 1. Ranking of the most 30 relevant features.
Table 1. Ranking of the most 30 relevant features.
Table 2. Confusion matrix of the eXtreme Gradient Boosting (XGBoost) classifier.
Table 2. Confusion matrix of the eXtreme Gradient Boosting (XGBoost) classifier.
Table 3. Classification report of the confusion matrix of Table 2.
Table 3. Classification report of the confusion matrix of Table 2.
ClassPrecision (%)Recall (%)F-Score (%)Support
Mean97.3297.3297.35∑ = 2277
Table 4. Performance in seconds of the main tasks from Algorithm 1.
Table 4. Performance in seconds of the main tasks from Algorithm 1.
Sub-SectionInstance 1Instance 2Instance 3Instance 4Instance 5MeanStd. Dev.
Data preparation
Loading Hypercube11.92710.94411.95411.76611.52111.6220.417
Calculating indexes46.86451.86051.90152.62247.32250.1142.779
Fitting XGBoost8.9488.6548.7588.6798.6928.7460.119
Features Filtering53.23655.43360.36457.25353.44655.9462.962
Re-Fitting XGBoost0.9641.0231.0100.9980.9650.9920.026
Predicting results29.73840.74942.13134.47366.47742.71414.188
Back to TopTop