Automating Drone Image Processing to Map Coral Reef Substrates Using Google Earth Engine

: While coral reef ecosystems hold immense biological, ecological, and economic value, frequent anthropogenic and environmental disturbances have caused these ecosystems to decline globally. Current coral reef monitoring methods include in situ surveys and analyzing remotely sensed data from satellites. However, in situ methods are often expensive and inconsistent in terms of time and space. High-resolution satellite imagery can also be expensive to acquire and subject to environmental conditions that conceal target features. High-resolution imagery gathered from remotely piloted aircraft systems (RPAS or drones) is an inexpensive alternative; however, processing drone imagery for analysis is time-consuming and complex. This study presents the ﬁrst semi-automatic workﬂow for drone image processing with Google Earth Engine (GEE) and free and open source software (FOSS). With this workﬂow, we processed 230 drone images of Heron Reef, Australia and classiﬁed coral, sand, and rock / dead coral substrates with the Random Forest classiﬁer. Our classiﬁcation achieved an overall accuracy of 86% and mapped live coral cover with 92% accuracy. The presented methods enable e ﬃ cient processing of drone imagery of any environment and can be useful when processing drone imagery for calibrating and validating satellite imagery.


Introduction
When the Great Barrier Reef Marine Park Authority released its 2019 Outlook Report, the Authority reported the greatest threat to the Great Barrier Reef was climate change [1]. Dynamical and statistical models and projections demonstrate that climate change is likely to increase the intensity of extreme weather events including tropical cyclones [2], contribute to sea level rise, and increase sea surface temperatures [3]. Each of these aspects pose significant challenges for coral reef environments, and the increasing frequency of mass bleaching events brought on by warmer water temperatures alone threatens the resiliency and survival of reefs worldwide [4,5]. As coral reefs hold immense biological, ecological, and economic value in the services they provide [6], there is a need for cost-effective and efficient practices to map, monitor, and manage coral reef habitats.
Over the last two decades, technological advances have made remote sensing a cost-effective approach when compared to other monitoring methods. Coral reef monitoring programs frequently use in situ field surveys, but they are expensive and often inconsistent in terms of time, space, and scale [7]. These surveys are also limited to areas easily accessible by boat. Remotely sensed imagery collected by satellites and airborne platforms can be used in combination with in situ data to address inconsistencies and enable a higher frequency of consistent observations to effectively observe spatial and temporal changes across reefs [8,9]. To date, multiple characteristics of coral reefs including The objective of this study is to present a distributable, semi-automated workflow using GEE to classify coral reef substrates in drone imagery. We present the steps for drone image pre-processing, substrate classification, and classification accuracy assessment using GEE and free and open source software (FOSS).

Study Site
Heron Reef is a coral cay within the Capricorn Bunker Group located near the Tropic of Capricorn in the southern Great Barrier Reef (23.4423 • S, 151.9148 • E) (Figure 1a). Heron Reef was chosen as the study site because it is a lagoonal platform reef [36] whose shallow reef flats are well suited to effective capture of reef substrate imagery by drone-based cameras. Furthermore, the availability of the University of Queensland Research Station on Heron Island and data from Reef Check Australia's annual monitoring missions of Heron Reef will enable efficient support for subsequent studies.
substrate classification, and classification accuracy assessment using GEE and free and open source software (FOSS).

Study Site
Heron Reef is a coral cay within the Capricorn Bunker Group located near the Tropic of Capricorn in the southern Great Barrier Reef (23.4423°S, 151.9148° E ) (Figure 1a). Heron Reef was chosen as the study site because it is a lagoonal platform reef [36] whose shallow reef flats are well suited to effective capture of reef substrate imagery by drone-based cameras. Furthermore, the availability of the University of Queensland Research Station on Heron Island and data from Reef Check Australia's annual monitoring missions of Heron Reef will enable efficient support for subsequent studies.

Data Acquisition
True color (RGB) aerial imagery over Heron Reef was collected using a DJI Phantom 4 Pro drone with its camera angled at nadir (FOV = 73.73 deg, pixel array = 5472 × 3648) (Figure 1b). The DJI Phantom 4 Pro was chosen because of its long flight time (~20 minutes), three-axis gimbal that enables smooth flight patterns to capture imagery and 20-megapixel RGB sensor. The drone was oriented south and flown at 20 m altitude resulting in a ground sample distance (GSD) equaling 0.5 cm and individual photo footprints of approximately 20.1 x 16.7 m. The GSD enabled identification of major substrate types for classification (Figure 1c,d,e). Flight paths were programmed through the Litchi mission planning desktop and mobile application and captured photos every 40 m along the flight paths. Images were collected from 14:30 to 17:00 on February 20, 2019, to coincide with low tide and reduce the effects of sun glint and surface roughness. Depths of the study area during low tide ranged from exposed coral to 40 cm of depth. After data collection, it was evident the drone camera failed to take photos at some points along the flight path ( Figure 1a). However, the large sample size of drone imagery collected (n = 230) still enabled a thorough representation and assessment of the selected substrates on Heron Reef.
Drone flight paths are usually designed following the standard aerial survey pattern, where the drone flies up and down adjacent flight paths to create a geometric area of coverage. However, we decided not to use this flight pattern. We programmed the drone path as shown in Figure 1a for two main reasons: 1) to cover a greater variety of habitats in more areas around Heron Reef and 2) for later use to calibrate and validate satellite imagery of Heron Reef.

Data Acquisition
True color (RGB) aerial imagery over Heron Reef was collected using a DJI Phantom 4 Pro drone with its camera angled at nadir (FOV = 73.73 deg, pixel array = 5472 × 3648) (Figure 1b). The DJI Phantom 4 Pro was chosen because of its long flight time (~20 minutes), three-axis gimbal that enables smooth flight patterns to capture imagery and 20-megapixel RGB sensor. The drone was oriented south and flown at 20 m altitude resulting in a ground sample distance (GSD) equaling 0.5 cm and individual photo footprints of approximately 20.1 × 16.7 m. The GSD enabled identification of major substrate types for classification (Figure 1c-e). Flight paths were programmed through the Litchi mission planning desktop and mobile application and captured photos every 40 m along the flight paths. Images were collected from 14:30 to 17:00 on February 20, 2019, to coincide with low tide and reduce the effects of sun glint and surface roughness. Depths of the study area during low tide ranged from exposed coral to 40 cm of depth. After data collection, it was evident the drone camera failed to take photos at some points along the flight path ( Figure 1a). However, the large sample size of drone imagery collected (n = 230) still enabled a thorough representation and assessment of the selected substrates on Heron Reef.
Drone flight paths are usually designed following the standard aerial survey pattern, where the drone flies up and down adjacent flight paths to create a geometric area of coverage. However, we decided not to use this flight pattern. We programmed the drone path as shown in Figure 1a for two main reasons: (1) to cover a greater variety of habitats in more areas around Heron Reef and (2) for later use to calibrate and validate satellite imagery of Heron Reef.

Data Processing
We divided the workflow in this study into three phases: (a) drone image pre-processing, (b) image classification, (c) accuracy assessment ( Figure 2).
We divided the workflow in this study into three phases: (a) drone image pre-processing, (b) image classification, (c) accuracy assessment ( Figure 2).

Image Pre-Processing
Google Earth Engine currently requires imagery to be in a GeoTIFF format before it can be ingested into the system. Drone images were therefore converted from their native JPEG format to GeoTIFF files using the Python programming language (Figure 2a). First, images were rotated 180° to the left for correct orientation. Latitude and longitude co-ordinates were extracted from each image through ExifTool, a free and open source software program for editing image, audio, video, and PDF metadata. These co-ordinates were then converted into Easting and Northing values (EPSG: 32,756 [WGS 84/UTM zone 56S]) in QGIS. Image rasters were also projected to EPSG: 32,756 in QGIS. TIFF word files (TFW) were created for each TIFF. They are plain text files that store X and Y pixel size, image rotational information, and co-ordinates for images stored as TIFF files. TFW header files created for each GeoTIFF recognize the Easting and Northing values as the upper left corner of the images instead of the image centers, which is where the images are typically tagged. Therefore, to ensure images were in the correct locations, 12.5541 m (half the image width) was subtracted from each Easting value and 8.3694 m (half of the image height) was subtracted from each Northing value.

Image Pre-Processing
Google Earth Engine currently requires imagery to be in a GeoTIFF format before it can be ingested into the system. Drone images were therefore converted from their native JPEG format to GeoTIFF files using the Python programming language (Figure 2a). First, images were rotated 180 • to the left for correct orientation. Latitude and longitude co-ordinates were extracted from each image through ExifTool, a free and open source software program for editing image, audio, video, and PDF metadata. These co-ordinates were then converted into Easting and Northing values (EPSG: 32,756 [WGS 84/UTM zone 56S]) in QGIS. Image rasters were also projected to EPSG: 32,756 in QGIS. TIFF word files (TFW) were created for each TIFF. They are plain text files that store X and Y pixel size, image rotational information, and co-ordinates for images stored as TIFF files. TFW header files created for each GeoTIFF recognize the Easting and Northing values as the upper left corner of the images instead of the image centers, which is where the images are typically tagged. Therefore, to ensure images were in the correct locations, 12.5541 m (half the image width) was subtracted from each Easting value and 8.3694 m (half of the image height) was subtracted from each Northing value.

Image Classification
After pre-processing, all drone images of Heron Reef were uploaded as an image collection asset in GEE (Figure 2b). We chose rock/dead coral, sand, and live coral for substrate classification, as these substrates were most easily recognizable in the drone imagery. It should be noted that the rock/dead coral class includes epilithic algae matrix and dead hard coral. We also created a sun glint class to account for the impacts of sun glint classifying substrates and subsequently determining their areal coverage.
The GSD of the drone images enables visual identification of major substrate types, which is acceptable to use as reference data for training the classifier and accuracy assessment [37]. Traditional methods of collecting in situ data for calibrating and validating satellite data [38] have positional accuracies far lower than the scale of drone observation, particularly when attempting to co-register with drone imagery that also has its own positional inaccuracy. Areas that best represent each class were visually identified in the drone imagery in GEE. We then used the geometry tool to manually create 30 representative polygons per class (n = 120) ( Figure 3a).

Image Classification
After pre-processing, all drone images of Heron Reef were uploaded as an image collection asset in GEE (Figure 2b). We chose rock/dead coral, sand, and live coral for substrate classification, as these substrates were most easily recognizable in the drone imagery. It should be noted that the rock/dead coral class includes epilithic algae matrix and dead hard coral. We also created a sun glint class to account for the impacts of sun glint classifying substrates and subsequently determining their areal coverage.
The GSD of the drone images enables visual identification of major substrate types, which is acceptable to use as reference data for training the classifier and accuracy assessment [37]. Traditional methods of collecting in situ data for calibrating and validating satellite data [38] have positional accuracies far lower than the scale of drone observation, particularly when attempting to co-register with drone imagery that also has its own positional inaccuracy. Areas that best represent each class were visually identified in the drone imagery in GEE. We then used the geometry tool to manually create 30 representative polygons per class (n = 120) (Figure 3a). To ensure an even sampling of training and validation pixels across the classes, we selected 8000 random points (pixels) per class (n = 32,000 points) from within the training polygons. We merged these points into a feature collection to be mapped across the images to extract digital number statistics for each category. We then split the points into 70% training and 30% validation data.
We employed the Random Forest algorithm with 50 trees for classification in GEE (Figure 3b). Preliminary tests to fine tune the classifier concluded 50 trees as the optimum number of trees to use, as this resulted in the highest training accuracy (90%) and lowest out of bag error (14%). The Random Forest classifier is not sensitive to noise or outliers and has been used with high accuracy to map benthic habitats of coral reefs [39] and tidal flats [40]. This classifier was applied to the image collection, and the total area (m 2 ) per substrate per drone image was calculated to create substrate cover maps.

Accuracy Assessment
An accuracy assessment was employed to assess the accuracy of the final classification result over the entire image collection of drone imagery (Figure 2c). Validation points (n = 9522 from the 30% validation data) were applied over the classification result to extract pixel classification information at each point. An error matrix was generated to compare how the classifier classified the validation points with our reference classification. The error matrix was used to compute the overall accuracy of the classification by dividing the total number of correctly classified pixels by the total number of reference pixels sampled. Producer and user accuracy metrics were calculated by class. Producer accuracy (precision) was calculated by dividing the number of correctly classified pixels in a class by the total number of reference pixels sampled for that class. This displays how well the To ensure an even sampling of training and validation pixels across the classes, we selected 8000 random points (pixels) per class (n = 32,000 points) from within the training polygons. We merged these points into a feature collection to be mapped across the images to extract digital number statistics for each category. We then split the points into 70% training and 30% validation data.
We employed the Random Forest algorithm with 50 trees for classification in GEE (Figure 3b). Preliminary tests to fine tune the classifier concluded 50 trees as the optimum number of trees to use, as this resulted in the highest training accuracy (90%) and lowest out of bag error (14%). The Random Forest classifier is not sensitive to noise or outliers and has been used with high accuracy to map benthic habitats of coral reefs [39] and tidal flats [40]. This classifier was applied to the image collection, and the total area (m 2 ) per substrate per drone image was calculated to create substrate cover maps.

Accuracy Assessment
An accuracy assessment was employed to assess the accuracy of the final classification result over the entire image collection of drone imagery (Figure 2c). Validation points (n = 9522 from the 30% validation data) were applied over the classification result to extract pixel classification information at each point. An error matrix was generated to compare how the classifier classified the validation points with our reference classification. The error matrix was used to compute the overall accuracy of the classification by dividing the total number of correctly classified pixels by the total number of reference pixels sampled. Producer and user accuracy metrics were calculated by class. Producer accuracy (precision) was calculated by dividing the number of correctly classified pixels in a class by the total number of reference pixels sampled for that class. This displays how well the classifier identified classes in the training data set [41]. User accuracy (recall) was computed by taking the total number of correctly classified pixels for a class and dividing it by the total number of pixels classified as that class by the algorithm. User accuracy represents the chance that the class classified on the map represents that class in the field [41].

Code and Data Access
The code for image pre-processing, classification (including additional parameters set for Random Forest), and accuracy assessment is available online, as both a live version (https://github.com/bennkat/ Drone_Image_Processing_GEE/tree/v1.0) and a static release as per this paper (https://doi.org/10.5281/ zenodo.3986067).

Image Classification
Using our semi-automated workflow, we classified 230 drone images of Heron Reef, representing a total area of 96,664 m 2 . A visual assessment of the classification results revealed that this workflow is able to distinguish between live coral, rock/dead coral, and sand substrates present in drone imagery ( Figure 4). Our workflow classified a total area of 15,240 m 2 of Heron Reef as live coral, 23,795 m 2 as sand, and 38,148 m 2 as rock/dead coral substrate. Predicted live coral cover percentages of Heron Reef range from less than 1% to 60% (Figure 4a). The highest estimated coral cover was observed along the reef slope and the lowest coral cover observed in reef flats (Figure 4b,c).
classifier identified classes in the training data set [41]. User accuracy (recall) was computed by taking the total number of correctly classified pixels for a class and dividing it by the total number of pixels classified as that class by the algorithm. User accuracy represents the chance that the class classified on the map represents that class in the field [41].

Code and Data Access
The code for image pre-processing, classification (including additional parameters set for Random Forest), and accuracy assessment is available online, as both a live version (https://github.com/bennkat/Drone_Image_Processing_GEE/tree/v1.0) and a static release as per this paper (https://doi.org/10.5281/zenodo.3986067).

Image Classification
Using our semi-automated workflow, we classified 230 drone images of Heron Reef, representing a total area of 96,664 m 2 . A visual assessment of the classification results revealed that this workflow is able to distinguish between live coral, rock/dead coral, and sand substrates present in drone imagery ( Figure 4). Our workflow classified a total area of 15,240 m 2 of Heron Reef as live coral, 23,795 m 2 as sand, and 38,148 m 2 as rock/dead coral substrate. Predicted live coral cover percentages of Heron Reef range from less than 1% to 60% (Figure 4a). The highest estimated coral cover was observed along the reef slope and the lowest coral cover observed in reef flats (Figure 4b,c).

Accuracy Assessment
Comparison of validation pixels with substrate class assignments made by the model, showed an overall mapping accuracy of 86%. Accuracy calculations and the error matrix (Tables 1 and 2) demonstrate differences between the model classification and substrate classes assigned manually. The producer and user accuracies for live coral indicate that Random Forest classified 92% of reference live coral pixels correctly and correctly classifies live coral in unseen data 86% of the time (Table 1). The model also displayed high accuracy in mapping sand (Table 1). The rock/dead coral class displayed the lowest mapping accuracy with an accuracy of 71% (Table 1). This accuracy results from the misclassification of rock/dead coral as live coral. While the model was able to classify 92% of the live coral reference pixels correctly, it classified 14% of the rock/dead coral reference pixels as live coral, leading the model to overpredict live coral cover ( Table 2). After visual inspection of the classification result, the model was unable to differentiate spectral information between rock/dead coral, sand, and sun glint classes in some areas around Heron Reef. Within our classification, seven percent of sand and eight percent of rock/dead coral reference pixels were classified as sun glint (Table 2). Additionally, 11% of the drone images had greater than 40% of the image area classified as sun glint, therefore excluding actual rock/dead coral, sand, and live coral assignments. However, the model successfully distinguished sun glint from the live coral class, with less than one percent of sun glint reference pixels classified as live coral (Table 2).

Image Classification Accuracy
We produced a substrate classification of Heron Reef, Australia, with an overall mapping accuracy of 86%, which maps coral cover with 92% accuracy. Coral cover substrate percentages produced from this classification workflow are within substrate cover ranges for Heron Reef published by Reef Check Australia (RCA) in 2018. The Reef Check Australia 2018 Heron Island Reef Health Report reported an average of 42% hard coral cover observed across 15 monitoring sites, with coverage ranging from absent to 73% [42]. The greatest coral coverage was observed in reef slope habitats, with less coral coverage found amongst sandy reef flats. These spatial patterns of coral cover published in the Reef Check Australia 2018 Heron Island Reef Health Report were also observed within our classification result.
Discrepancies between the estimated coral coverage found from our study and the coral coverage observed by RCA could be attributed to monitoring site location. Out of eight reef slope sites monitored by RCA, three sites showed greater than 70% hard coral cover. The other five sites ranged from 53% to 67%. Our drone imagery may have overlapped portions of the reef slope that exhibited coral cover with in the 53-67% range, therefore not matching the highest coverage percentages observed by RCA. We recognize that areas of Heron Reef may exhibit higher coral coverage than estimated in this study, but those areas may have not been represented in our drone imagery.
The presented methods provide steps for image preprocessing, so drone imagery is formatted for other classification techniques and algorithms in GEE that can be used to classify coral cover. Since the appropriateness of a classifier is dependent on the characteristics of the dataset being classified, users who want to implement this workflow can test the accuracies of other algorithms on their data and decide which one is best suited to provide the most accurate classification results. GEE offers tools for Object-Based Image Analysis (OBIA) techniques, which have been used to map coral cover in satellite imagery with resolutions of 2-30 m [35,43]. However, the authors of [43] also incorporated bathymetry data to develop segmentation rules for classification. As live corals exist in a variety of shapes, textures, and sizes in drone imagery, users may need to include additional datasets such as bathymetry to successfully apply OBIA to this workflow. Other classification algorithms such as support vector machine, and classification and regression trees are also available in GEE and can be used to classify coral reef habitats [39].
While the algorithm used will impact classification accuracy, environmental conditions such as sun glint can obscure classification targets and thus affect the final result [44]. Although steps were taken to minimize the presence of sun glint in the imagery used in this study, sun glint that was present in the images had to be accounted for. The creation of a sun glint class resulted in slight inaccuracies as the classifier misclassified sun glint, rock/dead coral, and sand substrates. Currently, GEE offers masking functions for atmospheric corrections of satellite data [29] and sun glint correction algorithms have been developed and used on high-resolution imagery of marine environments outside of the GEE platform [45]. However, these algorithms use near infrared (NIR) bands to identify sun glint affected pixels. As the imagery for this study was collected with a consumer-grade RGB camera, these sun glint correction algorithms would not be applicable. An alternative solution for future iterations of this workflow includes applying a mask that filters for pixels close to saturation (RGB = 255, 255, 255) or include a subset from within a spatially defined window of the image during image preprocessing. This would remove sun glint from the imagery and may decrease misclassifications between sun glint, rock/dead coral, and sand classes.
Other environmental factors like water quality, wind, and sea state, that create ripples or waves and differ in extent on other coral reefs, may impact drone image quality [19,46] and hinder substrate classifications. These factors should be considered during flight planning, and steps should be taken to mitigate the impacts of these factors on imagery, if possible. Completing flight missions during low tide and optimum weather conditions with low wind would reduce effects of ripples and waves on obscuring target features. However, shallow reef environments are optimal for aerial mapping [19].
Combining spectral information procured from optical sensors with bathymetric data can make this workflow applicable to other coral reef ecosystems with varying depths. In this study, depth is not a limiting factor, as drone imagery was collected over the reef flat at low tide to minimize the impacts of water depth on classification results. As spectral reflectance of substrates becomes more similar with increasing depth [7,14], combining bathymetric data with spectral information may increase the applicability of this workflow. The authors of [47] observed the greatest overall mapping accuracies in supervised and unsupervised benthic classifications (71% and 70%, respectively) of Glover's Atoll, Belize, after combining layers of acoustic roughness, hardness, and depth, with depth corrected optical layers from IKONOS imagery. The authors of [47] do note a limitation with the extensive field surveys required to calibrate acoustic data on an image by image basis. Furthermore, acoustic data can only be acquired in areas accessible by boat [18]. Bathymetric data acquired through light detection and ranging (LiDAR) sensors mounted on drones can overcome these limitations. Datasets from such methods have been used in combination with RGB drone imagery to assess coral reef bathymetry and have the potential to aid coral reef benthic classifications [48].
Assessing the accuracy of thematic maps is essential as it informs users on the reliability of the map product and impacts what the map can be used for. However, recommended and widely used techniques of assessment, such as comparing classification results to ground-truthing data through a confusion matrix, pose many challenges [49]. Issues with scale [50], site accessibility, and funding [7] often result in ground-truthing data that might not be available or suitable to use. Field data was available for ground-truthing assessment in this study but could not be used due to scale issues and potential GPS location errors, which would not allow for direct comparisons to classifications of sub centimeter pixels. Future field data acquisition campaigns for calibrating and validating classifications of drone imagery include georeferenced quadrat sampling [23,51] in combination with underwater imagery of substrate, to reduce GPS location errors. In lieu of comparisons with field data, accuracy assessments that use the same initial source information for reference and validation data have proven a credible alternative [52]. These accuracy assessments also avoid errors that result from the mislocation of reference data and points in the classification result [53].
As the classification of substrates for training data in this study was completed manually, over and under-estimates of substrate cover percentages may also be the result of misidentification of classes due to human error when training polygons were created during the training stage. The current spatial resolution of the GEE base map (where the scale bar equals 2 m) prevents observing drone imagery at native resolution. Since mislabeling of classes by the interpreter during classifier training can decrease the accuracy of map products [54], efforts were taken to minimize substrate misinterpretation. Live coral, rock/dead coral, and sand substrates were chosen because they were the most easily identifiable substrates. As a result, the classifier classified substrates such as algae and rubble under the classes chosen, affecting classification accuracy. Since the main goal was to automate a workflow with FOSS to enable mapping of coral reef substrates in drone imagery, the number of substrate classes used was enough to achieve this goal.

Workflow Assessment
We created a straightforward and efficient semi-automatic workflow that processes drone imagery for analysis and uses the Random Forest algorithm for benthic mapping of reef ecosystems. Steps for image pre-processing were created by the authors. Prior to this workflow, coding examples in GEE and published research that classified imagery using GEE use satellite imagery. The presented image classification steps consist of code previously used for classifying satellite imagery but have been modified and adapted for drone imagery collections. The accuracy assessment phase uses previously established code in GEE for accuracy assessment but is modified for this study's data. Our workflow pieces together image processing with image classification in GEE to make classifications of drone imagery with greater descriptive resolutions than satellite imagery possible. While drone imagery has been used to evaluate benthic characteristics of marine environments [13,22,23], this is the first study to present a semi-automatic process that formats and classifies drone imagery to map coral reef substrates using FOSS and GEE.
The semi-automation of drone imagery pre-processing, substrate classification, and accuracy assessment phases makes drone imagery analysis more widely available and reproducible. Datasets acquired from drones are often informationally dense [18] and require complicated processing. This workflow demonstrates straightforward image processing steps that can be applied to any drone acquired image collection. Furthermore, FOSS enables drone imagery analysis to users and organizations who lack the resources to pay for licenses of commercial products. Anyone with an internet connection and a computer can apply this workflow to their own drone imagery and share their code and datasets. This is a significant advancement, as reproducibility in remote sensing studies, especially regarding classification calibration and validation, is an obstacle [37,55].
The main limitation of this workflow is the number of programs used to convert drone images into GeoTIFF files. While these programs are FOSS, further automating this phase through Jupyter notebook and Python API for GEE would reduce the amount of time and effort needed for image preprocessing as well as the total amount of time needed for image classification and analysis.

Future Applications
This study presents a tool that can be used to process drone imagery of any environment for image classification and analysis. Drones and consumer grade cameras are used to monitor land use/cover [56], survey forest canopies, biodiversity, and assess deforestation [57,58], as well as map mangrove forests [59]. Lengthy processing times in addition to the processing power needed to analyze high-resolution imagery are common obstacles that often detract from time savings and effort gained by using drones to collect field data [60,61]. The presented methods drastically reduce image processing time through FOSS to enable analysis of drone imagery of any ecosystem, not only coral reefs.
Other applications include using this workflow to process drone imagery to calibrate and validate satellite imagery and establish baselines for temporal variation. Monitoring coral reefs from Earth observation data address many of the shortcomings of current in situ monitoring programs [7]. Imagery from drones can work collaboratively with in situ data to more accurately inform calibrations and validations of satellite imagery. Imagery from drones is proving to be highly effective in training and validating classifications made by satellite imagery [61] and studies note the usability of their drone derived datasets for calibrating satellite imagery [48]. This workflow provides accessible methods for rapid drone image processing to enable use of drone imagery as first-order ground truthing for satellite data. Additionally, imagery from drones can be used to evaluate temporal variation on coral reefs [62]. Effective ecosystem assessment and monitoring is dependent on the ability to monitor area changes over time. This workflow can provide more descriptive spatial layers to projects like the Allen Coral Atlas Project (in addition to using this workflow to calibrate and validate satellite data) [35]. Furthermore, using this workflow to process remotely sensed data to monitor shallow reef flats and crests will benefit existing monitoring programs, like the Australian Institute of Marine Science Long-term Monitoring Program and Marine Monitoring Program, that focus on reef depths greater than 2 m.

Conclusions
Although the use of remotely sensed imagery from drones for ecosystem monitoring has increased, methods for drone imagery processing are still time consuming and complex. This is the first study to present a widely accessible and reproducible semi-automatic workflow that enables rapid processing of drone imagery for image analysis using FOSS and GEE. Under this workflow, we processed and classified 230 drone images of Heron Reef, Australia and produced a substrate classification with an overall accuracy of 86%, which mapped coral cover with 92% accuracy. The accessible and reproducible nature of this workflow opens drone image processing and assessment to anyone with an internet connection. Drone imagery of any environment, marine or terrestrial, can be processed, classified, and assessed for accuracy. Further applications of this methodology include processing drone imagery for the purpose of calibrating and validating satellite imagery. This study provides cheap and efficient methods to process and assess drone imagery that will benefit monitoring efforts worldwide.