Next Article in Journal
Monitoring Human Visual Behavior during the Observation of Unmanned Aerial Vehicles (UAVs) Videos
Previous Article in Journal
Autonomous Landing of a UAV on a Moving Platform Using Model Predictive Control
Article Menu
Issue 4 (December) cover image

Export Article

Drones 2018, 2(4), 35; doi:10.3390/drones2040035

Review
UAVs in Support of Algal Bloom Research: A Review of Current Applications and Future Opportunities
1
Department of Environmental Sciences, Policy and Management, University of California, Berkeley, CA 94720, USA
2
Department of Landscape Architecture and Environmental Planning, University of California, Berkeley, CA 94720, USA
3
Division of Agriculture and Natural Resources, University of California, Berkeley, CA 94720, USA
*
Author to whom correspondence should be addressed.
Received: 29 August 2018 / Accepted: 12 October 2018 / Published: 17 October 2018

Abstract

:
Algal blooms have become major public health and ecosystem vitality concerns globally. The prevalence of blooms has increased due to warming water and additional nutrient inputs into aquatic systems. In response, various remotely-sensed methods of detection, analysis, and forecasting have been developed. Satellite imaging has proven successful in the identification of various inland and coastal blooms at large spatial and temporal scales, and airborne platforms offer higher spatial and often spectral resolution at targeted temporal frequencies. Unmanned aerial vehicles (UAVs) have recently emerged as another tool for algal bloom detection, providing users with on-demand high spatial and temporal resolution at lower costs. However, due to the challenges of processing images of water, payload costs and limitations, and a lack of standardized methods, UAV-based algal bloom studies have not gained critical traction. This literature review explores the current state of this field, and highlights opportunities that could promote its growth. By understanding the technical parameters required to identify algal blooms with airborne platforms, and comparing these capabilities to current UAV technology, such knowledge will assist managers, researchers, and public health officials in utilizing UAVs to monitor and predict blooms at greater spatial and temporal precision, reducing exposure to potentially toxic events.
Keywords:
unmanned aerial vehicles (UAV); drones; algal blooms; remote sensing; phytoplankton

1. Introduction

Algal Blooms

Algal blooms are natural phenomena that occur in marine and freshwater ecosystems in response to environmental factors such as nutrient and light availability, water pH, wind intensity, and water temperatures [1]. As primary producers, algae create the foundation for aquatic food structures and higher trophic cascades and are essential to aquatic ecosystems [2,3,4]. However, proliferations of both toxic and non-toxic algal species can have deleterious effects on aquatic life by clogging fish gills, depleting the water’s oxygen levels, and killing fish and other organisms [5,6]. Harmful algal blooms (HABs) are primarily caused by dinoflagellates, diatoms, and cyanobacteria (although they are technically photosynthesizing prokaryotic bacteria, they are included as algal blooms in this study); each of these phytoplankton groups can produce toxins that are harmful to aquatic and terrestrial species [1,7,8]. For example, toxic phytoplankton species can produce paralytic, diarrhetic, amnesic, ciguatera, and neurotoxic shellfish poisoning in shellfish consumers, and can materialize as aerosols that drift from estuarine or coastal regions and trigger asthma symptoms [8]. As global water temperatures warm [9,10] and urban and agricultural runoff continues to supply water systems with nutrients [2], the frequency of algal bloom events will continue to climb. These events pose threats to aquatic ecosystems and human health and call for new management strategies with tools at scales appropriate to the events in question.
Scale is important in detecting algal blooms because phytoplankton are very heterogeneous in space and time, and require frequent and often species-specific monitoring [11]. A single bloom can occur in short time periods, such as a few days, or extended periods of time, such as several months. It is common for algal proliferations to occur seasonally in response to environmental changes in sunlight and water temperature [8]. At a climatological scale, seasonal algal production can be affected by solar irradiance, nutrient limitations, trade winds, and monsoon reversal in the mid-latitudes. Grazing and predation are also important factors that affect phytoplankton growth, particularly in late summers in high latitude regions [12]. In addition to factors that can alter the timescale of bloom events and determine the temporal frequency needed to monitor a study site, it is important to understand the spatial extent and spectral information of blooming phytoplankton. Using combined methods of microscopic species identification, bloom imaging, and in situ water quality measurements can help researchers better understand the factors and potential drivers of the blooms they study.
The increase in algal blooms within the past several decades has led researchers to use and validate new tools to monitor these events. Remote sensing has proven to be an effective tool to study algae because of optical detection capabilities, and algal blooms are one of the most commonly studied water quality topics that incorporate remote-sensing methods in current literature. Common approaches used to identify algal blooms include multispectral and hyperspectral satellite and airborne imagery analyses [11,13,14]. Recent developments in unmanned aerial vehicle (UAV) technology and a transition from military applications to environmental research purposes [15] have opened the door for UAV-based algal bloom monitoring. Although UAVs currently cannot rival the spatial extent and often spectral capacity captured by satellite or aircraft sensors, they offer higher temporal revisit times and spatial resolution, and can be modified to perform specific tasks or capture specific spectral properties [16,17]. This paper does not seek to compare UAV capabilities to satellite or airborne research, but rather highlight the opportunities that are currently available to UAV-based algal bloom studies. This potential has not yet been extensively discussed in literature across rapidly emerging case studies, and we know of no review of the application of UAVs in aquatic algal bloom research to date. Therefore, this paper reviews current literature focusing on UAVs and algal blooms, and assesses the current state of research, primary barriers to conducting these studies, and what the future holds for this research. This information will assist managers, field technicians, and researchers in understanding algal bloom detection at greater temporal and spatial resolutions to better inform monitoring strategies and forecasting methodologies.

2. Methods

Through a Google Scholar search using the search terms “unmanned aerial vehicles algal blooms”, “UAVs algal blooms”, “algal blooms drones”, “HABs drones”, “HABs UAVs”, “unmanned aerial vehicles eutrophication”, “unmanned aerial vehicles red tide”, “cyanobacteria unmanned aerial vehicles”, and “phytoplankton unmanned aerial vehicles”, we found 193 relevant papers that focused on the remote sensing of algal blooms and the application of UAVs. These included papers focusing on general environmental UAV research (n = 35), the use of satellite imagery analysis for algal bloom and water quality research (n = 39), papers on airborne missions that investigated algal proliferations and identification methods (n = 41), papers related to UAV research in aquatic habitats (n = 42), and papers that involved UAVs in general water quality research (n = 36). Thirteen papers that focused specifically on the use of UAVs for algal bloom research, and included information on the UAV platform, image capturing methodology, and results, were selected for further analysis. This search was only conducted in English, and we performed additional searches using the aforementioned terms for cross-reference on PubMed, Web of Science, and Science Direct. We selected papers that identified and quantified chlorophyll-a, cyanobacteria, phytoplankton, or algal biomass concentrations using small unmanned aerial vehicles that excluded kites, balloons, and blimps. We did not include papers that pertained primarily to underwater bloom species identification using submerged aquatic vehicles, and most of the reviewed studies focused on microalgal identification, with the exception of papers that identified their target macroalgal species as a nuisance or harm to the ecosystem. Furthermore, we excluded thesis documents that described UAV-based algal bloom research but were not published in peer-reviewed journals. We reviewed the final 13 papers for their focal aquatic environment (i.e., coastal, freshwater, estuarine); the reported hardware, software, cameras, and spectral indices that were employed; and methodological considerations, including evidence of validation methods and reported technological or methodological challenges.

3. Results

Of the 13 papers reviewed, 10 focused on freshwater ecosystems [1,18,19,20,21,22,23,24,25,26], two occurred in coastal or estuarine environments [15,27], and one in a polar desert landscape [28] (Table 1). Of these, five focused on identifying cyanobacterial blooms [1,20,21,23,28]. Additionally, four studies used the visible spectrum to identify algal blooms [21,24,25,27], six incorporated the near-infrared (NIR) [1,19,20,23,26,28], and three utilized hyperspectral sensors [15,18,22]. Despite differences in targets and approaches, there were a number of general principles regarding technology (hardware, software), methodology (e.g., validation), and common challenges and barriers discussed further below.

3.1. Applications of Unmanned Aerial Vehicles (UAVs) in Algal Bloom Research

3.1.1. Benefits of UAVs in Algal Bloom Research

Not only do UAVs reduce the need to conduct airborne studies in manned aircraft, they also provide several other benefits for remote sensing of complex and often difficult-to-reach algal bloom-affected areas. Of the 13 papers reviewed, 10 mentioned that UAVs are affordable in that they can be more efficient than airborne or satellite imagery acquisition in both time and money [1,15,18,20,21,23,24,25,27,28], and can reduce in situ water quality sampling costs [29]. Furthermore, UAV-based algal bloom research can decrease the amount of fieldwork that is required to conduct water-quality measurements, and it can reduce the environmental and health consequences of traversing through protected, fragile, or toxic ecosystems [28]. Additionally, UAV-based spectral identification can increase the accuracy of detection of high biomass surface blooms due to increased temporal and spatial resolution and the ability to see what the naked eye cannot (via thermal imaging and hyperspectral classifications) [15,21,24,27,28]. Schedule-wise, UAV-based algal research can be more flexible than satellite or airborne missions because users can elect their own flight paths, spatial, spectral, and temporal resolutions, and revisit times, which is critical because spatial and temporal changes in phytoplankton biomass have high variability and may require revisit times as frequent as everyday [15,18,21,24,25,27,30]. Repeated flights over specific blooms at high spatial resolution can help researchers understand seasonal patterns of phytoplankton dynamics or identify potential nutrient inputs into a water system, thereby helping identify contributors to algal proliferations [29]. Finally, UAVs are beneficial tools for algal bloom research because image acquisition is not prohibited by cloud cover [15,18], which can be a major limitation to satellite imagery analysis of algal proliferations [4]. However, in their current states, satellite and especially airborne sensors used to quantify chlorophyll content or algal biomass generally provide more narrow-band capabilities than do UAVs [30], with the exception of UAV-based thermal [31] or hyperspectral studies [32].

3.1.2. Factors to Consider for UAV Algal Bloom Detection

There are several important factors that users should consider when planning a UAV flight for algal bloom detection. These pertain to image acquisition, sensor identification capabilities, and algal bloom characteristics such as color, surface pattern, and species composition. Flight logistics are very important, as weather conditions, the navigation system, autopilot capabilities, and gimbal setup are crucial in extracting clean images. For example, high wind speeds can decrease the ability to view blooms on the surface of the water, so flights are recommended during low wind periods. Also, gimbal error, UAV vibration from propeller movement, and high winds can produce blur in UAV images, so it is necessary to plan for these factors before flying [21]. Sensor capabilities that involve multispectral, hyperspectral, and laser fluorosensing determine what type of information is captured, and the narrower the band of the wavelength, the more precise the information. Sometimes the payload capabilities of the UAV platform will limit the number or type of sensors that can be flown simultaneously, and this is important when considering the spectral indicator that will be used to extract algal information. For instance, if using the Normalized Difference Vegetation Index (NDVI), the red and near-infrared bands will be required. This cannot be achieved using a simple red-green-blue (RGB) camera, unless there is a near-infrared filter, so it is necessary to think about image calibration of two separate cameras if a UAV platform cannot accommodate both concurrently. Finally, visual appearance and seasonality of the algal bloom is also extremely important, as different phytoplankton species have different reflectance signals, and have various lengths of bloom. Therefore, taking field measurements of phytoplankton spectra or analyzing phytoplankton species via microscopy in addition to flying a UAV over an algal bloom is recommended for validation purposes.

3.2. UAV Platforms

Our review suggests that the two main platforms used in UAV-based algal bloom research were fixed-wing and rotorcraft UAVs (Table 1). Although researchers reviewed in this study did use helicopter UAV platforms for phytoplankton biomass estimations [18,22], these appear to be less common. Fixed-wing and rotorcraft UAVs were equally used in the reviewed studies (Table 1). A large portion of the fixed-wing vehicles were eBee platforms, while the majority of multirotor platforms were DJI (www.dji.com/) products. Fixed-wing UAVs have a longer flight time than rotorcraft due to aerodynamic factors, and this enables these vehicles to survey larger blooms with extended battery power and greater potential to fly multiple sensors to measure chlorophyll a concentrations. Fixed-wing UAVs are suitable for algal blooms that affect larger spatial extents, while rotorcraft UAVs are advantageous for closer analysis to help identify phytoplankton groups due to their vertical take-off and landing abilities [21]. Rotorcraft can benefit algal bloom studies because it is often difficult to find substantial areas of flat, dry regions to safely launch and land a fixed-wing vehicle adjacent to bodies of water. Several reviewed studies that employed rotorcraft (multirotor, helicopter, octorotor) achieved a spatial extent of about 1 km2 for their study [18,21,22,25], while some researchers that used fixed-wing vehicles were able to cover 12 km2 in a single flight [26], and even up to 68 km2 of total coverage [28]. While many researchers prefer fixed-wing UAVs for algal bloom research because of the extended time capacity, others find rotorcraft UAVs to be more adept in observing small or microscopic colonies of algae due to their higher spatial resolution potential [28]. UAV platforms used for algal bloom studies may also benefit from custom waterproofing or surface water landing capabilities demonstrated in UAV-based water-quality sampling research [29,33].

3.3. Sensors and Cameras

Thirteen unique cameras or sensors were used in the primary papers analyzed (Table 1). The majority of these studies were conducted with RGB cameras [1,19,20,24,25,26,28], or RGB + infrared (IR) cameras (multispectral cameras) [1,19,20,23,26,28]. Only three papers used hyperspectral sensors [15,18,22]. These included the Fabry–Perot interferometer (FPI), which ranges from 400–1000 nm [18,22], and the AvaSpec-dual spectroradiometer, which has two sensors that range from 360–1000 nm [15]. Hyperspectral cameras are beginning to emerge as viable alternatives to RGB or multispectral cameras because they offer greater precision in regard to species or genus compositions of algal blooms, and thus are more desirable as research tools in discerning phytoplankton function types and toxicity potential. Although hyperspectral sensors are still quite expensive, their payloads are becoming lighter and more appropriate for UAVs [21]. Such sensors include the OCI-1000, which measures in the visible and near-infrared and has a payload of 180g, Headwall’s Nano-Hyperspec sensor, which ranges from 380–2500 nm and weighs 680 g, microHSI which ranges from 400–2400 nm and weighs 450 g–2.6 kg, and Ocean Optics USB4000 which ranges from 350–1000 nm and weighs 190 g [34].

3.4. Ground Sampling Distance (GSD)

One of the most advantageous aspects of UAV imagery is the sub-meter spatial resolution that satellite imagery and sometimes airborne imagery cannot frequently provide. This resolution is defined by the ground sampling distance (GSD) of an image, which is the distance between the centers of pixels within the image. GSD is related to the sensor width of the camera, the focal length of the camera, and the flight height of the UAV. The GSD is usually less than the actual resolution that researchers encounter when processing the images, and this is due to possible blur of the image, as well as the color, contrast, and tone of objects in the image [21]. With one exception, in the Shang et al. [15] research which used a radiometer, the reported GSD of all reviewed studies ranged between 2.96 mm and 15 cm for flying heights between 5 m and 400 m. However, most flights were conducted between 100 m and 400 m altitude and had a GSD of around 4–7 cm (Figure 1). The expected relationship between flying height and GSD is not completely linear here due to the range of cameras reviewed.

3.5. Algal Spectral Indices

This study observed several difference indices (band ratio algorithms) that were used to identify phytoplankton blooms in UAV imagery (Table 2). These indices include NDVI, Normalized Green Red Difference Index (NGRDI), Blue Normalized Difference Vegetation Index (BNDVI), Normalized Green Blue Difference Index (NGBDI), Green Normalized Difference Vegetation Index (GNDVI), Green Leaf Index (GLI), Excess Green (EXG), and an Algal Bloom Detection Index. Although each of these indices was developed specifically to the aquatic environment in question, the NGRDI appeared to be effective in two studies, indicating its potential for successful algal bloom detection in coastal environments [27]. Two indices used the red and NIR bands to identify algae, including the Algal Bloom Detection Index [19,20] and NDVI [23]. These appeared to perform fairly successfully in river and estuarine environments, whereas indices that incorporate the green or blue bands were utilized more frequently for lake and coastal bloom identification [1,27]. Finally, there were several algorithms that were utilized to delineate areas affected by blooms but were not explicitly described in the 13 studies reviewed. These include the adaptive cosine estimator (ACE) and spectral angle mapper (SAM) [25], as well as fast non-negative least squares solution (FNNLS), vertex component analysis (VCA), and Harsanyi–Farrand–Chang (HFC) [22]. One study developed a regression model based on the relationship between in situ chlorophyll-a and a NIR/Red band ration [26].

3.6. Software

The UAV image collection and processing workflow includes flight and mission planning, image capture, mosaicking and image processing, and visualization. There are few software options that encompass this entire workflow, and many UAV scientists use a compilation of software packages. Indeed, we reviewed references to more than 10 software packages. In the reviewed papers, the most commonly-used software reported was Agisoft PhotoScan [35], which was employed for georeferencing, orthorectification, and mosaicking. One researcher found that Agisoft performed better in image stitching than DroneDeploy [23]. Other software packages that were used included Mission Planner [36] for flight planning, Pix4D [37], DroneDeploy [38], ArcGIS [39], ENVI [40], Matlab [41], AutoPano Pro [42], and PixelWrench2 [43] for further image processing needs (Table 3).

3.7. Calibration Techniques

Several of the studies reviewed used a variety of radiometric or ground-based calibration techniques to normalize their reflectance data on multiple flight days, as factors such as differences in sun angle, sun glint, and shadows [2,44] can produce various aquatic organic matter absorption curves and suspended particle backscattering values [1]. Seven studies used calibration panels with known reflectance values to calibrate spectral values on the ground to the raw digital numbers in their camera data [18,20,22,23,24,25,28]. Six studies used spectroradiometers in flight [15] or in the field to gather upwelling radiance and downwelling irradiance spectra at the same time as their UAV flights for sensor calibration [18,19,20,22,24]. Three studies explained their radiometric calibration techniques, which included Bidirectional Reflectance Distribution Function (BRDF) and radiometric block adjustments [18,22,24]. We found that only one study reviewed employed atmospheric correction using Dark Object Subtraction, which is one of the most ubiquitously used methods for atmospheric corrections in aquatic contexts [24]. Furthermore, atmospheric correction is extremely rare and often unnecessary in current UAV studies because flights are conducted at low altitudes that have very little atmospheric interference, and the small extent of each study renders atmospheric refraction and influences from the Earth’s curvature negligible [26].

3.8. Validation Techniques

Almost every study reviewed used a validation technique to complement their remotely-sensed algal bloom imagery analysis. These techniques included in situ water quality sampling of chlorophyll a, turbidity, total suspended matter, and surface water temperature [22,23], as well as field sampling of algal cover and spectroradiometer measurements of algal spectral reflectance [15,18,19,27]. Additionally, some researchers [1,20,28] used microscopy to identify specific algal species (Table 1).

3.9. Quantitative Analyses

While many of the reviewed studies provided qualitative assessments of algal cover [1,19,20,21,23,27,28], including thematic maps representing algal identification and presence, five studies were able to conduct quantitative analyses using spectrally calibrated data of the algal blooms in question [15,22,24,25,26]. In these studies, researchers were able to quantify high algal biomass surface concentrations by producing outputs such as absorption value maps of in-situ-validated species, such as Microcystis aeruginosa [24], or chlorophyll a concentration maps using precise reflectance spectra from radiometric UAV data [26]. Most of the studies reviewed were able to estimate the surface area or biomass of the bloom in their study region [1,15,20,21,22,25,26,27,28]. As lightweight hyperspectral sensors increase in accessibility to researchers [32,45], we expect quantitative research on algal blooms using the spectral capabilities of UAVs to expand. Currently, most UAV-based algal bloom studies provide first-level evaluations of the spatial and temporal extents of blooms.

3.10. Barriers and Challenges

Although the studies reviewed proved that researchers can successfully identify phytoplankton blooms using UAV platforms and multispectral or hyperspectral sensors, the general lack of studies in this field reflects the obstacles inherent in using UAVs for current environmental assessments (Table 1). In our review, we found that weather and environmental conditions posed a major problem to UAVs, as rain, high winds, clouds, and solar glint can often cause pilots to cancel their flights (thus potentially missing critical time windows for bloom detection) or face the consequences of geometrically distorted images [15,21,22,25]. Consequences of distortion include difficulty in accurately estimating bloom extent and incorrect spectral values. A sub-text not explicitly mentioned in most papers reviewed was the difficulty in capturing imagery with good geolocation accuracy when ground control was limited. Many papers discussed mosaicking difficulties (e.g., [23]). Some drones are waterproof and highly stabilized, and can withstand precipitation and high winds [46], but the UAV platforms reviewed in this study could not. Other barriers that appeared in the reviewed case studies include hardware issues such as gimbal malfunctions, limitations in flight time and coverage, and inability to mount multiple sensors on one platform [1,20,21], as well as data processing errors pertaining to signal-to-noise ratio, mosaicking images, and spectrally imaging water due to the challenges of sun glint and ripple effect [18,22,23]. Additionally, some studies had difficulty identifying algal species due to coarse image resolution or laboratory analysis limitations [20,28]. Finally, the visual line of sight requirement for UAV pilots can limit algal bloom researchers from quickly conducting aerial surveys [1], as some river reaches are not straight or are heavily vegetated, and pilots cannot physically see around complex landscapes.

4. Discussion

4.1. Future Opportunities

4.1.1. Hyperspectral UAVs

Airborne platforms that use hyperspectral sensors to detect and map algal blooms provide greater spectral information than multispectral sensors and help identify specific algal species. There are thousands of species of phytoplankton; ~300 species that cause blooms or red tides; and approximately 80 species that produce toxins that can negatively affect fish and shellfish consumers [47]. Hyperspectral imagery provides more continuous spectral information on the reflectance of algae, and this is important because species have their own unique spectral signatures. Increased usage of hyperspectral sensors on UAVs will augment the capacity to discern different phytoplankton groups and will offer higher spatial and temporal resolution than aircraft or satellite measurements. This information can be integrated into a harmful algal bloom forecasting program [10]. Also, successfully-used algal indices from airborne and satellite sensors can be modified and adopted in UAV methodologies. Algal bloom researchers affiliated with the University of California, Santa Cruz, and the National Aeronautics and Space Administration (NASA) have applied different indices to extract algal bloom information, such as the Cyanobacteria index (CI) that uses wavelengths of 681 nm, 665 nm, and 709 nm, the Aphanizomenon-Microcystis index (AMI) that uses wavelengths of 640 nm, 510 nm, 642 nm, and 625 nm, and the Scattering Line Height (SLH) index that uses wavelengths of 714 nm and 654 nm [48]. In addition, researchers can create spectral library repositories, such as the one developed in 2009 for wetlands land cover classifications via hyperspectral sensors [49], for algal bloom research. Spectral libraries based on hyperspectral data can assist remote-sensing experts and biologists in identifying specific species of harmful blooms, thereby enhancing the ability of public health officials to address and predict illness outbreaks caused by these events.

4.1.2. Integrated Methods for Algal Bloom Research

Surface water imaging provides only partial information of algal bloom proliferation, given the vertical migration trajectories of several phytoplankton groups. Many harmful algae species can be found at depths below the surface [50], and current UAV-based spectral identification methods are limited to high surface biomass estimations. UAVs that can image and physically take samples of the water column using a dangling tube or container [33] could greatly improve algal bloom detection, while reducing the amount of fieldwork and surveys required by field technicians. This method could even enable researchers to conduct eDNA sampling to detect early warning signals of specific toxic algal species. To collect algae in situ, UAVs could be paired with autonomous surface or underwater gliders [46]. Sampling methodologies that integrate with UAV platforms, while reducing time and money dedicated to surveying and extracting in situ parameters, will greatly improve the field of UAV-based algal bloom research.

4.1.3. Standardization and Interoperability

A common theme emerged from this review: each reviewed study’s team used its own algorithm, modified drone and camera, and overall methodology (Table 1, Table 2 and Table 3), which clearly highlights a tremendous potential for flexible, customized applications of UAVs in a given research context and geographic and logistical setting. At the same time, the use of more than 10 unique cameras, two radiometers, 10 different UAV platforms, and a variety of methods across these 13 reviewed studies also demonstrate that UAV-based algal bloom research is a nascent field that lacks standardization in hardware, software, and workflow. To move forward in this field, a standardized framework [13] would reduce the time and effort spent in testing a multitude of UAV, camera and algorithms in similar landscapes, ideally also accompanied by a central spectral library repository to facilitate knowledge sharing.
Several factors that need to be addressed by this synthesized structure include understanding of the optical, geologic, distributive, temporal, and biochemical parameters of different types of phytoplankton blooms. This framework could be developed within specific regions where certain algal species dominate, and within localized environments, such as inland clear freshwater bodies, inland turbid freshwater bodies, coastal zones, and estuaries. As hyperspectral and Light Detection and Ranging (LiDAR) sensors become more affordable, these sensors can become integrated into this framework. Streamlining applications and imaging software that can perform all image capture, processing, and visualization tasks in one program (such as an ArcGIS for UAVs) would help facilitate faster processing and a greater online user help community to debug issues such as mosaicking or orthorectification. As these technologies and methodologies come into fruition, UAV-based algal bloom monitoring will become an accessible approach for students, industry professionals, and public health officials to utilize and prevent the spread of toxic blooms.

4.1.4. UAV Regulation

While only three of the studies reviewed mentioned regulatory impediments to UAV research [1,18,25], there are considerable obstacles to flying UAVs that may affect the applications of this technology under specific locations and schedules of algal blooms [51]. For example, researchers and commercial employees in the United States must obtain a pilot’s license from the Federal Aviation Administration before flying, while hobbyists can fly without a license and with few limitations. Several other nations have specific UAV regulations, including the United Kingdom, Canada, and Australia [52]. In many regions, there are no-fly zones or airspace-use restrictions that must be abided by, especially in close proximity to landing strips or airports. This posed a slight limitation to at least one of the studies reviewed [1]. Additionally, the requirement that the remote pilot in command must maintain visual line of sight of the UAV throughout the flight appeared to be a slight limitation in three studies [1,18,25].

4.1.5. Technical Challenges

Water is difficult to image due to ripple distortion, solar reflection, shadows, turbidity of the water, and the lack of ground control points or unique landscape features in a continuous body of water. In response to ripple distortion, fluid-lensing techniques [53], and structure-from-motion photogrammetry algorithms [54] are being developed to increase the clarity and dimensions of water imagery. In terms of ground control points in water, it is recommended that UAV pilots fly with a fraction of land, coast, or a unique object in the images to increase the chances of smooth mosaicking. When no ground control is possible, imagery accuracy will depend on the on-board capabilities of the UAV: the Global Positioning System (GPS) and Inertial Measurement Unit (IMU), the link between the camera and the GPS and IMU, and the stored telemetry data that records the UAV position and relevant information for each camera image in a data log which can be used to correct the location of the imagery post-flight [24,55,56]. Additionally, well-anchored floating buoys that have precise GPS measurements are a potential option for open water imaging [57]. In regards to spectral value normalization of UAV-based algal blooms, indices are difficult to standardize without radiometric calibration and potentially atmospheric correction. Commonly, NDVI is used for land-based studies, although it has been adapted to identify marine and freshwater algal blooms using the AVHRR (Advanced Very High Resolution Radiometer) and MODIS (Moderate Resolution Imaging Spectroradiometer) satellites. While this index has been successful in identifying blooms, it is susceptible to factors such as aerosols and solar angle [58]. Therefore, corrections of these variables are required in order to accurately compare various images acquired on different days. Finally, there are several properties of phytoplankton blooms themselves that are very difficult to assess with surface imagery. First, algae can migrate vertically, so it is important to understand the vertical profiles of algal blooms in a water column [12]. To do this, UAVs can attach water sample tubes or a sensor that dips into the water to observe algal concentrations at various depths [33]. Second, aerial imagery cannot quantify phytoplankton toxins [1], so remotely-sensed algal bloom studies must be paired with in situ algal sampling or microscopy.

5. Conclusions

UAVs are effective tools in monitoring small-scale algal blooms due to their high temporal frequency, relatively inexpensive operating costs, and ability to include lightweight sensors with the optical capacity to observe algae in various wavelengths between 500 and 1400 nm (visible and near-infrared range). Indeed, since 2012, there has been a rapid increase in use of UAVs for water resource management applications in freshwater and marine ecosystems [17], and this will continue to increase as UAV hardware, sensors, and processing software become cheaper. Our review revealed that the main limiting factors in using UAVs for algal bloom monitoring are the difficulties in imaging water (more specifically, the difficulties in stitching together water images), flight issues due to weather conditions, expensive sensors, and a lack of standardization and interoperability in methodologies. As hyperspectral and LiDAR sensors diminish in price, and algorithms specific to freshwater and marine environments are established, UAV-based algal bloom research will continue to flourish. However, until this occurs, rotocopters and fixed-wing UAVs can be equipped with RGB cameras and filters to accomplish this research. Indices that are modified from NDVI, such as the NGRDVI, GNDVI, and BNDVI, can be utilized to detect phytoplankton blooms. Users can benefit from this information as UAV-based algal bloom studies can prevent the destruction of aquaculture establishments, risk of consuming toxic shellfish, death of fish and other organisms, and illness of people who live near affected lakes, rivers, and tidal zones. Furthermore, identifying different colored algae with optical sensors on a UAV can help researchers understand environmental concerns such as insect emergence, nitrogen fixation and export, and carbon cycling in aquatic systems [59]. As UAV technology continues to advance, airborne methodologies will allow data collectors to integrate their detection techniques into predictive algal bloom models, preventing public health outbreaks, and promoting adaptive strategies to combating deleterious effects of prolific and toxic algal blooms.

Author Contributions

C.K., M.K. and I.D. conceived of the paper; C.K. conducted the literature review and wrote the manuscript; M.K. and I.D. edited and helped revise the manuscript.

Funding

While conducting this research, C.K. was funded by US National Science Foundation Training Grant: “Data Science for the 21st Century grant number 1450053.

Acknowledgments

The authors would like to thank the reviewers for this article, who had many useful suggestions for its betterment.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Van der Merwe, D.; Price, K.P. Harmful algal bloom characterization at ultra-high spatial and temporal resolution using small unmanned aircraft systems. Toxins 2015, 7, 1065–1078. [Google Scholar] [CrossRef] [PubMed]
  2. Klemas, V. Remote Sensing of Algal Blooms: An Overview with Case Studies. J. Coast. Res. 2012, 28, 34–43. [Google Scholar] [CrossRef]
  3. Lee, Z.; Marra, J.; Perry, M.J.; Kahru, M. Estimating oceanic primary productivity from ocean color remote sensing: A strategic assessment. J. Mar. Syst. 2015, 149, 50–59. [Google Scholar] [CrossRef]
  4. Blondeau-Patissier, D.; Gower, J.F.R.; Dekker, A.G.; Phinn, S.R.; Brando, V.E. A review of ocean color remote sensing methods and statistical techniques for the detection, mapping and analysis of phytoplankton blooms in coastal and open oceans. Prog. Oceanogr. 2014, 123, 123–144. [Google Scholar] [CrossRef]
  5. Hallegraeff, G.M. Harmful algal blooms: A global overview. In Manual on Harmful Marine Microalgae; Monographs on Oceanographic Methodology Series; Hallegraeff, G.M., Anderson, D.M., Cembella, A.D., Eds.; UNESCO Publishing: Paris, France, 2003; Volume 11, pp. 25–49. [Google Scholar]
  6. Kirkpatrick, B.; Fleming, L.E.; Squicciarini, D.; Backer, L.C.; Clark, R.; Abraham, W.; Benson, J.; Cheng, Y.S.; Johnson, D.; Pierce, R.; et al. Literature Review of Florida Red Tide: Implications for Human Health Effects. Harmful Algae 2004, 3, 99–115. [Google Scholar] [CrossRef]
  7. Smayda, T.J. Bloom dynamics: Physiology, behavior, trophic effects. Limnol. Oceanogr. 1997, 42, 1132–1136. [Google Scholar] [CrossRef]
  8. Moore, S.K.; Trainer, V.L.; Mantua, N.J.; Parker, M.S.; Laws, E.A.; Backer, L.C.; Fleming, L.E. Impacts of climate variability and future climate change on harmful algal blooms and human health. Environ. Health 2008, 7 (Suppl. 2), S4. [Google Scholar] [CrossRef]
  9. Gobler, C.J.; Doherty, O.M.; Hattenrath-Lehmann, T.K.; Griffith, A.W.; Kang, Y.; Litaker, R.W. Ocean warming since 1982 has expanded the niche of toxic algal blooms in the North Atlantic and North Pacific oceans. Proc. Natl. Acad. Sci. USA 2017, 114, 4975–4980. [Google Scholar] [CrossRef]
  10. Jochens, A.E.; Malone, T.C.; Stumpf, R.P.; Hickey, B.M.; Carter, M.; Morrison, R.; Dyble, J.; Jones, B.; Trainer, V.L. Integrated Ocean Observing System in Support of Forecasting Harmful Algal Blooms. Mar. Technol. Soc. J. 2010, 44, 99–121. [Google Scholar] [CrossRef]
  11. Kutser, T. Passive optical remote sensing of cyanobacteria and other intense phytoplankton blooms in coastal and inland waters. Int. J. Remote Sens. 2009, 30, 4401–4425. [Google Scholar] [CrossRef]
  12. Longhurst, A. Seasonal cycles of pelagic production and consumption. Prog. Oceanogr. 1995, 36, 77–167. [Google Scholar] [CrossRef]
  13. Shen, L.; Xu, H.; Guo, X. Satellite remote sensing of harmful algal blooms (HABs) and a potential synthesized framework. Sensors 2012, 12, 7778–7803. [Google Scholar] [CrossRef]
  14. Hook, S.J.; Myers, J.J.; Thome, K.J.; Fitzgerald, M.; Kahle, A.B. The MODIS/ASTER airborne simulator (MASTER)—A new instrument for earth science studies. Remote Sens. Environ. 2001, 76, 93–102. [Google Scholar] [CrossRef]
  15. Shang, S.; Lee, Z.; Lin, G.; Hu, C.; Shi, L.; Zhang, Y.; Li, X.; Wu, J.; Yan, J. Sensing an intense phytoplankton bloom in the western Taiwan Strait from radiometric measurements on a UAV. Remote Sens. Environ. 2017, 198, 85–94. [Google Scholar] [CrossRef]
  16. Manfreda, S.; McCabe, M.F.; Miller, P.E.; Lucas, R.; Pajuelo Madrigal, V.; Mallinis, G.; Ben Dor, E.; Helman, D.; Estes, L.; Ciraolo, G.; et al. On the Use of Unmanned Aerial Systems for Environmental Monitoring. Remote Sens. 2018, 10, 641. [Google Scholar] [CrossRef]
  17. DeBell, L.; Anderson, K.; Brazier, R.E.; King, N.; Jones, L. Water resource management at catchment scales using lightweight UAVs: Current capabilities and future perspectives. J. Unmanned Veh. Syst. 2015, 4, 7–30. [Google Scholar] [CrossRef]
  18. Honkavaara, E.; Hakala, T.; Kirjasniemi, J.; Lindfors, A.; Mäkynen, J.; Nurminen, K.; Ruokokoski, P.; Saari, H.; Markelin, L. New light-weight stereosopic spectrometric airborne imaging technology for high-resolution environmental remote sensing case studies in water quality mapping. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2013, 1, W1. [Google Scholar] [CrossRef]
  19. Jang, S.W.; Yoon, H.J.; Kwak, S.N.; Sohn, B.Y.; Kim, S.G.; Kim, D.H. Algal Bloom Monitoring using UAVs Imagery. Adv. Sci. Technol. Lett. 2016, 138, 30–33. [Google Scholar]
  20. Kim, H.-M.; Yoon, H.-J.; Jang, S.W.; Kwak, S.N.; Sohn, B.Y.; Kim, S.G.; Kim, D.H. Application of Unmanned Aerial Vehicle Imagery for Algal Bloom Monitoring in River Basin. Int. J. Control Autom. 2016, 9, 203–220. [Google Scholar] [CrossRef]
  21. Lyu, P.; Malang, Y.; Liu, H.H.T.; Lai, J.; Liu, J.; Jiang, B.; Qu, M.; Anderson, S.; Lefebvre, D.D.; Wang, Y. Autonomous cyanobacterial harmful algal blooms monitoring using multirotor UAS. Int. J. Remote Sens. 2017, 38, 2818–2843. [Google Scholar] [CrossRef]
  22. Pölönen, I.; Puupponen, H.H.; Honkavaara, E.; Lindfors, A.; Saari, H.; Markelin, L.; Hakala, T.; Nurminen, K. UAV-based hyperspectral monitoring of small freshwater area. Proc. SPIE 2014, 9239, 923912. [Google Scholar]
  23. Goldberg, S.J.; Kirby, J.T.; Licht, S.C. Applications of Aerial Multi-Spectral Imagery for Algal Bloom Monitoring in Rhode Island; SURFO Technical Report No. 16-01; University of Rhode Island: South Kingstown, RI, USA, 2016; p. 28. [Google Scholar]
  24. Aguirre-Gómez, R.; Salmerón-García, O.; Gómez-Rodríguez, G.; Peralta-Higuera, A. Use of unmanned aerial vehicles and remote sensors in urban lakes studies in Mexico. Int. J. Remote Sens. 2017, 38, 2771–2779. [Google Scholar] [CrossRef]
  25. Flynn, K.F.; Chapra, S.C. Remote Sensing of Submerged Aquatic Vegetation in a Shallow Non-Turbid River Using an Unmanned Aerial Vehicle. Remote Sens. 2014, 6, 12815–12836. [Google Scholar] [CrossRef]
  26. Su, T.-C.; Chou, H.-T. Application of Multispectral Sensors Carried on Unmanned Aerial Vehicle (UAV) to Trophic State Mapping of Small Reservoirs: A Case Study of Tain-Pu Reservoir in Kinmen, Taiwan. Remote Sens. 2015, 7, 10078–10097. [Google Scholar] [CrossRef]
  27. Xu, F.; Gao, Z.; Jiang, X.; Shang, W.; Ning, J.; Song, D.; Ai, J. A UAV and S2A data-based estimation of the initial biomass of green algae in the South Yellow Sea. Mar. Pollut. Bull. 2018, 128, 408–414. [Google Scholar] [CrossRef] [PubMed]
  28. Bollard-Breen, B.; Brooks, J.D.; Jones, M.R.L.; Robertson, J.; Betschart, S.; Kung, O.; Craig Cary, S.; Lee, C.K.; Pointing, S.B. Application of an unmanned aerial vehicle in spatial mapping of terrestrial biology and human disturbance in the McMurdo Dry Valleys, East Antarctica. Polar Biol. 2015, 38, 573–578. [Google Scholar] [CrossRef]
  29. Koparan, C.; Koc, A.; Privette, C.; Sawyer, C. In Situ Water Quality Measurements Using an Unmanned Aerial Vehicle (UAV) System. Water 2018, 10, 264. [Google Scholar] [CrossRef]
  30. Kutser, T.; Metsamaa, L.; Strömbeck, N.; Vahtmäe, E. Monitoring cyanobacterial blooms by satellite remote sensing. Estuar. Coast. Shelf Sci. 2006, 67, 303–312. [Google Scholar] [CrossRef]
  31. Berni, J.A.J.; Zarco-Tejada, P.J.; Suárez Barranco, M.D.; Fereres Castiel, E. Thermal and narrow-band multispectral remote sensing for vegetation monitoring from an unmanned aerial vehicle. IEEE Trans. Geosci. Remote Sens. 2009, 47, 722–738. [Google Scholar] [CrossRef]
  32. Liu, S.; Zhang, C.; Zhang, Y.; Wang, T.; Zhao, A.; Zhou, T.; Jia, X. Miniaturized spectral imaging for environment surveillance based on UAV platform. Proc. SPIE 2017, 10461, 104611K. [Google Scholar]
  33. Chung, M.; Detweiler, C.; Hamilton, M.; Higgins, J.; Ore, J.-P.; Thompson, S. Obtaining the Thermal Structure of Lakes from the Air. Water 2015, 7, 6467–6482. [Google Scholar] [CrossRef]
  34. Rhee, D.S.; Kim, Y.D.; Kang, B.; Kim, D. Applications of unmanned aerial vehicles in fluvial remote sensing: An overview of recent achievements. KSCE J. Civ. Eng. 2018, 22, 588–602. [Google Scholar] [CrossRef]
  35. Agisoft PhotoScan. Available online: http://www.agisoft.com/ (accessed on 7 August 2018).
  36. Mission Planner. Available online: http://ardupilot.org/planner/ (accessed on 7 August 2018).
  37. Pix4D. Available online: https://pix4d.com/ (accessed on 7 August 2018).
  38. DroneDeploy. Available online: https://www.dronedeploy.com/ (accessed on 7 August 2018).
  39. ESRI Esri GIS Products. Available online: https://www.esri.com/en-us/arcgis/products/index (accessed on 7 August 2018).
  40. ENVI. Available online: https://www.harrisgeospatial.com/SoftwareTechnology/ENVI.aspx (accessed on 7 August 2018).
  41. MATLAB. Available online: https://www.mathworks.com/products/matlab.html (accessed on 7 August 2018).
  42. Kolor|Autopano. Available online: http://www.kolor.com/fr/autopano/ (accessed on 7 August 2018).
  43. Tetracam PixelWrench2. Available online: http://www.tetracam.com/Products_PixelWrench2.htm (accessed on 7 August 2018).
  44. Sathyendranath, S.; Cota, G.; Stuart, V.; Maass, H.; Platt, T. Remote sensing of phytoplankton pigments: A comparison of empirical and theoretical approaches. Int. J. Remote Sens. 2001, 22, 249–273. [Google Scholar] [CrossRef]
  45. Anderson, K.; Gaston, K.J. Lightweight unmanned aerial vehicles will revolutionize spatial ecology. Front. Ecol. Environ. 2013, 11, 138–146. [Google Scholar] [CrossRef]
  46. Jung, S.; Cho, H.; Kim, D.; Kim, K.; Han, J.I.; Myung, H. Development of Algal Bloom Removal System Using Unmanned Aerial Vehicle and Surface Vehicle. IEEE Access 2017, 5, 22166–22176. [Google Scholar] [CrossRef]
  47. Zingone, A.; Oksfeldt Enevoldsen, H. The diversity of harmful algal blooms: A challenge for science and management. Ocean Coast. Manag. 2000, 43, 725–748. [Google Scholar] [CrossRef]
  48. Kudela, R.M.; Palacios, S.L.; Austerberry, D.C.; Accorsi, E.K.; Guild, L.S.; Torres-Perez, J. Application of hyperspectral remote sensing to cyanobacterial blooms in inland waters. Remote Sens. Environ. 2015, 167, 196–205. [Google Scholar] [CrossRef]
  49. Zomer, R.J.; Trabucco, A.; Ustin, S.L. Building spectral libraries for wetlands land cover classification and hyperspectral remote sensing. J. Environ. Manag. 2009, 90, 2170–2177. [Google Scholar] [CrossRef]
  50. Hallegraeff, G.M. Ocean climate change, phytoplankton community responses, and harmful algal blooms: A formidable predictive challenge1. J. Phycol. 2010, 46, 220–235. [Google Scholar] [CrossRef]
  51. Hogan, S.D.; Kelly, M.; Stark, B.; Chen, Y. Unmanned aerial systems for agriculture and natural resources. Calif. Agric. 2017, 71, 5–14. [Google Scholar] [CrossRef]
  52. Duffy, J.P.; Cunliffe, A.M.; DeBell, L.; Sandbrook, C.; Wich, S.A.; Shutler, J.D.; Myers-Smith, I.H.; Varela, M.R.; Anderson, K. Location, location, location: Considerations when using lightweight drones in challenging environments. Remote Sens. Ecol. Conserv. 2018, 4, 7–19. [Google Scholar] [CrossRef]
  53. Chirayath, V.; Earle, S.A. Drones that see through waves—Preliminary results from airborne fluid lensing for centimetre-scale aquatic conservation. Aquat. Conserv. 2016, 26, 237–250. [Google Scholar] [CrossRef]
  54. Dietrich, J.T. Riverscape mapping with helicopter-based Structure-from-Motion photogrammetry. Geomorphology 2016, 252, 144–157. [Google Scholar] [CrossRef]
  55. Brown, A.; Carter, D. Geolocation of Unmanned Aerial Vehicles in GPS-Degraded Environments. In Proceedings of the AIAA [email protected] Conference and Exhibit, Arlington, VA, USA, 26–29 September 2005. [Google Scholar]
  56. Gerke, M.; Przybilla, H.-J. Accuracy Analysis of Photogrammetric UAV Image Blocks: Influence of Onboard RTK-GNSS and Cross Flight Patterns. Photogrammetrie Fernerkundung Geoinformation 2016, 2016, 17–30. [Google Scholar] [CrossRef]
  57. Levy, J.; Hunter, C.; Lukacazyk, T.; Franklin, E.C. Assessing the spatial distribution of coral bleaching using small unmanned aerial systems. Coral Reefs 2018, 37, 373–387. [Google Scholar] [CrossRef]
  58. Hu, C. A novel ocean color index to detect floating algae in the global oceans. Remote Sens. Environ. 2009, 113, 2118–2129. [Google Scholar] [CrossRef]
  59. Power, M.; Lowe, R.; Furey, P.; Welter, J.; Limm, M.; Finlay, J.; Bode, C.; Chang, S.; Goodrich, M.; Sculley, J. Algal mats and insect emergence in rivers under Mediterranean climates: Towards photogrammetric surveillance. Freshw. Biol. 2009, 54, 2101–2115. [Google Scholar] [CrossRef]
Figure 1. Flying height (m) vs. ground sampling distance (GSD, cm) reported in the studies reviewed, showing camera/sensor type.
Figure 1. Flying height (m) vs. ground sampling distance (GSD, cm) reported in the studies reviewed, showing camera/sensor type.
Drones 02 00035 g001
Table 1. Summary of unmanned aerial vehicle (UAV) use in algal bloom research organized by hardware, sensor, validation, and challenges encountered.
Table 1. Summary of unmanned aerial vehicle (UAV) use in algal bloom research organized by hardware, sensor, validation, and challenges encountered.
Target, LocationUAV PlatformSensor(s)Parameters and IndicesValidationChallengesExample Reference
Lake cyanobacteria, Chapultepec Park, MexicoMultirotor: DJI Phantom 312-megapixel red green blue (RGB) cameraChlorophyll a at 544 nm, phycocyanin absorption at 619 nmIn situ upwelling and downwelling irradiances with hyperspectral spectroradiometer (GER-1500), cyanobacteria sampling, meteorological dataNot specified[24]
Cyanobacterial mats, AntarcticaFixed-wing: Modified UAV (Kevlar fabric, Skycam UAV Ltd. airframe) “Polar fox”Sony NEX 5 RGB and near-infrared (NIR) camerasNot specifiedIn situ spectral reflectance using multi- and hyperspectral cameras.
Field microscopy of cyanobacterial mats.
Resolution: Fixed-wing UAV imagery too coarse to see small algal colonies[28]
Shallow river algae (Cladophora glomerata) in Clark Fork River, Montana, USAMultirotor: DJI (unspecified)GoPro Hero 3 with 12MP RGB cameraACE 1, SAM 2In situ data of total suspended sediment samples, light attenuation profiles, daily streamflowRegulation: federal flight restrictions.
Weather: Wind Turbulence.
Hardware: Off-nadir images
[25]
Cyanobacteria biomass, Rhode Island, USAMultirotor: 3DR X8+
Multirotor: 3DR Iris+
Tetracam Mini-MCA6 multispectral camera
4 MAPIR Survey 1 cameras
NDVI, GNDVI 3In situ water quality data
(not specified)
Data processing: aligning images; mosaicking[23]
Shallow lake algae, FinlandHelicopter UAV: not specifiedFabry-Perot interferometer (FPI) 500–900 nm filterIn situ radiometric water measurements.
Signal-to-noise ratios.
Data processing: low signal-to-noise ratio for narrow band hyperspectral images[18]
River algal blooms, Daecheong Dam, South KoreaFixed-wing: Ltd’s eBeeCanon S110 RGB and NIR camerasAI 4In situ field spectral reflectance Not specified[19]
River algal blooms, Nakdong River, South KoreaFixed-wing: Ltd’s eBee Canon Powershot S110 RGB and NIR cameraAIIn situ water quality, microscopy analysis of phytoplankton species
Field spectral reflectance
Laboratory analysis: identification of phytoplankton to species level[20]
Pond Cyanobacteria, Kingston, CanadaOctorotor: UAV not specified with Pixhawk as autopilotSony ILCE-6000 mini SLR camera (RGB) with MTI-G-700 gimbal
Sony HX60v used for test flights
Not specifiedOnly technical validation performedHardware: Gimbal errors [21]
Lake blue-green algal distribution, water turbidity, FinlandHelicopter UAV: not specifiedFabry-Perot interferometer (FPI)
CMV4000 4.2 Megapixel CMOS camera
SAM, HFC 5, VCA 6, FNNLS 7In situ water quality measurements (Temp, conductance, turbidity, chlorophyll a, blue-green algae distribution)Data processing: geometric and radiometric water image processing[22]
Estuarine phytoplankton bloom (Phaeocystis globosa), Western Taiwan Strait and Weitou Bay, ChinaFixed-wing: LT-150, from TOPRS Technology Co., Ltd.AvaSpec-dual spectroradiometer with 2 sensors that span 360–1000 nmNot specifiedIn situ chlorophyll a sampling and spectroradiometer measurements via shipFundamental: Accurate remote sensing reflectance values in coastal region.
Environmental: tides, wind
[15]
Algal cover in Tain-Pu Reservoir, Taiwan, ChinaFixed-wing: Ltd’s eBeeCanon Powershot S110 RGB & Canon Powershot S110 NIR camerasChlorophyll a regression modelIn situ total phosphorus, chlorophyll a, Secchi disk depthEnvironmental: reflection of water[26]
Lake cyanobacterial (Microcystis) buoyant volume, Kansas, USAFixed-wing: Zephyr sUAS from RitewingRC; Controller: Ardupilot Mega 2.6
Multirotor: DJI F550; Controller: NAZA V2
Canon Powershot S100 NDVI camera
Multirotor gimbal (Gaui Crane II) and real-time video (ReadymadeRC)
BNDVI 8In situ microscopy identification of cyanobacterial genus Hardware: Multirotor flight limits, slow speed, small coverage.
Airspace: use constraints
[1]
Green tide algae (on P. yezoenis rafts) biomass in aquaculture zones, Yellow Sea, ChinaMultirotor: DJI Inspire 1 DJI X3 RGB cameraNGRDI 9, NGBDI 10, GLI 11, EXG 12In situ samples of green algae attached to P. yezoenis in the aquaculture zoneNot specified[27]
Notes: 1 ACE: Adaptive Cosine Estimator; 2 SAM: Spectral Angle Mapper; 3 GNDVI: Green Normalized Difference Vegetation Index; 4 AI: Algal Bloom Detection Index; 5 HFC: Harsanyi-Ferrand-Chang; 6 VCA: Vertex Component Analysis; 7 FNNLS: Fast Non-Negative Least Squares Solution; 8 BNDVI: Blue Normalized Difference Vegetation Index; 9 NGRDI: Normalized Green Red Difference Index; 10 NGBDI: Normalized Green Blue Difference Index; 11 GLI: Green Leaf Index; 12 EXG: Excess Green.
Table 2. Simple spectral indices for highlighting algal blooms described in the 13 studies reviewed.
Table 2. Simple spectral indices for highlighting algal blooms described in the 13 studies reviewed.
NameIndexReference(s)
Normalized Difference Vegetation Index (NDVI)(NIR − Red)/(NIR + Red)[23]
Normalized Green Red Difference Index (NGRDI)(Green − Red)/(Green + Red)[27]
Blue Normalized Difference Vegetation Index (BNDVI)(NIR − Blue)/(NIR + Blue)[1]
Normalized Green Blue Difference Index (NGBDI)(Green − Blue)/(Green + Blue)[27]
Green Normalized Difference Vegetation Index (GNDVI)(NIR − Green)/(NIR + Green)[23]
Green Leaf Index (GLI)(2 * Green – Red − Blue)/(2 * Green + Red + Blue)[27]
Excess Green (EXG)2 * Green − Red − Blue[27]
Algal Bloom Detection Index (AI)((R850nm − R660nm)/(R850nm + R660nm)) + ((R850nm − R625nm)/(R850nm + R625nm))[19,20]
Table 3. Processing software used in 13 case studies reviewed.
Table 3. Processing software used in 13 case studies reviewed.
WorkflowProcessing SoftwareUsesReferences
Mission and flight planningMission PlannerFlight planning[1]
Image ProcessingAgisoft PhotoScanMosaicking, orthorectification and georeferencing[1,23]
BAE Systems SOCET SET and GXP Mosaicking, radiometric processing[18]
MATLABImage processing[23]
Clipping rasters[25]
Pix4D Digital surface models, orthomosaics[27,28]
PixelWrench2 Image processing[23]
VIsualSFM methodImage orientation, radiometric processing, mosaicking[22]
DroneDeploy Mosaicking[23]
QGISGeoreferencing[24]
ArcGISMosaicking, georeferencing[25,28]
Menci SoftwareSensor orientation, Image processing[26]
Spatial AnalysisArcGIS Spatial analysis[28]
ENVI Spatial analysis[28]
OpticksClassification[25]
VisualizationQGISMap production[24]
ArcGISMap production[1]

© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Drones EISSN 2504-446X Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top