Next Article in Journal
Polarization Analysis of the Impact of Temporal Decorrelation in Synthetic Aperture Radar (SAR) Tomography
Previous Article in Journal
A Novel Object-Based Deep Learning Framework for Semantic Segmentation of Very High-Resolution Remote Sensing Data: Comparison with Convolutional and Fully Convolutional Networks
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Shallow-Water Habitat Mapping using Underwater Hyperspectral Imaging from an Unmanned Surface Vehicle: A Pilot Study

by
Aksel Alstad Mogstad
1,*,
Geir Johnsen
1,2 and
Martin Ludvigsen
3,4
1
Centre for Autonomous Marine Operations and Systems, Department of Biology, Norwegian University of Science and Technology (NTNU), Trondhjem Biological Station, NO-7491 Trondheim, Norway
2
Arctic Biology Department, University Centre in Svalbard (UNIS), P.O. Box 156, NO-9171 Longyearbyen, Norway
3
Centre for Autonomous Marine Operations and Systems, Department of Marine Technology, Norwegian University of Science and Technology (NTNU), Otto Nielsens vei 10, NO-7491 Trondheim, Norway
4
Arctic Technology Department, University Centre in Svalbard (UNIS), P.O. Box 156, NO-9171 Longyearbyen, Norway
*
Author to whom correspondence should be addressed.
Remote Sens. 2019, 11(6), 685; https://doi.org/10.3390/rs11060685
Submission received: 15 February 2019 / Revised: 15 March 2019 / Accepted: 20 March 2019 / Published: 21 March 2019
(This article belongs to the Section Ocean Remote Sensing)

Abstract

:
The impacts of human activity on coastal ecosystems are becoming increasingly evident across the world. Consequently, there is a growing need to map, monitor, and manage these regions in a sustainable manner. In this pilot study, we present what we believe to be a novel mapping technique for shallow-water seafloor habitats: Underwater hyperspectral imaging (UHI) from an unmanned surface vehicle (USV). A USV-based UHI survey was carried out in a sheltered bay close to Trondheim, Norway. In the survey, an area of 176 m2 was covered, and the depth of the surveyed area was approximately 1.5 m. UHI data were initially recorded at a 1-nm spectral resolution within the range of 380–800 nm, but this was reduced to 86 spectral bands between 400-700 nm (3.5-nm spectral resolution) during post-processing. The hyperspectral image acquisition was synchronized with navigation data from the USV, which permitted georeferencing and mosaicking of the imagery at a 0.5-cm spatial resolution. Six spectral classes, including coralline algae, the wrack Fucus serratus, green algal films, and invertebrates, were identified in the georeferenced imagery, and chosen as targets for support vector machine (SVM) classification. Based on confusion matrix analyses, the overall classification accuracy was estimated to be 89%–91%, which suggests that USV-based UHI may serve as a useful tool for high-resolution mapping of shallow-water habitats in the future.

Graphical Abstract

1. Introduction

Coastal ecosystems vital to the health and productivity of the world’s oceans are currently facing increasing levels of anthropogenic pressure. At present, approximately 40% of the global population lives within 100 km of the coastline [1,2], and more than half of the urban population can be characterized as coastal [3]. As compared to numbers from 2010, the population living <100 km from the coastline is predicted to grow by an additional 500 million by the year 2030 [2], and considering that the coastal zone makes up less than 20% of the Earth’s land surface area [4], sustainable management of coastal regions is becoming an increasingly relevant topic worldwide.
Mapping and monitoring of the seafloor represents an essential part of marine management; however, unlike terrestrial habitats and ecosystems, most marine benthic habitats remain poorly mapped. Specifically, estimates suggest that only 5%–10% of the seafloor is mapped at a resolution comparable to that of equivalent surveys on land [5,6]. Somewhat counterintuitively, shallow-water (here defined as ≤10-m depth) benthic habitats are among the marine areas of interest where detailed mapping data is currently lacking and in demand [7]. A partial explanation for this is that mapping surveys in shallow, coastal waters may be more expensive than surveys in deeper waters due to factors, such as environmental variables, hazards related to navigation, and a lack of appropriate sampling platforms [8]. Although space-borne and aerial remote sensing both represent valuable and cost-efficient tools for large-scale mapping of optically shallow marine environments, the spatial resolution of these techniques is currently limited to the m-dm scale at best [9,10]. This resolution has proven to be sufficient for generalized, large-scale area (typically >km) mapping of, e.g., coral reefs [11,12,13,14,15] and seagrass meadows [16,17,18], but if detailed information from highly heterogeneous, smaller-scale areas (<km) of interest is required, higher resolution could be beneficial. As numerous coastal regions across the world (including coastlines of north-western Europe, the Mediterranean, north-eastern America, the Caribbean, the Persian Gulf, and south-eastern China) are heavily impacted by multiple anthropogenic drivers [19], and approximately 1% of the oceans (an area the size of India and Great Britain combined) is 0–10 m deep [20] (Figure 1), the need for shallow-water mapping techniques capable of covering different environments and spatial scales is evident. In the current paper, we introduce a technique we believe to be new for high-resolution mapping of shallow-water habitats: Underwater hyperspectral imaging (UHI) from an unmanned surface vehicle (USV).
UHI is emerging as a promising remote sensing technique for marine benthic environments. Like many well-known hyperspectral imagers deployed on airplanes and in space, such as the Compact Airborne Spectrographic Imager (CASI; ITRES Research Ltd., Calgary, Canada), the EO-1 Hyperion [21], and the Hyperspectral Imager for the Coastal Ocean (HICO) [22], the underwater hyperspectral imager is a push-broom scanner that records imagery, where each image pixel contains a contiguous light spectrum [23,24]. What separates UHI from traditional hyperspectral imaging is that the instrument is waterproof and typically deployed with an active light source. The latter increases the signal-to-noise ratio and permits imaging in the absence of ambient light. Strong attenuation of light in water [25] limits the scanning altitude (altitude is here defined as the distance from the sensor to the seafloor) at which UHI surveys can be carried out, and previous UHI surveys have been conducted at altitudes ranging from 1–10 m [26,27,28,29]. UHI operations conducted at ≤10 m altitude yield highly detailed imagery, and at an altitude of 1–2 m, mm-scale spatial resolution can be achieved [23,30].
One of the advantages of hyperspectral imagery is its high spectral resolution. The most recent underwater hyperspectral imagers are capable of recording imagery at a 0.5-nm spectral resolution within the interval of ~380–800 nm [24,31]. This range covers the spectrum of visible light (400–700 nm), as well as some near-infrared radiation. Since each hyperspectral image pixel contains a contiguous light spectrum, a UHI transect holds substantial amounts of spectral information. By comparing the spectral data in a UHI transect to known optical signatures of desired target objects, individual pixels can be assigned to predefined classes. Following this classification procedure, the distribution and abundance of objects of interest (OOIs) within the survey area can be estimated.
Over the past few years, UHI technology has been tested for a variety of applications. Underwater hyperspectral imagers deployed on remotely operated vehicles (ROVs) have, for instance, been used to assess coralline algae off Svalbard [27], manganese nodules and benthic megafauna at a 4200-m depth in the southeast Pacific Ocean [26,30], and man-made materials at a wreck site in the Trondheimsfjord, Norway [29]. In addition, UHI was briefly attempted from an autonomous underwater vehicle (AUV) in a pilot survey carried out near a hydrothermal vent complex situated at a 2350-m depth on the Mid-Atlantic Ridge [28]. The rationale behind the work presented in the current paper was to evaluate the potential of UHI in combination with yet another instrument-carrying platform: The USV. Prior to our pilot study, little was known about the utility of UHI in shallow-water environments. The only known in situ data the authors are aware of come from submerged hyperspectral imagers mounted on stationary tripods [24] and mechanical cart-and-rail setups [23,32,33], and diver-operated hyperspectral imagers [34]. Although useful for certain applications, these modes of deployment arguably put some constraints on operational flexibility. In an attempt to address this issue, we here present a more dynamic shallow-water mapping option that in terms of areal coverage and spatial resolution helps to fill the gap between aerial/space-borne systems and stationary platforms. In our paper, we assess USV-based UHI in relation to biological mapping of shallow-water habitats. Supervised classification of biologically relevant groups is carried out, and the results are evaluated with respect to accuracy, limitations, and potential future applications.

2. Materials and Methods

2.1. Study Area

The survey was carried out in Hopavågen (63°35’N 9°32’E), Agdenes, Norway. Hopavågen is a sheltered bay connected to the mouth of the Trondheimsfjord through a narrow tidal channel. The bay covers an area of approximately 275,000 m2 and has a maximum depth of 32 m [35]. Over the past 25 years, Hopavågen has been subject to scientific studies related to, for example, hydrography and kelp [36,37,38], plankton ecology [39,40,41,42], and nutrient dynamics [35,43,44]. In addition, the inlet of the bay’s benthic community has been characterized by means of a visual census [45]. Prominent members of Hopavågen’s benthic biota include brown macroalgae of the genera, Fucus (Linnaeus, 1753) and Laminaria (J.V. Lamouroux, 1813), coralline algae, the plumose anemone Metridium senile (Linnaeus, 1761), and sea urchins, such as Strongylocentrotus droebachiensis (O.F. Müller, 1776) and Echinus esculentus (Linnaeus, 1758).
In the current pilot study, an area of approximately 176 m2 situated at the southwestern part of the bay was surveyed (an overview is shown in the Results section). The depth of the surveyed area was approximately 1.5 m, but varied between ~1–2 m. Within the confines of the area, the substrate consisted of gravel and cobbles gradually fading over to sand and silt with increasing distance from the shore. The sediment was interspersed with shell fragments of various marine invertebrates. Coralline algae (Rhodophyceae, red algae) of the genera, Corallina (Linnaeus, 1758), Lithothamnion (Heydrich, 1897), and Phymatolithon (Foslie, 1898), were observed to cover a considerable portion of the rocky surfaces, whereas a thin film of green algae (Chlorophyceae) frequently covered calcareous surfaces of shell fragments. In addition to coralline and green algae, clusters of the brown macroalga (Phaeophyceae), Fucus serratus (Linnaeus, 1753), and the plumose anemone, M. senile, made notable contributions to the site’s benthic community.

2.2. Acquisition of Underwater Hyperspectral Imagery

Hyperspectral data from Hopavågen were obtained using the underwater hyperspectral imager, UHI-4 (4th generation underwater hyperspectral imager; Ecotone AS, Trondheim, Norway). UHI-4 is a push-broom scanner that offers an across-track spatial resolution of 1920 pixels. Its field of view (FOV) is 60° and 0.4° in the across- and along-track directions, respectively. At a 12-bit radiometric resolution, the imager covers the spectral range of 380–800 nm with a maximum spectral resolution of 0.5 nm. Recorded imagery is stored locally on an internal solid-state drive, from which it can later be exported for post-processing.
The main novelty of this pilot study was using UHI from the surface for shallow-water habitat mapping. To achieve this, UHI-4 was deployed on an OTTER USV (Maritime Robotics AS, Trondheim, Norway). The OTTER is an electric 200 × 105 × 85-cm twin hull USV with a weight of approximately 95 kg. For the purpose of the Hopavågen survey, the OTTER was equipped with a POS MV WaveMaster II combined real-time kinematic global positioning system (RTK GPS, logging positioning data) and inertial measurement unit (IMU, logging pitch, roll and heading data; Applanix Inc., Ontario, Canada) synchronized with UHI-4 for georeferencing of the hyperspectral data. The underwater hyperspectral imager was mounted on the vehicle in the nadir viewing position using a custom mounting bracket that also held a downward-facing KELDAN Video 8M CRI LED (light-emitting diode) light source (KELDAN GmbH, Brügg, Switzerland; Figure 2). The light source provided 105 W illumination through a 90° diffuser and was mounted 25 cm aft of the imager. In water, UHI-4 and the light source both protruded ~20 cm below the surface. Communication with the OTTER and UHI-4 was established wirelessly using a 4G internet connection.
The Hopavågen fieldwork was conducted in February and March of 2018. On 28 February, approximately three weeks prior to the UHI survey, four 40 × 40-cm weighed down wooden frames were deployed at discrete, 1.5 m deep locations within the area of interest. Subsequently, the sections of the seafloor delimited by the frames were photographed for the purpose of serving as a means of ground truthing. In addition to the frames, three 50 × 33-cm metal sheets that previously had been spray-painted white were deployed to highlight the approximate position of the desired survey area and to be used as reference standards for spectral pseudo-reflectance conversion.
Hyperspectral data were collected on 22 March, between 12:45 PM and 1:40 PM. Hopavågen is known to have a narrow tidal range (0.3–0.7 m) [36], and the tidal difference between the beginning and the end of the survey was estimated to be ~5 cm. The water was calm during the data acquisition, with no discernible wave action, and wind speeds <2 m/s. Upon survey initiation, in situ chlorophyll a (chl a), colored dissolved organic matter (cDOM), and optical backscatter at 700 nm (a proxy for total suspended matter; TSM) were respectively measured to 2.1 ± 0.4 µg L−1 (SD, n = 30), 1.0 ± 0.2 ppb (SD, n = 30), and 0.017 ± 0.005 m−1 (SD, n = 30) using an ECO Triplet-w (WET Labs Inc., Corvallis, USA). Furthermore, downwelling irradiance (400–700 nm) at sea level was measured to 250.5 ± 3.0 µmol photons m−2 s−1 (SD, n = 6) using a SpectraPen LM 510 spectroradiometer (Photon Systems Instruments spol. s. r. o., Drásov, Czech Republic). For the hyperspectral data collection, the OTTER USV was controlled remotely from the shore. Six partially overlapping survey tracks were programmed into the USV’s control system and executed at a speed of ~0.25 m s−1. During the survey, hyperspectral imagery was captured at a frame rate of 25 Hz and an exposure time of 40 ms. The spectral resolution of the recordings was binned down to 1 nm, whereas spatially, full resolution (1920 across-track pixels) was maintained.

2.3. Processing of UHI Data

UHI and navigation data stored together in HDF5 (Hierarchical Data Format) files were exported to an external hard drive for processing. The first two steps of the processing were radiance conversion and georeferencing. Radiance conversion implies the removal of noise inherent to the sensor and the conversion of raw digital counts into upwelling spectral radiance (Lu,λ, W m2 sr−1 nm−1), whereas georeferencing places each pixel in a geospatial context. In the present pilot study, both steps were carried out simultaneously using the function, “Geo-correct”, in the software, Immersion from Ecotone AS. To perform georeferencing in the Immersion software, information about the sensor position (latitude, longitude), heading, pitch, roll, and altitude must be supplied. The OTTER USV’s RTK GPS and IMU logged all the aforementioned parameters except for altitude. Consequently, a fixed altitude of 1.3 m (1.5-m depth—20 cm instrument protrusion) was assigned to the UHI data. It is worth noting that assigning a uniform survey altitude is an erroneous assumption, but for the purpose of this pilot study, which set out to serve as a proof of concept, it was considered a viable solution as most of the surveyed area was approximately 1.5 m deep. In the Immersion software, the altitude-adjusted data was radiance-converted at a 3.5-nm spectral resolution (to reduce file size and enhance processability) and georeferenced at a 0.5-cm spatial resolution. The resulting raster files were in BSQ (band sequential) format.
Before classification of the data was attempted, the georeferenced raster files were processed further using the software, ENVI (Environment for Visualizing Images, v. 5.4; Harris Geospatial Solutions Inc., Boulder, USA). As wavelengths <400 nm and >700 nm were considered noisy, all UHI transect lines were spectrally subset to cover only the interval of 400–700 nm. With a spectral resolution of 3.5 nm, this resulted in 86 spectral bands available for classification purposes. The six transect lines were subsequently merged together to form a continuous raster dataset using ENVI’s “Seamless mosaic” tool. For the mosaicking, an edge feathering (blending overlapping pixels) of 20 pixels was used to smooth transitions in overlaps between adjacent transect lines. To facilitate the interpretation of optical signatures present in the raster mosaic, radiance was converted into both internal average relative reflectance (IARR) and flat field reflectance (FFR) using ENVI’s “IAR Reflectance Correction” and “Flat Field Correction” tools. An IARR correction converts spectral data (e.g., Lu,λ) into spectral pseudo-reflectance by dividing the spectrum of each pixel by the mean spectrum of the entire scene [46]. Although this procedure does not produce absolute reflectance, it represents a convenient means of correction that requires no additional field data [47], and typically yields pixel spectra that in terms of shape can be related to the actual reflectance properties of the OOIs they represent. In a flat field correction, all pixel spectra are divided by the mean spectrum of an area of flat reflectance, which in the current pilot study was provided by the three white reference plates. For visualization purposes, an RGB representation (R: 638 nm, G: 549 nm, B: 470 nm) of the IARR-converted raster mosaic was exported to ArcMap (v. 10.6; Esri Inc., Redlands, USA) as a DAT (data) file (shown in the Results section).

2.4. Classification of UHI Data

Supervised classification of both the IARR-converted and the FFR-converted raster mosaic was carried out in ENVI using the support vector machine (SVM) algorithm. SVM maximizes the distance between classes using decision surfaces defined by vectors from training data [48,49]. The algorithm is known to be suited for complex datasets [50,51], and has previously performed well on underwater hyperspectral imagery [26]. In the current pilot study, a radial basis function (RBF) kernel was chosen for the SVM classification, as RBF-SVM can be considered a robust classifier [52,53].
Spatially corresponding IARR and FFR training data were obtained for the SVM classifier. Based on RGB visualization of the UHI data and in situ observations at the survey site, six spectral classes were chosen to serve as classification targets: Coralline algae, F. serratus, green algae (green algal films covering bright objects), invertebrates (training data were obtained from M. senile), reference plates, and a general seafloor category. These classes were chosen a priori because they were observed to be present within the confines of the surveyed area. Pixels corresponding to the different classes were identified at various locations in the raster mosaics and labelled as regions of interest (ROIs) to serve as training data. The number of training pixels chosen for each class as well as class-specific IARR/FFR signatures are presented in the Results section. For coralline algae, F. serratus, and green algae, reflectance spectra obtained in the laboratory using a JAZ spectrometer (Ocean Optics Inc., Largo, USA) according to the procedure described in Mogstad and Johnsen [27] are shown for comparison.
In addition to a training dataset, RBF-SVM classification requires specification of the parameters, γ and C, which respectively correspond to the kernel width and degree of regularization [53]. To optimise γ and C for the Hopavågen raster mosaics, a grid search cross-validation was performed on both the IARR and the FFR training data. The grid searches were carried out in Python (v. 3.6; Python Software Foundation, Wilmington, USA) using the free software machine learning library “scikit-learn” [54]. For IARR, a γ of 0.01 in combination with a C of 1000 was found to yield the highest cross-validation accuracy. For FFR, a γ of 0.1 in combination with a C of 10,000 produced the most accurate results. IARR and FFR classification results were converted to SHP (shape) files and exported to ArcMap for visualization and estimation of each class’ areal coverage.

2.5. Assessment of Classification Accuracy

To assess the accuracy of the SVM classifications, a ground truth of the seafloor sections delimited by the frames deployed at the survey site was generated as precisely as possible. The frames were identified in the UHI data, and within each framed section, all pixels were manually assigned to one of the previously defined spectral classes. Pixels whose identity were uncertain were assigned to the general seafloor category. The pixel labelling process was aided by comparing an RGB representation of the UHI data to the frame photographs acquired prior to the UHI survey. Although there was a three-week time lag between the frame photography and the acquisition of UHI data, imagery from the two techniques largely appeared to agree. For the sections of the survey area corresponding to frame sites, SVM classification results were compared to the ground truth using ENVI’s “confusion matrix” tool. This process compares the identity (class) of spatially corresponding pixels in a classified image and a ground truth image, and produces an accuracy assessment. The accuracy assessment can be thought of as an estimate of the pixel-by-pixel, class-specific agreement between a ground truth image (where all pixels are considered to be assigned to the correct spectral class) and a classification image (where all pixels have been assigned a spectral class based on, for example, SVM classification). It is worth noting that the training data for the SVM classifier were obtained strictly from non-frame pixels (i.e., pixels from locations outside the four areas delimited by the deployed frames). The resulting estimates of the overall and class-specific accuracy are presented in the Results section.

3. Results

3.1. Data Quality

The photomosaic of the six UHI transect lines covered an area of approximately 176 m2, and in Figure 3, an RGB-representation of the IARR-converted data is shown in the geospatial context with the survey area. As can be observed in Figure 3a, the hyperspectral data were of high spatial resolution, with features, such as seaweed clusters (F. serratus), deployed frames, and white reference plates, clearly visible. Furthermore, the navigation data provided by the USV’s RTK GPS and IMU proved to be highly accurate, as the different transect lines fit closely together (minor mismatches were likely a consequence of assuming a fixed survey altitude). A deficiency in the experimental dataset was, however, the lack of detailed altitude data. Although most of the surveyed area was approximately 1.5 m deep, a colour shift towards blue wavelengths with increasing distance from the shore (slightly increasing depth) can be observed in Figure 3. This reduced the spectral coherence of the dataset to some degree, but as we will show in the classification results, the classification procedure used in the present pilot study managed to produce reasonable results regardless.

3.2. Classification Results

Class-specific, mean optical signatures of the IARR and FFR training pixels used for the SVM classifications are shown in Figure 4. Figure 4a–c show that the positions of major dips and peaks in both IARR and FFR were comparable to laboratory-based reflectance estimates (JAZ spectrometer measurements). IARR and FRR spectra did, however, differ both from each other and from the laboratory-acquired reflectance data, which, as expected, showed that neither reflectance conversion technique was capable of yielding exact reflectance. It should be noted that all spectra displayed in Figure 4 have been normalized to their highest value to better illustrate shape-related differences between spectra.
Supervised classification of the six classes shown in Figure 4 yielded promising results despite the lack of georeferenced altitude data. Results from the SVM classifications of IARR and FFR data are respectively displayed in Figure 5a,b. The two classifications produced highly similar distribution patterns, and the estimated areal coverage of each spectral class is presented in Table 1. In the regions of the survey area closest to the shore, where the seafloor surface was dominated by gravel and cobbles, the majority of the image pixels were classified as coralline algae. This makes sense, as most hard surfaces at the survey site were observed to be at least partially covered by coralline algae. With increasing distance from the shore, sand littered with larger calcareous fragments replaced gravel and cobbles as the dominating substrate, and in agreement with this in situ observation, most pixels in these regions were classified as either seafloor or green algae (green algal films covering bright objects). Clusters of F. serratus and larger invertebrates (mostly M. senile) were scattered across the survey area (Figure 3a), and pixels corresponding to these features largely appeared correctly classified. Reference plates represented the final class in this pilot study’s classification attempt, and for both IARR and FFR, the positions of all three plates were sharply outlined by the SVM classifier.

3.3. Classification Accuracy

Accuracy of the SVM classifications was assessed by comparing the results displayed in Figure 5a,b to manually labelled ground truths of the four reference frames. In Figure 6, photographs of the framed areas are shown along with their corresponding ground truths and SVM classifications (both IARR and FFR). Figure 6a–d correspond to frames 1–4 highlighted in Figure 3a. For all frames, there appeared to be a considerable degree of agreement between the ground truth and the results of both classifications. Inconsistencies were, however, present, and based on visual inspection, the main disagreement between the ground truths and classification results appeared to be the size and abundance of the objects covered by green algal films. This inference was confirmed by the confusion matrices of the two classification attempts, which are shown in Table 2 and Table 3.
Based on the outputs from the confusion matrices, the overall classification accuracy, i.e., the total number of correctly classified pixels (pixels with the same identity/spectral class in both the ground truth image and the SVM classification results) divided by the total number of pixels in the scene, was estimated to be around 90% for both IARR and FFR data. Corresponding kappa coefficients were estimated to be 0.65 (IARR) and 0.62 (FFR), which suggests substantial agreement between the ground truth and the classification results [55]. Class-specific classification accuracies were also estimated to be similar for the two datasets. However, there were minor differences, with F. serratus being the main source of disagreement. Whereas F. serratus was classified with 83% producer accuracy (total number of correctly classified pixels for a given class divided by the total number of pixels within that class in the ground truth image) and 74% user accuracy (total number of correctly classified pixels for a given class divided by the total number of pixels classified as that class by the SVM classifier) in the IARR data, the equivalent numbers from the classification of thFFR data were 92% and 59%, respectively. Common to both IARR and FFR was that the class-specific producer accuracy ranged from <60% to >90%, with green algae and seafloor unambiguously representing the least and most accurately classified spectral classes. Moreover, user accuracy was consistently lower than producer accuracy for four out of the five spectral classes. The exception was the seafloor class, for which user accuracy was estimated to be ~3–5 percentage points higher than producer accuracy. The reason for the patterns observed in the accuracy estimates could have been the way the ground truth pixel-labelling process was carried out, and this hypothesis will be discussed further in the Discussion section.

4. Discussion

In the current pilot study, we have presented the results from what we believe to be the first attempt to map a shallow-water habitat using USV-based UHI. As shown in the Results section, the technique appears capable of generating detailed and useful information, even when certain data components are sub-optimal. Although the overall findings of the pilot study were promising, there were, nevertheless, issues present. In the following discussion, we will consequently address both the positive and negative aspects of the results, as well as guidelines for future applications of the technique.
With a total weight of ~120 kg (USV, mounting bracket and UHI-4 combined), the UHI-equipped OTTER USV could easily be deployed and handled by 3–4 adults. In the calm and sheltered waters of Hopavågen, the USV was capable of accurately following pre-programmed transect lines at slow speeds, in a controlled manner. An advantage of using a USV was its ability to cover extremely shallow areas. Specifically, the deepest part of the USV’s hull only protruded ~30 cm below the surface, which permitted mapping of regions that would have been hard to access by other means (e.g., by using a boat or an AUV).
When using a push-broom scanner, such as an underwater hyperspectral imager, precise spatial referencing of individual pixel rows is important if the geometrical integrity of the depicted area is to be preserved. A considerable convenience of using a USV for UHI deployment was that positioning data could be acquired electromagnetically as opposed to acoustically. The latter approach typically has to be used if the imaging platform is fully submerged (e.g., for AUVs and ROVs), since electromagnetic radiation quickly dissipates in water. Although acoustic navigation data can be of high quality, its acquisition may depend on relaying information from the platform to a research vessel through a network of deployed transponders. Not only does this process require extensive planning and resources, but it may also reduce positioning accuracy. High-resolution UHI data requires highly accurate georeferencing, and in the current pilot study, that is exactly what was provided. The OTTER USV was equipped with its own RTK GPS, and the imager’s position was logged continuously with a ~5-cm spatial accuracy. In addition, the USV’s IMU simultaneously recorded the sensor’s heading, pitch, and roll, which in combination with the navigation data permitted the georeferencing displayed in Figure 3. The undulating edges of, for example, the southwestern-most transect line in Figure 3a serve as a testament to the quality of the navigation and attitude data, as they show that even minor wave action has been accounted for. In the future, it would be interesting to try the technique in less sheltered waters to investigate its limitations with respect to environmental conditions.
In terms of georeferencing, the only deficiency of the pilot study was the lack of concurrent altitude data. Even though most of the survey area was approximately 1.5 m deep, depth still varied between ~1–2 m at the extremes. Since a fixed depth was assumed for the georeferencing, this implies that the results displayed in Figure 3a and Figure 5 are partially distorted in the across-track direction. Despite this, the geometrical features and proportions of the known OOIs, such as deployed frames and reference plates, largely appear correct. In addition, the positioning of OOIs present in two bordering or overlapping transect lines appears consistent, which suggests that the visualization of data is spatially representative. For future USV-based UHI surveys, real-time altitude data should, however, be considered a necessity if accurate spatial representation is an absolute requirement.
As the presented work from Hopavågen should be regarded as a pilot study, the overall data quality can be considered promising, but with room for improvement. The results show that accurate georeferencing of high-resolution hyperspectral imagery acquired using USV-based UHI is achievable when the platform is equipped with an RTK GPS and an IMU. Furthermore, Figure 4 shows that the utilized setup also is capable of recording spectral pixel values with enough signal to yield biologically interpretable results, even in near-coastal regions, where the water’s optical properties (chl a, cDOM, TSM) can be considered complex.
However, one challenge was evident in the dataset: The colour shift towards blue apparent in Figure 3a. The degree to which light is attenuated in water is wavelength-dependent, and longer wavelengths (red) are attenuated more rapidly than shorter wavelengths (blue) [56]. This implies that the perceived colour of a seafloor OOI will shift towards blue wavelengths as the distance to the observer (the underwater hyperspectral imager) increases. The wavelength-dependent attenuation of light in water is visible in Figure 3a, where shallower nearshore pixels are less blue-tinted than pixels corresponding to slightly deeper areas further from the shore. As a more specific example, Figure 7 illustrates the effect the slight variation of the survey altitude had on the perceived spectral properties of coralline algae at the depth extremes of the survey area. This effect reduced the overall quality of the data, and to spectrally account for the variable depth of the survey area in future USV-based UHI surveys, two additional parameters would have to be measured from the USV: Real-time altitude and the water’s in situ spectral attenuation coefficient. A potential way to do this would be to equip the USV with an altimeter (an instrument recording altitude) and a hyperspectral light beam attenuation meter (e.g. a VIPER Photometer; TriOS Mess- und Datentechnik GmbH, Rastede, Germany), respectively measuring real-time depth and spectral light attenuation (an alternative to the former would be to acquire a highly detailed bathymetric map of the survey area using, e.g., LIDAR; light detection and ranging, or multi-beam echo sounding). If in addition the approximate spectral downwelling irradiance from the light source (e.g., the sun) is measured continuously, such a setup could potentially yield a more representative dataset in terms of spectral quality if a radiative transfer model, like the one presented by Maritorena et al. [57], is applied. It should be noted that light fields generated by the sun or multiple light sources simultaneously might be difficult to estimate accurately. Cloud cover and surface wave action may, for instance, change over the course of the survey, which possibly is one of the reasons why there are small differences in brightness between neighbouring transect lines in Figure 3a besides from the previously mentioned blue shift. An implication of this is that removing the effect the water column has on perceived colour with absolute accuracy may prove challenging. However, for classification and mapping purposes, absolute intensity units are not necessarily of the essence. The important part is that the correct spectral relationship between OOIs at varying depths is restored, and this can likely be achieved, or at least majorly improved, by post-processing the data based on the georeferenced altitude, spectral light attenuation, and an approximate estimate of the in situ spectral downwelling irradiance. An interesting approach to future USV-based UHI surveys would be to record imagery during the night, and consequently rely solely on illumination from an active light source with known properties. This way, the impact of ambient light would be minimized, which potentially could make quantifying the light field an easier task.
The results displayed in Figure 5 and Figure 6 suggest that USV-based UHI has the potential to serve as a useful tool for detailed mapping surveys of shallow-water habitats in the future. For all frames (Figure 6a–d), there was an evident resemblance between the distribution patterns generated by SVM classification and the ground truth distribution pattern. Based on in situ observations and visual inspection of the RGB-visualized UHI data (Figure 3a), classification results from non-frame areas (Figure 5) also appeared reasonable. The visual interpretation was in agreement with the results from the confusion matrix analyses of the framed areas (Table 2 and Table 3), which revealed overall classification accuracies of ~90% accompanied by kappa coefficients suggesting substantial agreement. Taking into consideration the intrinsic spectral variability in the hyperspectral data caused by the slightly varying survey altitude, we would therefore characterize the classification results as satisfactory. The results of our pilot study demonstrate how powerful SVM classification can be if used properly. In the current study, an effort was made to optimize the RBF-SVM classifier’s γ and C values for both datasets, as opposed to using the values suggested by ENVI. When the values suggested by ENVI were used, the overall accuracy was nearly 10 percentage points lower than the results presented in Table 2 and Table 3. These findings emphasize the importance of fine-tuning classification parameters, and exemplify the potential impact they can have on classification results.
Although the overall classification results were encouraging, classification accuracy was not uniformly great across all spectral classes. Whereas >91% of the seafloor pixels were classified correctly, and coralline algae, F. serratus, and invertebrates were classified with producer accuracies of 73%–90%, pixels thought to represent green algal films were only classified with 56%–57% producer accuracy. For an interpretation of these results, it is important to consider that accuracy assessments were made based on comparisons with a manually labelled ground truth. The photographs of the deployed frames made identifying and labelling the main features of the framed areas easier, but they did not permit the labelling of smaller features with absolute certainty. Consequently, pixels whose identity were not considered entirely certain were binged into the general seafloor class, which implies that the said ground truth class may have contained some pixels that in reality represented other spectral classes. The results displayed in Table 2 and Table 3 support this hypothesis, in that user accuracy was estimated to be lower than producer accuracy for all classes, except for the seafloor class, where the opposite applied. Considering this, and looking back at Figure 6, it is not necessarily self-evident that all regions are more truthfully represented in the ground truth images than in the SVM classifications. A good example is the second frame (Figure 6b), in which it appears that the distribution and abundance of F. serratus could be more accurately estimated in the SVM classifications than in the ground truth. If this is the case, there is a possibility that certain accuracy estimates from the confusion matrices (Table 2 and Table 3) are in fact underestimates. Returning to the case of green algal classification accuracy, this was likely the spectral class most subjected to the ground truth labelling bias for several reasons. Firstly, green algal films were often present on small and inconspicuous calcareous surfaces, which made them more difficult to identify manually. Secondly, the perceived colour of the green algal films was influenced by the particular substrates on which they grew, making green algae a somewhat vague class spectrally. Lastly, green algae frequently grew on surfaces in tight association with coralline algae, to the point where pixel values may occasionally have been blended. What should be taken from these findings is that spectral classification is more clear-cut for some groups of marine organisms than for others. Additionally, an interesting future research topic would be further investigation of the relationship between manual ground truthing and results from the classification of underwater hyperspectral imagery.
It is worth noting that the spectral classes used in this pilot study were chosen specifically for the acquired dataset, and that the applicability of the classes of green algae, reference plate, and seafloor can be considered somewhat limited outside this particular proof of concept. The remaining three classes are, however, relevant from a management perspective. Coralline algae, F. serratus, and invertebrates (here represented by M. senile), for instance, all have equivalent counterparts defined in the Coastal and Marine Ecological Classification Standard (CMECS) [58]. Based on the presented findings, this suggests that data acquired using USV-based UHI potentially can be related to standardized frameworks, which is a possibility that should be further explored in the future.
In the final paragraph of the discussion, we will address the impact that the mode of reflectance conversion had on classification. For this pilot study, two simple conversion methods were used: IARR and FFR. In both methods, the spectrum of each pixel is divided by a predetermined reference spectrum; the only difference is whether the reference spectrum is based on the entire scene (IARR) or an area of flat reflectance (FFR). As shown in Figure 4, the different conversions yielded different signatures for equivalent classification targets. However, peaks and dips in the signatures from both IARR and FFR data were located at wavelengths similar to those of laboratory measurements, which was enough to relate the signatures to their biological origin (Figure 4a–c). Given that IARR- and FFR-conversion of underwater hyperspectral data is essentially the same technique, but with a different reference spectrum, the results from the SVM classification of the two datasets were not expected to differ significantly. As shown in Figure 5 and Figure 6 and Table 1, this was exactly the case. Although the effort of classifying two closely related datasets may seem redundant, in hindsight, these findings do in fact provide one important piece of information: Deploying a spectrally neutral reference plate at the survey site may not be necessary for future UHI surveys, which potentially makes data acquisition even less invasive and time-consuming. One of the goals of the work presented here was to map a shallow-water habitat using a new technique in a relatively simple fashion. Simplicity, robustness, and ease of use arguably represent desirable traits when it comes to mapping techniques, and in our pilot study, we believe we have shown that USV-based UHI is capable of fulfilling these criteria.

5. Conclusions

The findings from our pilot study suggest that USV-based UHI may serve as a valuable technique for shallow-water habitat mapping in the future. By deploying an underwater hyperspectral imager on a USV, we were able to acquire high-resolution, georeferenced hyperspectral imagery from a seafloor area that would have been hard to map at the same spatial resolution using other platforms. By converting the data to pseudo-reflectance and subsequently carrying out SVM classification, we were able to estimate the areal coverage of six spectral classes with an overall accuracy of ~90%. The classification results were achieved using simple means, which shows that USV-based UHI is a robust technique, capable of performing even when certain data elements are sub-optimal.
As this pilot study yielded promising SVM classification results for coralline algae, F. serratus, and invertebrates we suggest USV-based UHI should be attempted for biological shallow-water habitat mapping at other locations in the near future. Chennu et al. [34], for instance, recently showed the utility of diver-operated hyperspectral imaging in relation to high-resolution mapping of tropical corals, which suggests that tropical coral reefs could be an interesting survey target. Furthermore, detailed information from underwater hyperspectral imagery could be used to complement findings from aerial and space-borne imaging sensors, which ultimately could improve the management of coastal regions in a world where anthropogenic pressure continues to increase.

Author Contributions

Conceptualization, A.A.M. and G.J.; Data curation, A.A.M.; Formal analysis, A.A.M.; Funding acquisition, G.J. and M.L.; Investigation, A.A.M. and G.J.; Methodology, A.A.M. and G.J.; Project administration, A.A.M. and G.J.; Resources, G.J. and M.L.; Supervision, G.J. and M.L.; Visualization, A.A.M.; Writing—original draft, A.A.M.; Writing—review & editing, A.A.M., G.J. and M.L.

Funding

This work has been carried out at the Centre for Autonomous Marine Operations and Systems (NTNU AMOS). This work was supported by the Research Council of Norway through the Centres of Excellence funding scheme (grant no. 223254—NTNU AMOS).

Acknowledgments

We would like to thank Maritime Robotics AS for their crucial contributions with respect to integrating the underwater hyperspectral imager onto the USV, and operating the USV at the survey site. We would also like to thank Ecotone AS for their valuable help and input regarding processing of the underwater hyperspectral imagery.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. FAO. The State of World Fisheries and Aquaculture 2014; Food and Agriculture Organization: Rome, Italy, 2014. [Google Scholar]
  2. Kummu, M.; De Moel, H.; Salvucci, G.; Viviroli, D.; Ward, P.J.; Varis, O. Over the hills and further away from coast: Global geospatial patterns of human and environment over the 20th–21st centuries. Environ. Res. Lett. 2016, 11, 034010. [Google Scholar] [CrossRef]
  3. Barragán, J.M.; de Andrés, M. Analysis and trends of the world’s coastal cities and agglomerations. Ocean. Coast. Manag. 2015, 114, 11–20. [Google Scholar] [CrossRef]
  4. Crossland, C.J.; Baird, D.; Ducrotoy, J.P.; Lindeboom, H.; Buddemeier, R.W.; Dennison, W.C.; Maxwell, B.A.; Smith, S.V.; Swaney, D.P. The coastal zone—A domain of global interactions. In Coastal Fluxes in the Anthropocene, 1st ed.; Crossland, C.J., Kremer, H.H., Lindeboom, H., Crossland, J.I.M., Le Tissier, M.D.A., Eds.; Springer: Berlin/Heidelberg, Germany, 2005; pp. 1–37. [Google Scholar]
  5. Brown, C.J.; Smith, S.J.; Lawton, P.; Anderson, J.T. Benthic habitat mapping: A review of progress towards improved understanding of the spatial ecology of the seafloor using acoustic techniques. Estuar. Coast. Shelf Sci. 2011, 92, 502–520. [Google Scholar] [CrossRef]
  6. Wright, D.J.; Heyman, W.D. Introduction to the special issue: Marine and coastal GIS for geomorphology, habitat mapping, and marine reserves. Mar. Geod. 2008, 31, 223–230. [Google Scholar] [CrossRef]
  7. Gavazzi, G.M.; Madricardo, F.; Janowski, L.; Kruss, A.; Blondel, P.; Sigovini, M.; Foglini, F. Evaluation of seabed mapping methods for fine-scale classification of extremely shallow benthic habitats—Application to the Venice Lagoon, Italy. Estuar. Coast. Shelf Sci. 2016, 170, 45–60. [Google Scholar] [CrossRef]
  8. Battista, T.; O’Brien, K. Spatially prioritizing seafloor mapping for coastal and marine planning. Coast. Manag. 2015, 43, 35–51. [Google Scholar] [CrossRef]
  9. Hedley, J.D.; Roelfsema, C.M.; Chollett, I.; Harborne, A.R.; Heron, S.F.; Weeks, S.; Skirving, W.J.; Strong, A.E.; Eakin, C.M.; Christensen, T.R.L. Remote sensing of coral reefs for monitoring and management: A review. Remote Sens. 2016, 8, 118. [Google Scholar] [CrossRef]
  10. Purkis, S.J. Remote sensing tropical coral reefs: The view from above. Annu. Rev. Mar. Sci. 2018, 10, 149–168. [Google Scholar] [CrossRef]
  11. Kutser, T.; Miller, I.; Jupp, D.L.B. Mapping coral reef benthic substrates using hyperspectral space-borne images and spectral libraries. Estuar. Coast. Shelf Sci. 2006, 70, 449–460. [Google Scholar] [CrossRef]
  12. Roelfsema, C.M.; Phinn, S.R.; Jupiter, S.; Comley, J.; Albert, S. Mapping coral reefs at reef to reef-system scales, 10s-1000s km2, using object-based image analysis. Int. J. Remote Sens. 2013, 34, 6367–6388. [Google Scholar] [CrossRef]
  13. Leiper, I.A.; Phinn, S.R.; Roelfsema, C.M.; Joyce, K.E.; Dekker, A.G. Mapping coral reef benthos, substrates, and bathymetry, using compact airborne spectrographic imager (CASI) data. Remote Sens. 2014, 6, 6423–6445. [Google Scholar] [CrossRef]
  14. Garcia, R.A.; Lee, Z.; Hochberg, E.J. Hyperspectral shallow-water remote sensing with an enhanced benthic classifier. Remote Sens. 2018, 10, 147. [Google Scholar] [CrossRef]
  15. Hochberg, E.J.; Atkinson, M. Spectral discrimination of coral reef benthic communities. Coral Reefs 2000, 19, 164–171. [Google Scholar] [CrossRef]
  16. Phinn, S.R.; Roelfsema, C.M.; Dekker, A.G.; Brando, V.E.; Anstee, J.M. Mapping seagrass species, cover and biomass in shallow waters: An assessment of satellite multi-spectral and airborne hyper-spectral imaging systems in Moreton Bay (Australia). Remote Sens. Environ. 2008, 112, 3413–3425. [Google Scholar] [CrossRef]
  17. Kovacs, E.; Roelfsema, C.M.; Lyons, M.; Zhao, S.; Phinn, S.R. Seagrass habitat mapping: How do Landsat 8 OLI, Sentinel-2, ZY-3A, and Worldview-3 perform? Remote Sens. Lett. 2018, 9, 686–695. [Google Scholar] [CrossRef]
  18. Dekker, A.G.; Brando, V.E.; Anstee, J.M. Retrospective seagrass change detection in a shallow coastal tidal Australian lake. Remote Sens. Environ. 2005, 97, 415–433. [Google Scholar] [CrossRef]
  19. Halpern, B.S.; Walbridge, S.; Selkoe, K.A.; Kappel, C.V.; Micheli, F.; D’agrosa, C.; Bruno, J.F.; Casey, K.S.; Ebert, C.; Fox, H.E. A global map of human impact on marine ecosystems. Science 2008, 319, 948–952. [Google Scholar] [CrossRef]
  20. Amante, C.; Eakins, B.W. ETOPO1 1 Arc-Minute Global Relief Model: Procedures, Data Sources and Analysis; NOAA Technical Memorandum NESDIS NGDC-24; National Geophysical Data Center, NOAA: Boulder, CO, USA, 2009. [CrossRef]
  21. Pearlman, J.S.; Barry, P.S.; Segal, C.C.; Shepanski, J.; Beiso, D.; Carman, S.L. Hyperion, a space-based imaging spectrometer. IEEE Trans. Geosci. Remote Sens. 2003, 41, 1160–1173. [Google Scholar] [CrossRef]
  22. Lucke, R.L.; Corson, M.; McGlothlin, N.R.; Butcher, S.D.; Wood, D.L.; Korwan, D.R.; Li, R.R.; Snyder, W.A.; Davis, C.O.; Chen, D.T. Hyperspectral imager for the coastal ocean: Instrument description and first images. Appl. Opt. 2011, 50, 1501–1516. [Google Scholar] [CrossRef] [PubMed]
  23. Johnsen, G.; Volent, Z.; Dierssen, H.; Pettersen, R.; Ardelan, M.; Søreide, F.; Fearns, P.; Ludvigsen, M.; Moline, M. Underwater hyperspectral imagery to create biogeochemical maps of seafloor properties. In Subsea Optics and Imaging, 1st ed.; Watson, J., Zielinski, O., Eds.; Woodhead Publishing Limited: Cambridge, UK, 2013; pp. 508–535. [Google Scholar]
  24. Johnsen, G.; Ludvigsen, M.; Sørensen, A.; Aas, L.M.S. The use of underwater hyperspectral imaging deployed on remotely operated vehicles—Methods and applications. IFAC-PapersOnLine 2016, 49, 476–481. [Google Scholar] [CrossRef]
  25. Funk, C.J.; Bryant, S.B.; Heckman, P.J., Jr. Handbook of Underwater Imaging System Design (TP-303); Ocean Technology Dept., Naval Undersea Center: San Diego, CA, USA, 1972. [Google Scholar]
  26. Dumke, I.; Nornes, S.M.; Purser, A.; Marcon, Y.; Ludvigsen, M.; Ellefmo, S.L.; Johnsen, G.; Søreide, F. First hyperspectral imaging survey of the deep seafloor: High-resolution mapping of manganese nodules. Remote Sens. Environ. 2018, 209, 19–30. [Google Scholar] [CrossRef]
  27. Mogstad, A.A.; Johnsen, G. Spectral characteristics of coralline algae: A multi-instrumental approach, with emphasis on underwater hyperspectral imaging. Appl. Opt. 2017, 56, 9957–9975. [Google Scholar] [CrossRef]
  28. Sture, Ø.; Ludvigsen, M.; Søreide, F.; Aas, L.M.S. Autonomous underwater vehicles as a platform for underwater hyperspectral imaging. In Proceedings of the OCEANS 2017 MTS/IEEE, Aberdeen, Scotland, 19–22 June 2017. [Google Scholar] [CrossRef]
  29. Ødegård, Ø.; Mogstad, A.A.; Johnsen, G.; Sørensen, A.J.; Ludvigsen, M. Underwater hyperspectral imaging: A new tool for marine archaeology. Appl. Opt. 2018, 57, 3214–3223. [Google Scholar] [CrossRef]
  30. Dumke, I.; Purser, A.; Marcon, Y.; Nornes, S.M.; Johnsen, G.; Ludvigsen, M.; Søreide, F. Underwater hyperspectral imaging as an in situ taxonomic tool for deep-sea megafauna. Sci. Rep. 2018, 8, 12860. [Google Scholar] [CrossRef] [PubMed]
  31. Tegdan, J.; Ekehaug, S.; Hansen, I.M.; Aas, L.M.S.; Steen, K.J.; Pettersen, R.; Beuchel, F.; Camus, L. Underwater hyperspectral imaging for environmental mapping and monitoring of seabed habitats. In Proceedings of the OCEANS 2015, Genova, Italy, 18–21 May 2015. [Google Scholar] [CrossRef]
  32. Chennu, A.; Färber, P.; Volkenborn, N.; Al-Najjar, M.A.A.; Janssen, F.; de Beer, D.; Polerecky, L. Hyperspectral imaging of the microscale distribution and dynamics of microphytobenthos in intertidal sediments. Limnol. Oceanogr. Methods 2013, 11, 511–528. [Google Scholar] [CrossRef]
  33. Pettersen, R.; Johnsen, G.; Bruheim, P.; Andreassen, T. Development of hyperspectral imaging as a bio-optical taxonomic tool for pigmented marine organisms. Org. Divers. Evol. 2014, 14, 237–246. [Google Scholar] [CrossRef]
  34. Chennu, A.; Färber, P.; De’ath, G.; de Beer, D.; Fabricius, K.E. A diver-operated hyperspectral imaging and topographic surveying system for automated mapping of benthic habitats. Sci. Rep. 2017, 7, 7122. [Google Scholar] [CrossRef] [PubMed]
  35. Öztürk, M.; Vadstein, O.; Sakshaug, E. The effects of enhanced phytoplankton production on iron speciation and removal in mesocosm experiments in a landlocked basin of Hopavågen, Norway. Mar. Chem. 2003, 84, 3–17. [Google Scholar] [CrossRef]
  36. Van Marion, P. Ecological studies in Hopavågen, a landlocked bay at Agdenes, Sør-Trøndelag, Norway. Gunneria 1996, 71, 1–39. [Google Scholar]
  37. Maike, P.; Henry, P.Y.T. Evaluation of the use of surrogate Laminaria digitata in eco-hydraulic laboratory experiments. J. Hydrodyn. Ser. B 2014, 26, 374–383. [Google Scholar] [CrossRef]
  38. Paul, M.; Henry, P.Y.T.; Thomas, R.E. Geometrical and mechanical properties of four species of northern European brown macroalgae. Coast. Eng. 2014, 84, 73–80. [Google Scholar] [CrossRef]
  39. Sommer, F.; Hansen, T.; Feuchtmayr, H.; Santer, B.; Tokle, N.; Sommer, U. Do calanoid copepods suppress appendicularians in the coastal ocean? J. Plankton Res. 2003, 25, 869–871. [Google Scholar] [CrossRef]
  40. Stibor, H.; Vadstein, O.; Lippert, B.; Roederer, W.; Olsen, Y. Calanoid copepods and nutrient enrichment determine population dynamics of the appendicularian Oikopleura dioica: A mesocosm experiment. Mar. Ecol. Prog. Ser. 2004, 270, 209–215. [Google Scholar] [CrossRef]
  41. Vadstein, O.; Stibor, H.; Lippert, B.; Løseth, K.; Roederer, W.; Sundt-Hansen, L.; Olsen, Y. Moderate increase in the biomass of omnivorous copepods may ease grazing control of planktonic algae. Mar. Ecol. Prog. Ser. 2004, 270, 199–207. [Google Scholar] [CrossRef]
  42. Olsen, Y.; Reinertsen, H.; Sommer, U.; Vadstein, O. Responses of biological and chemical components in North East Atlantic coastal water to experimental nitrogen and phosphorus addition—A full scale ecosystem study and its relevance for management. Sci. Total Environ. 2014, 473, 262–274. [Google Scholar] [CrossRef] [PubMed]
  43. Van Nieuwerburgh, L.; Wänstrand, I.; Liu, J.; Snoeijs, P. Astaxanthin production in marine pelagic copepods grazing on two different phytoplankton diets. J. Sea Res. 2005, 53, 147–160. [Google Scholar] [CrossRef]
  44. Olsen, Y.; Agustí, S.; Andersen, T.; Duarte, C.M.; Gasol, J.M.; Gismervik, I.; Heiskanen, A.S.; Hoell, E.; Kuuppo, P.; Lignell, R. A comparative study of responses in plankton food web structure and function in contrasting European coastal waters exposed to experimental nutrient addition. Limnol. Oceanogr. 2006, 51, 488–503. [Google Scholar] [CrossRef]
  45. Teacă, A.; Ungureanu, C.; Mureșan, M. Assessment of diversity and distribution of benthic communities in Hopavågen Bay, Sletvik area (Norway). Geo Eco Marina 2017, 23, 103–119. [Google Scholar] [CrossRef]
  46. Kruse, F.A. Use of airborne imaging spectrometer data to map minerals associated with hydrothermally altered rocks in the Northern Grapevine Mountains, Nevada, and California. Remote Sens. Environ. 1988, 24, 31–51. [Google Scholar] [CrossRef]
  47. Aspinall, R.J.; Marcus, W.A.; Boardman, J.W. Considerations in collecting, processing, and analysing high spatial resolution hyperspectral data for environmental investigations. J. Geogr. Syst. 2002, 4, 15–29. [Google Scholar] [CrossRef]
  48. Cortes, C.; Vapnik, V. Support-vector networks. Mach. Learn. 1995, 20, 273–297. [Google Scholar] [CrossRef]
  49. Mountrakis, G.; Im, J.; Ogole, C. Support vector machines in remote sensing: A review. ISPRS J. Photogramm. Remote Sens. 2011, 66, 247–259. [Google Scholar] [CrossRef]
  50. Camps-Valls, G. Robust support vector method for hyperspectral data classification and knowledge discovery. IEEE Trans. Geosci. Remote Sens. 2004, 42, 1530–1542. [Google Scholar] [CrossRef]
  51. Wu, X. Top 10 algorithms in data mining. Knowl. Inf. Syst. 2008, 14, 1–37. [Google Scholar] [CrossRef]
  52. Melgani, F.; Bruzzone, L. Classification of hyperspectral remote sensing images with support vector machines. IEEE Trans. Geosci. Remote Sens. 2004, 42, 1778–1790. [Google Scholar] [CrossRef]
  53. Kavzoglu, T.; Colkesen, I. A kernel functions analysis for support vector machines for land cover classification. Int. J. Appl. Earth Obs. Geoinform. 2009, 11, 352–359. [Google Scholar] [CrossRef]
  54. Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Blondel, M.; Prettenhofer, P.; Weiss, R.; Dubourg, V. Scikit-learn: Machine learning in Python. J. Mach. Learn. Res. 2011, 12, 2825–2830. [Google Scholar]
  55. Landis, J.R.; Koch, G.G. The measurement of observer agreement for categorical data. Biometrics 1977, 33, 159–174. [Google Scholar] [CrossRef] [PubMed]
  56. Morel, A. Optical properties of pure water and pure sea water. In Optical Aspects of Oceanography, 1st ed.; Jerlov, N.G., Nielsen, E.S., Eds.; Academic Press: Madison, WI, USA, 1974; pp. 1–24. [Google Scholar]
  57. Maritorena, S.; Morel, A.; Gentili, B. Diffuse reflectance of oceanic shallow waters: Influence of water depth and bottom albedo. Limnol. Oceanogr. 1994, 39, 1689–1703. [Google Scholar] [CrossRef]
  58. Federal Geographic Data Committee (FGDC). Coastal and Marine Ecological Classification Standard. Available online: www.fgdc.gov/standards/projects/cmecs-folder/CMECS_Version_06-2012_FINAL.pdf (accessed on 13 March 2019).
Figure 1. Worldwide distribution of shallow waters (≤10 m depth). The map was generated in ArcMap (v. 10.6; Esri Inc., Redlands, USA; http://desktop.arcgis.com/en/arcmap/), and is based on the ETOPO1 1 Arc-Minute Global Relief Model [20]. Projection: Mollweide, Datum: WGS 1984.
Figure 1. Worldwide distribution of shallow waters (≤10 m depth). The map was generated in ArcMap (v. 10.6; Esri Inc., Redlands, USA; http://desktop.arcgis.com/en/arcmap/), and is based on the ETOPO1 1 Arc-Minute Global Relief Model [20]. Projection: Mollweide, Datum: WGS 1984.
Remotesensing 11 00685 g001
Figure 2. The OTTER USV (unmanned surface vehicle) after deployment in Hopavågen (63°35’N 9°32’E), Agdenes, Norway (a). The positions of UHI-4 (4th generation underwater hyperspectral imager) and the light source are shown with arrows. Panel (b) shows a close-up photograph of UHI-4.
Figure 2. The OTTER USV (unmanned surface vehicle) after deployment in Hopavågen (63°35’N 9°32’E), Agdenes, Norway (a). The positions of UHI-4 (4th generation underwater hyperspectral imager) and the light source are shown with arrows. Panel (b) shows a close-up photograph of UHI-4.
Remotesensing 11 00685 g002
Figure 3. RGB representation (R: 638 nm, G: 549 nm, B: 470 nm) of the underwater hyperspectral imagery from Hopavågen (63°35’N 9°32’E), Agdenes, Norway, with the positions of frames 1–4 highlighted (a). Panels (b) and (c) respectively show the geographical extent/position of the former panel. The maps were generated in ArcMap (v. 10.6; Esri Inc., Redlands, USA; http://desktop.arcgis.com/en/arcmap/). The map in panel b is based on data from the Norwegian Mapping Authority, available at https://kartkatalog.geonorge.no/tema/dybdedata/8 under CC BY 4.0 license. The map in panel c is based on the ETOPO1 1 Arc-Minute Global Relief Model [20]. Projection: UTM 32, Datum: WGS 1984.
Figure 3. RGB representation (R: 638 nm, G: 549 nm, B: 470 nm) of the underwater hyperspectral imagery from Hopavågen (63°35’N 9°32’E), Agdenes, Norway, with the positions of frames 1–4 highlighted (a). Panels (b) and (c) respectively show the geographical extent/position of the former panel. The maps were generated in ArcMap (v. 10.6; Esri Inc., Redlands, USA; http://desktop.arcgis.com/en/arcmap/). The map in panel b is based on data from the Norwegian Mapping Authority, available at https://kartkatalog.geonorge.no/tema/dybdedata/8 under CC BY 4.0 license. The map in panel c is based on the ETOPO1 1 Arc-Minute Global Relief Model [20]. Projection: UTM 32, Datum: WGS 1984.
Remotesensing 11 00685 g003
Figure 4. Mean optical signatures of red coralline algae (a), the brown macroalga, Fucus serratus (b), green algae (c), invertebrates (d), reference plate (e), and seafloor (f). All spectra have been normalized to their highest value. For each class, the mean optical signature for both the internal average relative reflectance (IARR) and flat field reflectance (FFR) is shown. For coralline algae, Fucus serratus, and green algae, reflectance measurements acquired in the laboratory using a JAZ spectrometer are shown for comparison.
Figure 4. Mean optical signatures of red coralline algae (a), the brown macroalga, Fucus serratus (b), green algae (c), invertebrates (d), reference plate (e), and seafloor (f). All spectra have been normalized to their highest value. For each class, the mean optical signature for both the internal average relative reflectance (IARR) and flat field reflectance (FFR) is shown. For coralline algae, Fucus serratus, and green algae, reflectance measurements acquired in the laboratory using a JAZ spectrometer are shown for comparison.
Remotesensing 11 00685 g004
Figure 5. Results of the support vector machine (SVM) classifications of internal average relative reflectance (IARR)-converted (a) and flat field reflectance (FFR)-converted (b) underwater hyperspectral imagery from Hopavågen (63°35’N 9°32’E), Agdenes, Norway. The maps were generated in ArcMap (v. 10.6; Esri Inc., Redlands, USA; http://desktop.arcgis.com/en/arcmap/). Projection: UTM 32, Datum: WGS 1984.
Figure 5. Results of the support vector machine (SVM) classifications of internal average relative reflectance (IARR)-converted (a) and flat field reflectance (FFR)-converted (b) underwater hyperspectral imagery from Hopavågen (63°35’N 9°32’E), Agdenes, Norway. The maps were generated in ArcMap (v. 10.6; Esri Inc., Redlands, USA; http://desktop.arcgis.com/en/arcmap/). Projection: UTM 32, Datum: WGS 1984.
Remotesensing 11 00685 g005
Figure 6. Photographs, RGB-visualized (R: 638 nm, G: 549 nm, B: 470 nm) data from underwater hyperspectral imaging (UHI), manually labelled ground truths, and support vector machine (SVM) classifications of four framed seafloor areas (a)–(d). SVM classification results are shown for both the internal average relative reflectance (IARR) and flat field reflectance (FFR).
Figure 6. Photographs, RGB-visualized (R: 638 nm, G: 549 nm, B: 470 nm) data from underwater hyperspectral imaging (UHI), manually labelled ground truths, and support vector machine (SVM) classifications of four framed seafloor areas (a)–(d). SVM classification results are shown for both the internal average relative reflectance (IARR) and flat field reflectance (FFR).
Remotesensing 11 00685 g006
Figure 7. An example of the impact varying the survey altitude may have on the perceived spectral properties of a given target class (in this case, coralline algae). Panel (a) shows internal average relative reflectance (IARR) signatures of coralline algae situated at shallow (~1-m depth) and deep (~2-m depth) locations within the current pilot study’s IARR raster mosaic (Figure 3a). Each signature is the mean of n = 25 pixels. As shown, the positions of spectral peaks and dips are comparable, but wavelengths are attenuated disproportionally with increasing depth (water attenuates red wavelengths more strongly than blue wavelengths). Panels (b) and (c) respectively show RGB representations (R: 638 nm, G: 549 nm, B: 470 nm) of the pixels the shallow and deep coralline algal signatures were obtained from (pink squares).
Figure 7. An example of the impact varying the survey altitude may have on the perceived spectral properties of a given target class (in this case, coralline algae). Panel (a) shows internal average relative reflectance (IARR) signatures of coralline algae situated at shallow (~1-m depth) and deep (~2-m depth) locations within the current pilot study’s IARR raster mosaic (Figure 3a). Each signature is the mean of n = 25 pixels. As shown, the positions of spectral peaks and dips are comparable, but wavelengths are attenuated disproportionally with increasing depth (water attenuates red wavelengths more strongly than blue wavelengths). Panels (b) and (c) respectively show RGB representations (R: 638 nm, G: 549 nm, B: 470 nm) of the pixels the shallow and deep coralline algal signatures were obtained from (pink squares).
Remotesensing 11 00685 g007
Table 1. Estimated areal coverage of six spectral classes based on support vector machine (SVM) classification of the surveyed area. The table shows estimates from classifications of both internal average relative reflectance (IARR) data and flat field reflectance (FFR) data.
Table 1. Estimated areal coverage of six spectral classes based on support vector machine (SVM) classification of the surveyed area. The table shows estimates from classifications of both internal average relative reflectance (IARR) data and flat field reflectance (FFR) data.
Spectral ClassIARRFFR
Areal Coverage (m2)% CoverageAreal Coverage (m2)% Coverage
Coralline algae12.377.0312.427.06
Fucus serratus13.937.9213.367.60
Green algae11.786.7012.296.99
* Invertebrates0.620.350.580.33
Reference plate0.360.200.370.21
Seafloor136.7877.79136.8277.81
Total175.84100175.84100
* Dominated by the plumose anemone, Metridium senile.
Table 2. Confusion matrix for support vector machine (SVM) classification of internal average relative reflectance (IARR) data from four frame-delimited seafloor areas (Figure 6a–d), and corresponding accuracy assessments.
Table 2. Confusion matrix for support vector machine (SVM) classification of internal average relative reflectance (IARR) data from four frame-delimited seafloor areas (Figure 6a–d), and corresponding accuracy assessments.
Predicted Spectral ClassGround Truth (pixels)Producer Accuracy (%)User Accuracy (%)
Coralline AlgaeFucus SerratusGreen AlgaeInvertebratesSeafloorTotal
Coralline algae10923620700185775.8958.8
Fucus serratus092800334126282.6473.53
Green algae240407044087156.2246.73
Invertebrates01201253417189.9373.10
Seafloor3231802551421,70722,47993.5096.57
Total1439112372413923,21526,640
Overall classification accuracy: 91.06%; Kappa coefficient: 0.65.
Table 3. Confusion matrix for support vector machine (SVM) classification of flat field reflectance (FFR) data from four frame-delimited seafloor areas (Figure 6a–d), and corresponding accuracy assessments.
Table 3. Confusion matrix for support vector machine (SVM) classification of flat field reflectance (FFR) data from four frame-delimited seafloor areas (Figure 6a–d), and corresponding accuracy assessments.
Predicted Spectral ClassGround Truth (pixels)Producer Accuracy (%)User Accuracy (%)
Coralline AlgaeFucus SerratusGreen AlgaeInvertebratesSeafloorTotal
Coralline algae10544530778188973.2555.80
Fucus serratus0102802715174591.5458.91
Green algae140415047990857.3245.70
Invertebrates0001232514888.4983.11
Seafloor371912561421,21821,95091.4096.67
Total1439112372413923,21526,640
Overall classification accuracy: 89.48%; Kappa coefficient: 0.62.

Share and Cite

MDPI and ACS Style

Mogstad, A.A.; Johnsen, G.; Ludvigsen, M. Shallow-Water Habitat Mapping using Underwater Hyperspectral Imaging from an Unmanned Surface Vehicle: A Pilot Study. Remote Sens. 2019, 11, 685. https://doi.org/10.3390/rs11060685

AMA Style

Mogstad AA, Johnsen G, Ludvigsen M. Shallow-Water Habitat Mapping using Underwater Hyperspectral Imaging from an Unmanned Surface Vehicle: A Pilot Study. Remote Sensing. 2019; 11(6):685. https://doi.org/10.3390/rs11060685

Chicago/Turabian Style

Mogstad, Aksel Alstad, Geir Johnsen, and Martin Ludvigsen. 2019. "Shallow-Water Habitat Mapping using Underwater Hyperspectral Imaging from an Unmanned Surface Vehicle: A Pilot Study" Remote Sensing 11, no. 6: 685. https://doi.org/10.3390/rs11060685

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop