Next Article in Journal
Development and Validation of an Aeropropulsive and Aeroacoustic Simulation Model of a Quadcopter Drone
Previous Article in Journal
A Reinforcement Learning Approach Based on Automatic Policy Amendment for Multi-AUV Task Allocation in Ocean Current
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

sUAS Monitoring of Coastal Environments: A Review of Best Practices from Field to Lab

1
Department of Engineering, East Carolina University, Greenville, NC 27858, USA
2
Department of Geography, Planning & Environment, East Carolina University, Greenville, NC 27858, USA
3
Department of Construction Management, East Carolina University, Greenville, NC 27858, USA
*
Author to whom correspondence should be addressed.
Drones 2022, 6(6), 142; https://doi.org/10.3390/drones6060142
Submission received: 12 May 2022 / Revised: 31 May 2022 / Accepted: 6 June 2022 / Published: 8 June 2022

Abstract

:
Coastal environments are some of the most dynamic environments in the world. As they are constantly changing, so are the technologies and techniques we use to map and monitor them. The rapid advancement of sUAS-based remote sensing calls for rigorous field and processing workflows so that more reliable and consistent sUAS projects of coastal environments are carried out. Here, we synthesize the best practices to create sUAS photo-based surveying and processing workflows that can be used and modified by coastal scientists, depending on their project objective. While we aim to simplify the complexity of these workflows, we note that the nature of this work is a craft that carefully combines art, science, and technology. sUAS LiDAR is the next advancement in mapping and monitoring coastal environments. Therefore, future work should consider synthesizing best practices to develop rigorous field and data processing workflows used for sUAS LiDAR-based projects of coastal environments.

1. Introduction

Coastal zones are land surfaces that are influenced by marine processes, which is why they are some of the most dynamic environments in the world. Both marine (e.g., waves and tides) and atmospheric (e.g., precipitation and winds) processes create a variety of landforms ranging from gently sloping sandy beaches to high rocky cliffs. Coastal zones are important because they support many different complex ecosystems from tidal swamps and marshes that extend from the landward limit of waves to coral reefs that extend to the seaward limit where the waves interact with the seabed. Coastal zones not only provide essential ecosystem services, such as shoreline protection, improved water quality, fisheries resources, and food and habitat to wildlife, but they are also attractive to human populations for their recreational opportunities [1]. For this reason, coastal scientists, engineers, and managers prioritize mapping and monitoring these changing environments.
As our coastal environments are constantly changing, so are the technologies and techniques that we use to map and monitor them. Traditionally, passive sensors mounted on satellites and occupied aircraft were used to monitor coastal environments [2]. However, with the rapid advancement of small Unoccupied Aircraft Systems (sUAS), coastal monitoring is now more affordable and efficient [3,4,5,6,7]. This makes sUAS highly attractive “on-demand remote sensing devices” [7]. The ability to choose sensors and to control temporal resolutions with a sUAS makes them excellent for mapping and monitoring small coastal areas. For example, many coastal habitats need on-demand remote sensing monitoring devices to capture data at certain phases of the tide, such as oyster reef beds exposed at low tide [8,9,10]. In addition, on-demand remote sensing monitoring is needed to understand the behavior of sea life, such as rays [6], sea turtles [11,12], and whales [9]. In addition to the ability to control temporal resolution, sUAS help to reduce the time needed to collect in situ data in challenging coastal environments (e.g., by as much as a week) [4,6,7]. While sUAS have other benefits, such as the ability to provide high resolution imagery, dense point clouds, and the ability to capture data simultaneously with in situ measurements, the use of these technologies can be especially challenging in coastal environments. These challenges include but are not limited to environmental conditions such as the weather, sun glint on water, turbidity [8], distribution, and placement of Ground Control Points (GCPs) [5,6], the phase of the tide [4,5,6,7,8], battery capacity limiting sUAS projects to small coastal areas [6,7,13], and technical issues related to the computer power and skill needed for image processing [5,7].
To address these challenges, this review article aims to synthesize and illustrate best practices used to collect and process the sUAS data of coastal environments. To assist with this objective, we reviewed recent review articles that focus on sUAS applications in coastal environments and include topics such as regulations, sensors, platforms, calibration, validation, and data processing. Based on the best practices identified from these review articles in addition to the current literature, we illustrate a step-by-step workflow that can be used for either conducting sUAS surveys or sUAS data processing, or both. It is anticipated that this review will assist coastal scientists, engineers, and managers by providing flexible workflows that help guide consistent and reliable sUAS projects of coastal environments.

2. Previous Reviews of sUAS Monitoring of Coastal Environments

The rapid advancement of sUAS-based remote sensing has resulted in several recent review articles with a focus on coastal environments. Two academic research databases were used in this study to locate these articles. A Scopus search was conducted using the terms ‘coastal’, and ‘drone’, where 353 results were found. A Google Scholar search was also conducted using these terms, where 33,800 results were found. Both database searches were then limited to review documents, which produced 16 articles from Scopus and 665 articles from Google Scholar. We examined each abstract to verify each review article related to our overall goal of understanding the current sUAS applications in coastal environments. We were interested in synthesizing and illustrating the best practices used to collect and process sUAS data in coastal environments.
It should be noted that several of the review articles focused on other topics not related to our goals. Some articles from the Scopus search included a short discussion of how sUAS can be used in coastal environments, such as for non-destructive testing of bridges [14], to benefit marine citizen science [15], or to study human behavior such as recreational fishing or visitor use to public land [16]. Other articles from Scopus reviewed a different topic altogether, such as the application of space-born synthetic aperture radar to offshore wind sources [17]. The Google Scholar search results were similar, in that some articles reviewed a different topic such as the application of sUAS for protected areas [18], monitoring marine environments with autonomous underwater vehicles [19], and the use of sUAS to conduct water sampling in freshwater environments [20]. These articles were removed from the literature bases, and afterwards five remained from Scopus and an additional four remained from Google Scholar. After thoroughly examining each article, a review article was identified in the references that was not categorized accordingly in the search engine [3]. All 10 of these articles focused on our interests above [3,4,5,6,7,8,9,11,12,13].

2.1. sUAS Regulations

Most of the review articles mention that a critical component to any sUAS project is the employment of a commercial pilot who is knowledgeable with the current flight restrictions and regulations placed by their respective country’s aviation authorities [3,4,5,6,7,9,11]. The authors of [6] expands on this requirement to include that a commercial pilot should be knowledgeable on the three common approaches to sUAS regulations, which include: (1) the Outright or Effective Ban, as used in countries such as Cuba to ban sUAS flights completely; (2) the Visual Line of Sight (VLOS)-dependent approach, as used by Australia and the European Union to limit the sUAS flight to be within the pilot’s VLOS; and (3) the Permissive approach, where regulations are reasonable and less restrictive, such as in Sweden. Since flight restrictions vary across the globe and continue to change as the technology advances, it is often difficult for commercial pilots to keep up with these changes [11,13]. This is especially the case for countries such as the United States that implement the VLOS-dependent approach that requires scientific pilots to obtain a license. Once qualified as a commercial pilot for a respective country, it is good practice to periodically check the sUAS regulations to ensure no changes were implemented, especially if flights are not conducted on regular basis (e.g., once, or twice a year). Regulations for the VLOS-dependent approach can be found at each aviation authority’s website, such as the United States (https://www.faa.gov/uas/ (accessed on 7 May 2022)), Costa Rica (https://www.dgac.go.cr/ (accessed on 7 May 2022)), the European Union (https://www.easa.europa.eu/domains/civil-drones (accessed on 7 May 2022)), New Zealand (https://www.aviation.govt.nz/ (accessed on 7 May 2022)), and Japan (https://www.mlit.go.jp/en/koku/index.html (accessed on 7 May 2022)).

2.2. Cameras and Platforms

The choice of a camera sensor should depend on the sUAS project objectives, although this can be limited, based on the project’s budget costs and aircraft (multirotor or fixed-wing). The review articles cover a range of camera sensors including basic Red, Green, Blue (RGB), and more advanced sensors capable of measuring wavelengths not visible to the human eye such as multispectral (red edge and near infrared), hyperspectral, and thermal infrared. Basic camera RGB sensors are most often used because they are suitable for most coastal applications due to their low-cost, light weight, and high resolution [4,5]. The visible RGB bands of the electromagnetic spectrum are effective at capturing the behavior of marine vertebrates [6,9,11,12], as well as accurate 3D changes in coastal geomorphology derived from structure-from-motion (SfM) photogrammetry techniques. However, additional portions of the electromagnetic spectrum are needed to capture certain phenomena. For example, multispectral and hyperspectral sensors can capture the health and distribution of different wetland [3,4,5,7,8] and coral species [3,9], algal blooms, and water quality [4]. Thermal infrared cameras provide images of temperature, which are used to assess animal populations and water quality [4,6,8,9,11,13]. While the choice of sensor should depend on the sUAS project objectives, the choice of platform should depend on its ability to carry the sensor and meet the project’s quality specifications (e.g., image resolution and accuracy).
sUAS are either multirotors with propellors, or fixed wings. Fixed wings are the most efficient at surveying large coastal areas due to the increased flight time, while multirotors are best for small coastal areas because the flight height and speed are easier to control [3,4,5,7]. However, if surveying several kilometers of coastline, the efficiency of fixed wings goes down while the project costs go up. The higher payload and better stabilization offered by multirotors makes them better suited for mounting more complex sensors, such as hyperspectral cameras [3,7]. Nonetheless, the technology is advancing, and complex sensors are now being designed specifically for fixed wings, such as the senseFly Parrot Sequoia+.

2.3. Calibration Procedures

Calibration is an important process that needs to be considered when conducting a sUAS project. Calibration is performed by comparing a test measurement with a calibration measurement standard of known accuracy. Calibration can refer to the platform, RAW (uncompressed, which is preferred) imagery, camera, and the final products (i.e., orthomosaics, 3D dense clouds, Digital Elevation Models (DEMs). In terms of the platform itself, the Inertial Measurement Unit (IMU), compass, and gimble need to be calibrated often to ensure that the sUAS operates within the software tolerances. The IMU corrects for the platform’s yaw, pitch, and roll allowing it to balance while moving in different directions. One review article mentions that multirotors often calibrate their IMU at startup, which can be a problem if launching from a moving platform, such as a boat [9]. This issue can often be resolved by first launching the aircraft on land using motion-boot or boat-mode calibration sequences [9]. None of the review articles mention compass or gimbal calibration, which are important for accuracy and safety. Compass calibration aligns the platform’s flight system with the Earth’s magnetic north. We prefer to calibrate our platforms’ compass prior to each flight to help prevent unwelcoming flyaways and unstable landings. The gimbal’s motors support and stabilize the camera about an axis so it is not restricted from movement when taking images at different angles that increase the image network’s geometry. We typically calibrate our gimbal after any rough handling of the platform, such as a harsh landing or even after an unwanted crash. A commercial pilot can often perform these calibrations using the respective platform’s software.
In addition to platform calibration, both image and camera calibration need to be acknowledged. Image calibration, or radiometric calibration, is necessary when using multispectral and hyperspectral cameras because the raw Digital Numbers (DN) must be converted into reflectance spectra. Radiometric calibration is also required when conducting repeat flights where weather conditions are different so that the imagery is in a common scale based on reflectance spectra, such as when using multispectral, hyperspectral and RGB cameras. Sensor noise can also contribute to radiometric variability [7]. We note that radiometric calibration is required when comparing reflectance spectra data through time, such as when monitoring different vegetation communities. However, if the sUAS project goal is to monitor topographic changes, as derived from SfM techniques, 3D point clouds are compared and not the reflectance spectra data. While radiometric calibration is an important process to consider in many sUAS projects, only 2 out of the 10 review articles discussed radiometric calibration procedures [4,7]. These include the use of calibration targets that are distributed prior to the sUAS survey, so that the corresponding imagery (test measurement) can be used with the calibration target (calibration measurement of standard accuracy) [4,7]. Spectroradiometers are also used in flight or in the field to collect upwelling radiance and downwelling irradiance data [4]. Some camera manufacturers offer calibration targets, such as MicaSense, and software available for radiometric calibration to consider include Agisoft and Pix 4D.
A discussion on camera calibration is missing from all of the 10 review articles. Camera calibration is a process used to describe the camera parameters needed to reliably relate the 2D image coordinate system with a 3D real-world coordinate system. The camera model can be determined using a pre-calibration or self-calibration procedure. Pre-calibration is more involved because it requires capturing images taken at many different angles of a calibration pattern whose geometry in 2D or 3D space is precisely known [21], which is performed prior to the bundle adjustment in the SfM workflow [21,22]. A workflow for determining a pre-calibration camera model using a 2D geometric pattern is available using MATLAB (https://www.mathworks.com/help/vision/ug/using-the-single-camera-calibrator-app.html (accessed on 7 May 2022)) and Agisoft (https://agisoft.freshdesk.com/support/solutions/articles/31000160059-lens-calibration-using-chessboard-pattern-in-metashape (accessed on 7 May 2022)). A caveat is that the geometric pattern should be captured at a distance that is roughly equal to the flight height, which is not suitable for long-range capturing scenarios. On the other hand, self-calibration procedures are more flexible because they do not require an observation of a geometric pattern and are carried out automatically during the bundle adjustment. However, this approach also has its cons, as many parameters need to be estimated that do not always obtain accurate results [21]. The pre-calibration and self-calibration approaches each have their pros and cons, and, regardless of which approach is used for a sUAS project, we agree with [23] that sufficient image metadata such as the camera make and model, ISO, shutter speed, aperture, and focal length need to be reported to allow confidence and reproducibility in the results.
Geometric calibration (also known as georeferencing) is the process used to relate the 2D image coordinate system with a 3D real-world coordinate system. The calibration measurements of known accuracy relative to an established 3D coordinate system are referred to as GCPs. GCPs are often surveyed with Real-time Kinematic Global Navigation Satellite Systems (RTK-GNSS) prior to or after a sUAS survey, so that the corresponding imagery (test measurement) can be matched up with their 3D ground coordinates (calibration measurement of standard accuracy) in the SfM workflow. Only 1 out of the 10 review articles mention the use of GCPs for registering the sUAS imagery to a 3D coordinate system [7]. In addition to accurate measurements of GCPs, their number and distribution throughout the study area depends on the quality specifications of the sUAS project. The number of GCPs should be sufficient where half are designated as control (calibration) and the other half are designated as quality (validation) to assess the reliability of the georeferencing [24].

2.4. Validation Procedures

Validation is the process of determining whether the test measurement meets the standard requirements for an intended sUAS project’s purpose. A total of 3 out of the 10 review articles discussed validation as the process of comparing the RTK-GNSS locations of field samples of the phenomenon to be mapped (e.g., coastal habitat, water quality, etc.) with the classified coastal habitat or predicted water quality values [4,5,7]. Validation in this context is used to assess, for example, how well a machine learning model makes new predictions on unseen data to predict the water quality. However, before we can validate such a model, we must validate the geometric calibration results.

2.5. Literature Review Gaps

The best practices used to conduct sUAS surveys as well as process sUAS photo-based data in coastal environments are missing from the review literature. Table 1 emphasizes the gaps as data collection, calibration, validation, data processing, and software. We intend to fill this gap by presenting two workflows to help guide coastal scientists. The following section compiles the current best practices as a series of activities necessary to complete a successful sUAS field campaign as well as a workflow to process sUAS photo-based data.

3. sUAS Photo-Based Surveys of Coastal Environments: Best Practices

3.1. sUAS Photo-Based Surveys

Good data can easily be lost due to poor field protocols. Here, we expand on the existing literature and create a Red, Green, Blue (RGB) workflow organized in three phases using best practices (Figure 1). This workflow can be modified to map and monitor different coastal phenomena such as coastal habitats, beach nesting species, topographic and nearshore bathymetry.
In phase red of the workflow, the scientific pilot follows sUAS regulations while obtaining any special permissions from landowners and managers to access the survey area [25,26]. While sUAS regulations differ among nations, [26] identified three general aspects of sUAS regulations, which include: (1) targeting regulated use of airspace; (2) imposing operational limitations; and (3) administration of flight permissions, pilot licenses, and authorization of data collection. This demonstrates the importance of knowing these rules before beginning a sUAS survey. The next step in phase red is to determine the project objective(s). In our example, the project objective is to conduct repeat annual surveys to monitor vertical changes in newly restored oyster reefs. With the project objective identified, the quality specifications of minimum vertical accuracy (mean bias) and precision (1 σ or Root Mean Square Error (RMSE)) assuming data follow a normal distribution) requirements can be justified. This requires some background knowledge of the features being mapped. In our example, intertidal oyster reefs can grow 10 to 13 cm yr−1 vertically [27], so one approach is to determine the required precision by dividing the minimum annual expected vertical change of 10 cm by two (10/2 = 5 cm). The required mean bias should be zero. This allows the minimum expected annual vertical change of 10 cm to be measured more reliably. The next step in phase red of the workflow is to determine the optimal resolution based on the minimum object to be mapped, which, in our example, is a 2 × 4 cm oyster. Traditionally, the Minimum Mapping Unit (MMU), which is the smallest feature to be mapped, is useful when considering the Ground Sampling Distance (GSD) [28]. The GSD can be illustrated as the distance between the center of two cells when using an orthogonal plane. In our example, the GSD should be at a maximum 1 cm, so that the shape and size of an oyster can resolved. GSD is a result of flight height, focal length, and sensor resolution, which can automatically be calculated in most flight planning software, such as DJI GS Pro.
Once regulations, project objective(s), quality specifications, and GSD are defined in phase red, the next phase in the workflow is phase green, where the first step is to identify the appropriate sensor. RGB sensors with fine resolutions (≥20 MP), a wide range in global shutter speed, lens aperture, and ISO should be considered for accurate topographic mapping [16]. An RGB sensor is a reasonable choice for our example project objective, to monitor vertical changes in oyster reefs. Currently, multispectral, and hyperspectral sensors produce non-aligned bands and low-resolution images [26,29] making them less suitable for fine resolution topographic mapping. However, if our project objective was to classify coastal vegetation at the species level or estimate water quality, multispectral or hyperspectral sensors would be preferred because they provide more spectral information for training a machine or deep learning classification model. Although vendors typically carry out sensor calibration, an ongoing assessment in the lab is also necessary [26]. The next step in phase green of the workflow is to determine the aircraft on which the sensor will be mounted. Multicoptors are best suited for our example project, due to their vertical takeoff and landing in complex coastal terrain, their ability to fly slowly and stop to capture imagery (reducing motion blur), hover at low flight heights for close data capture of small objects, and the project’s small spatial coverage (e.g., <2 ha), which is suitable for a low battery capacity.
A field reconnaissance of the survey site is critical for safety, which is the next step in phase green. This visual ground assessment helps to identify any flight obstacles, potential takeoff and landing sites, and accessible areas with full visibility of GCPs [26]. It is also useful to take good field notes that map out these safety concerns to assist with proper placement of the GCPs and geometric calibration targets (if comparing spectral information). Important considerations in GCP placement include the number (more GCPs tend to reduce vertical errors), an even spatial distribution that also reflects the variations in topography, and accurate and precise measurement, such as with RTK-GNSS [26]. One helpful tool for managing the placement of GCPs is the open-source PhenoFly Planning Tool [26,30]. The combination of this tool with field reconnaissance notes maximizes careful GCP planning in the coastal zone. Additional considerations for our example project objective are the use of semi-permanent elevated GCPs on a platform [7] on the marsh-side and seaward-side of the oyster reefs. Additionally, the dimension of the GCPs should be about 10 times the GSD [26,31]. With our required GSD of 1 cm, GCP dimensions should be no less than 10 cm in dimension. In addition to the GCPs, radiometric calibration target(s) should be considered if comparing reflectance spectra data through time, such as when monitoring different vegetation communities.
Once the sensor, aircraft, field reconnaissance, and GCP placement are determined in phase green of the workflow, the last phase in the workflow is the blue phase, which involves planning and conducting the sUAS surveys. The first step in phase blue is flight planning, which can be performed a few days before or on the day of the survey. The options for flight planning software are well covered in the literature [5,7,26]. The open-source PhenoFly Planning Tool is unique because, in addition to planning the number and placement of GCPs, the software allows for flight mission planning that considers many important parameters, such as sensor and lens, flight height needed to achieve the required GSD, exposure value due to a sunny or cloudy day, side and end lap, motion blur, and flight path [30]. The choice of autopilot software, such as DJI GS Pro and DroneDeploy, should be made, based on its compatibility with the aircraft and pilot control over as many of these parameters as possible. In setting the GSD, the software optimizes flying speed and height, based on the aircraft and camera. To reduce motion blur, we use a capture mode of hover and capture at each point. For homogenous coastal areas, such as beaches and sandy bottoms in the intertidal zone, we prefer a cross flight pattern, a tilted camera, and an 80% front and side overlap (this requires more images and longer flights). These parameter settings help mitigate fewer tie points from being identified in the SfM process and improve the self-calibration process [26]. Another important consideration in this step is to set the image file format as a RAW file rather than a JPEG file, because RAW files do not compress information that leads to lower quality imagery [28]. It is wise to save the mission in the autopilot software and adjust as needed when in the field.
Additional considerations in the blue phase include weather and tides, which should be checked prior to and on the day of the survey. Many aircraft cannot sustain winds over 20 mph, nor should they be flown in the rain. We prefer a calm cloudy day to reduce shadows amongst objects on the surface. In addition, the flight needs to be timed right at low tide when collecting data in the intertidal zone. There are useful apps that can be used to check airspace notifications, weather, and tides prior to and during flight operations. In the US, there are the notices to airmen (NOTAMs; identifies where a pilot can fly), the aviation weather report (METAR), and the terminal aerodrome forecast (TAF) from the National Weather Service Aviation Weather Center at www.aviationweather.gov (accessed on 7 May 2022). Tides can be predicted far in advance with a high degree of accuracy, and tidal predictions in the US can be obtained from the National Oceanic Atmospheric Administration’s Tide Alert app or for free at: https://tidesandcurrents.noaa.gov/ (accessed on 7 May 2022). The next steps in phase blue are to conduct the GCP survey and flights. For safety and efficiency, a preliminary checklist should be followed and is provided in Appendix A by [26] including aircraft calibration procedures. In the final step of phase blue, the procedures, settings, and parameters are documented and shared with the data products and any results derived from them. We recommend following Appendix B by [26] for creating metadata associated with the sUAS survey, which are critical for reproducibility and confidence in the results.

3.2. sUAS Photo-Based Processing of Dense Point Clouds, DEMs and Orthomosaics

A sUAS photo-based workflow is created in this study by combining the best practices of several proven data processing approaches into one (Figure 2). We expand on the workflow provided by [24] (see Figure 3), focusing more in detail on the processing of sUAS photo-based data.
Step 1 of the workflow requires the use of radiometrically calibrated images when conducting repeat sUAS surveys where the spectral data are compared through time (e.g., classifying coastal vegetation). The Empirical Line Method (ELM) is commonly used due to its accuracy and simplification [32], which was further simplified by [33]. However, in projects that require dense point clouds to be compared through time, radiometric calibration may be less necessary. This is because radiometric calibration is not directly related to the performance of SfM photogrammetry used to derive dense point clouds [34]. The next step in the workflow is to determine the processing software and removal of poorly focused images. Agisoft Photoscan/Metashape (Metashape) is the leading software used by coastal scientists for geometric calibration, orthomosaic, and dense point cloud generation [4,7]. Therefore, Metashape is used here to illustrate the sUAS photo-based processing workflow. In Step 2, poorly focused images are excluded because they can negatively impact image alignment. Metashape offers an automatic image quality feature that calculates a value, based on the sharpness of the most focused area of an image [35]. While images with a value less than 0.5 units are recommended for removal, a unit of 0.8 was found to be a more conservative estimate [24].
Step 3 in the processing workflow is image alignment. This involves SfM to reconstruct 3D geometry by identifying and matching common features on overlapping images into tie points. In Step 4, a bundle adjustment is carried, which uses a least-squares global optimization approach to reduce image residuals by adjusting camera parameters, camera orientations, and the 3D point positions [36,37]. The output from this procedure results in a more reliable, aligned image network based on the estimated camera positions from the imagery alone and a resulting sparse cloud. Step 5 involves performing a quality control assessment on the image network by checking for errors in the potential mismatching of tie points [24,37]. At this step, the RMSE between the projected reconstructed tie points and their corresponding original projections detected on the photos are calculated for each photo in the network and the results visualized in a statistical software, such as open-source R v4.1.3. This helps to identify any photos with high image residuals that may need removal from the image network [24,37].
Step 6 in the workflow involves the process of hand marking GCPs on the images. The GCPs are then added to the image network without setting them as a control, followed by another bundle adjustment (Step 7) [24,37]. This allows for the quality of the GCPs to be captured in pixels (Step 8). In Step 9, georeferencing to an established coordinate system is achieved by linking the GCPs with their 3D ground coordinates. This is followed by another bundle adjustment, using all GCPs as control (Step 10). A Python script is then executed in Metashape [37] to export the GCP errors into a statistical software, so that the RMSE between the estimated positions and GCP 3D coordinates can be calculated (Step 11).
The next step in the processing workflow is camera model optimization (Step 12). A camera model can be determined either by a pre-calibration or self-calibration procedure during the bundle adjustment. The pre-calibrated procedure involves determining the intrinsic geometry and distortion camera model parameters prior to camera model optimization, such as through a common checkboard routine [22]. Self-calibration is where the camera model parameters are determined during the bundle adjustment. It should be noted that capturing imagery at various angles during the survey (e.g., different crosshatch flight patterns and gimbal pitches) increases the image geometry and can minimize error in the dense point clouds when self-calibrating [38]. Whether using a pre-calibrated or self-calibrated procedure, a suitable camera model is determined by evaluating different camera models (i.e., different combinations of camera parameters such as focal length, principal point, etc.) to determine which provides the lowest RMSE [38]. To test a camera model, half of the GCPs are randomly selected as control when running many Monte Carlo simulations in Metashape [24], using the Python script provided by [37]. The Monte Carlo results are then compiled in the open source sfm_georef v.3.0 [39] and brought into a statistical software to estimate the RMSE where the camera model with the lowest RMSE prevails. Then another bundle adjustment is performed, but this time, all GCPs are set as control so the reprojection RMSE, GCP image RMSE, and GCP ground RMSE are captured (Step 13). These parameter values are important for any coastal scientist to report, as they are used in the final calculation of the reported error for the derived dense point clouds and orthomosaics.
For Step 14 in the processing workflow, Step 12 is repeated but this time the camera model is set because it is already optimized, and the errors calculated in step 13 are used in many Monte Carlo simulations, each followed by a bundle adjustment [37]. The Monte Carlo results are then compiled in sfm_georef v.3.0 and brought into a statistical software to estimate the control and quality of the GCP RMSEs. With a high-quality image network, a dense point cloud is generated (Step 15), but first, all of the GCPs are selected as control, the errors calculated in Step 13 are included, and the camera model is set followed by a bundle adjustment. Dense point clouds are generated using “high” quality and “aggressive” depth filtering, followed by automatic and manual classification [24]. The workflow is concluded by generating a high resolution and accurate DEM (Step 16), and orthomosaic (Step 17).

4. Discussion

Our analysis of recent review articles on sUAS-based remote sensing of coastal environments demonstrates the need for workflows to help guide coastal scientists in carrying out consistent and reliable sUAS projects from field to lab. The art and science of capturing and processing accurate sUAS data involves field survey requirements and rigorous photogrammetric workflows to compensate for the potential shortcomings in sUAS-derived products [24]. Traditionally, geographers and remote sensing scientists are trained in the use of such technologies and data processing. However, with the accessibility and affordability of these on demand remote sensing devices for real-time monitoring, coastal scientists from a variety of backgrounds are taking advantage of their benefits. While the workflows presented in Section 3.1 and Section 3.2 of this study aim to provide helpful guidance on the steps needed to carry out reliable and repeatable sUAS projects in coastal environments, we acknowledge the detail and complexity of the workflows, which is a part of the craft.
The sUAS survey workflow Illustrated in this article was divided into three phases to help simplify the process. In phase 1 of the workflow, the requirement of a licensed pilot who is knowledgeable of the scientific questions that are being addressed is required. Phase 1 in the workflow ultimately relies on the pilot’s ability to carry out phase 2, where the choice of sensor and aircraft are determined, along with a field reconnaissance to ensure the survey can be conducted efficiently and safely. Otherwise, the ability to address the sUAS objectives may be compromised. The sUAS survey workflow illustrated in this article demonstrates that there is more to simply being a commercial pilot when carrying out sUAS projects for science.
Although we also attempted to help simplify a sUAS data processing workflow that results in data products that meet strict sUAS project quality specifications (see Figure 2), the need for coastal scientists with advanced technical skills persists. The 17-step processing workflow illustrates that sUAS photo-based data processing requires special attention to the image network, GCP image observations, and camera model optimization procedures to prevent propagation of errors into the resulting data products. This requires knowledge of the SfM process as well as of many different software packages.
Currently, there is not one single software that can carry out all the steps provided in the sUAS processing workflow. Instead, a combination of Metashape SfM photogrammetry software, open-source programming such as Python, and statistical software such as R, and sfm_georef [39] are used (see Figure 2). However, coastal scientists with minimum programming experience can use the Python script provided by [37], because basic information such as the file path and version of Metashape require basic re-scripting. The Python scripting associated with each version of Metashape is also provided by the software’s scripting reference [40]. For coastal scientists that do not have access to Metashape, another option is the open-source photogrammetry software MicMac [41]. Future work may consider modifying our sUAS data processing workflow using MicMac instead of Metashape to allow accessibility to all coastal scientists regardless of budget.
While this article focused on sUAS photo-based data collection and processing, future work is needed to create reliable sUAS LiDAR-based workflows used for surveying coastal environments. sUAS LiDAR is the next advancement in coastal mapping and monitoring [3,4,5,7]. sUAS LiDAR-based surveys are faster than sUAS photo-based surveys, which require multiple flight paths at different angles and large image overlap (e.g., 70–80%). When compared to sUAS photo-based data, sUAS LiDAR-based data are more reliable at estimating ground, vegetation height, and density in coastal marshes [42]. While sUAS photo-based surveys seem promising for the measurement of nearshore bathymetry, they are limited to areas with distinct visible features on the seafloor and do not perform well in homogeneous sandy bottoms. sUAS LiDAR-based surveys, on the other hand, have proven effective at measuring homogenous sandy bottoms [43]. Many of the reviews converge on sUAS-based LiDAR as the next advancement in coastal mapping and monitoring [3,4,5,7].

5. Conclusions

We assessed the current state of review articles on the use of sUAS in coastal environments [3,4,5,6,7,8,9,11,12,13]. These review articles covered a wide range of topics including sUAS photo-based regulations, sensors, platforms, calibration, validation, software, challenges, benefits, and applications. Table 1 emphasizes the major gaps as data collection, calibration, validation, and processing. We expanded on these review articles to create both a sUAS survey and sUAS processing workflow, using several proven data collection and processing techniques.
sUAS photo-based surveys require more time than sUAS LiDAR-based surveys, and this poses a challenge for pilots tasked with collecting data at certain phases of the tide. The sUAS survey workflow presented in this study can be modified to fit other sUAS projects to ensure more reliable and rapid data are captured during sensitive time constraints. For example, the sUAS survey workflow helps prepare the scientific pilot with knowledge of sUAS regulations, the project objective along with its quality specifications, the layout of the survey site, and the GCP placement. With data collected using reliable field protocols, rigorous photogrammetric workflows can be carried out to compensate for the potential shortcomings in the final sUAS-derived products. The sUAS processing workflow presented in this study is rigorous, so that accurate and reliable products are produced. Key steps include performing quality and control assessments on the image network, GCPs, and camera calibration model.
While this article focused on sUAS photo-based surveys due to the current rapid advance of technology, future work should focus on developing rigorous field and processing workflows for using sUAS LiDAR in coastal environments. sUAS LiDAR is the next important advancement in mapping and monitoring coastal environments. We hope that this study will stimulate the application of sUAS photo-based and sUAS LiDAR-based best practices in coastal environments.

Author Contributions

Conceptualization, S.G., H.S. and Z.Z.; methodology, H.S.; writing—original draft preparation, H.S.; writing—review and editing, H.S.; illustrations, H.S.; funding acquisition, Z.Z., S.G., G.W. and H.S. All authors have read and agreed to the published version of the manuscript.

Funding

The authors would like to thank the North Carolina Department of Transportation (Award Number: RP 2020-35) (Z.Z., S.G. and G.W.), and the Department of Interior—National Park Service (Award Number: P19AC01091) (H.S.) for their assistance in supporting this research. This project is also funded, in part, by the US Coastal Research Program (USCRP) (H.S.) as administered by the US Army Corps of Engineers (USACE), Department of Defense. The authors acknowledge the USACE and USCRP’s support of their effort to strengthen coastal academic programs and address coastal community needs in the United States.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest. The content of the information provided in this publication does not necessarily reflect the position or the policy of the United States government, and no official endorsement should be inferred.

References

  1. Leven, L.A.; Boesch, D.F.; Covich, A.; Dahm, C.; Erséus, C.; Ewel, K.C.; Kneib, R.T.; Moldenke, A.; Palmer, M.A.; Shelgrove, P.; et al. The function of marine critical zone transition zones and the importance of sediment biodiversity. Ecosystems 2001, 4, 430–451. [Google Scholar] [CrossRef]
  2. Matthews, M.W. A current review of empirical procedures of remote sensing in inland and near-coastal transitional waters. Int. J. Remote Sens. 2011, 32, 6855–6899. [Google Scholar] [CrossRef]
  3. Klemas, V.V. Coastal and environmental remote sensing from unmanned aerial vehicles: An overview. J. Coast. Res. 2015, 31, 1260–1267. [Google Scholar] [CrossRef] [Green Version]
  4. Kislik, C.; Dronova, I.; Kelly, M. UAVs in support of algal bloom research: A review of current applications and future opportunities. Drones 2018, 2, 35. [Google Scholar] [CrossRef] [Green Version]
  5. Adade, R.; Aibinu, A.M.; Ekumah, B.; Asaana, J. Unmanned Aerial Vehicle (UAV) applications in coastal zone management—A review. Environ. Mont. Assess. 2021, 193, 1–12. [Google Scholar] [CrossRef]
  6. Oleksyn, S.; Tosetto, L.; Raoult, V.; Joyce, K.E.; Willimason, J.E. Going Batty: The challenges and opportunities of using drone to monitor the behavior and habitat use of rays. Drones 2021, 5, 12. [Google Scholar] [CrossRef]
  7. Morgan, G.R.; Hodgson, M.E.; Wang, C.; Schill, S.R. Unmanned aerial remote sensing of coastal vegetation: A review. Ann. GIS 2022, 1–15. [Google Scholar] [CrossRef]
  8. Ridge, J.; Seymour, A.; Rodriguez, A.B.; Dale, J.; Newton, E.; Johnston, D.W. Advancing UAS Methods for Monitoring Coastal Environments. In Proceedings of the AGU Fall Meeting, New Orleans, LA, USA, 11–15 December 2017. [Google Scholar]
  9. Johnston, D.W. Unoccupied Aircraft Systems in Marine Science and Conservation. Annu. Rev. Mar. Sci. 2019, 11, 439–463. [Google Scholar] [CrossRef] [Green Version]
  10. Windle, A.E.; Poulin, S.K.; Johnston, D.W.; Ridge, J.T. Rapid and accurate monitoring of intertidal Oyster Reef Habitat using unoccupied aircraft systems and structure from motion. Remote Sens. 2019, 11, 2394. [Google Scholar] [CrossRef] [Green Version]
  11. Rees, A.F.; Avens, L.; Ballorain, K.; Bevan, E.; Broderick, A.C.; Carthy, R.R.; Christianen, M.J.A.; Duclos, G.T.; Heithaus, M.R.; Johnston, J.W.; et al. The potential of unmanned aerial systems for sea turtle research and conservation: A review and future directions. Endang. Species Res. 2018, 35, 81–100. [Google Scholar] [CrossRef] [Green Version]
  12. Schofield, G.; Esteban, N.; Katselidis, K.A.; Hays, G.C. Drones for research on sea turtles and other marine invertebrates—A review. Biol. Conserv. 2019, 238, 108214. [Google Scholar] [CrossRef]
  13. Kandrot, S.; Hayes, S.; Holloway, P. Applications of Uncrewed Aerial Vehicles (UAV) Technology to Support Integrated Coastal Zone Management and the UN Sustainable Development Goals at the Coast. Estuaries Coast. 2021, 1–20. [Google Scholar] [CrossRef] [PubMed]
  14. Khedmatgozar Dolati, S.S.; Caluk, N.; Mehrabi, A.; Khedmatgozer Dolati, S.S. Non-detructuve testing applications for steel bridges. Appl. Sci. 2021, 11, 9757. [Google Scholar] [CrossRef]
  15. Garcia-Soto, C.; Seys, J.J.C.; Zielinski, O.; Busch, J.A.; Luna, S.I.; Baez, J.C.; Domegan, C.; Dubsky, K.; Kotynska-Zielinska, I.; Loubat, P.; et al. Marine Citizen Science: Current State in Europe and New Technological Developments. Front. Mar. Sci. 2021, 8, 621472. [Google Scholar] [CrossRef]
  16. Nowlin, M.B.; Roady, S.E.; Newton, E.; Johnston, D.W. Applying unoccupied aircraft systems to study human behavior in marine science and conservation programs. Front. Mar. Sci. 2019, 6, 567. [Google Scholar] [CrossRef] [Green Version]
  17. Beaucage, P.; Glazer, A.; Choisnard, J.; Yu, W.; Bernier, M.; Benoit, R.; Lafrance, G. Wind assessment in a coastal environment using synthetic aperture radar satellite imagery and a numerical weather prediction model. Can. J. Remote Sens. 2007, 33, 368–377. [Google Scholar] [CrossRef]
  18. Seier, G.; Hodl, C.; Abermann, J.; Schottl, S.; Maringer, M.; Hofstadler, D.N.; Probstl-Haider, U.; Lieb, G.H. Unmanned aircraft systems for protected areas: Gadgetry or necessity? J. Nat. Conserv. 2021, 64, 126078. [Google Scholar] [CrossRef]
  19. Ciaccio, F.; Troisi, S. Montoring marine environments with Atutonomous Underwater Vehicles: A bibliometric analysis. Res. Eng. 2021, 9, 100205. [Google Scholar]
  20. Lally, H.; O’Connor, I. Can drone be used to conduct water sampling in aquatic environments? A review. Sci. Total Envron. 2019, 20, 569–575. [Google Scholar] [CrossRef]
  21. Zhang, Z. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [Google Scholar] [CrossRef] [Green Version]
  22. Griffiths, D.; Burningham, H. Comparison of pre- and self-calibrated camera calibration models for UAS-derived nadir imagery for a SfM application. Prog. Phys. Geog. 2019, 43, 215–235. [Google Scholar] [CrossRef] [Green Version]
  23. Oconner, P.L.; Smith, M.J.; James, M.R. Cameras and settings for aerial surveys in the geosciences: Optimising image data. Prog. Phys. Geog. 2017, 41, 325–344. [Google Scholar] [CrossRef] [Green Version]
  24. Cooper, H.M.; Wasklewicz, T.; Zhu, Z.; Lewis, W.; Lecompte, K.; Heffentrager, M.; Smaby, R.; Brady, J.; Howard, R. Evaluating the ability of multi-sensor techniques to capture topographic complexity. Sensors 2021, 21, 2105. [Google Scholar] [CrossRef] [PubMed]
  25. Cruzan, M.B.; Weinstein, B.G.; Grasty, M.R.; Kohrn, B.F.; Hendrickson, E.C.; Arredondo, T.M.; Thompson, P.G. Small unmanned aerial vehicles (micro-UAVs, drones) in plant ecology. Appl. Plant Sci. 2016, 4, 1600041. [Google Scholar] [CrossRef] [PubMed]
  26. Tmušic, G.; Salvator, M.; Helge, A.; James, M.R.; Goncalves, G.; Ben-Dor, E.; Brook, A.; Polinova, M.; Arranz, J.J.; Mészáros, J.; et al. Current practices in UAS-based environmental monitoring. Remote Sens. 2020, 12, 1001. [Google Scholar] [CrossRef] [Green Version]
  27. Rodriguez, A.B.; Fodrie, F.J.; Ridge, J.T.; Lindquist, N.L.; Theuerkauf, E.J.; Coleman, S.E.; Grabowski, J.H.; Brodeur, M.C.; Gittman, R.K.; Keller, D.A.; et al. Oyster reefs can outpace sea-level rise. Nat. Clim. Change 2014, 4, 493–497. [Google Scholar] [CrossRef]
  28. Singh, K.K.; Frazier, A.E. A meta-analysis and review of unmanned aircraft system (UAS) imagery for terrestrial applications. Int. J. Remote Sens. 2018, 39, 5078–5098. [Google Scholar] [CrossRef]
  29. Aasen, H.; Bolten, A. Multi-temporal high-resolution imaging spectroscopy with hyperspectral 2D imagers—From theory to application. Remote Sens. Environ. 2018, 205, 374–389. [Google Scholar] [CrossRef]
  30. Roth, L.; Hund, A.; Aasen, H. PhenoFly Planning Tool: Flight planning for high-resolution optical remote sensing with unmanned aerial systems. Plant Methods 2018, 14, 116. [Google Scholar] [CrossRef] [Green Version]
  31. Assmann, J.J.; Kerby, J.T.; Cunlie, A.M.; Myers-Smith, I.H. Vegetation monitoring using multispectral sensors—Best practices and lessons learned from high latitudes. J. Unmanned Veh. Syst. 2019, 7, 54–75. [Google Scholar] [CrossRef] [Green Version]
  32. Smith, G.M.; Milton, E.J. The use of the empirical line method to calibrate remotely sensed data to reflectance. Int. J. Remote Sens. 1999, 20, 2653–2662. [Google Scholar] [CrossRef]
  33. Iqbal, F.; Lucieer, A.; Barry, K. Simplified radiometric calibration for UAS-mounted multispectral sensor. Eur. J. Remote Sens. 2018, 51, 301–313. [Google Scholar] [CrossRef]
  34. Conte, P.; Girelli, V.A.; Mandanici, E. Structure from Motion for aerial thermal imagery at city scale: Pre-processing, camera calibration, accuracy assessment. ISPRS J. Photogramm. Remote Sens. 2018, 146, 320–333. [Google Scholar] [CrossRef]
  35. Agisoft LLC. Agisoft Metashape User Manual; Professional Edition, Version 1.8; Agisoft LLC: St. Petersburg, Russia, 2022. [Google Scholar]
  36. Granshaw, S.I. Bundle adjustment methods in engineering photogrammetry. Photogram. Rec. 1980, 56, 181–207. [Google Scholar] [CrossRef]
  37. James, M.R.; Robson, S.; Smith, M.W. 3-D uncertainty-based topographic change detection with structure-from-motion photogrammetry: Precision maps for ground control and directly georeferenced surveys. Earth Surf. Process. Landf. 2017, 42, 1769–1788. [Google Scholar] [CrossRef]
  38. Wackrow, R.; Chandler, J.H. Minimizing systematic error surfaces in digital elevation models using oblique convergent imagery. Photgram. Rec. 2011, 26, 16–31. [Google Scholar] [CrossRef] [Green Version]
  39. James, M.R.; Robson, S. Straightforward reconstruction of 3D surfaces and topography with a camera: Accuracy and geoscience application. J. Geophys. Res. 2012, 117, F03017. [Google Scholar] [CrossRef] [Green Version]
  40. Agisoft LLC. Metashape Python Reference, Release 1.8.2; Agisoft LLC: St. Petersburg, Russia, 2022. [Google Scholar]
  41. Rupnik, E.; Daakir, M.; Deseiligny, P. MicMac—A free, open-source solution for photogrammetry. Open Geospat. Data Softw. Stand. 2017, 2, 14. [Google Scholar] [CrossRef]
  42. Pinton, D.; Canestrelli, A.; Wilkenson, B.; Ifju, P.; Ortega, A. Estimating ground elevation and vegetation characteristics in coastal salt marshes using UAV-based LiDAR and digital aerial photogrammetry. Remote Sens. 2021, 13, 4506. [Google Scholar] [CrossRef]
  43. Wang, D.; Xing, S.; He, Y.; Yu, J.; Xu, Q.; Li, P. Evaluation of new lightweight UAV-borne topo-bathymetric LiDAR for shallow water bathymetry and object detection. Sensors 2022, 22, 1379. [Google Scholar] [CrossRef]
Figure 1. Red, Green, Blue (RGB) workflow organized in three phases using best practices for conducting sUAS photo-based surveys of coastal environments.
Figure 1. Red, Green, Blue (RGB) workflow organized in three phases using best practices for conducting sUAS photo-based surveys of coastal environments.
Drones 06 00142 g001
Figure 2. A 17-step workflow using best practices for processing sUAS photo-based surveys of coastal environments.
Figure 2. A 17-step workflow using best practices for processing sUAS photo-based surveys of coastal environments.
Drones 06 00142 g002
Table 1. Major topics either reviewed or not reviewed by each of the 10 review articles identified in this study. Topics marked by a check in the box indicate the topic was covered. The topics least reviewed are data collection, calibration, validation, data processing, and software.
Table 1. Major topics either reviewed or not reviewed by each of the 10 review articles identified in this study. Topics marked by a check in the box indicate the topic was covered. The topics least reviewed are data collection, calibration, validation, data processing, and software.
ReferenceRegulationsSensorsPlatformsCollectionCalibrationValidationProcessingSoftwareChallengesBenefitsApplications
Morgan et al. (2022) [7]
Adade et al. (2021) [5]
Kandrot et al. (2021) [13]
Oleksyn et al. (2021) [6]
Ridge and Johnston (2020) [8]
Johnston et al. (2019) [9]
Schofield et al. (2019) [12]
Kislik (2018) [4]
Rees et al. (2018) [11]
Klemas (2015) [3]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Guan, S.; Sirianni, H.; Wang, G.; Zhu, Z. sUAS Monitoring of Coastal Environments: A Review of Best Practices from Field to Lab. Drones 2022, 6, 142. https://doi.org/10.3390/drones6060142

AMA Style

Guan S, Sirianni H, Wang G, Zhu Z. sUAS Monitoring of Coastal Environments: A Review of Best Practices from Field to Lab. Drones. 2022; 6(6):142. https://doi.org/10.3390/drones6060142

Chicago/Turabian Style

Guan, Shanyue, Hannah Sirianni, George Wang, and Zhen Zhu. 2022. "sUAS Monitoring of Coastal Environments: A Review of Best Practices from Field to Lab" Drones 6, no. 6: 142. https://doi.org/10.3390/drones6060142

APA Style

Guan, S., Sirianni, H., Wang, G., & Zhu, Z. (2022). sUAS Monitoring of Coastal Environments: A Review of Best Practices from Field to Lab. Drones, 6(6), 142. https://doi.org/10.3390/drones6060142

Article Metrics

Back to TopTop