Next Article in Journal
Estimating Tree Height and Volume Using Unmanned Aerial Vehicle Photography and SfM Technology, with Verification of Result Accuracy
Next Article in Special Issue
Operational Protocols for the Use of Drones in Marine Animal Research
Previous Article in Journal
Development of a Simplified Radiometric Calibration Framework for Water-Based and Rapid Deployment Unmanned Aerial System (UAS) Operations
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Sharkeye: Real-Time Autonomous Personal Shark Alerting via Aerial Surveillance

1
SMART Infrastructure Facility, University of Wollongong, Wollongong 2522, Australia
2
Centre for Sustainable Ecosystem Solutions and School of Earth Atmospheric and Life Sciences, University of Wollongong, Wollongong 2522, Australia
3
Across the Cloud Pty Ltd., Wollongong 2500, Australia
4
Centre for Bioinformatics and Biometrics, National Institute for Applied Statistics Research Australia, University of Wollongong, Wollongong 2522, Australia
5
Sharkmate, Wollongong 2500, Australia
6
Advanced Multimedia Research Lab, University of Wollongong, Wollongong 2522, Australia
*
Author to whom correspondence should be addressed.
Drones 2020, 4(2), 18; https://doi.org/10.3390/drones4020018
Submission received: 16 March 2020 / Revised: 23 April 2020 / Accepted: 24 April 2020 / Published: 4 May 2020
(This article belongs to the Special Issue Drone Technology for Wildlife and Human Management)

Abstract

:
While aerial shark spotting has been a standard practice for beach safety for decades, new technologies offer enhanced opportunities, ranging from drones/unmanned aerial vehicles (UAVs) that provide new viewing capabilities, to new apps that provide beachgoers with up-to-date risk analysis before entering the water. This report describes the Sharkeye platform, a first-of-its-kind project to demonstrate personal shark alerting for beachgoers in the water and on land, leveraging innovative UAV image collection, cloud-hosted machine learning detection algorithms, and reporting via smart wearables. To execute, our team developed a novel detection algorithm trained via machine learning based on aerial footage of real sharks and rays collected at local beaches, hosted and deployed the algorithm in the cloud, and integrated push alerts to beachgoers in the water via a shark app to run on smartwatches. The project was successfully trialed in the field in Kiama, Australia, with over 350 detection events recorded, followed by the alerting of multiple smartwatches simultaneously both on land and in the water, and with analysis capable of detecting shark analogues, rays, and surfers in average beach conditions, and all based on ~1 h of training data in total. Additional demonstrations showed potential of the system to enable lifeguard-swimmer communication, and the ability to create a network on demand to enable the platform. Our system was developed to provide swimmers and surfers with immediate information via smart apps, empowering lifeguards/lifesavers and beachgoers to prevent unwanted encounters with wildlife before it happens.

Graphical Abstract

1. Introduction

Managing shark–human interactions is a key social and environmental challenge requiring a balance between responsibilities of providing beachgoer safety and the responsibility to maintain shark populations and a healthy marine ecosystem. Sharks pose some limited risk to public safety [1]; however, there is a recognized social and environmental responsibility to develop strategies that effectively keep people safe and ensure the least harm to the environment [2]. This process is still evolving, particularly in countries with yearly shark interactions. For instance, in Australia, The Shark Meshing (Bather Protection) Program NSW [3] is a hazard mitigation strategy in place since 1937 which is known to cause harm to vulnerable species [4] and is listed as a Key Threatening Process by the NSW Scientific Committee with ecological and economic costs [5]. With the range of emerging technologies offering new potential to mitigate harm to people and sharks alike, there are new ways to manage interactions without such high costs.
Aerial surveillance has historically been used to help identify sharks in beach zones and warn beachgoers in order to avoid shark incidents. Australia started one of the earliest “shark patrols” known through the Australian Aerial Patrol in the Illawarra in 1957 [6]; currently, across the country, 1000s of km are patrolled by airplanes and helicopters on any weekend. For decades, aerial patrols have absolutely helped mitigate shark interactions, predominantly by alerting lifeguards/lifesavers to enact beach closures [7], and further are essential for a range of safety and rescue purposes. However, a recent study suggested the ability to reliably detect sharks from planes and helicopters has shown limited effectiveness, with only 12.5% and 17.1% of shark analogues spotted for fixed-wing and helicopter observers, respectively [8]. Though the assessment of such studies has been debated [9], there is certainly agreement that new technologies could assist expansion of aerial shark spotting and communication of threats to lead to better outcomes for humans and sharks alike.
The recent expansion of unmanned aerial vehicles (UAVs), often known as drones, offers new abilities to enhance traditional aerial surveillance. In recent years alone, drones have been used to evaluate beaches and coastal waters [10,11,12] and monitor and study wildlife [13,14,15], from jellyfish [16], sea turtles [17,18,19], salt-water crocodiles [20], dolphins and whales [21,22,23,24], and manatees [25] to rays [26] and sharks [27,28,29,30]. This expansion in surveillance is likely related to the market release of consumer drones with advanced camera imaging; for instance, the DJI Phantom introduced in 2016. Whether rotor-copter or fixed wing, rapid development of commercial drones means more flying “eyes” can be deployed in more locations for conservation and management [31]. Using these methods for shark detection again shows great promise; drones are now a part of several lifeguard programs in Australia in multiple states [32] and are continuously being evaluated for reliability [33]. Other UAVs are also being considered [34]; a review by Bryson and Williamson (2015) into the use of UAVs for Marine Surveys highlighted the potential of unmanned tethered platforms such as balloons with imaging especially over fixed locations [35], and such platforms were successfully tested for shark spotting applications by Adams et al. [36]. These platforms may be able to overcome current issues facing drones: restrictions resulting from limited battery life, limited flight areas per air safety regulations (in Australia this is facilitated by Civil Aviation Safety Authority (CASA) [37]), and the need for pilot training and equipment expertise. Regardless of platform, the use of piloted and, critically, autonomous UAVs in wildlife management will almost certainly expand and offer new opportunities, particularly to complement the use of conventional drones.
One critical limitation in traditional aerial surveillance is the need for human engagement with the technology, principally for visual detection. However, again the recent expansion of machine learning/artificial intelligence and in particular Deep Learning tools [38] in the last decade has offered significant advantages as an approach to image classification as well as the detection and localization of objects from images and video sequences. While traditional image analysis relies on human-engineered signatures or features to localize objects of interest in images and performance depends on how well the features are defined, Deep Learning provides an effective way to automatically learn or teach a detector (algorithm) the features from a large amount of sample images with the known location of the objects of interest. This concept of using automated identification from photos/videos has been explored for years (especially from images gathered from camera traps); with the expansion of available drones, automated detection has bloomed as a tool for wildlife monitoring [39,40]. Thus Deep Learning detection algorithms for sharks [41,42] (as well as other marine wildlife and beachgoers) are expanding and offer superior tools to be integrated with aerial surveillance for automated detection.
Combining UAVs and machine learning with smart wearables like smartwatches creates an additional opportunity to build scalable systems to manage shark interactions with individuals. Again, a recent technology expansion (mainly in 2017) has created an era of available smartwatches with cellular communication technology, offering the potential to link surveillance with personalized alerting. Our project was designed to overcome the gap between these emerging techniques for real- time personal shark detection by integrating aerial reconnaissance, smart image recognition, and wireless wearable technology. Sharkeye had two main objectives: (1) bridge data from aerial monitoring to alert smartwatches and (2) test the system as a pilot on location. Objective 1 required technical integration from the aerial acquisition of data to the app along the architecture stack including validating key components of hardware, software, and services (connectivity). To execute, our team developed an improved detection algorithm by training the system using machine learning based on aerial footage of real sharks, rays and surfers collected at local beaches; determined how to host and deploy the algorithm in the cloud; and expanded a shark app to run on smartwatches and push alerts to beachgoers in the water. For Objective 2, field trials were conducted at Surf Beach Kiama over a period of three days using methods for testing continuous aerial surveillance. To test detection accuracy, we deployed moving shark analogues as well as swimmers, surfers, and paddleboarders while recording the response through the use of multiple smartphones and three smartwatches simultaneously. In addition to the main objectives and related work, a backup emergency communication system between a swimmer in danger and a lifeguard was also tested, as well as deployment of a communication relay on the blimp to further demonstrate the potential of the Sharkeye platform.
The system was successfully deployed in a trial with over 350 detection events recorded with an accuracy of 68.7%. This is a significant achievement considering that our algorithm was developed using <1 h of training footage, highlighting the huge potential of the system given more training data to improve confidence. Overall, Sharkeye showed potential in the convergence of drones, Internet of Things, machine learning, and smart wearables for human–animal management. We believe and advocate continued investment to leverage existing technology and utilize emerging technology to tackle both shark and human safety objectives. Further investment would significantly improve accuracy and prediction and expand identification. This capability to expand to other beach and water-based objects suggests the platform can be used to develop tools for effective and affordable beach management. A common tool that allows for real-time superior shark detection as well as monitoring beach usage, swimmer safety, pollution, etc., would help de-risk investment in R&D and deployment. Ultimately, as the Sharkeye system is not only platform agnostic, it has potential to assist cross-agency objectives of data collection and analysis for consumer safety, efficiency gains, market competitiveness, and even development for export, we strongly recommend resourcing funds to expand such work.

2. Materials and Methods

The Sharkeye project developed an interface between two previous initiatives, Project Aerial Inflatable Remote Shark–Human Interaction Prevention (Project AIRSHIP), a blimp-based megafauna monitoring system, and Sharkmate, a predictive app based on the likelihood of shark presence. We successfully optimized the equipment and process, i.e., our platform, required to automate video collection from the blimp, send feed to a cloud-hosted algorithm, determine shark presence, and send alerts via smartwatches. The components of the system are described below.

2.1. Aerial Imaging from UAVs

A blimp-based system, Project Aerial Inflatable Remote Shark–Human Interaction Prevention (Project AIRSHIP) was used for the aerial surveillance component of the Sharkeye project. Project AIRSHIP was developed with the intent to help mitigate the risk of shark–human interactions with zero by-catch; the blimp system was developed and used for several years before the Sharkeye project was initiated. The blimp uses a high-definition mounted camera, designed to provide continuous aerial coverage of swimming and surfing areas [36].
As a first step of Sharkeye, we collected new footage of wildlife from the blimp to help train the detection algorithm. To collect new footage, the blimp-mounted camera system was tethered above the surf zone of Surf Beach in Kiama, NSW. For context on the observation area, Kiama Surf Beach is described as an east-facing embayed beach, 270 m in length, that is exposed to most swell with average wave heights of 1–1.5 m [43]. Footage of the surf zone was generally collected from 11:00 am until 4:00 pm during a six-week period (excluding weekends and public holidays) from December 2017 to January 2018. Deployment was coordinated around the work hours of the professional lifeguard service at Kiama. Once the algorithm was trained, the blimp was redeployed (May 2018) to test the effectiveness of real-time detection.
Images were collected via a live streaming camera with 10x zoom on a 3-axis gimbal (Tarot Peeper camera with custom modifications). This setup provided a relatively stable feed even in windy conditions (>30 km/h), with footage transmitted to a ground station at a resolution of 1080 p. To focus the collection of images, the direction and zoom of the camera was controlled by an operator on the ground and was visually monitored on an HD screen. The blimp itself is 5 m in length and 1.5 m in diameter (Airship Solutions Pty Ltd., Australia) and served as a stable platform for the camera. The blimp enabled collection over an 8-hour collection period, as limited only by the camera battery. The blimp was launched in the morning and taken down at night. When not in use, it was stored fully inflated to minimize helium usage, taking 9000 L of helium for initial inflation. The blimp loses helium at a rate of <1% a day, requiring a small top up every few days (100–200 L). Images of the system are provided in Figure 1.
Sharkeye utilized the ability of blimps to detect sharks at ocean beaches. Sharks, and other large marine species including stingrays and seals, were frequently spotted and tracked by observers monitoring the blimp feed [36]. This work validates the ability of humans to observe and identify fauna from this aerial platform and provided a benchmark against which the performance of the algorithm can could be compared.

2.2. Detection Algorithm Development

The project employed a Deep Learning Neural Network (DNN) for the detection algorithm for various animals as well as people in the water (sharks, rays, and surfers). The following is a summary of the dataset development, the network architecture, the algorithm training and evaluation.
To create the algorithm, raw footage from the blimp-mounted camera was collected over several weeks at a specific beach. This footage was brought back to the laboratory to be manually screened for the presence of marine life. The spliced footage was then used to extract images for algorithm creation and testing and divided into training and testing datasets. One of every five frames of the video selections were annotated with the locations and types of objects and each frame divided into 448 × 448 blocks with 50% overlapping. The training datasets where then processed in a DNN developed by our labs.
The network architecture was based on You Only Look Once (YOLO) as described in Redmon and Farhati 2017 [44,45]. It consisted of nineteen convolutional layers and five max-pooling layers with a softmax classification output. It used 3 × 3 filters to extract features and double the number of channels after every pooling step, with twelve 3 × 3 convolutional layers in total. Between every 3 × 3 convolutional layers, 1 × 1 filters are used to compress the feature representation. There are seven 1 × 1 convolutional layers. At the end of the network, three extra 3 × 3 convolutional layers with 1024 filters each followed by a final 1 × 1 convolutional layer with the number of outputs for detection are added for detection. The network is augmented with a preprocessing component to process the video frames before they are taken for training and detection in real-time. A visualization of the process is outlined in Figure 2.
The commonly used mini-batch strategy [44] was used to train the network, with a batch size of 64, a weight decay of 0.0005 and a momentum of 0.9. In this case, the learning rate was set to 0.00001 and max iteration was set to 45,000, with the threshold used for detection set to 0.6. For training, the frequently used data augmentation techniques, including random crops, rotations, hue, saturation, and exposure shifts, were used [44]. Once the algorithm was created, it was then tested versus the testing dataset in the laboratories. Once validated on the training dataset, the final algorithm was then ready for hosting and deployment in the cloud-based architecture. Results from the algorithm could then be pushed to additional alerting components of the Sharkeye system.

2.3. Personal Alerting via an App

We expanded on the previously developed SharkMate app to provide the user interface (UI) to facilitate real-time alerting accessible for beachgoers in the water. For context, the SharkMate app was designed as a proof-of-principle demonstration, developed as a means to aggregate data contributing to a range of risk factors surrounding shark incidents. Analysis of data from the Australian Shark Attack File (ASAF, Taronga Zoo [46]) and International Shark Attack File (ISAF [47]) and a range of correlation studies including Fisheries Occasional Publication No. 109, 2012 [48] found statistically significant risk factors contributing to shark incidents such as proximity to river mouths and recent incidents or sightings. It is important to acknowledge that sharks are wild animals and their movements and actions are not completely predictable. However, historical data does present factors that are significant in contributing to the risk of shark incidents. It is also important to distinguish factors that are purely correlative and are not causal.
The app derives its location-specific data from a range of sources. Climate data was sourced from the Australian Bureau of Meteorology (BOM). Geographic data was used to map and identify proximity of river mouths, outfalls and seal colonies. This was combined with data such as the presence of mitigation strategies (including the presence of blimp) and lifeguards/lifesavers. Historical incidents and sightings were sourced from ASAF and ISAF. Real-time sightings of sharks and bait balls from across Australia were scraped from verified Twitter accounts. This data was then aggregated to a centralised database.
The app used a series of cloud-based programming steps to provide predictions. Information from the various sources is updated every hour. Scripts are used to retrieve information such as water temperature and the presence of lifeguards at the beach. These values are then run through a series of conditional statements, each individually scaled to reflect the respective impact they have on risk. The output of these conditions is a likelihood score, a number out of ten, designed to be easily compared to other beaches. The higher the number, the higher the risk. Finally, after the impact of all factors has been processed to generate an aggregate score, this value is then sent to the cloud, where it can be accessed from the SharkMate app.
The functionality of the app was later expanded to facilitate real-time alerts from the blimp via Apple Watch. Prior to this work, the SharkMate iPhone app (using iOS) was near completion. However, the SharkMate smartwatch app (using smartOS) needed further development. Swift 4 and XCode9 were used to create and program both the iPhone and Apple Watch applications. As the smartwatch functions vary from the smartphones, the smartwatch app could not simply be converted. Regardless, push notifications were developed to alert the user of the presence of a shark displayed on the Apple Watch app. This was achieved by taking manual and automated triggers of cloud functions that then processed the data and pushed it over the cloud to the watch. With the app and watch validated, the components were complete to integrate the end-to-end alerting system.

2.4. Integration Across the Stack

Integration was required to connect the various pieces of hardware and discrete custom and commercial software to execute real-time notifications of automatically detected sharks in the ocean via the smartwatches. To integrate these features, our team created the interface between:
  • The camera mounted to the blimp (or the video feed from a separate drone),
  • The receiving ground station,
  • The machine learning model (algorithm),
  • The smartwatches, and
  • The SmartMate app (iOS phone/watchOS).
Several additional pieces of software were required to facilitate integration. Firstly, a program was developed to initiate processing using the algorithm; this software transferred images captured from the live video feed and uploaded them to the cloud for processing. Secondly, once the images were processed from the algorithm, an additional script was created to process the outputs of the machine learning model and send appropriate push notification messages via Amazon Web Services SNS (simple notification service pub/sub messaging) integration with the Apple Push Notification Service. The data flow is outlined in Figure 3. To test the system before deployment, the camera was removed from the blimp, and aerial shark images from the internet and from our training set were placed in front to simulate a live feed. This was done on a local beach to take into account environmental conditions that might be observed during field testing. Although simplistic, the test run showed the integrated components worked; alerts were triggered, and notifications received on the smartwatches.

2.5. Field Testing and Additional Drone-based Hosting

Once the individual Sharkeye components had been completed and validated, the system was then deployed for field testing. Initially, the blimp was inflated, and the camera attached to the underside mounting. The live camera feed was then established to laptops on the beach and monitors in the surf club. As before, to provide a preliminarily check of communication in the system, aerial images of sharks (via printouts) were placed in front of the camera, the feed sent to the cloud to process the detection algorithm, and alerts were monitored on several smart phones and smartwatches.
Once the check was complete, the blimp was launched to a height of approximately 60 m via a tether to the beach. A shark analogue (plywood cutout) was then placed on the sand and successfully detected as a verification to establish the system was working. The analogue was then deployed in the surf zone and detection monitored on screens, on 3 smartwatches and on various phones with the app installed. Detection rates were recorded in the cloud. Alerts on multiple smartwatches were then tested while in the water near the shore with success. Images of the field testing are seen in Figure 4. For environmental context, the weather conditions on the testing days were average, partially cloudy, while the surf was higher than normal (see Supplementary Files for videos of the conditions and setup in action).
To test the system on a mobile platform, the camera feed was successfully changed to a drone operated by a certificated drone pilot (DJI Phantom 4, Opterra Australia). A surfer wearing a smartwatch was deployed, the drone was then set to observe, and send the feed to the cloud as with the blimp. In that case, the surfer received alerts at 150 m from shore. Additionally, while filming, a stingray was observed from the drone. The team took advantage of the situation and was able to detect the native wildlife in the field and with alerts. Finally, we deployed a stand up paddleboarder (SUP) with a smartwatch and observed detection via the system simultaneously on the beach and in the water.

3. Results and Discussion

Overall, our Sharkeye system was successfully deployed during the field trial, demonstrating that automated real-time personal shark detection is possible using existing technologies: aerial imaging, cloud-based machine learning analysis, and alerting via smartwatches.

3.1. Creating the Detection Algorithm

In order to develop the dataset to train the algorithm, the blimp system captured over 100 h of raw footage over a six-week period. As mentioned, the footage was then manually screened at the laboratory for the presence of marine life. Upon review, multiple sharks were sighted during approximately 30 min of total time. Other marine animals were far more commonly observed, with stingrays spotted at least 50 times, as well numerous seals and baitfish schools. It should be noted that during the data collection, the monitored video feed also provided a beneficial vantage point for lifeguards. One rescue was also observed from the blimp during the original footage collection. Additionally, lifeguards monitoring the video feed identified sharks in real time on several occasions. On one of these occasions, the blimp operator spotted a shark in close proximity to a body boarder and alerted lifeguards who evacuated swimmers from the water.
Four video sequences, one hour and seven minutes in total, containing sharks, stingrays and surfers were chosen from the video footage collected by a blimp overlooking the beach and ocean. After analysis this led to a dataset having ~10,805 annotated images. The dataset was then divided randomly into three subsets: 6915 images for training, 1730 images for validation, and 2160 images for off-line performance evaluation. This training set was extremely limited, particularly for sharks; based on <30 min of footage of sharks seen in the beach area during operation of the blimp. However, even with this limited training data the system was able to detect various objects from the images set aside for testing with confidence. Using a standard accuracy calculation (number of correctly detected objects + number of correctly detected "no" objects/total instances), evaluation in the lab showed an accuracy of 91.67%, 94.52% and 86.27% for sharks, stingrays, and surfers, respectively, using the testing dataset.
The authors acknowledge the limitations of the algorithm in the context of deployment for shark spotting. The utility of the training and testing datasets was directly linked to the conditions seen during the recording of the subject at hand: whether shark, ray, or surfer. This resulting environment and activity during the sighting, including weather, wave activity, glare (time of day), animal depth, etc., will all have an impact on the ability and accuracy of the resulting algorithm to detect a subject in different conditions. For the footage used for the algorithm training, the conditions would be described as average and mild for periods of the Australian summer in the Illawarra region, ranging from overcast to partially cloudy.
On reflection, developing the detection algorithm for sharks, stingrays and surfers proved more challenging than conventional objects due to a number of reasons. First, these objects are typically underwater causing significant deformation of their outlines that makes defining borders difficult. Second, the colour of the objects can vary due to the depth in the water, meaning the algorithm must be able to distinguish an object under multiple conditions. Third, when collecting video from aerial platforms, the objects can be too small to use due to the distance from the camera. Fourth, it can be challenging to collect the amount of footage that is required to have enough images to effectively train a deep neural network with accuracy and precision (i.e., footage of sharks at Kiama); as well as the scarcity of sharks to validate a trained algorithm (i.e. resulting in the need to deploy shark analogues to test). With access to further training data and access to sharks in the wild, in a variety of environmental conditions, we anticipate that the accuracy can be greatly expanded and improved.

3.2. Design and Testing the Smartwatch User Interface

The user interface and design of the watch app was identified as a crucial element of the Sharkeye system. Users in the water have limited ability to control the watch due to moisture on the glass when in the surf. Because of this, the UI and experience were designed to be accessible and responsive in an aquatic environment. The smartwatch app was setup to receive the message that a shark has been spotted, and the user should exit the water. This was presented in the form of attention-grabbing actions across three senses (sight, sound, and feel) that were performed by the watch. Besides a visual notification, the speakers in the watch also produce a sound. Vibration was also ultilised to further ensure that someone wearing the watch was alerted to the potential danger. Additional alerts were simultaneously sent to the smart phone (examples of alerts are seen in Figure 5).
Overall connectivity of new wearables was an important factor in app development. The Apple Watch used in the demonstration was one of the only smartwatches that was waterproof and also boasted GPS and cellular capabilities at the time of demonstration; this ultimately allowed for use in the surf and the required long distance connectivity for the system to receive alerts from the cloud. However, there were difficulties in translating the iPhone app to the Apple Watch app due to the network limitations with the wearable; communication with the database is sometimes limited when the user is not connected to an internet source or SIM card. The user interface is completely different on the two devices due to the different dimensions that required additional development in creating the UI. Certainly, the potential of the system is not limited to Apple products and other wearables and operating systems could be used in the future.

3.3. Real-Time Aerial Shark Detection and Alerting

During the deployment trial period, 373 detection events were recorded. A detection event was defined as an automatic alert from the system onto the smartwatch and/or smart phone, and which was then validated by a researcher against the truth/falsity of the AI detection. These alerts occurred with a latency of less than 1 min from image collection to notification. The detection events could be broken down into detected sharks, rays, or surfers (as seen in Figure 6). To analyse the events, we examined whether there was an actual shark analogue, ray or surfer being imaged and also detected by the algorithm. The output of the detection event was manually annotated as falling into one the categories (shark analogue, ray, surfer) as the ground truth. The data was then run through a classifier, and we compared the ground truth for both raw counts and for binary presence/absence classification.
This arrangement was successfully able to detect shark analogues, surfers, and rays with an accuracy of 68.7% when considering a simple no presence of an animal versus presence of an animal. While a suitable accuracy for a demonstration, our testing showed ample room for improvement. The system had trouble distinguishing between sharks, rays and humans with perfect confidence and most often mis-classified objects as rays. Further, at times, artefacts causing false positives were observed; the system thought shadows of waves were animals, or shadows from the blimp or drone itself. However, from the context of a proof of principle, once trained the algorithm performed sufficiently well in a different situation than it had been taught and effectively picked up objects darker than surroundings.
The reduction in accuracy from the training dataset and the field data could be due in part to the differing conditions seen between the footage collection periods. The demonstration took place in autumn, 5 months after the training data was collected in the summer. Further, while the weather was somewhat comparable, mild and partially sunny, the swell was notably bigger during the testing period. Obviously with further training, especially using more footage from on-beach locations in varied conditions, the algorithm would be improved for both the ability to pick up real objects and the ability to distinguish between them. Further work in the post-image capture analysis could also improve the system significantly; particularly methodologies that evaluate the detection in context. For instance, techniques like those used by Hodgson et al. [49] and replicated by other groups [19], where the observations are related to parameters like sea visibility, glare and glitter, and sea state, would not only help to understand the accuracy, but could also potentially be used to further improve the detection algorithm.
In terms of differences between aerial platforms, there was little difference in image quality between the blimp-mounted camera and the quadcopter drone camera. The ground station used over the last few seasons at the beach had an effect on the image quality and had some reception issues, highlighting the need to consider a casing for this equipment to prevent corrosion from the sea air. The area of coverage is dependent on the height of the blimp (restricted to 120 m or 400 ft). We deployed the blimp at 70 m to provide complete coverage of a beach that is 270 m in length to a distance of 250 m offshore. This ensures both the flagged swimming area and the surf banks were covered by the blimp. On longer beaches, the blimp may be deployed higher (to a limit of 120 m) to achieve a larger area of coverage. However, this may reduce the accuracy of observers/algorithms, so the area of coverage and accuracy would need to be assessed and optimized.

3.4. Platform for additional Lifesaving Technology

In addition to the notification system of sharks and rays to surfers and swimmers, our team was able to use the smartwatch app to provide a secondary emergency notification system. In an emergency, surfers and swimmers who need help could use the watch app to send a notification to life savers with the fact they need help as well as the location reported by the watch app. To achieve this, we added a simple web API that the smartwatch app could call, which would then pass on the details via email. We integrated this functionality to the SharkMate app on the smartwatch by adding a map and alert button to an alert section in the watch app UI to send out that alert using the web API. The alerting system is shown in Figure 7, followed by the response email as displayed on a lifesaver’s watch shown below. To demonstrate the system, two tests were carried out—one with a surfer in trouble and another with a swimmer in trouble—ending with a mock rescue. The demonstration proved the viability of using the platform for in-water emergencies, including potential use during a shark interaction.

3.5. Overcoming Deadzones—Creating an On-Demand IoT Network

The emergence of Internet of Things (IoT) has led to new types of wireless data communication, offering long-range, low-power and low-bandwidth connectivity. Those new technologies are using non-licensed radio frequencies to communicate. With these new protocols, devices can last for years on a single battery, while communicating from distances of over 100 km. A major open source initiative in this space is The Things Network, which offers a fully distributed IoT data infrastructure based on LoRa (short for Long Range) technology [50]. The University of Wollongong and the SMART infrastructure facility have deployed multiple gateways offering free access to the Things Network infrastructure in the Illawarra area [51]. An advantage of this infrastructure is that the network can be easily expanded by connecting new gateways, increasing the coverage of the radio network. Ultimately, such networks could be used to power and collect data from sensors and other instruments across local beaches.
The use of the blimp allowed us to show the possibility of extending the network even over the ocean by integrating a nano-gateway. A nano-gateway was designed by the SMART IoT hub, composed of a LoRa Gateway module (RAK831) coupled to a Raspberry pi zero. The LoRa Gateway module demodulates LoRa signals received from sensors, while the Raspberry pi zero uses 4G connectivity to push data to the cloud. This nano-gateway is powered by battery, which offers the possibility to easily expand the network to isolated areas at a very low cost. Embedded in the blimp, the gateway offers large area radio coverage. To map the signal, we used a mobile device that sends messages containing GPS coordinates through the LoRa network. When the gateway receives those signals, meta-data can be extracted from the message, displaying the position of the sensor on a map and the strength of the signal. Positions are displayed live on the website https://ttnmapper.org/, which is crowd-sourced map coverage of the Things Network [52]. The mobile device was taken into the sea by a lifeguard on a paddleboard to test the coverage of the nano-gateway. As shown in the experiments in Figure 8, the signal reported locations even at a distance of 1 km, which was the extent of the experiment. For beach surveillance, the demonstration shows potential to quickly deploy an autonomous solution where classical internet connection is not available.

3.6. Considerations for Future Deployment.

The system has the potential to improve on the more traditional view of aerial surveillance that requires operators to analyse images and more critically requires lifeguards to audibly alert beachgoers. Conceptualizing future deployment to a large number of beaches, we would recommend streamlining the hardware on the beach with a view to minimize setup requirements and cost. Critically we would explore where it makes sense to do the process computing “at the edge” (on the beach) sending only essential data or notifications via the cloud, given the bandwidth constraints (that may vary from beach to beach) as well as operating costs.
To expand the algorithm for other beaches and scenarios, more video footage would need to be collected, under a wide range of weather and beach conditions, and used to train the algorithm. Further research on the network architecture is needed so the architecture would be able to deal with varied conditions to improve the detection accuracy. Generally, the network and training procedure can be easily and efficiently scaled to multiple objects; we demonstrated the use of detection for sharks, rays, and surfers simultaneously. Scaling again would require sufficient video footage to teach the algorithm and scaled to multiple sites with more cloud processing resources. The system is also algorithm “agnostic”, i.e., capable of hosting new independent algorithms developed by external groups.
With regards to the coverage of the Sharkeye system and alerting potential of the SmartMate app, future work could encapsulate the expansion of beaches affiliated with the likelihood score (currently, there are approximately 100 beaches predominantly in NSW, Australia) and the inclusion of further research into specific risk factors. As a proof of principle, in its current form, the SharkMate app has both provided a platform for users to develop a greater understanding of the local risk factors contributing to shark incidents and also acted as a means to broadcast real-time shark sightings by the blimp. Expansion of the work may see the introduction of machine learning into the risk models as it has proved successful in the other elements of this project. Further, as wearable technology continues to improve and capabilities extend, there is potential to enhance the communication between lifeguards and the beachgoers through a range of mediums other than audio/text alert. Improvements in connectivity could allow lifeguards to communicate at a significant distance through video and audio in real-time through wearables—a possible enhancement for the SharkMate app. Such improvements extend beyond shark incident mitigation into other aspects of beach safety.
The alerting aspect of the Sharkeye platform could also integrate into other existing or future apps utilized for shark monitoring. The authors would like to add a word of caution that users should never only depend on an app, Sharkmate or otherwise, as the basis for deciding on the risk of being in the water. Technology will never replace common sense, particularly when in areas where sharks are known to frequent and especially when detection conditions are poor.
Although tested and trained using footage from a novel blimp-based aerial platform, the system is cross-compatible with any video-based aerial surveillance method and potentially the only automated system proven on both blimps and drones. Given the advantages and shortcomings of these various surveillance methods to the application of shark safety [14,36], it is likely that a mosaic approach will be most effective, whereby blimps and balloons, drones and traditional aircraft are used alongside each other to provide appropriate coverage. Therefore, by designing the system to be compatible with any method of image capture, we allow for increased uptake potential and a broader pool of application.

4. Conclusions

Through limited training (via self-collected footage), we were able to develop a detection algorithm that could identify several distinct objects (sharks, rays, surfers, paddleboarders) and report to various smart devices. However, as described in the discussion, with more training and optimization, the algorithm can be significantly improved for accuracy in prediction. There is also the possibility to expand to a host of other beach- and water-based objects. With the basis of the system in place, the algorithm could be expanded to look at a variety of relevant aspects of shark management, including rapid identification of various species, behavioral patterns for specific shark species, and interspecies relationships (e.g., following bait balls), potentially offering new ways of identifying (and ultimately preventing) what triggers attacks with humans.
The Sharkeye system brings Industry 4.0 to shark spotting and sits at the flourishing nexus of drones, sensors, the Internet of Things (IoT), AI, edge and cloud computing. However, our system has the potential to go well beyond looking at specific animals; it can be used as an effective tool for identifying other threats to assist in beach management. Further data collection and algorithm development could enable a system that could monitor beach usage and safety (such as swimmers in trouble), identify rips and other conditions, and then warn beachgoers, as well as monitor pollution, water and air quality, all the while simultaneously offering a platform for real-time alerting for the presence of sharks. We believe widespread deployment is truly feasible if the platform provides a range of enhancements for public safety and integrates as a tool for lifeguards/lifesavers, councils, and other stakeholders alike. This strategy would reduce investment risk and provide greater returns than resourcing various discrete piece-wise shark-specific technologies at individual beaches. We heavily caution that although automation shows promise for providing safer beaches, it should be seen as a tool for lifeguards/lifesavers rather than as a replacement. If a shark is sighted from the blimp, professionals must be notified via smartwatch alerts to verify and respond. We would also like to see detection algorithms tested in continuous and varied real-life conditions to verify their reliability; this is essential to ensure that the public has confidence in an automated system that would work with human observers.

Supplementary Materials

The following are available online at https://www.mdpi.com/2504-446X/4/2/18/s1, Video S1: Kiama Field Testing Overhead Perspective; Video S2: Kiama Field Testing Sea View Perspective; Video S3: Kiama Field Testing Side Perspective; Video S4: Kiama Field Testing Top Aerial Perspective

Author Contributions

Conceptualization, R.G., K.A., M.J.B., and W.L; methodology, K.A., M.J.B., W.L. and R.G.; software, M.J.B., S.A., J.B. and W.L.; validation, R.G., K.A., M.J.B, and S.A.; formal analysis, W.L., M.J.B., and J.B.; investigation, R.G., K.A., M.J.B, S.A., and W.L.; resources, K.A., M.J.B., J.B. and R.G.; data curation, M.J.B., S.A., and W.L.; writing—original draft preparation, R.G.; writing—review and editing, R.G., K.A., J.B., A.R.D., S.A., and W.L.; visualization, R.G.; supervision, A.R.D. and W.L.; project administration, R.G.; funding acquisition, R.G. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the NSW Department of Primary Industries (DPI), Shark Management Strategy in NSW Australia.

Acknowledgments

This project has drawn heavily on expertise, techniques and equipment developed as a direct result of financial and in-kind support by a variety of organisations. Kye Adams and Andy Davis wish to acknowledge financial and in-kind support from Kiama Municipal Council, the Department of Primary Industries, the Holsworth Research Endowment, Skipp Surfboards, the Save Our Seas Foundation and the Global Challenges Program at the University of Wollongong. For assistance in digitizing many thousands of images to develop training datasets for previous algorithms, we thank Douglas Reeves and David Ruiz Garcia. We’d also like to thank Douglas Reeves and Allison Broad for assistance in field testing. Sam Aubin would like to acknowledge the support from David Powter, Ann Sudmalis MP, Hon Arthur Sinodinos AO, Matt Kean MP, Gareth Ward MP, Gavin McClure, Kate McNiven, Jean Burton, Kiama Business Chamber and Kiama Council. Wanqing Li wishes to thank the visiting PhD student, Fei (Sheldon) Li, from Northwestern Polytechnical University, China, for his direct contribution to the project. Johan Barthelemy would like to recognize the support for the Digital Living Lab, Benoit Passot and Nicolas Veerstavel for assistance with the IoT Gateway deployment, development, and related testing during the project. Matthew Berryman would like to recognize the support from the Amazon Web Services Cloud Credits for Research program. Robert Gorkin would like to thank Rylan Loemker from Opterra for assistance with drone testing and Kihan Garcia from Rise Above for drone expertise. Robert would also like to thank the Bellambi SLSC for training exposure during the project.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

References

  1. Yearly Worldwide Shark Attack Summary. Available online: https://www.floridamuseum.ufl.edu/shark-attacks/yearly-worldwide-summary/ (accessed on 8 March 2020).
  2. Simmons, P.; Mehmet, M.I. Shark management strategy policy considerations: Community preferences, reasoning and speculations. Mar. Policy 2018, 96, 111–119. [Google Scholar] [CrossRef]
  3. NSW. Shark Meshing (Bather Protection) Program; NSW Department of Primary Industries: Orange, Australia, 2015; p. 4. [Google Scholar]
  4. Davis, A.R.; Broad, A. Tightening the net: In search of alternatives to destructive fishing and meshing programs. Available online: http://drainmag.com/junk-ocean/ (accessed on 9 March 2020).
  5. Gibbs, L.; Warren, A. Transforming shark hazard policy: Learning from ocean-users and shark encounter in Western Australia. Mar. Policy 2015, 58, 116–124. [Google Scholar] [CrossRef] [Green Version]
  6. Mitchell, H. Report on November Meeting: Australian Aerial Patrol; lllawarra Historical Society Inc.: Wollongong, Australia, 1997; p. 3. [Google Scholar]
  7. Surf Live Saving Western Australia Shark Safety. Available online: https://www.mybeach.com.au/safety-rescue-services/beach-safety/shark-safety/ (accessed on 8 March 2020).
  8. Robbins, W.D.; Peddemors, V.M.; Kennelly, S.J. Assessment of Shark Sighting Rates by Aerial Beach Patrols; Cronulla Fisheries Research Centre of Excellence: Cronulla, Australia, 2020; p. 40. [Google Scholar]
  9. Leadbitter, D. Submission to the Senate Inquiry into Shark Mitigation and Deterrent Measures; NSW: Orange, Australia, 2017. [Google Scholar]
  10. Lowe, M.K.; Adnan, F.A.F.; Hamylton, S.M.; Carvalho, R.C.; Woodroffe, C.D. Assessing Reef-Island Shoreline Change Using UAV-Derived Orthomosaics and Digital Surface Models. Drones 2019, 3, 44. [Google Scholar] [CrossRef] [Green Version]
  11. Casella, E.; Drechsel, J.; Winter, C.; Benninghoff, M.; Rovere, A. Accuracy of sand beach topography surveying by drones and photogrammetry. Geo. Mar. Lett. 2020, 1–14. [Google Scholar] [CrossRef] [Green Version]
  12. Jiménez López, J.; Mulero-Pázmány, M. Drones for Conservation in Protected Areas: Present and Future. Drones 2019, 3, 10. [Google Scholar] [CrossRef] [Green Version]
  13. Harris, J.M.; Nelson, J.A.; Rieucau, G.; Broussard, W.P. Use of Drones in Fishery Science. Trans. Amer. Fish. Soc. 2019, 148, 687–697. [Google Scholar] [CrossRef]
  14. Colefax, A.P.; Butcher, P.A.; Kelaher, B.P. The potential for unmanned aerial vehicles (UAVs) to conduct marine fauna surveys in place of manned aircraft. Ices J. Mar. Sci. 2018, 75, 1–8. [Google Scholar] [CrossRef]
  15. Kelaher, B.P.; Colefax, A.P.; Tagliafico, A.; Bishop, M.J.; Giles, A.; Butcher, P.A. Assessing variation in assemblages of large marine fauna off ocean beaches using drones. Mar. Freshw. Res. 2019, 71, 68–77. [Google Scholar] [CrossRef] [Green Version]
  16. Raoult, V.; Gaston, T.F. Rapid biomass and size-frequency estimates of edible jellyfish populations using drones. Fish. Res. 2018, 207, 160–164. [Google Scholar] [CrossRef]
  17. Schofield, G.; Esteban, N.; Katselidis, K.A.; Hays, G.C. Drones for research on sea turtles and other marine vertebrates—A review. Biol. Conserv. 2019, 238, 108214. [Google Scholar] [CrossRef]
  18. Rees, A.; Avens, L.; Ballorain, K.; Bevan, E.; Broderick, A.; Carthy, R.; Christianen, M.; Duclos, G.; Heithaus, M.; Johnston, D.; et al. The potential of unmanned aerial systems for sea turtle research and conservation: A review and future directions. Endang. Species. Res. 2018, 35, 81–100. [Google Scholar] [CrossRef] [Green Version]
  19. Schofield, G.; Katselidis, K.A.; Lilley, M.K.S.; Reina, R.D.; Hays, G.C. Detecting elusive aspects of wildlife ecology using drones: New insights on the mating dynamics and operational sex ratios of sea turtles. Funct. Ecol. 2017, 31, 2310–2319. [Google Scholar] [CrossRef]
  20. Bevan, E.; Whiting, S.; Tucker, T.; Guinea, M.; Raith, A.; Douglas, R. Measuring behavioral responses of sea turtles, saltwater crocodiles, and crested terns to drone disturbance to define ethical operating thresholds. Plos One 2018, 13. [Google Scholar] [CrossRef] [PubMed]
  21. Fettermann, T.; Fiori, L.; Bader, M.; Doshi, A.; Breen, D.; Stockin, K.A.; Bollard, B. Behaviour reactions of bottlenose dolphins (Tursiops truncatus) to multirotor Unmanned Aerial Vehicles (UAVs). Sci. Rep. 2019, 9, 1–9. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  22. Fiori, L.; Martinez, E.; Bader, M.K.-F.; Orams, M.B.; Bollard, B. Insights into the use of an unmanned aerial vehicle (UAV) to investigate the behavior of humpback whales (Megaptera novaeangliae) in Vava’u, Kingdom of Tonga. Mar. Mammal Sci. 2020, 36, 209–223. [Google Scholar] [CrossRef]
  23. Horton, T.W.; Hauser, N.; Cassel, S.; Klaus, K.F.; Fettermann, T.; Key, N. Doctor Drone: Non-invasive Measurement of Humpback Whale Vital Signs Using Unoccupied Aerial System Infrared Thermography. Front. Mar. Sci. 2019, 6. [Google Scholar] [CrossRef] [Green Version]
  24. Subhan, B.; Arafat, D.; Santoso, P.; Pahlevi, K.; Prabowo, B.; Taufik, M.; Kusumo, B.S.; Awak, K.; Khaerudi, D.; Ohoiulun, H.; et al. Development of observing dolphin population method using Small Vertical Take-off and Landing (VTOL) Unmanned Aerial System (AUV). Iop Conf. Ser. Earth Environ. Sci. 2019, 278, 012074. [Google Scholar] [CrossRef]
  25. Landeo-Yauri, S.S.; Ramos, E.A.; Castelblanco-Martínez, D.N.; Niño-Torres, C.A.; Searle, L. Using small drones to photo-identify Antillean manatees: A novel method for monitoring an endangered marine mammal in the Caribbean Sea. Endanger. Species Res. 2020, 41, 79–90. [Google Scholar] [CrossRef] [Green Version]
  26. Tagliafico, A.; Butcher, P.A.; Colefax, A.P.; Clark, G.F.; Kelaher, B.P. Variation in cownose ray Rhinoptera neglecta abundance and group size on the central east coast of Australia. J. Fish. Biol. 2020, 96, 427–433. [Google Scholar] [CrossRef]
  27. Butcher, P.A.; Piddocke, T.P.; Colefax, A.P.; Hoade, B.; Peddemors, V.M.; Borg, L.; Cullis, B.R. Beach safety: Can drones provide a platform for sighting sharks? Wildl. Res. 2020, 46, 701–712. [Google Scholar] [CrossRef] [Green Version]
  28. Raoult, V.; Tosetto, L.; Williamson, J.E. Drone-Based High-Resolution Tracking of Aquatic Vertebrates. Drones 2018, 2, 37. [Google Scholar] [CrossRef] [Green Version]
  29. Benavides, M.T.; Fodrie, F.J.; Johnston, D.W. Shark detection probability from aerial drone surveys within a temperate estuary. J. Unmanned Veh. Sys. 2019, 8, 44–56. [Google Scholar] [CrossRef]
  30. Hensel, E.; Wenclawski, S.; Layman, C.A.; Hensel, E.; Wenclawski, S.; Layman, C.A. Using a small, consumer-grade drone to identify and count marine megafauna in shallow habitats. Lat. Am. J. Aquat. Res. 2018, 46, 1025–1033. [Google Scholar] [CrossRef]
  31. Wich, S. Chapter 7: Drones And Conservation. In Drones and Aerial Observation: New Technologies for Property Rights, Human Rights, and Global Development Drones: A Primer; New America: Washington, DC, USA, 2015. [Google Scholar]
  32. UAVs in Surf Life Saving. Available online: https://www.surflifesaving.com.au/uavs-surf-life-saving (accessed on 9 March 2020).
  33. Colefax, A.P.; Butcher, P.A.; Pagendam, D.E.; Kelaher, B.P. Reliability of marine faunal detections in drone-based monitoring. Ocean Coast. Manag. 2019, 174, 108–115. [Google Scholar] [CrossRef]
  34. Verfuss, U.K.; Aniceto, A.S.; Harris, D.V.; Gillespie, D.; Fielding, S.; Jiménez, G.; Johnston, P.; Sinclair, R.R.; Sivertsen, A.; Solbø, S.A.; et al. A review of unmanned vehicles for the detection and monitoring of marine fauna. Mar. Pollut. Bull. 2019, 140, 17–29. [Google Scholar] [CrossRef]
  35. Bryson, M.; Williams, S. Review of Unmanned Aerial Systems (UAS) for Marine Surveys; Australian Centre for Field Robotics, University of Sydney: Sydney, Australia, 2015. [Google Scholar]
  36. Adams, K.; Broad, A.; Ruiz-Garcia, D.; Davis, A.R. Continuous wildlife monitoring using blimps as an aerial platform: A case study observing marine megafauna. Austral. Zool. In Press.
  37. RPAS Drones. Available online: https://www.casa.gov.au/drones (accessed on 8 March 2020).
  38. Schmidhuber, J. Deep learning in neural networks: An overview. Neural Netw. 2015, 61, 85–117. [Google Scholar] [CrossRef] [Green Version]
  39. Van Gemert, J.C.; Verschoor, C.R.; Mettes, P.; Epema, K.; Koh, L.P.; Wich, S. Nature Conservation Drones for Automatic Localization and Counting of Animals. In Proceedings of the Computer Vision—ECCV 2014 Workshops; Agapito, L., Bronstein, M.M., Rother, C., Eds.; Springer International Publishing: Cham, Switzerland, 2015; pp. 255–270. [Google Scholar]
  40. Gonzalez, L.F.; Montes, G.A.; Puig, E.; Johnson, S.; Mengersen, K.; Gaston, K.J. Unmanned Aerial Vehicles (UAVs) and Artificial Intelligence Revolutionizing Wildlife Monitoring and Conservation. Sensors 2016, 16, 97. [Google Scholar] [CrossRef] [Green Version]
  41. Shrivakshan, G.T.; Chandrasekar, D.C. A Comparison of various Edge Detection Techniques used in Image Processing. Int. J. Comput. Sci. Issues 2012, 9, 269. [Google Scholar]
  42. Sharma, N.; Scully-Power, P.; Blumenstein, M. Shark Detection from Aerial Imagery Using Region-Based CNN, a Study. In Proceedings of the AI 2018: Advances in Artificial Intelligence; Mitrovic, T., Xue, B., Li, X., Eds.; Springer International Publishing: Cham, Switzerland, 2018; pp. 224–236. [Google Scholar]
  43. Surf—Beach in Kiama. Kiama NSW. Available online: http://beachsafe.org.au/beach/nsw/kiama/kiama/surf (accessed on 11 April 2020).
  44. Redmon, J.; Farhadi, A. YOLO9000: Better, Faster, Stronger. arXiv 2016, arXiv:1612.08242. [Google Scholar]
  45. Redmon, J.; Farhadi, A. YOLOv3: An Incremental Improvement. arXiv 2018, arXiv:1804.02767. [Google Scholar]
  46. Australian Shark Attack File. Taronga Conservation Society Australia. Available online: https://taronga.org.au/conservation-and-science/australian-shark-attack-file#publishingthisinformation (accessed on 15 March 2020).
  47. International Shark Attack File—Florida Museum of Natural History. Available online: https://www.floridamuseum.ufl.edu/shark-attacks/ (accessed on 15 March 2020).
  48. Fisheries occasional publications. Available online: http://www.fish.wa.gov.au/About-Us/Publications/Pages/Fisheries-Occasional-Publications.aspx (accessed on 15 March 2020).
  49. Hodgson, A.; Kelly, N.; Peel, D. Unmanned Aerial Vehicles (UAVs) for Surveying Marine Fauna: A Dugong Case Study. Plos One 2013, 8, e79556. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  50. The Things Network. Available online: https://www.thethingsnetwork.org/ (accessed on 15 March 2020).
  51. About—Digital Living Lab. Available online: http://digitallivinglab.uow.edu.au/about/ (accessed on 15 March 2020).
  52. TTN Mapper. Available online: https://ttnmapper.org/ (accessed on 15 March 2020).
Figure 1. Left: Image of the inflated blimp, Project Aerial Inflatable Remote Shark–Human Interaction Prevention (Project AIRSHIP), with ground crew and tether at Kiama Surf Beach, NSW (May 2018). Right: Typical view from the blimp camera over Kiama. Inset: the image collection system with power supply, camera on gimbal, communication system, and custom mounting. The system allows for operators on the beach to control direction and zoom on the camera. Once in the air, the video feed is passed to computers at the beach.
Figure 1. Left: Image of the inflated blimp, Project Aerial Inflatable Remote Shark–Human Interaction Prevention (Project AIRSHIP), with ground crew and tether at Kiama Surf Beach, NSW (May 2018). Right: Typical view from the blimp camera over Kiama. Inset: the image collection system with power supply, camera on gimbal, communication system, and custom mounting. The system allows for operators on the beach to control direction and zoom on the camera. Once in the air, the video feed is passed to computers at the beach.
Drones 04 00018 g001
Figure 2. Outline of steps of algorithm training. First, the original image is pre-processed into specific feature areas. Second, those sections are run through You Only Look Once (YOLO)-like object detection processing (including a series of convolutional and pooling layers). Third, a series of localized detection of results is obtained. Finally, these results are post-processed to form the resulting object detection for analysis.
Figure 2. Outline of steps of algorithm training. First, the original image is pre-processed into specific feature areas. Second, those sections are run through You Only Look Once (YOLO)-like object detection processing (including a series of convolutional and pooling layers). Third, a series of localized detection of results is obtained. Finally, these results are post-processed to form the resulting object detection for analysis.
Drones 04 00018 g002
Figure 3. Diagram of the data flow for the Sharkeye system. (1) The blimp-mounted camera sends video to the ground station. (2) The ground station then automatically converts the videos for upload to the cloud. (3) Once the data is uploaded to the cloud service, it is stored, then triggered for evaluation with the detection algorithm (hosted in a virtual server), and processed through the image recognition model, and the results are prepared in a messaging script. (4) Alerts are then passed from the cloud to smartwatches and smartphones.
Figure 3. Diagram of the data flow for the Sharkeye system. (1) The blimp-mounted camera sends video to the ground station. (2) The ground station then automatically converts the videos for upload to the cloud. (3) Once the data is uploaded to the cloud service, it is stored, then triggered for evaluation with the detection algorithm (hosted in a virtual server), and processed through the image recognition model, and the results are prepared in a messaging script. (4) Alerts are then passed from the cloud to smartwatches and smartphones.
Drones 04 00018 g003
Figure 4. Images of the testing protocol. Top Left: View from the ground of the blimp deployed. Top Middle: Shark analogue used for testing. Top Right: View of the beach from the blimp. Bottom Left: View from ground near the Kiama surf club. Bottom Right: Image of team performing testing at Kiama Surf Beach.
Figure 4. Images of the testing protocol. Top Left: View from the ground of the blimp deployed. Top Middle: Shark analogue used for testing. Top Right: View of the beach from the blimp. Bottom Left: View from ground near the Kiama surf club. Bottom Right: Image of team performing testing at Kiama Surf Beach.
Drones 04 00018 g004
Figure 5. Examples of user interface (UI) image of the Sharkmate App Alerts on watch and on smartphone. The measure ‘confidence danger’ was a gauge how confident the detection of the Deep Learning neural network was for the detection event observed.
Figure 5. Examples of user interface (UI) image of the Sharkmate App Alerts on watch and on smartphone. The measure ‘confidence danger’ was a gauge how confident the detection of the Deep Learning neural network was for the detection event observed.
Drones 04 00018 g005
Figure 6. Top: Detection of our shark analogue. Middle: Though a fortuitous encounter with a ray, we were able to test the system observing and detecting a real animal during the trial. Bottom: Several surfers were out enjoying the waves, and we used the opportunity to test the algorithm with success. Additionally, a paddleboarder (SUP) was set out to test the algorithm. Inserts: Sample screenshots of alerts displayed on the smartwatches.
Figure 6. Top: Detection of our shark analogue. Middle: Though a fortuitous encounter with a ray, we were able to test the system observing and detecting a real animal during the trial. Bottom: Several surfers were out enjoying the waves, and we used the opportunity to test the algorithm with success. Additionally, a paddleboarder (SUP) was set out to test the algorithm. Inserts: Sample screenshots of alerts displayed on the smartwatches.
Drones 04 00018 g006
Figure 7. Top left: Screenshots of watch emergency notification system from the perspective of a person in the water. Top right: Image of team members, Robert (Surf Life Saving Australia—Bellambi) and Kye (Professional Lifeguard Kiama City Beach), in testing with notification on the lifesavers watch when emergency system was activated. Bottom left: Picture of surfer in water activating the emergency SOS. Bottom right: Demonstration of a swimmer in trouble; a team member was deployed into the surf, the SOS was activated, and a mock rescue was completed to bring the swimmer back to shore.
Figure 7. Top left: Screenshots of watch emergency notification system from the perspective of a person in the water. Top right: Image of team members, Robert (Surf Life Saving Australia—Bellambi) and Kye (Professional Lifeguard Kiama City Beach), in testing with notification on the lifesavers watch when emergency system was activated. Bottom left: Picture of surfer in water activating the emergency SOS. Bottom right: Demonstration of a swimmer in trouble; a team member was deployed into the surf, the SOS was activated, and a mock rescue was completed to bring the swimmer back to shore.
Drones 04 00018 g007
Figure 8. Top Left: IoT Tracking on the Things Network. Top Middle: the IoT Gateway packaged with battery and secured to the blimp. Top Right: Live view of the Gateway launched on the blimp connected to the network during testing at Kiama. Bottom: Landscape view of tracking area. Blimp can be seen at the upper left part of the picture (arrowed)—the small dot in the middle is a lifeguard on a paddleboard with the beacon (arrowed). Tracking was achieved out to a shark sonar buoy ~1 km offshore.
Figure 8. Top Left: IoT Tracking on the Things Network. Top Middle: the IoT Gateway packaged with battery and secured to the blimp. Top Right: Live view of the Gateway launched on the blimp connected to the network during testing at Kiama. Bottom: Landscape view of tracking area. Blimp can be seen at the upper left part of the picture (arrowed)—the small dot in the middle is a lifeguard on a paddleboard with the beacon (arrowed). Tracking was achieved out to a shark sonar buoy ~1 km offshore.
Drones 04 00018 g008

Share and Cite

MDPI and ACS Style

Gorkin, R., III; Adams, K.; Berryman, M.J.; Aubin, S.; Li, W.; Davis, A.R.; Barthelemy, J. Sharkeye: Real-Time Autonomous Personal Shark Alerting via Aerial Surveillance. Drones 2020, 4, 18. https://doi.org/10.3390/drones4020018

AMA Style

Gorkin R III, Adams K, Berryman MJ, Aubin S, Li W, Davis AR, Barthelemy J. Sharkeye: Real-Time Autonomous Personal Shark Alerting via Aerial Surveillance. Drones. 2020; 4(2):18. https://doi.org/10.3390/drones4020018

Chicago/Turabian Style

Gorkin, Robert, III, Kye Adams, Matthew J Berryman, Sam Aubin, Wanqing Li, Andrew R Davis, and Johan Barthelemy. 2020. "Sharkeye: Real-Time Autonomous Personal Shark Alerting via Aerial Surveillance" Drones 4, no. 2: 18. https://doi.org/10.3390/drones4020018

APA Style

Gorkin, R., III, Adams, K., Berryman, M. J., Aubin, S., Li, W., Davis, A. R., & Barthelemy, J. (2020). Sharkeye: Real-Time Autonomous Personal Shark Alerting via Aerial Surveillance. Drones, 4(2), 18. https://doi.org/10.3390/drones4020018

Article Metrics

Back to TopTop