sensors-logo

Journal Browser

Journal Browser

IoT-Driven Bioacoustics Sensing

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Internet of Things".

Deadline for manuscript submissions: closed (10 March 2024) | Viewed by 8553

Special Issue Editors


E-Mail Website
Guest Editor
Department of Computing and Communications, Lancaster University, Lancaster LA1 4YW, UK
Interests: Internet of Things; fault-tolerant computing; software architecture; embedded software engineering; machine learning

E-Mail Website
Guest Editor
Computer Science, The British University in Dubai, Block 11, 1st and 2nd Floor, Dubai International Academic City P.O. Box 345015, Dubai, United Arab Emirates
Interests: Internet of Things; Systems of Systems engineering; requirements engineering for Systems of Systems; cyber-physical Systems of Systems and Systems Security Engineering (SSE) with a particular focus on Securing Smart Cities

Special Issue Information

Dear Colleagues,

The interconnection of the Internet of Things (IoT) has heralded automation and control solutions in almost all fields, both technical and non-technical. In recent years, low-cost IoT devices and machine learning have been combined with acoustic sensing to monitor biodiversity. IoT-driven bioacoustics is being used to capture and classify acoustic signals and temporal metadata as a way of identifying and studying the behaviour of species. The technology is non-invasive and is particularly useful in studies that involve remote analysis and visually inaccessible areas. Animal and insect acoustic signals contain species-specific information that can be used to infer the character, state, and behaviour of a species in different biological contexts. For example, insects use acoustic signals to attract mates, to send warnings, to trigger a defensive response, and to direct foragers to food sources. IoT-driven bioacoustics has applications in medical science, agriculture, biodiversity, pest management, and marine mammal acoustics, and there are numerous overlaps between these areas. The multi-disciplinary nature of IoT-driven bioacoustics offers the potential for new and exciting research. This Special Issue will collect original research papers and novel insights into the state, modelling, design, implementation, sustainability, and case-study experience of IoT-driven bioacoustics, techniques, and systems.

Dr. Gerald Kotonya
Dr. Cornelius Ncube
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Internet of Things
  • bioacoustics
  • machine learning
  • species
  • modelling
  • design
  • sustainability
  • implementation
  • case studies

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

16 pages, 4097 KiB  
Article
DualDiscWaveGAN-Based Data Augmentation Scheme for Animal Sound Classification
by Eunbeen Kim, Jaeuk Moon, Jonghwa Shim and Eenjun Hwang
Sensors 2023, 23(4), 2024; https://doi.org/10.3390/s23042024 - 10 Feb 2023
Cited by 3 | Viewed by 4204
Abstract
Animal sound classification (ASC) refers to the automatic identification of animal categories by sound, and is useful for monitoring rare or elusive wildlife. Thus far, deep-learning-based models have shown good performance in ASC when training data is sufficient, but suffer from severe performance [...] Read more.
Animal sound classification (ASC) refers to the automatic identification of animal categories by sound, and is useful for monitoring rare or elusive wildlife. Thus far, deep-learning-based models have shown good performance in ASC when training data is sufficient, but suffer from severe performance degradation if not. Recently, generative adversarial networks (GANs) have shown the potential to solve this problem by generating virtual data. However, in a multi-class environment, existing GAN-based methods need to construct separate generative models for each class. Additionally, they only consider the waveform or spectrogram of sound, resulting in poor quality of the generated sound. To overcome these shortcomings, we propose a two-step sound augmentation scheme using a class-conditional GAN. First, common features are learned from all classes of animal sounds, and multiple classes of animal sounds are generated based on the features that consider both waveforms and spectrograms using class-conditional GAN. Second, we select data from the generated data based on the confidence of the pretrained ASC model to improve classification performance. Through experiments, we show that the proposed method improves the accuracy of the basic ASC model by up to 18.3%, which corresponds to a performance improvement of 13.4% compared to the second-best augmentation method. Full article
(This article belongs to the Special Issue IoT-Driven Bioacoustics Sensing)
Show Figures

Figure 1

26 pages, 2771 KiB  
Article
A Review of Automated Bioacoustics and General Acoustics Classification Research
by Leah Mutanu, Jeet Gohil, Khushi Gupta, Perpetua Wagio and Gerald Kotonya
Sensors 2022, 22(21), 8361; https://doi.org/10.3390/s22218361 - 31 Oct 2022
Cited by 4 | Viewed by 3872
Abstract
Automated bioacoustics classification has received increasing attention from the research community in recent years due its cross-disciplinary nature and its diverse application. Applications in bioacoustics classification range from smart acoustic sensor networks that investigate the effects of acoustic vocalizations on species to context-aware [...] Read more.
Automated bioacoustics classification has received increasing attention from the research community in recent years due its cross-disciplinary nature and its diverse application. Applications in bioacoustics classification range from smart acoustic sensor networks that investigate the effects of acoustic vocalizations on species to context-aware edge devices that anticipate changes in their environment adapt their sensing and processing accordingly. The research described here is an in-depth survey of the current state of bioacoustics classification and monitoring. The survey examines bioacoustics classification alongside general acoustics to provide a representative picture of the research landscape. The survey reviewed 124 studies spanning eight years of research. The survey identifies the key application areas in bioacoustics research and the techniques used in audio transformation and feature extraction. The survey also examines the classification algorithms used in bioacoustics systems. Lastly, the survey examines current challenges, possible opportunities, and future directions in bioacoustics. Full article
(This article belongs to the Special Issue IoT-Driven Bioacoustics Sensing)
Show Figures

Figure 1

Back to TopTop