Next Article in Journal
Dimensionality Reduction and Subspace Clustering in Mixed Reality for Condition Monitoring of High-Dimensional Production Data
Next Article in Special Issue
On-Site 4-in-1 Alignment: Visualization and Interactive CAD Model Retrofitting Using UAV, LiDAR’s Point Cloud Data, and Video
Previous Article in Journal
Development of a Magnetic Compound Fluid Rubber Stability Sensor and a Novel Production Technique via Combination of Natural, Chloroprene and Silicone Rubbers
Previous Article in Special Issue
Motion Compensation for Radar Terrain Imaging Based on INS/GPS System
Open AccessArticle

On the Use of the AIRA-UAS Corpus to Evaluate Audio Processing Algorithms in Unmanned Aerial Systems

1
Instituto de Investigaciones en Matematicas Aplicadas y en Sistemas, Universidad Nacional Autonoma de Mexico (UNAM), Mexico City 04510, Mexico
2
Computer Science Department, Instituto Nacional de Astrofísica, Óptica y Electrónica (INAOE), Puebla 72840, Mexico
3
Computer Science Department, University of Bristol, Bristol BS8 1UB, UK
*
Author to whom correspondence should be addressed.
This paper is an extended version of our paper published in Ruiz-Espitia, O.; Martinez-Carranza, J.; Rascon, C. AIRA-UAS: An Evaluation Corpus for Audio Processing in Unmanned Aerial System. In Proceedings of the 2018 International Conference on Unmanned Aircraft Systems (ICUAS), Amsterdam, The Netherlands, 10–11 May 2018; pp. 836–845.
Sensors 2019, 19(18), 3902; https://doi.org/10.3390/s19183902
Received: 27 July 2019 / Revised: 3 September 2019 / Accepted: 6 September 2019 / Published: 10 September 2019
Audio analysis over an Unmanned Aerial Systems (UAS) is of interest it is an essential step for on-board sound source localization and separation. This could be useful for search & rescue operations, as well as for detection of unauthorized drone operations. In this paper, an analysis of the previously introduced Acoustic Interactions for Robot Audition (AIRA)-UAS corpus is presented, which is a set of recordings produced by the ego-noise of a drone performing different aerial maneuvers and by other drones flying nearby. It was found that the recordings have a very low Signal-to-Noise Ratio (SNR), that the noise is dynamic depending of the drone’s movements, and that their noise signatures are highly correlated. Three popular filtering techniques were evaluated in this work in terms of noise reduction and signature extraction, which are: Berouti’s Non-Linear Noise Subtraction, Adaptive Quantile Based Noise Estimation, and Improved Minima Controlled Recursive Averaging. Although there was moderate success in noise reduction, no filter was able to keep intact the signature of the drone flying in parallel. These results are evidence of the challenge in audio processing over drones, implying that this is a field prime for further research. View Full-Text
Keywords: corpus evaluation; AIRA-UAS; AQBNE; IMCRA; unmanned aerial systems corpus evaluation; AIRA-UAS; AQBNE; IMCRA; unmanned aerial systems
Show Figures

Figure 1

MDPI and ACS Style

Rascon, C.; Ruiz-Espitia, O.; Martinez-Carranza, J. On the Use of the AIRA-UAS Corpus to Evaluate Audio Processing Algorithms in Unmanned Aerial Systems. Sensors 2019, 19, 3902.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map

1
Back to TopTop