Next Article in Journal
Next Article in Special Issue
Previous Article in Journal
Previous Article in Special Issue
Sensors 2014, 14(6), 9522-9545; doi:10.3390/s140609522
Article

Audio-Visual Perception System for a Humanoid Robotic Head

1,* , 2
, 1
, 2
, 2
 and 1
Received: 28 December 2013; in revised form: 7 May 2014 / Accepted: 20 May 2014 / Published: 28 May 2014
(This article belongs to the Special Issue State-of-the-Art Sensors Technology in Spain 2013)
View Full-Text   |   Download PDF [7298 KB, uploaded 21 June 2014]
Abstract: One of the main issues within the field of social robotics is to endow robots with the ability to direct attention to people with whom they are interacting. Different approaches follow bio-inspired mechanisms, merging audio and visual cues to localize a person using multiple sensors. However, most of these fusion mechanisms have been used in fixed systems, such as those used in video-conference rooms, and thus, they may incur difficulties when constrained to the sensors with which a robot can be equipped. Besides, within the scope of interactive autonomous robots, there is a lack in terms of evaluating the benefits of audio-visual attention mechanisms, compared to only audio or visual approaches, in real scenarios. Most of the tests conducted have been within controlled environments, at short distances and/or with off-line performance measurements. With the goal of demonstrating the benefit of fusing sensory information with a Bayes inference for interactive robotics, this paper presents a system for localizing a person by processing visual and audio data. Moreover, the performance of this system is evaluated and compared via considering the technical limitations of unimodal systems. The experiments show the promise of the proposed approach for the proactive detection and tracking of speakers in a human-robot interactive framework.
Keywords: multimodal perception; bio-inspired attention mechanism; human-robot interaction multimodal perception; bio-inspired attention mechanism; human-robot interaction
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Export to BibTeX |
EndNote


MDPI and ACS Style

Viciana-Abad, R.; Marfil, R.; Perez-Lorenzo, J.M.; Bandera, J.P.; Romero-Garces, A.; Reche-Lopez, P. Audio-Visual Perception System for a Humanoid Robotic Head. Sensors 2014, 14, 9522-9545.

AMA Style

Viciana-Abad R, Marfil R, Perez-Lorenzo JM, Bandera JP, Romero-Garces A, Reche-Lopez P. Audio-Visual Perception System for a Humanoid Robotic Head. Sensors. 2014; 14(6):9522-9545.

Chicago/Turabian Style

Viciana-Abad, Raquel; Marfil, Rebeca; Perez-Lorenzo, Jose M.; Bandera, Juan P.; Romero-Garces, Adrian; Reche-Lopez, Pedro. 2014. "Audio-Visual Perception System for a Humanoid Robotic Head." Sensors 14, no. 6: 9522-9545.


Sensors EISSN 1424-8220 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert