Open AccessThis article is
- freely available
User Localization During Human-Robot Interaction
Robotics Lab, Universidad Carlos III de Madrid, Av. de la Universidad 30, 28911 Leganés, Madrid, Spain
* Author to whom correspondence should be addressed.
Received: 8 June 2012; in revised form: 6 July 2012 / Accepted: 11 July 2012 / Published: 23 July 2012
Abstract: This paper presents a user localization system based on the fusion of visual information and sound source localization, implemented on a social robot called Maggie. One of the main requisites to obtain a natural interaction between human-human and human-robot is an adequate spatial situation between the interlocutors, that is, to be orientated and situated at the right distance during the conversation in order to have a satisfactory communicative process. Our social robot uses a complete multimodal dialog system which manages the user-robot interaction during the communicative process. One of its main components is the presented user localization system. To determine the most suitable allocation of the robot in relation to the user, a proxemic study of the human-robot interaction is required, which is described in this paper. The study has been made with two groups of users: children, aged between 8 and 17, and adults. Finally, at the end of the paper, experimental results with the proposed multimodal dialog system are presented.
Keywords: sound source localization; robot audition; social robot; array-microphone; phonotaxis; proxemics; dialog system
Citations to this Article
Cite This Article
MDPI and ACS Style
Alonso-Martín, F.; Gorostiza, J.F.; Malfaz, M.; Salichs, M.A. User Localization During Human-Robot Interaction. Sensors 2012, 12, 9913-9935.
Alonso-Martín F, Gorostiza JF, Malfaz M, Salichs MA. User Localization During Human-Robot Interaction. Sensors. 2012; 12(7):9913-9935.
Alonso-Martín, F.; Gorostiza, Javi F.; Malfaz, María; Salichs, Miguel A. 2012. "User Localization During Human-Robot Interaction." Sensors 12, no. 7: 9913-9935.