Next Article in Journal
Compact Eucapnic Voluntary Hyperpnoea Apparatus for Exercise-Induced Respiratory Disease Detection
Next Article in Special Issue
Sound Power Estimation for Beam and Plate Structures Using Polyvinylidene Fluoride Films as Sensors
Previous Article in Journal
A Probabilistically Weakly Secure Network Coding Scheme in Multipath Routing for WSNs
Previous Article in Special Issue
An Automatic Localization Algorithm for Ultrasound Breast Tumors Based on Human Visual Mechanism
Article Menu
Issue 5 (May) cover image

Export Article

Open AccessArticle
Sensors 2017, 17(5), 1138; doi:10.3390/s17051138

Detecting and Classifying Human Touches in a Social Robot Through Acoustic Sensing and Machine Learning

Robotics Laboratory, Universidad Carlos III de Madrid, Av. de la Universidad 30, Leganés, 28911 Madrid, Spain
*
Author to whom correspondence should be addressed.
Academic Editors: Xiaoning Jiang and Chao Zhang
Received: 15 March 2017 / Revised: 9 May 2017 / Accepted: 10 May 2017 / Published: 16 May 2017
(This article belongs to the Special Issue Acoustic Sensing and Ultrasonic Drug Delivery)
View Full-Text   |   Download PDF [15346 KB, uploaded 16 May 2017]   |  

Abstract

An important aspect in Human–Robot Interaction is responding to different kinds of touch stimuli. To date, several technologies have been explored to determine how a touch is perceived by a social robot, usually placing a large number of sensors throughout the robot’s shell. In this work, we introduce a novel approach, where the audio acquired from contact microphones located in the robot’s shell is processed using machine learning techniques to distinguish between different types of touches. The system is able to determine when the robot is touched (touch detection), and to ascertain the kind of touch performed among a set of possibilities: stroke, tap, slap, and tickle (touch classification). This proposal is cost-effective since just a few microphones are able to cover the whole robot’s shell since a single microphone is enough to cover each solid part of the robot. Besides, it is easy to install and configure as it just requires a contact surface to attach the microphone to the robot’s shell and plug it into the robot’s computer. Results show the high accuracy scores in touch gesture recognition. The testing phase revealed that Logistic Model Trees achieved the best performance, with an F-score of 0.81. The dataset was built with information from 25 participants performing a total of 1981 touch gestures. View Full-Text
Keywords: acoustic sensing; touch interaction; contact microphone; human-robot interaction; machine learning acoustic sensing; touch interaction; contact microphone; human-robot interaction; machine learning
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Alonso-Martín, F.; Gamboa-Montero, J.J.; Castillo, J.C.; Castro-González, Á.; Salichs, M.Á. Detecting and Classifying Human Touches in a Social Robot Through Acoustic Sensing and Machine Learning. Sensors 2017, 17, 1138.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Sensors EISSN 1424-8220 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top