Next Article in Journal
Highly Selective Polypyrrole MIP-Based Gravimetric and Electrochemical Sensors for Picomolar Detection of Glyphosate
Next Article in Special Issue
A Pneumatic Tactile Sensor for Co-Operative Robots
Previous Article in Journal
FieldSAFE: Dataset for Obstacle Detection in Agriculture
Previous Article in Special Issue
Multifunctional Woven Structure Operating as Triboelectric Energy Harvester, Capacitive Tactile Sensor Array, and Piezoresistive Strain Sensor Array
Article Menu
Issue 11 (November) cover image

Export Article

Open AccessArticle
Sensors 2017, 17(11), 2585;

Textile Pressure Mapping Sensor for Emotional Touch Detection in Human-Robot Interaction

German Research Center for Artificial Intelligence, 67663 Kaiserslautern, Germany
Department Computer Science, University of Kaiserslautern, 67663 Kaiserslautern, Germany
Swedish School of Textiles, University of Borås, 50190 Borås, Sweden
School of Informatics, University of Skövde, 54128 Skövde, Sweden
Institute for Clinical Science, Intervention and Technology, Karolinska Institutet, 17177 Stockholm, Sweden
Department Biomedical Engineering, Karolinska University Hospital, 14186 Stockholm, Sweden
Author to whom correspondence should be addressed.
Received: 30 September 2017 / Revised: 1 November 2017 / Accepted: 6 November 2017 / Published: 9 November 2017
(This article belongs to the Special Issue Tactile Sensors and Sensing)
Full-Text   |   PDF [16343 KB, uploaded 10 November 2017]   |  


In this paper, we developed a fully textile sensing fabric for tactile touch sensing as the robot skin to detect human-robot interactions. The sensor covers a 20-by-20 cm 2 area with 400 sensitive points and samples at 50 Hz per point. We defined seven gestures which are inspired by the social and emotional interactions of typical people to people or pet scenarios. We conducted two groups of mutually blinded experiments, involving 29 participants in total. The data processing algorithm first reduces the spatial complexity to frame descriptors, and temporal features are calculated through basic statistical representations and wavelet analysis. Various classifiers are evaluated and the feature calculation algorithms are analyzed in details to determine each stage and segments’ contribution. The best performing feature-classifier combination can recognize the gestures with a 93 . 3 % accuracy from a known group of participants, and 89 . 1 % from strangers. View Full-Text
Keywords: tactile sensing; smart textiles; human-robot interaction tactile sensing; smart textiles; human-robot interaction

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).

Share & Cite This Article

MDPI and ACS Style

Zhou, B.; Altamirano, C.A.V.; Zurian, H.C.; Atefi, S.R.; Billing, E.; Martinez, F.S.; Lukowicz, P. Textile Pressure Mapping Sensor for Emotional Touch Detection in Human-Robot Interaction. Sensors 2017, 17, 2585.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics



[Return to top]
Sensors EISSN 1424-8220 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top