Next Article in Journal
Methods for Simultaneous Robot-World-Hand–Eye Calibration: A Comparative Study
Next Article in Special Issue
Human–Robot–Environment Interaction Interface for Smart Walker Assisted Gait: AGoRA Walker
Previous Article in Journal
Real-time Precise Point Positioning with a Xiaomi MI 8 Android Smartphone
Previous Article in Special Issue
Directional Forgetting for Stable Co-Adaptation in Myoelectric Control
Open AccessArticle

AMiCUS—A Head Motion-Based Interface for Control of an Assistive Robot

Group of Sensors and Actuators, Department of Electrical Engineering and Applied Physics, Westphalian University of Applied Sciences, 45877 Gelsenkirchen, Germany
*
Author to whom correspondence should be addressed.
Current address: Xsens Technologies B.V., P.O. Box 559, 7500 AN Enschede, The Netherlands.
Sensors 2019, 19(12), 2836; https://doi.org/10.3390/s19122836
Received: 23 May 2019 / Revised: 13 June 2019 / Accepted: 18 June 2019 / Published: 25 June 2019
(This article belongs to the Special Issue Assistance Robotics and Biosensors 2019)
Within this work we present AMiCUS, a Human-Robot Interface that enables tetraplegics to control a multi-degree of freedom robot arm in real-time using solely head motion, empowering them to perform simple manipulation tasks independently. The article describes the hardware, software and signal processing of AMiCUS and presents the results of a volunteer study with 13 able-bodied subjects and 6 tetraplegics with severe head motion limitations. As part of the study, the subjects performed two different pick-and-place tasks. The usability was assessed with a questionnaire. The overall performance and the main control elements were evaluated with objective measures such as completion rate and interaction time. The results show that the mapping of head motion onto robot motion is intuitive and the given feedback is useful, enabling smooth, precise and efficient robot control and resulting in high user-acceptance. Furthermore, it could be demonstrated that the robot did not move unintendedly, giving a positive prognosis for safety requirements in the framework of a certification of a product prototype. On top of that, AMiCUS enabled every subject to control the robot arm, independent of prior experience and degree of head motion limitation, making the system available for a wide range of motion impaired users. View Full-Text
Keywords: assistive technology; human-machine interaction; motion sensors; robot control; tetraplegia; IMU; AHRS; head control; gesture recognition; real-time control assistive technology; human-machine interaction; motion sensors; robot control; tetraplegia; IMU; AHRS; head control; gesture recognition; real-time control
Show Figures

Figure 1

MDPI and ACS Style

Rudigkeit, N.; Gebhard, M. AMiCUS—A Head Motion-Based Interface for Control of an Assistive Robot. Sensors 2019, 19, 2836.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop