Next Article in Journal
Tactile Cues for Improving Target Localization in Subjects with Tunnel Vision
Previous Article in Journal
Exploring the Development Requirements for Virtual Reality Gait Analysis
Previous Article in Special Issue
Tango vs. HoloLens: A Comparison of Collaborative Indoor AR Visualisations Using Hand-Held and Hands-Free Devices
Open AccessArticle

IMPAct: A Holistic Framework for Mixed Reality Robotic User Interface Classification and Design

1
Human-Computer Interaction (HCI), Department Informatics, University of Hamburg, 22527 Hamburg, Germany
2
Technical Aspects of Multimodal Systems (TAMS), Department Informatics, University of Hamburg, 22527 Hamburg, Germany
*
Author to whom correspondence should be addressed.
Multimodal Technologies Interact. 2019, 3(2), 25; https://doi.org/10.3390/mti3020025
Received: 28 February 2019 / Revised: 22 March 2019 / Accepted: 5 April 2019 / Published: 11 April 2019
(This article belongs to the Special Issue Mixed Reality Interfaces)
The number of scientific publications combining robotic user interfaces and mixed reality highly increased during the 21st Century. Counting the number of yearly added publications containing the keywords “mixed reality” and “robot” listed on Google Scholar indicates exponential growth. The interdisciplinary nature of mixed reality robotic user interfaces (MRRUI) makes them very interesting and powerful, but also very challenging to design and analyze. Many single aspects have already been successfully provided with theoretical structure, but to the best of our knowledge, there is no contribution combining everything into an MRRUI taxonomy. In this article, we present the results of an extensive investigation of relevant aspects from prominent classifications and taxonomies in the scientific literature. During a card sorting experiment with professionals from the field of human–computer interaction, these aspects were clustered into named groups for providing a new structure. Further categorization of these groups into four different categories was obvious and revealed a memorable structure. Thus, this article provides a framework of objective, technical factors, which finds its application in a precise description of MRRUIs. An example shows the effective use of the proposed framework for precise system description, therefore contributing to a better understanding, design, and comparison of MRRUIs in this growing field of research. View Full-Text
Keywords: mixed reality; robotic user interface taxonomy; human–robot interaction; robotic user interface; holistic user interface design; user interface analysis mixed reality; robotic user interface taxonomy; human–robot interaction; robotic user interface; holistic user interface design; user interface analysis
Show Figures

Graphical abstract

MDPI and ACS Style

Krupke, D.; Zhang, J.; Steinicke, F. IMPAct: A Holistic Framework for Mixed Reality Robotic User Interface Classification and Design. Multimodal Technologies Interact. 2019, 3, 25.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop