Next Article in Journal
Combining VR Visualization and Sonification for Immersive Exploration of Urban Noise Standards
Next Article in Special Issue
Guest Editors’ Introduction: Multimodal Technologies and Interaction in the Era of Automated Driving
Previous Article in Journal
Recognition of Tactile Facial Action Units by Individuals Who Are Blind and Sighted: A Comparative Study
Previous Article in Special Issue
Tell Them How They Did: Feedback on Operator Performance Helps Calibrate Perceived Ease of Use in Automated Driving
Article

Towards a Taxonomy for In-Vehicle Interactions Using Wearable Smart Textiles: Insights from a User-Elicitation Study

1
Department of Computer Science and Software Engineering, Xi’an Jiaotong-Liverpool University, Suzhou 215123, China
2
Department of Chemistry, Xi’an Jiaotong-Liverpool University, Suzhou 215123, China
3
School of Electrical Engineering, Electronics and Computer Science, University of Liverpool, Liverpool L69 3BX, UK
*
Author to whom correspondence should be addressed.
Multimodal Technol. Interact. 2019, 3(2), 33; https://doi.org/10.3390/mti3020033
Received: 15 March 2019 / Revised: 15 April 2019 / Accepted: 27 April 2019 / Published: 9 May 2019
Textiles are a vital and indispensable part of our clothing that we use daily. They are very flexible, often lightweight, and have a variety of application uses. Today, with the rapid developments in small and flexible sensing materials, textiles can be enhanced and used as input devices for interactive systems. Clothing-based wearable interfaces are suitable for in-vehicle controls. They can combine various modalities to enable users to perform simple, natural, and efficient interactions while minimizing any negative effect on their driving. Research on clothing-based wearable in-vehicle interfaces is still underexplored. As such, there is a lack of understanding of how to use textile-based input for in-vehicle controls. As a first step towards filling this gap, we have conducted a user-elicitation study to involve users in the process of designing in-vehicle interactions via a fabric-based wearable device. We have been able to distill a taxonomy of wrist and touch gestures for in-vehicle interactions using a fabric-based wrist interface in a simulated driving setup. Our results help drive forward the investigation of the design space of clothing-based wearable interfaces for in-vehicle secondary interactions. View Full-Text
Keywords: wearable interfaces; in-vehicle interactions; fabric-based wrist interfaces; user-elicitation wearable interfaces; in-vehicle interactions; fabric-based wrist interfaces; user-elicitation
Show Figures

Figure 1

MDPI and ACS Style

Nanjappan, V.; Shi, R.; Liang, H.-N.; Lau, K.K.-T.; Yue, Y.; Atkinson, K. Towards a Taxonomy for In-Vehicle Interactions Using Wearable Smart Textiles: Insights from a User-Elicitation Study. Multimodal Technol. Interact. 2019, 3, 33. https://doi.org/10.3390/mti3020033

AMA Style

Nanjappan V, Shi R, Liang H-N, Lau KK-T, Yue Y, Atkinson K. Towards a Taxonomy for In-Vehicle Interactions Using Wearable Smart Textiles: Insights from a User-Elicitation Study. Multimodal Technologies and Interaction. 2019; 3(2):33. https://doi.org/10.3390/mti3020033

Chicago/Turabian Style

Nanjappan, Vijayakumar, Rongkai Shi, Hai-Ning Liang, Kim K.-T. Lau, Yong Yue, and Katie Atkinson. 2019. "Towards a Taxonomy for In-Vehicle Interactions Using Wearable Smart Textiles: Insights from a User-Elicitation Study" Multimodal Technologies and Interaction 3, no. 2: 33. https://doi.org/10.3390/mti3020033

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop