Next Article in Journal
DEADS: Depth and Energy Aware Dominating Set Based Algorithm for Cooperative Routing along with Sink Mobility in Underwater WSNs
Next Article in Special Issue
Performance Analysis of Several GPS/Galileo Precise Point Positioning Models
Previous Article in Journal
Submersible Spectrofluorometer for Real-Time Sensing of Water Quality
Previous Article in Special Issue
A Flight Test of the Strapdown Airborne Gravimeter SGA-WZ in Greenland
Article Menu

Export Article

Open AccessArticle
Sensors 2015, 15(6), 14435-14457; doi:10.3390/s150614435

Inertial Sensor-Based Touch and Shake Metaphor for Expressive Control of 3D Virtual Avatars

Graduate School of Advanced Imaging Science, Multimedia and Film Chung-Ang University, Seoul 156-756, Korea
*
Author to whom correspondence should be addressed.
Academic Editor: Vittorio M.N. Passaro
Received: 19 March 2015 / Revised: 27 May 2015 / Accepted: 12 June 2015 / Published: 18 June 2015
(This article belongs to the Special Issue Inertial Sensors and Systems)
View Full-Text   |   Download PDF [3560 KB, uploaded 18 June 2015]   |  

Abstract

In this paper, we present an inertial sensor-based touch and shake metaphor for expressive control of a 3D virtual avatar in a virtual environment. An intuitive six degrees-of-freedom wireless inertial motion sensor is used as a gesture and motion control input device with a sensor fusion algorithm. The algorithm enables user hand motions to be tracked in 3D space via magnetic, angular rate, and gravity sensors. A quaternion-based complementary filter is implemented to reduce noise and drift. An algorithm based on dynamic time-warping is developed for efficient recognition of dynamic hand gestures with real-time automatic hand gesture segmentation. Our approach enables the recognition of gestures and estimates gesture variations for continuous interaction. We demonstrate the gesture expressivity using an interactive flexible gesture mapping interface for authoring and controlling a 3D virtual avatar and its motion by tracking user dynamic hand gestures. This synthesizes stylistic variations in a 3D virtual avatar, producing motions that are not present in the motion database using hand gesture sequences from a single inertial motion sensor. View Full-Text
Keywords: inertial sensors; gestural interfaces; expressive control; gesture recognition; gesture variations; interactive systems; touch and shake; virtual avatar inertial sensors; gestural interfaces; expressive control; gesture recognition; gesture variations; interactive systems; touch and shake; virtual avatar
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Patil, S.; Chintalapalli, H.R.; Kim, D.; Chai, Y. Inertial Sensor-Based Touch and Shake Metaphor for Expressive Control of 3D Virtual Avatars. Sensors 2015, 15, 14435-14457.

Show more citation formats Show less citations formats

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Sensors EISSN 1424-8220 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top