Next Article in Journal
An Ethanol Vapor Sensor Based on a Microfiber with a Quantum-Dot Gel Coating
Previous Article in Journal
An Analog-Front ROIC with On-Chip Non-Uniformity Compensation for Diode-Based Infrared Image Sensors
Previous Article in Special Issue
A Robust Deep Learning Approach for Position-Independent Smartphone-Based Human Activity Recognition
Article Menu
Issue 2 (January-2) cover image

Export Article

Open AccessArticle
Sensors 2019, 19(2), 299;

Navigating Virtual Environments Using Leg Poses and Smartphone Sensors

Information Technology Department, King Abdulaziz University, Jeddah 21589, Saudi Arabia
Department of Computer Science and Electrical Engineering, Singidunum University, 11000 Belgrade, Serbia
Author to whom correspondence should be addressed.
Received: 1 November 2018 / Revised: 3 January 2019 / Accepted: 10 January 2019 / Published: 13 January 2019
Full-Text   |   PDF [3063 KB, uploaded 13 January 2019]   |  
  |   Review Reports


Realization of navigation in virtual environments remains a challenge as it involves complex operating conditions. Decomposition of such complexity is attainable by fusion of sensors and machine learning techniques. Identifying the right combination of sensory information and the appropriate machine learning technique is a vital ingredient for translating physical actions to virtual movements. The contributions of our work include: (i) Synchronization of actions and movements using suitable multiple sensor units, and (ii) selection of the significant features and an appropriate algorithm to process them. This work proposes an innovative approach that allows users to move in virtual environments by simply moving their legs towards the desired direction. The necessary hardware includes only a smartphone that is strapped to the subjects’ lower leg. Data from the gyroscope, accelerometer and campus sensors of the mobile device are transmitted to a PC where the movement is accurately identified using a combination of machine learning techniques. Once the desired movement is identified, the movement of the virtual avatar in the virtual environment is realized. After pre-processing the sensor data using the box plot outliers approach, it is observed that Artificial Neural Networks provided the highest movement identification accuracy of 84.2% on the training dataset and 84.1% on testing dataset. View Full-Text
Keywords: virtual reality; mobile sensors; machine learning; feature selection; movement identification virtual reality; mobile sensors; machine learning; feature selection; movement identification

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).

Supplementary material


Share & Cite This Article

MDPI and ACS Style

Tsaramirsis, G.; Buhari, S.M.; Basheri, M.; Stojmenovic, M. Navigating Virtual Environments Using Leg Poses and Smartphone Sensors. Sensors 2019, 19, 299.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics



[Return to top]
Sensors EISSN 1424-8220 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top