sensors-logo

Journal Browser

Journal Browser

Sensing Technology in Virtual Reality

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Physical Sensors".

Deadline for manuscript submissions: 31 July 2024 | Viewed by 9628

Special Issue Editors


E-Mail Website
Guest Editor
Serious Games Research Group, Technical University of Darmstadt, Rundeturmstrasse 10, 64283 Darmstadt, Germany
Interests: serious games; standardization: serious games metadata format; RAL quality criteria; authoring; control; evaluation

E-Mail Website
Guest Editor
Serious Games Research Group, Technical University of Darmstadt, Rundeturmstrasse 10, 64283 Darmstadt, Germany
Interests: serious games; virtual reality; full-body motion reconstruction; full-body motion recognition; inverse kinematics

Special Issue Information

Dear Colleagues,

Over the recent years, virtual reality technology has become increasingly popular, not only in the entertainment industry, but also in academic research, e.g., for education, training and simulation or health. Although many VR setups provide off-the-shelf devices for tracking, they often solely focus on providing sensors for interactions involving hands, e.g., various types of controllers. As one of the main features of virtual reality is to convey an illusion of being present in the virtual environment, VR-based applications should not be limited to high-end graphics or intuitive hand interactions; instead, immersive experiences should additionally stimulate all human senses, including sound, touch, force, taste, and smell.

This Special Issue aims to display innovative work exploring the potential of sensing technology for virtual reality applications. The topics of interest include, but are not limited to, the following:

  • Innovative (immersive) virtual reality-based applications using novel sensing technologies (e.g., full-body suits, infrared trackers, inertial measurement units, olfactory interfaces and smell dispensers, data gloves, etc.);
  • Methods for motion capture, e.g., reconstruction of full-body avatars;
  • Methods for motion recognition, e.g., recognition and assessment of physical activities with haptic feedback;
  • Systematic reviews, meta-analyses, and evaluation studies or pilot trials in related applications, e.g., clinical trials in health-related applications or VR-based simulation and training for forces, such as fireworkers or policemen.

Dr. Stefan Göbel
Dr. Polona Caserman
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • virtual reality
  • sensor technology
  • game controllers
  • haptics
  • wearables
  • motion capture
  • motion recognition
  • vital parameters
  • monitoring
  • immersive environments

Published Papers (5 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

23 pages, 5227 KiB  
Article
Characterization of Upper Extremity Kinematics Using Virtual Reality Movement Tasks and Wearable IMU Technology
by Skyler A. Barclay, Lanna N. Klausing, Tessa M. Hill, Allison L. Kinney, Timothy Reissman and Megan E. Reissman
Sensors 2024, 24(1), 233; https://doi.org/10.3390/s24010233 - 30 Dec 2023
Cited by 1 | Viewed by 865
Abstract
Task-specific training has been shown to be an effective neuromotor rehabilitation intervention, however, this repetitive approach is not always very engaging. Virtual reality (VR) systems are becoming increasingly popular in therapy due to their ability to encourage movement through customizable and immersive environments. [...] Read more.
Task-specific training has been shown to be an effective neuromotor rehabilitation intervention, however, this repetitive approach is not always very engaging. Virtual reality (VR) systems are becoming increasingly popular in therapy due to their ability to encourage movement through customizable and immersive environments. Additionally, VR can allow for a standardization of tasks that is often lacking in upper extremity research. Here, 16 healthy participants performed upper extremity movement tasks synced to music, using a commercially available VR game known as Beat Saber. VR tasks were customized to characterize participants’ joint angles with respect to each task’s specified cardinal direction (inward, outward, upward, or downward) and relative task location (medial, lateral, high, and/or low). Movement levels were designed using three common therapeutic approaches: (1) one arm moving only (unilateral), (2) two arms moving in mirrored directions about the participant’s midline (mirrored), or (3) two arms moving in opposing directions about the participant’s midline (opposing). Movement was quantified using an XSens System, a wearable inertial measurement unit (IMU) technology. Results reveal a highly engaging and effective approach to quantifying movement strategies. Inward and outward (horizontal) tasks resulted in decreased wrist extension. Upward and downward (vertical) tasks resulted in increased shoulder flexion, wrist radial deviation, wrist ulnar deviation, and elbow flexion. Lastly, compared to opposing, mirrored, and unilateral movement levels often exaggerated joint angles. Virtual reality games, like Beat Saber, offer a repeatable and customizable upper extremity intervention that has the potential to increase motivation in therapeutic applications. Full article
(This article belongs to the Special Issue Sensing Technology in Virtual Reality)
Show Figures

Figure 1

31 pages, 20979 KiB  
Article
Give Me a Sign: Using Data Gloves for Static Hand-Shape Recognition
by Philipp Achenbach, Sebastian Laux, Dennis Purdack, Philipp Niklas Müller and Stefan Göbel
Sensors 2023, 23(24), 9847; https://doi.org/10.3390/s23249847 - 15 Dec 2023
Cited by 1 | Viewed by 663
Abstract
Human-to-human communication via the computer is mainly carried out using a keyboard or microphone. In the field of virtual reality (VR), where the most immersive experience possible is desired, the use of a keyboard contradicts this goal, while the use of a microphone [...] Read more.
Human-to-human communication via the computer is mainly carried out using a keyboard or microphone. In the field of virtual reality (VR), where the most immersive experience possible is desired, the use of a keyboard contradicts this goal, while the use of a microphone is not always desirable (e.g., silent commands during task-force training) or simply not possible (e.g., if the user has hearing loss). Data gloves help to increase immersion within VR, as they correspond to our natural interaction. At the same time, they offer the possibility of accurately capturing hand shapes, such as those used in non-verbal communication (e.g., thumbs up, okay gesture, …) and in sign language. In this paper, we present a hand-shape recognition system using Manus Prime X data gloves, including data acquisition, data preprocessing, and data classification to enable nonverbal communication within VR. We investigate the impact on accuracy and classification time of using an outlier detection and a feature selection approach in our data preprocessing. To obtain a more generalized approach, we also studied the impact of artificial data augmentation, i.e., we created new artificial data from the recorded and filtered data to augment the training data set. With our approach, 56 different hand shapes could be distinguished with an accuracy of up to 93.28%. With a reduced number of 27 hand shapes, an accuracy of up to 95.55% could be achieved. The voting meta-classifier (VL2) proved to be the most accurate, albeit slowest, classifier. A good alternative is random forest (RF), which was even able to achieve better accuracy values in a few cases and was generally somewhat faster. outlier detection was proven to be an effective approach, especially in improving the classification time. Overall, we have shown that our hand-shape recognition system using data gloves is suitable for communication within VR. Full article
(This article belongs to the Special Issue Sensing Technology in Virtual Reality)
Show Figures

Figure 1

12 pages, 1132 KiB  
Article
Influence of Normal Aging and Multisensory Data Fusion on Cybersickness and Postural Adaptation in Immersive Virtual Reality
by Marie-Philippine Séba, Pauline Maillot, Sylvain Hanneton and Gilles Dietrich
Sensors 2023, 23(23), 9414; https://doi.org/10.3390/s23239414 - 26 Nov 2023
Viewed by 706
Abstract
Immersive Virtual Reality (VR) systems are expanding as sensorimotor readaptation tools for older adults. However, this purpose may be challenged by cybersickness occurrences possibly caused by sensory conflicts. This study aims to analyze the effects of aging and multisensory data fusion processes in [...] Read more.
Immersive Virtual Reality (VR) systems are expanding as sensorimotor readaptation tools for older adults. However, this purpose may be challenged by cybersickness occurrences possibly caused by sensory conflicts. This study aims to analyze the effects of aging and multisensory data fusion processes in the brain on cybersickness and the adaptation of postural responses when exposed to immersive VR. Methods: We repeatedly exposed 75 participants, aged 21 to 86, to immersive VR while recording the trajectory of their Center of Pressure (CoP). Participants rated their cybersickness after the first and fifth exposure. Results: The repeated exposures increased cybersickness and allowed for a decrease in postural responses from the second repetition, i.e., increased stability. We did not find any significant correlation between biological age and cybersickness scores. On the contrary, even if some postural responses are age-dependent, a significant postural adaptation occurred independently of age. The CoP trajectory length in the anteroposterior axis and mean velocity were the postural parameters the most affected by age and repetition. Conclusions: This study suggests that cybersickness and postural adaptation to immersive VR are not age-dependent and that cybersickness is unrelated to a deficit in postural adaptation or age. Age does not seem to influence the properties of multisensory data fusion. Full article
(This article belongs to the Special Issue Sensing Technology in Virtual Reality)
Show Figures

Figure 1

14 pages, 3381 KiB  
Article
Facial Motion Capture System Based on Facial Electromyogram and Electrooculogram for Immersive Social Virtual Reality Applications
by Chunghwan Kim, Ho-Seung Cha, Junghwan Kim, HwyKuen Kwak, WooJin Lee and Chang-Hwan Im
Sensors 2023, 23(7), 3580; https://doi.org/10.3390/s23073580 - 29 Mar 2023
Cited by 2 | Viewed by 2265
Abstract
With the rapid development of virtual reality (VR) technology and the market growth of social network services (SNS), VR-based SNS have been actively developed, in which 3D avatars interact with each other on behalf of the users. To provide the users with more [...] Read more.
With the rapid development of virtual reality (VR) technology and the market growth of social network services (SNS), VR-based SNS have been actively developed, in which 3D avatars interact with each other on behalf of the users. To provide the users with more immersive experiences in a metaverse, facial recognition technologies that can reproduce the user’s facial gestures on their personal avatar are required. However, it is generally difficult to employ traditional camera-based facial tracking technology to recognize the facial expressions of VR users because a large portion of the user’s face is occluded by a VR head-mounted display (HMD). To address this issue, attempts have been made to recognize users’ facial expressions based on facial electromyogram (fEMG) recorded around the eyes. fEMG-based facial expression recognition (FER) technology requires only tiny electrodes that can be readily embedded in the HMD pad that is in contact with the user’s facial skin. Additionally, electrodes recording fEMG signals can simultaneously acquire electrooculogram (EOG) signals, which can be used to track the user’s eyeball movements and detect eye blinks. In this study, we implemented an fEMG- and EOG-based FER system using ten electrodes arranged around the eyes, assuming a commercial VR HMD device. Our FER system could continuously capture various facial motions, including five different lip motions and two different eyebrow motions, from fEMG signals. Unlike previous fEMG-based FER systems that simply classified discrete expressions, with the proposed FER system, natural facial expressions could be continuously projected on the 3D avatar face using machine-learning-based regression with a new concept named the virtual blend shape weight, making it unnecessary to simultaneously record fEMG and camera images for each user. An EOG-based eye tracking system was also implemented for the detection of eye blinks and eye gaze directions using the same electrodes. These two technologies were simultaneously employed to implement a real-time facial motion capture system, which could successfully replicate the user’s facial expressions on a realistic avatar face in real time. To the best of our knowledge, the concurrent use of fEMG and EOG for facial motion capture has not been reported before. Full article
(This article belongs to the Special Issue Sensing Technology in Virtual Reality)
Show Figures

Figure 1

18 pages, 5525 KiB  
Article
Augmented Reality Based Interactive Cooking Guide
by Isaias Majil, Mau-Tsuen Yang and Sophia Yang
Sensors 2022, 22(21), 8290; https://doi.org/10.3390/s22218290 - 28 Oct 2022
Cited by 4 | Viewed by 3712
Abstract
Cooking at home is a critical survival skill. We propose a new cooking assistance system in which a user only needs to wear an all-in-one augmented reality (AR) headset without having to install any external sensors or devices in the kitchen. Utilizing the [...] Read more.
Cooking at home is a critical survival skill. We propose a new cooking assistance system in which a user only needs to wear an all-in-one augmented reality (AR) headset without having to install any external sensors or devices in the kitchen. Utilizing the built-in camera and cutting-edge computer vision (CV) technology, the user can direct the AR headset to recognize available food ingredients by simply looking at them. Based on the types of the recognized food ingredients, suitable recipes are suggested accordingly. A step-by-step video tutorial providing details of the selected recipe is then displayed with the AR glasses. The user can conveniently interact with the proposed system using eight kinds of natural hand gestures without needing to touch any devices throughout the entire cooking process. Compared with the deep learning models ResNet and ResNeXt, experimental results show that the YOLOv5 achieves lower accuracy for ingredient recognition, but it can locate and classify multiple ingredients in one shot and make the scanning process easier for users. Twenty participants test the prototype system and provide feedback via two questionnaires. Based on the analysis results, 19 of the 20 participants would recommend others to use the proposed system, and all participants are overall satisfied with the prototype system. Full article
(This article belongs to the Special Issue Sensing Technology in Virtual Reality)
Show Figures

Figure 1

Back to TopTop