sensors-logo

Journal Browser

Journal Browser

Special Issue "Sensing Technology in Virtual Reality"

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Physical Sensors".

Deadline for manuscript submissions: closed (20 November 2023) | Viewed by 5487

Special Issue Editors

Serious Games Research Group, Technical University of Darmstadt, Rundeturmstrasse 10, 64283 Darmstadt, Germany
Interests: serious games; standardization: serious games metadata format; RAL quality criteria; authoring; control; evaluation
Serious Games Research Group, Technical University of Darmstadt, Rundeturmstrasse 10, 64283 Darmstadt, Germany
Interests: serious games; virtual reality; full-body motion reconstruction; full-body motion recognition; inverse kinematics

Special Issue Information

Dear Colleagues,

Over the recent years, virtual reality technology has become increasingly popular, not only in the entertainment industry, but also in academic research, e.g., for education, training and simulation or health. Although many VR setups provide off-the-shelf devices for tracking, they often solely focus on providing sensors for interactions involving hands, e.g., various types of controllers. As one of the main features of virtual reality is to convey an illusion of being present in the virtual environment, VR-based applications should not be limited to high-end graphics or intuitive hand interactions; instead, immersive experiences should additionally stimulate all human senses, including sound, touch, force, taste, and smell.

This Special Issue aims to display innovative work exploring the potential of sensing technology for virtual reality applications. The topics of interest include, but are not limited to, the following:

  • Innovative (immersive) virtual reality-based applications using novel sensing technologies (e.g., full-body suits, infrared trackers, inertial measurement units, olfactory interfaces and smell dispensers, data gloves, etc.);
  • Methods for motion capture, e.g., reconstruction of full-body avatars;
  • Methods for motion recognition, e.g., recognition and assessment of physical activities with haptic feedback;
  • Systematic reviews, meta-analyses, and evaluation studies or pilot trials in related applications, e.g., clinical trials in health-related applications or VR-based simulation and training for forces, such as fireworkers or policemen.

Dr. Stefan Göbel
Dr. Polona Caserman
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • virtual reality
  • sensor technology
  • game controllers
  • haptics
  • wearables
  • motion capture
  • motion recognition
  • vital parameters
  • monitoring
  • immersive environments

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

12 pages, 1132 KiB  
Article
Influence of Normal Aging and Multisensory Data Fusion on Cybersickness and Postural Adaptation in Immersive Virtual Reality
Sensors 2023, 23(23), 9414; https://doi.org/10.3390/s23239414 - 26 Nov 2023
Viewed by 257
Abstract
Immersive Virtual Reality (VR) systems are expanding as sensorimotor readaptation tools for older adults. However, this purpose may be challenged by cybersickness occurrences possibly caused by sensory conflicts. This study aims to analyze the effects of aging and multisensory data fusion processes in [...] Read more.
Immersive Virtual Reality (VR) systems are expanding as sensorimotor readaptation tools for older adults. However, this purpose may be challenged by cybersickness occurrences possibly caused by sensory conflicts. This study aims to analyze the effects of aging and multisensory data fusion processes in the brain on cybersickness and the adaptation of postural responses when exposed to immersive VR. Methods: We repeatedly exposed 75 participants, aged 21 to 86, to immersive VR while recording the trajectory of their Center of Pressure (CoP). Participants rated their cybersickness after the first and fifth exposure. Results: The repeated exposures increased cybersickness and allowed for a decrease in postural responses from the second repetition, i.e., increased stability. We did not find any significant correlation between biological age and cybersickness scores. On the contrary, even if some postural responses are age-dependent, a significant postural adaptation occurred independently of age. The CoP trajectory length in the anteroposterior axis and mean velocity were the postural parameters the most affected by age and repetition. Conclusions: This study suggests that cybersickness and postural adaptation to immersive VR are not age-dependent and that cybersickness is unrelated to a deficit in postural adaptation or age. Age does not seem to influence the properties of multisensory data fusion. Full article
(This article belongs to the Special Issue Sensing Technology in Virtual Reality)
Show Figures

Figure 1

14 pages, 3381 KiB  
Article
Facial Motion Capture System Based on Facial Electromyogram and Electrooculogram for Immersive Social Virtual Reality Applications
Sensors 2023, 23(7), 3580; https://doi.org/10.3390/s23073580 - 29 Mar 2023
Cited by 1 | Viewed by 1602
Abstract
With the rapid development of virtual reality (VR) technology and the market growth of social network services (SNS), VR-based SNS have been actively developed, in which 3D avatars interact with each other on behalf of the users. To provide the users with more [...] Read more.
With the rapid development of virtual reality (VR) technology and the market growth of social network services (SNS), VR-based SNS have been actively developed, in which 3D avatars interact with each other on behalf of the users. To provide the users with more immersive experiences in a metaverse, facial recognition technologies that can reproduce the user’s facial gestures on their personal avatar are required. However, it is generally difficult to employ traditional camera-based facial tracking technology to recognize the facial expressions of VR users because a large portion of the user’s face is occluded by a VR head-mounted display (HMD). To address this issue, attempts have been made to recognize users’ facial expressions based on facial electromyogram (fEMG) recorded around the eyes. fEMG-based facial expression recognition (FER) technology requires only tiny electrodes that can be readily embedded in the HMD pad that is in contact with the user’s facial skin. Additionally, electrodes recording fEMG signals can simultaneously acquire electrooculogram (EOG) signals, which can be used to track the user’s eyeball movements and detect eye blinks. In this study, we implemented an fEMG- and EOG-based FER system using ten electrodes arranged around the eyes, assuming a commercial VR HMD device. Our FER system could continuously capture various facial motions, including five different lip motions and two different eyebrow motions, from fEMG signals. Unlike previous fEMG-based FER systems that simply classified discrete expressions, with the proposed FER system, natural facial expressions could be continuously projected on the 3D avatar face using machine-learning-based regression with a new concept named the virtual blend shape weight, making it unnecessary to simultaneously record fEMG and camera images for each user. An EOG-based eye tracking system was also implemented for the detection of eye blinks and eye gaze directions using the same electrodes. These two technologies were simultaneously employed to implement a real-time facial motion capture system, which could successfully replicate the user’s facial expressions on a realistic avatar face in real time. To the best of our knowledge, the concurrent use of fEMG and EOG for facial motion capture has not been reported before. Full article
(This article belongs to the Special Issue Sensing Technology in Virtual Reality)
Show Figures

Figure 1

18 pages, 5525 KiB  
Article
Augmented Reality Based Interactive Cooking Guide
Sensors 2022, 22(21), 8290; https://doi.org/10.3390/s22218290 - 28 Oct 2022
Cited by 2 | Viewed by 2583
Abstract
Cooking at home is a critical survival skill. We propose a new cooking assistance system in which a user only needs to wear an all-in-one augmented reality (AR) headset without having to install any external sensors or devices in the kitchen. Utilizing the [...] Read more.
Cooking at home is a critical survival skill. We propose a new cooking assistance system in which a user only needs to wear an all-in-one augmented reality (AR) headset without having to install any external sensors or devices in the kitchen. Utilizing the built-in camera and cutting-edge computer vision (CV) technology, the user can direct the AR headset to recognize available food ingredients by simply looking at them. Based on the types of the recognized food ingredients, suitable recipes are suggested accordingly. A step-by-step video tutorial providing details of the selected recipe is then displayed with the AR glasses. The user can conveniently interact with the proposed system using eight kinds of natural hand gestures without needing to touch any devices throughout the entire cooking process. Compared with the deep learning models ResNet and ResNeXt, experimental results show that the YOLOv5 achieves lower accuracy for ingredient recognition, but it can locate and classify multiple ingredients in one shot and make the scanning process easier for users. Twenty participants test the prototype system and provide feedback via two questionnaires. Based on the analysis results, 19 of the 20 participants would recommend others to use the proposed system, and all participants are overall satisfied with the prototype system. Full article
(This article belongs to the Special Issue Sensing Technology in Virtual Reality)
Show Figures

Figure 1

Back to TopTop