Next Article in Journal
Cyber-Physical System Security Based on Human Activity Recognition through IoT Cloud Computing
Previous Article in Journal
Blockchain-Based E-Commerce: A Review on Applications and Challenges
Previous Article in Special Issue
Design and Implementation of Two Immersive Audio and Video Communication Systems Based on Virtual Reality
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Editorial

Advances in Tangible and Embodied Interaction for Virtual and Augmented Reality

by
Jorge C. S. Cardoso
1,*,†,
André Perrotta
1,†,
Paula Alexandra Silva
1,† and
Pedro Martins
2,†
1
Department of Informatics Engineering, Centre for Informatics and Systems of the University of Coimbra, University of Coimbra, 3030-790 Coimbra, Portugal
2
Department of Architecture, Faculty of Sciences and Technology, University of Coimbra, 3030-790 Coimbra, Portugal
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Electronics 2023, 12(8), 1893; https://doi.org/10.3390/electronics12081893
Submission received: 10 April 2023 / Accepted: 12 April 2023 / Published: 17 April 2023
Virtual Reality (VR) and Augmented Reality (AR) technologies have the potential to revolutionise the way we interact with digital content. VR and AR technologies have seen tremendous progress in recent years, enabling novel and exciting ways to interact within virtual environments. An interesting approach to interaction in VR and AR is the use of tangible user interfaces, which leverage our innate understanding of the physical world, how our bodies move and interact with it, and our learned capabilities to manipulate physical objects. This Special Issue explores recent advances in the field of tangible and embodied interactions for virtual and augmented reality, as showcased by the following papers.
  • “Design and Implementation of Two Immersive Audio and Video Communication Systems Based on Virtual Reality” [1]: This paper presents two immersive communication systems that combine VR with audio and video to create a more natural and engaging communication experience. The researchers describe the technical details of the systems, including the hardware and software used, and provide a thorough analysis of the results of user testing. They also discuss the potential applications of the systems, such as in remote conferencing and virtual collaboration.
  • “An Interactive Augmented Reality Graph Visualization for Chinese Painters” [2]: This paper describes an interactive AR system that allows Chinese painters to explore and visualise complex graphs in an intuitive and immersive way. The authors discuss the design considerations that went into creating the system, such as the use of colour and motion to convey information, and provide detailed examples of how the system can be used in practice.
  • “Situating Learning in AR Fantasy, Design Considerations for AR Game-Based Learning for Children” [3]: This paper discusses the use of AR in educational games for children and presents design considerations for creating effective and engaging learning experiences. The authors provide an overview of the current state of the field, including the benefits and challenges of using AR in education, and present a set of best practices for designing AR games for children. They also describe several case studies of AR games that have been successfully implemented in real-world educational settings.
  • “Development of a Virtual Object Weight Recognition Algorithm Based on Pseudo-Haptics and the Development of Immersion Evaluation Technology” [4]: This paper presents a virtual weight recognition algorithm that uses pseudo-haptics to create a more realistic and immersive experience. The authors describe the technical details of the algorithm, including the software and hardware used, and provide a thorough analysis of the results of user testing.
  • “A 3D Image Registration Method for Laparoscopic Liver Surgery Navigation” [5]: This paper proposes a new method for 3D image registration in laparoscopic liver surgery navigation. The authors introduce a hybrid registration method that combines feature-based and intensity-based registration to improve the accuracy and robustness of the registration process. The method was tested on real patient data and showed promising results. Overall, the paper presents a new approach to improve the precision of laparoscopic AR navigation in minimally invasive abdominal surgery.
  • “Gaze-Based Interaction Intention Recognition in Virtual Reality” [6]: This paper explores the use of gaze-based interaction in VR and how it can be used to recognise user intentions. The paper discusses the potential of this technology to unlock intuitive new interaction schemes and proposes a classification model for recognising user intentions based on gaze data. The authors conducted experiments to test the accuracy of their model and found promising results. They also discuss potential future research directions for this technology.
  • “Personalized Virtual Reality Environments for Intervention with People with Disability” [7]: This paper discusses the use of personalised VR environments to provide rehabilitation and intervention for people with disabilities. The authors provide an overview of the current state of the field, including the benefits and challenges of using VR in rehabilitation, and describe several case studies of successful interventions using VR. They also discuss the potential implications of the technology for the field of disability services.
  • “Preoperative Virtual Reality Surgical Rehearsal of Renal Access during Percutaneous Nephrolithotomy: A Pilot Study” [8]: This paper proposes a preliminary study of PCNL surgical rehearsal using the Marion Surgical PCNL simulator, where preoperative CT scans of a patient are used to create a 3D model of the renal system. An experienced surgeon planned and practised the procedure in the simulator before performing the surgery in the operating room. Preliminary results suggest that surgical rehearsal using a combination of VR and haptic feedback strongly affects decision making during the procedure.
  • “Digital Taste in Mulsemedia Augmented Reality: Perspective on Developments and Challenges” [9]: This article reviews how AR can be used to stimulate and modulate the sensation of taste in humans using low-amplitude electrical signals. The article explores techniques from prominent research pools and proposes extensions to the already established technological architecture for taste stimulation and modulation. The goal is to integrate gustatory augmentation into the commercial market and create a viable multichannel medium for the transfer of sensory information. The article highlights benefits and limitations and proposes the use of modern technological extensions, including the Internet of Things, artificial intelligence, and machine learning.
  • “Visual Positioning System Based on 6D Object Pose Estimation Using Mobile Web” [10]: The article presents a new method of detecting 3D objects from a single image taken by a smartphone camera in indoor spaces and calculating the location of the smartphone to find users in those spaces. The proposed indoor visual positioning system for mobile devices is inexpensive, as it integrates deep learning and computer vision algorithms and does not require additional infrastructure. The method uses convolutional neural networks (CNNs) and real-time pose estimation to handle the entire 6D pose estimate and determine the location and direction of the camera. The estimated position is addressed to a voxel to determine a stable user position. The proposed voxel-addressed optimisation approach with camera 6D position estimation using RGB images outperforms current state-of-the-art methods using RGB depth or point cloud, and provides users with indoor information in a 3D AR model.
  • “Effects of Using Vibrotactile Feedback on Sound Localization by Deaf and Hard-of-Hearing People in Virtual Environments” [11]: This paper proposes a haptic VR suit that helps Deaf and Hard-of-Hearing (DHH) individuals complete sound-related VR tasks efficiently. The VR suit receives sound information wirelessly and indicates the direction of the sound source using vibrotactile feedback. The study suggests that using different setups of the VR suit can significantly improve VR task completion times. The results of mounting haptic devices on different positions on users’ bodies indicate that DHH users can complete a VR task significantly faster when two vibro-motors are mounted on their arms and ears compared to their thighs. In an additional study, it was found that there was no significant difference in task completion time when using four vibro-motors with the VR suit compared to using only two vibro-motors in users’ ears without the VR suit.
  • “Virtual/Augmented Reality for Rehabilitation Applications Using Electromyography as Control/Biofeedback: Systematic Literature Review” [12]: The article is a systematic literature review that explores whether there is a standardised protocol towards therapeutic applications of surface electromyography (sEMG) signals in VR and AR interfaces. The review found 40 relevant articles that focused on applications, such as neurological motor rehabilitation and prosthesis training, and processing algorithms such as artificial intelligence and direct control. The hardware used includes Myo Armband, Delsys, and proprietary equipment, and the VR/AR interfaces are training scene models, video games, and first-person views. The review concludes that there is no consensus regarding signal processing or classification criteria and proposes that future work should aim to standardise these technologies for adoption in clinical practice.
  • “Virtual Reality for Safe Testing and Development in Collaborative Robotics: Challenges and Perspectives” [13]: This paper explores the use of extended reality (XR), specifically VR, to test and develop collaboration between humans and collaborative robots (cobots). The use of XR simulations allows for evaluating collaboration without putting humans at risk, making it useful for dangerous scenarios. XR also enables combining human behavioural data, subjective self-reports, and biosignals to measure human comfort, stress, and cognitive load during collaboration. The paper suggests that XR has the potential to change the way cobots are designed, tested, and trained in a range of applications, from industry to healthcare and space operations.
These papers demonstrate the diverse range of applications and approaches to tangible and embodied interactions in virtual and augmented reality and highlight the potential of these technologies to create more natural and engaging user experiences. As the field continues to evolve, we can expect to see even more innovative and exciting developments in this area.

Funding

This work is funded by national funds through the FCT-Foundation for Science and Technology, I.P., within the scope of the project CISUC-UID/CEC/00326/2020 and by European Social Fund, through the Regional Operational Program Centro 2020.

Acknowledgments

The editors of this Special Issue would like to thank all the authors for their contributions and all the reviewers for their careful and timely reviews that helped improve the quality of this Special Issue.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Zhang, H.; Wang, J.; Li, Z.; Li, J. Design and Implementation of Two Immersive Audio and Video Communication Systems Based on Virtual Reality. Electronics 2023, 12, 1134. [Google Scholar] [CrossRef]
  2. Li, J.; Wang, Z. An Interactive Augmented Reality Graph Visualization for Chinese Painters. Electronics 2022, 11, 2367. [Google Scholar] [CrossRef]
  3. Zuo, T.; Jiang, J.; der Spek, E.V.; Birk, M.; Hu, J. Situating Learning in AR Fantasy, Design Considerations for AR Game-Based Learning for Children. Electronics 2022, 11, 2331. [Google Scholar] [CrossRef]
  4. Son, E.; Song, H.; Nam, S.; Kim, Y. Development of a Virtual Object Weight Recognition Algorithm Based on Pseudo-Haptics and the Development of Immersion Evaluation Technology. Electronics 2022, 11, 2274. [Google Scholar] [CrossRef]
  5. Li, D.; Wang, M. A 3D Image Registration Method for Laparoscopic Liver Surgery Navigation. Electronics 2022, 11, 1670. [Google Scholar] [CrossRef]
  6. Chen, X.L.; Hou, W.J. Gaze-Based Interaction Intention Recognition in Virtual Reality. Electronics 2022, 11, 1647. [Google Scholar] [CrossRef]
  7. Lagos Rodríguez, M.; García, Á.G.; Loureiro, J.P.; García, T.P. Personalized Virtual Reality Environments for Intervention with People with Disability. Electronics 2022, 11, 1586. [Google Scholar] [CrossRef]
  8. Sainsbury, B.; Wilz, O.; Ren, J.; Green, M.; Fergie, M.; Rossa, C. Preoperative Virtual Reality Surgical Rehearsal of Renal Access during Percutaneous Nephrolithotomy: A Pilot Study. Electronics 2022, 11, 1562. [Google Scholar] [CrossRef]
  9. Duggal, A.S.; Singh, R.; Gehlot, A.; Rashid, M.; Alshamrani, S.S.; AlGhamdi, A.S. Digital Taste in Mulsemedia Augmented Reality: Perspective on Developments and Challenges. Electronics 2022, 11, 1315. [Google Scholar] [CrossRef]
  10. Kim, J.Y.; Kim, I.S.; Yun, D.Y.; Jung, T.W.; Kwon, S.C.; Jung, K.D. Visual Positioning System Based on 6D Object Pose Estimation Using Mobile Web. Electronics 2022, 11, 865. [Google Scholar] [CrossRef]
  11. Mirzaei, M.; Kán, P.; Kaufmann, H. Effects of Using Vibrotactile Feedback on Sound Localization by Deaf and Hard-of-Hearing People in Virtual Environments. Electronics 2021, 10, 2794. [Google Scholar] [CrossRef]
  12. Toledo-Peral, C.L.; Vega-Martínez, G.; Mercado-Gutiérrez, J.A.; Rodríguez-Reyes, G.; Vera-Hernández, A.; Leija-Salas, L.; Gutiérrez-Martínez, J. Virtual/Augmented Reality for Rehabilitation Applications Using Electromyography as Control/Biofeedback: Systematic Literature Review. Electronics 2022, 11, 2271. [Google Scholar] [CrossRef]
  13. i Badia, S.B.; Silva, P.A.; Branco, D.; Pinto, A.; Carvalho, C.; Menezes, P.; Almeida, J.; Pilacinski, A. Virtual Reality for Safe Testing and Development in Collaborative Robotics: Challenges and Perspectives. Electronics 2022, 11, 1726. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Cardoso, J.C.S.; Perrotta, A.; Silva, P.A.; Martins, P. Advances in Tangible and Embodied Interaction for Virtual and Augmented Reality. Electronics 2023, 12, 1893. https://doi.org/10.3390/electronics12081893

AMA Style

Cardoso JCS, Perrotta A, Silva PA, Martins P. Advances in Tangible and Embodied Interaction for Virtual and Augmented Reality. Electronics. 2023; 12(8):1893. https://doi.org/10.3390/electronics12081893

Chicago/Turabian Style

Cardoso, Jorge C. S., André Perrotta, Paula Alexandra Silva, and Pedro Martins. 2023. "Advances in Tangible and Embodied Interaction for Virtual and Augmented Reality" Electronics 12, no. 8: 1893. https://doi.org/10.3390/electronics12081893

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop