Special Issue "Applications of Virtual, Augmented, and Mixed Reality"

A special issue of Applied Sciences (ISSN 2076-3417). This special issue belongs to the section "Computing and Artificial Intelligence".

Deadline for manuscript submissions: 30 May 2020.

Special Issue Editor

Prof. Jorge Martin-Gutierrez
E-Mail Website
Guest Editor
Técnicas y Proyectos en Ingeniería y Arquitectura, Universidad de La Laguna, Santa Cruz de Tenerife, Spain
Interests: augmented reality; virtual reality; mixed reality; human–computer interaction; wearable interaction; user experience; usability
Special Issues and Collections in MDPI journals

Special Issue Information

Dear Colleagues,

The term XR (extended reality), which includes the technology of virtual reality, augmented reality, and mixed reality, is beginning to become more widely known. In recent years, XR has made remarkable progress, and usage expectations are very high. There is no doubt about the potential of this technology.

As some basic research has come to fruition, expectations for XR have increased, as have opportunities for it to be applied in different fields. That way, these technologies provide great opportunities for education, medicine, architecture, Industry 4.0, e-commerce, gaming, healthcare, the military, emergency response, entertainment, engineering, advertising, entertainment, retail, etc., and we can consider that we are facing a technological change as big as the massive use of PC, internet or smartphone was at its time.

The applications that we can develop through smartphones, tablets, and new XR wearable (glasses and headset) devices that free workers and users from having to hold on to devices are more than we can imagine and can help to save time and reduce production costs, improving quality of life.

In the manufacturing field, the use of augmented reality has been the topic of conversation for years, but actual deployment has been slow. This is changing, however, as manufacturers explore the technology in their plants and move beyond pilots and trials to the wider, day-to-day use of AR. Although AR is still at an early stage in manufacturing, there is a lot of innovation going on, and a lot of movement in the industry around AR. On the other hand, XR provides great opportunities in education and training that are not possible with traditional instruction methods and other technologies used in education. VR, AR, and MR allow learners, in a safe way, to experience environments and virtual scenarios that would normally be dangerous to learn in. Even for academic institutions and companies, it is difficult to have some infrastructures to teach or train their learners or workers. Unlike some traditional instruction methods, VR, AR, and MR applications offer consistent education and training that do not vary from instructor to instructor. These virtual technologies also afford the development of psychomotor skills through physical 3D interactions with virtual elements. This is especially important when resources are limited for training purposes.

This Special Issue calls for many interesting studies, applications, and experiences that will open up new uses of XR. In addition to research that has steadily improved existing issues, we welcome research papers that present new possibilities of VR, AR, and MX. Topics of interest include but are not limited to the following:

  • VR/AR/MR applications: manufacturing, healthcare, virtual travel, e-sports, games, cultural heritage, military, e-commerce, military, psychology, medicine, emergency response, entertainment, engineering, advertising, etc.
  • Brain science for VR/AR/MX
  • VR/AR/MX collaboration
  • Context awareness for VR/AR
  • Education with VR/AR/MX
  • Use 360° video for VR
  • Display technologies for VR/AR/MX
  • Human–computer interactions in VR/AR/MX
  • Human factors in VR/AR/MX
  • Perception/presence in VR/AR/MX
  • Physiological sensing for VR/AR/MX
  • Cybersickness
  • User experience/usability in VR/AR/MX
  • Interfaces for VR/AR
  • Virtual humans/avatars in VR/AR/MX
  • Wellbeing with VR/AR/MX
  • Human behavior sensing
  • Gesture interface
  • Interactive simulation
  • New interaction design for VR/AR/MR
  • AR/VR devices and technologies integrated
  • Issues on real world and virtual world integration
  • Social aspects in VR/AR/MR interaction

Prof. Jorge Martin-Gutierrez
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Applied Sciences is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Virtual, augmented, and mixed reality
  • Interactive simulation
  • HCI (human–computer interaction)
  • Human-centered design

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Open AccessArticle
Towards Next Generation Technical Documentation in Augmented Reality Using a Context-Aware Information Manager
Appl. Sci. 2020, 10(3), 780; https://doi.org/10.3390/app10030780 - 22 Jan 2020
Abstract
Technical documentation is evolving from static contents presented on paper or via digital publishing to real-time on-demand contents displayed via virtual and augmented reality (AR) devices. However, how best to provide personalized and context-relevant presentation of technical information is still an open field [...] Read more.
Technical documentation is evolving from static contents presented on paper or via digital publishing to real-time on-demand contents displayed via virtual and augmented reality (AR) devices. However, how best to provide personalized and context-relevant presentation of technical information is still an open field of research. In particular, the systems described in the literature can manage a limited number of modalities to convey technical information, and do not consider the ‘people’ factor. Then, in this work, we present a Context-Aware Technical Information Management (CATIM) system, that dynamically manages (1) what information as well as (2) how information is presented in an augmented reality interface. The system was successfully implemented, and we made a first evaluation in the real industrial scenario of the maintenance of a hydraulic valve. We also measured the time performance of the system, and results revealed that CATIM performs fast enough to support interactive AR. Full article
(This article belongs to the Special Issue Applications of Virtual, Augmented, and Mixed Reality)
Show Figures

Figure 1

Open AccessArticle
The Limited Effect of Graphic Elements in Video and Augmented Reality on Children’s Listening Comprehension
Appl. Sci. 2020, 10(2), 527; https://doi.org/10.3390/app10020527 - 10 Jan 2020
Abstract
There is currently significant interest in the use of instructional strategies in learning environments thanks to the emergence of new multimedia systems that combine text, audio, graphics and video, such as augmented reality (AR). In this light, this study compares the effectiveness of [...] Read more.
There is currently significant interest in the use of instructional strategies in learning environments thanks to the emergence of new multimedia systems that combine text, audio, graphics and video, such as augmented reality (AR). In this light, this study compares the effectiveness of AR and video for listening comprehension tasks. The sample consisted of thirty-two elementary school students with different reading comprehension. Firstly, the experience, instructions and objectives were introduced to all the students. Next, they were divided into two groups to perform activities—one group performed an activity involving watching an Educational Video Story of the Laika dog and her Space Journey available by mobile devices app Blue Planet Tales, while the other performed an activity involving the use of AR, whose contents of the same history were visualized by means of the app Augment Sales. Once the activities were completed participants answered a comprehension test. Results (p = 0.180) indicate there are no meaningful differences between the lesson format and test performance. But there are differences between the participants of the AR group according to their reading comprehension level. With respect to the time taken to perform the comprehension test, there is no significant difference between the two groups but there is a difference between participants with a high and low level of comprehension. To conclude SUS (System Usability Scale) questionnaire was used to establish the measure usability for the AR app on a smartphone. An average score of 77.5 out of 100 was obtained in this questionnaire, which indicates that the app has fairly good user-centered design. Full article
(This article belongs to the Special Issue Applications of Virtual, Augmented, and Mixed Reality)
Show Figures

Figure 1

Open AccessArticle
The Influence of Display Parameters and Display Devices over Spatial Ability Test Answers in Virtual Reality Environments
Appl. Sci. 2020, 10(2), 526; https://doi.org/10.3390/app10020526 - 10 Jan 2020
Abstract
This manuscript analyzes the influence of display parameters and display devices over the spatial skills of the users in virtual reality environments. For this, the authors of this manuscript developed a virtual reality application which tests the spatial skills of the users. 240 [...] Read more.
This manuscript analyzes the influence of display parameters and display devices over the spatial skills of the users in virtual reality environments. For this, the authors of this manuscript developed a virtual reality application which tests the spatial skills of the users. 240 students used an LG desktop display and 61 students used the Gear VR for the tests. Statistical data are generated when the users do the tests and the following factors are logged by the application and evaluated in this manuscript: virtual camera type, virtual camera field of view, virtual camera rotation, contrast ratio parameters, the existence of shadows and the device used. The probabilities of correct answers were analyzed based on these factors by logistic regression (logit) analysis method. The influences and interactions of all factors were analyzed. The perspective camera, lighter contrast ratio, no or large camera rotations and the use of the Gear VR greatly and positively influenced the probability of correct answers on the tests. Therefore, for the assessment of spatial ability in virtual reality, the use of these parameters and device present the optimal user-centric human–computer interaction practice. Full article
(This article belongs to the Special Issue Applications of Virtual, Augmented, and Mixed Reality)
Show Figures

Figure 1

Back to TopTop