Special Issue "LifeXR: Concepts, Technology and Design for Everyday XR"

A special issue of Electronics (ISSN 2079-9292). This special issue belongs to the section "Computer Science & Engineering".

Deadline for manuscript submissions: 31 December 2021.

Special Issue Editors

Prof. Dr. Dongsik Jo
E-Mail Website
Guest Editor
School of IT convergence, University of Ulsan, Ulsan 44610, Korea
Interests: virtual/mixed reality; games
Prof. Dr. Gerard J. Kim
E-Mail Website
Guest Editor
Department of Computer Science and Engineering, Korea University, Seoul 02841, Korea
Interests: virtual/mixed reality; human–computer interaction
Dr. Jae-in Hwang
E-Mail Website
Guest Editor
Center for Imaging Media Research, Korea Institute of Science and Technology, Seoul 136-791, Korea
Interests: virtual/mixed reality; computer graphics; human–computer interaction

Special Issue Information

Dear Colleagues,

With the introduction of inexpensive lines of head-mounted displays circa 2012, the craze surrounding virtual reality (VR), with rosy prospects as the most promising and exciting future media form, was rekindled yet again. It also has sparked huge investments and further quick developments of related products, software, contents, devices, and sensors, and has even helped instigate a similar phenomenon for augmented reality (AR).

After about a decade, while VR/AR may have managed to become important media technologies for use in the industrial sector, the “consumer”-level mass market VR/AR has not become a reality. This Special Issue explores the concept of "LifeXR", namely, asking questions of where the current state of XR (i.e., VR/AR/AVR) stands in our everyday life for the average consumer, and exploring ways to make it happen. As an example, leveraging upon smartphones and extending the mobile XR platforms to be more usable and inexpensive would be one important way to realize LifeXR. Taking advantage of the recent 5G everywhere communication and IoT devices to secure the needed computational power to support realism and immersive user experience of the LifeXR platforms could be another. The types of content should go beyond typical areas such as games, training, and medicine to be used on a daily basis by the average users—their themes should be found in the midst of our daily lives.

In essence, we ask how can VR/AR or LifeXR improve the quality of human lives and make our mundane daily routines more exciting?

We invite papers that address the conceptualization of LifeXR, technologies specific to LifeXR, and design solutions to LifeXR including but not limited to:

  •  New mobile XR/LifeXR platforms;
  •  Usability of mobile XR/LifeXR;
  •  User experience of mobile XR/LifeXR;
  •  Sickness in mobile XR/LifeXR;
  •  Interaction design for mobile XR/LifeXR;
  •  Interfaces for mobile XR/LifeXR;
  •  Tracking for mobile XR/LifeXR;
  •  Context awareness for mobile XR/LifeXR;
  •  5G for mobile XR/LifeXR;
  •  IoT for mobile XR/LifeXR;
  •  Multimode XR and LifeXR;
  •  Multitasking LifeXR;
  •  LifeXR applications;
  •  LifeXR in everyday spaces (vehicle, office, kitchen, living room, restaurant, bed, etc.).

Prof. Dr. Dongsik Jo
Prof. Dr. Gerard J. Kim
Dr. Jae-in Hwang
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Electronics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • New mobile XR/LifeXR platforms
  • Usability of mobile XR/LifeXR
  • User experience of mobile XR/LifeXR
  • Sickness in mobile XR/LifeXR
  • Interaction design for mobile XR/LifeXR
  • Interfaces for mobile XR/LifeXR
  • Tracking for mobile XR/LifeXR
  • Context awareness for mobile XR/LifeXR
  • 5G for mobile XR/LifeXR
  • IoT for mobile XR/LifeXR
  • Multimode XR and LifeXR
  • Multitasking LifeXR
  • LifeXR applications
  • LifeXR in everyday spaces (vehicle, office, kitchen, living room, restaurant, bed, etc.)

Published Papers (8 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Article
SEOUL AR: Designing a Mobile AR Tour Application for Seoul Sky Observatory in South Korea
Electronics 2021, 10(20), 2552; https://doi.org/10.3390/electronics10202552 - 19 Oct 2021
Viewed by 283
Abstract
Skyscrapers are symbols of local landmarks, and their prevalence is increasing across the world owing to recent advances in architectural technology. In Korea, the Lotte World Tower, which is now the tallest skyscraper in Seoul, was constructed in 2017. In addition, it has [...] Read more.
Skyscrapers are symbols of local landmarks, and their prevalence is increasing across the world owing to recent advances in architectural technology. In Korea, the Lotte World Tower, which is now the tallest skyscraper in Seoul, was constructed in 2017. In addition, it has an observatory deck called Seoul Sky, which is currently in operation. This study focuses on the design of Seoul AR, which is a mobile augmented reality (AR) tour application. Visitors can use Seoul AR when visiting the Seoul Sky Observatory, one of the representative landmarks of Seoul, and enjoy a 360° view of the entire landscape of Seoul in the observatory space. With Seoul AR, they can identify tourist attractions in Seoul with simple mission games. Users are also provided with information regarding the specific attraction they are viewing, as well as other information on transportation, popular restaurants, shopping places, etc., in order to increase the level of satisfaction of tourists visiting the Seoul Sky Observatory. The final design is revised through heuristic evaluation, and a study of users’ levels of satisfaction with Seoul AR is conducted through surveys completed by visitors to the Seoul Sky Observatory. Full article
(This article belongs to the Special Issue LifeXR: Concepts, Technology and Design for Everyday XR)
Show Figures

Figure 1

Article
Multi-User Drone Flight Training in Mixed Reality
Electronics 2021, 10(20), 2521; https://doi.org/10.3390/electronics10202521 - 15 Oct 2021
Viewed by 201
Abstract
The development of services and applications involving drones is promoting the growth of the unmanned-aerial-vehicle industry. Moreover, the supply of low-cost compact drones has greatly contributed to the popularization of drone flying. However, flying first-person-view (FPV) drones requires considerable experience because the remote [...] Read more.
The development of services and applications involving drones is promoting the growth of the unmanned-aerial-vehicle industry. Moreover, the supply of low-cost compact drones has greatly contributed to the popularization of drone flying. However, flying first-person-view (FPV) drones requires considerable experience because the remote pilot views a video transmitted from a camera mounted on the drone. In this paper, we propose a remote training system for FPV drone flying in mixed reality. Thereby, beginners who are inexperienced in FPV drone flight control can practice under the guidance of remote experts. Full article
(This article belongs to the Special Issue LifeXR: Concepts, Technology and Design for Everyday XR)
Show Figures

Graphical abstract

Article
Integration of Extended Reality and a High-Fidelity Simulator in Team-Based Simulations for Emergency Scenarios
Electronics 2021, 10(17), 2170; https://doi.org/10.3390/electronics10172170 - 06 Sep 2021
Viewed by 902
Abstract
Wearable devices such as smart glasses are considered promising assistive tools for information exchange in healthcare settings. We aimed to evaluate the usability and feasibility of smart glasses for team-based simulations constructed using a high-fidelity simulator. Two scenarios of patients with arrhythmia were [...] Read more.
Wearable devices such as smart glasses are considered promising assistive tools for information exchange in healthcare settings. We aimed to evaluate the usability and feasibility of smart glasses for team-based simulations constructed using a high-fidelity simulator. Two scenarios of patients with arrhythmia were developed to establish a procedure for interprofessional interactions via smart glasses using 15-h simulation training. Three to four participants formed a team and played the roles of remote supporter or bed-side trainee with smart glasses. Usability, attitudes towards the interprofessional health care team and learning satisfaction were assessed. Using a 5-point Likert scale, from 1 (strongly disagree) to 5 (strongly agree), 31 participants reported that the smart glasses were easy to use (3.61 ± 0.95), that they felt confident during use (3.90 ± 0.87), and that that responded positively to long-term use (3.26 ± 0.89) and low levels of physical discomfort (1.96 ± 1.06). The learning satisfaction was high (4.65 ± 0.55), and most (84%) participants found the experience favorable. Key challenges included an unstable internet connection, poor resolution and display, and physical discomfort while using the smart glasses with accessories. We determined the feasibility and acceptability of smart glasses for interprofessional interactions within a team-based simulation. Participants responded favorably toward a smart glass-based simulation learning environment that would be applicable in clinical settings. Full article
(This article belongs to the Special Issue LifeXR: Concepts, Technology and Design for Everyday XR)
Show Figures

Figure 1

Article
Production of Mobile English Language Teaching Application Based on Text Interface Using Deep Learning
Electronics 2021, 10(15), 1809; https://doi.org/10.3390/electronics10151809 - 28 Jul 2021
Viewed by 451
Abstract
This paper proposes a novel text interface using deep learning in a mobile platform environment and presents the English language teaching applications created based on our interface. First, an interface for handwriting texts is designed with a simple structure based on a touch-based [...] Read more.
This paper proposes a novel text interface using deep learning in a mobile platform environment and presents the English language teaching applications created based on our interface. First, an interface for handwriting texts is designed with a simple structure based on a touch-based input method of mobile platform applications. This input method is easier and more convenient than the existing graphical user interface (GUI), in which menu items such as buttons are selected repeatedly or step by step. Next, an interaction that intuitively facilitates a behavior and decision making from the input text is proposed. We propose an interaction technique that recognizes a text handwritten on the text interface through the Extended Modified National Institute of Standards and Technology (EMNIST) dataset and a convolutional neural network (CNN) model and connects the text to a behavior. Finally, using the proposed interface, we create English language teaching applications that can effectively facilitate learning alphabet writing and words using handwriting. Then, the satisfaction regarding the interface during the educational process is analyzed and verified through a survey experiment with users. Full article
(This article belongs to the Special Issue LifeXR: Concepts, Technology and Design for Everyday XR)
Show Figures

Figure 1

Article
Planar-Equirectangular Image Stitching
Electronics 2021, 10(9), 1126; https://doi.org/10.3390/electronics10091126 - 10 May 2021
Viewed by 650
Abstract
The 360° cameras have served as a convenient tool for people to record their special moments or everyday lives. The supported panoramic view allowed for an immersive experience with a virtual reality (VR) headset, thus adding viewer enjoyment. Nevertheless, they cannot deliver the [...] Read more.
The 360° cameras have served as a convenient tool for people to record their special moments or everyday lives. The supported panoramic view allowed for an immersive experience with a virtual reality (VR) headset, thus adding viewer enjoyment. Nevertheless, they cannot deliver the best angular resolution images that a perspective camera may support. We put forward a solution by placing the perspective camera planar image onto the pertinent 360° camera equirectangular image region of interest (ROI) through planar-equirectangular image stitching. The proposed method includes (1) tangent image-based stitching pipeline to solve the equirectangular image spherical distortion, (2) feature matching scheme to increase correct feature match count, (3) ROI detection to find the relevant ROI on the equirectangular image, and (4) human visual system (HVS)-based image alignment to tackle the parallax error. The qualitative and quantitative experiments showed improvement of the proposed planar-equirectangular image stitching over existing approaches on a collected dataset: (1) less distortion on the stitching result, (2) 29.0% increased on correct matches, (3) 5.72° ROI position error from the ground truth and (4) lower aggregated alignment-distortion error over existing alignment approaches. We discuss possible improvement points and future research directions. Full article
(This article belongs to the Special Issue LifeXR: Concepts, Technology and Design for Everyday XR)
Show Figures

Figure 1

Article
CIRO: The Effects of Visually Diminished Real Objects on Human Perception in Handheld Augmented Reality
Electronics 2021, 10(8), 900; https://doi.org/10.3390/electronics10080900 - 09 Apr 2021
Viewed by 649
Abstract
Augmented reality (AR) scenes often inadvertently contain real world objects that are not relevant to the main AR content, such as arbitrary passersby on the street. We refer to these real-world objects as content-irrelevant real objects (CIROs). CIROs may distract users from focusing [...] Read more.
Augmented reality (AR) scenes often inadvertently contain real world objects that are not relevant to the main AR content, such as arbitrary passersby on the street. We refer to these real-world objects as content-irrelevant real objects (CIROs). CIROs may distract users from focusing on the AR content and bring about perceptual issues (e.g., depth distortion or physicality conflict). In a prior work, we carried out a comparative experiment investigating the effects on user perception of the AR content by the degree of the visual diminishment of such a CIRO. Our findings revealed that the diminished representation had positive impacts on human perception, such as reducing the distraction and increasing the presence of the AR objects in the real environment. However, in that work, the ground truth test was staged with perfect and artifact-free diminishment. In this work, we applied an actual real-time object diminishment algorithm on the handheld AR platform, which cannot be completely artifact-free in practice, and evaluated its performance both objectively and subjectively. We found that the imperfect diminishment and visual artifacts can negatively affect the subjective user experience. Full article
(This article belongs to the Special Issue LifeXR: Concepts, Technology and Design for Everyday XR)
Show Figures

Figure 1

Article
DeepHandsVR: Hand Interface Using Deep Learning in Immersive Virtual Reality
Electronics 2020, 9(11), 1863; https://doi.org/10.3390/electronics9111863 - 06 Nov 2020
Cited by 6 | Viewed by 1020
Abstract
This paper proposes a hand interface through a novel deep learning that provides easy and realistic interactions with hands in immersive virtual reality. The proposed interface is designed to provide a real-to-virtual direct hand interface using a controller to map a real hand [...] Read more.
This paper proposes a hand interface through a novel deep learning that provides easy and realistic interactions with hands in immersive virtual reality. The proposed interface is designed to provide a real-to-virtual direct hand interface using a controller to map a real hand gesture to a virtual hand in an easy and simple structure. In addition, a gesture-to-action interface that expresses the process of gesture to action in real-time without the necessity of a graphical user interface (GUI) used in existing interactive applications is proposed. This interface uses the method of applying image classification training process of capturing a 3D virtual hand gesture model as a 2D image using a deep learning model, convolutional neural network (CNN). The key objective of this process is to provide users with intuitive and realistic interactions that feature convenient operation in immersive virtual reality. To achieve this, an application that can compare and analyze the proposed interface and the existing GUI was developed. Next, a survey experiment was conducted to statistically analyze and evaluate the positive effects on the sense of presence through user satisfaction with the interface experience. Full article
(This article belongs to the Special Issue LifeXR: Concepts, Technology and Design for Everyday XR)
Show Figures

Figure 1

Article
Exploring the Effects of Scale and Color Differences on Users’ Perception for Everyday Mixed Reality (MR) Experience: Toward Comparative Analysis Using MR Devices
Electronics 2020, 9(10), 1623; https://doi.org/10.3390/electronics9101623 - 02 Oct 2020
Cited by 1 | Viewed by 842
Abstract
With continued technological innovations in the fields of mixed reality (MR), wearable type MR devices, such as head-mounted display (HMD), have been released and are frequently used in various fields, such as entertainment, training, education, and shopping. However, because each product has different [...] Read more.
With continued technological innovations in the fields of mixed reality (MR), wearable type MR devices, such as head-mounted display (HMD), have been released and are frequently used in various fields, such as entertainment, training, education, and shopping. However, because each product has different parts and specifications in terms of design and manufacturing process, users feel that the virtual objects overlaying real environments in MR are visualized differently, depending on the scale and color used by the MR device. In this paper, we compare the effect of scale and color parameters on users’ perceptions in using different types of MR devices to improve their MR experiences in real life. We conducted two experiments (scale and color), and our experimental study showed that the subjects who participated in the scale perception experiment clearly tended to underestimate virtual objects, in comparison with real objects, and overestimated color in MR environments. Full article
(This article belongs to the Special Issue LifeXR: Concepts, Technology and Design for Everyday XR)
Show Figures

Figure 1

Back to TopTop