Special Issue "Advances in Mobile Augmented Reality"

A special issue of Computers (ISSN 2073-431X).

Deadline for manuscript submissions: closed (31 March 2018)

Special Issue Editor

Guest Editor
Professor M. Carmen Juan

Instituto de Automática e Informática Industrial (ai2), Departamento de Sistemas Informáticos y Computación (DSIC), Universitat Politècnica de València (UPV), 46022 València, Spain
Website | E-Mail
Interests: computer graphics; specifically augmented reality (AR); advanced user interfaces and their applications to psychology and education/edutainment

Special Issue Information

Dear Colleagues,

Mobile Augmented Reality is at an optimum time for users to enjoy incredible experiences in the palm of their hands. There are mainly two reasons for its success. On the one hand, the hardware, and, on the other, the available tools for programming. Regarding hardware, smartphones, not only have fast CPUs, large displays, cameras with 16 MP, graphics acceleration, compass/accelerometers, GPS sensors or gyroscopes, but they can also include depth and motion tracking sensors. The possibility to combine head-mount viewers with smartphones and have an augmented reality platform has opened a new niche for research and the market.

The current situation raises new challenges for researchers to develop new applications that take advantage of existing technology, as well as to contribute to improve it. This implies topics such as interaction techniques, tracking and rendering methods, new development tools, new uses and applications, methodologies for evaluation, questionnaires adapted to these concrete experiences, etc.

This Special Issue will cover the lastest contributions to Mobile Augmented Reality. Appropriate topics include, but are not limited to:

  • Tracking
  • Rendering and visualization techniques
  • New devices, development tools, uses and applications
  • Evaluation methods/usability evaluation
  • Human factors
Professor M. Carmen Juan

Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Computers is an international peer-reviewed open access quarterly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 350 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Mobile Augmented Reality
  • Devices
  • Development tools
  • New uses
  • Applications
  • Interaction
  • Tracking
  • Rendering
  • Visualization
  • Usability

Published Papers (6 papers)

View options order results:
result details:
Displaying articles 1-6
Export citation of selected articles as:

Research

Jump to: Review

Open AccessArticle User Experience in Mobile Augmented Reality: Emotions, Challenges, Opportunities and Best Practices
Received: 20 April 2018 / Revised: 9 May 2018 / Accepted: 18 May 2018 / Published: 21 May 2018
PDF Full-text (2645 KB) | HTML Full-text | XML Full-text
Abstract
Mobile Augmented Reality (MAR) is gaining a strong momentum to become a major interactive technology that can be applied across domains and purposes. The rapid proliferation of MAR applications in global mobile application markets has been fueled by a range of freely-available MAR
[...] Read more.
Mobile Augmented Reality (MAR) is gaining a strong momentum to become a major interactive technology that can be applied across domains and purposes. The rapid proliferation of MAR applications in global mobile application markets has been fueled by a range of freely-available MAR software development kits and content development tools, some of which enable the creation of MAR applications even without programming skills. Despite the recent advances of MAR technology and tools, there are still many challenges associated with MAR from the User Experience (UX) design perspective. In this study, we first define UX as the emotions that the user encounters while using a service, a product or an application and then explore the recent research on the topic. We present two case studies, a commercial MAR experience and our own Virtual Campus Tour MAR application, and evaluate them from the UX perspective, with a focus on emotions. Next, we synthesize the findings from previous research and the results of the case study evaluations to form sets of challenges, opportunities and best practices related to UX design of MAR applications. Based on the identified best practices, we finally present an updated version of the Virtual Campus Tour. The results can be used for improving UX design of future MAR applications, thus making them emotionally engaging. Full article
(This article belongs to the Special Issue Advances in Mobile Augmented Reality)
Figures

Figure 1

Open AccessArticle Mobile Educational Augmented Reality Games: A Systematic Literature Review and Two Case Studies
Received: 31 January 2018 / Revised: 28 February 2018 / Accepted: 1 March 2018 / Published: 3 March 2018
Cited by 2 | PDF Full-text (2836 KB) | HTML Full-text | XML Full-text
Abstract
Augmented reality (AR) has evolved from research projects into mainstream applications that cover diverse fields, such as entertainment, health, business, tourism and education. In particular, AR games, such as Pokémon Go, have contributed to introducing the AR technology to the general public. The
[...] Read more.
Augmented reality (AR) has evolved from research projects into mainstream applications that cover diverse fields, such as entertainment, health, business, tourism and education. In particular, AR games, such as Pokémon Go, have contributed to introducing the AR technology to the general public. The proliferation of modern smartphones and tablets with large screens, cameras, and high processing power has ushered in mobile AR applications that can provide context-sensitive content to users whilst freeing them to explore the context. To avoid ambiguity, I define mobile AR as a type of AR where a mobile device (smartphone or tablet) is used to display and interact with virtual content that is overlaid on top of a real-time camera feed of the real world. Beyond being mere entertainment, AR and games have been shown to possess significant affordances for learning. Although previous research has done a decent job of reviewing research on educational AR applications, I identified a need for a comprehensive review on research related to educational mobile AR games (EMARGs). This paper explored the research landscape on EMARGs over the period 2012–2017 through a systematic literature review complemented by two case studies in which the author participated. After a comprehensive literature search and filtering, I analyzed 31 EMARGs from the perspectives of technology, pedagogy, and gaming. Moreover, I presented an analysis of 26 AR platforms that can be used to create mobile AR applications. I then discussed the results in depth and synthesized my interpretations into 13 guidelines for future EMARG developers. Full article
(This article belongs to the Special Issue Advances in Mobile Augmented Reality)
Figures

Figure 1

Open AccessArticle Users’ Perceptions Using Low-End and High-End Mobile-Rendered HMDs: A Comparative Study
Received: 10 January 2018 / Revised: 8 February 2018 / Accepted: 9 February 2018 / Published: 13 February 2018
PDF Full-text (6829 KB) | HTML Full-text | XML Full-text
Abstract
Currently, it is possible to combine Mobile-Rendered Head-Mounted Displays (MR HMDs) with smartphones to have Augmented Reality platforms. The differences between these types of platforms can affect the user’s experiences and satisfaction. This paper presents a study that analyses the user’s perception when
[...] Read more.
Currently, it is possible to combine Mobile-Rendered Head-Mounted Displays (MR HMDs) with smartphones to have Augmented Reality platforms. The differences between these types of platforms can affect the user’s experiences and satisfaction. This paper presents a study that analyses the user’s perception when using the same Augmented Reality app with two MR HMD (low-end and high-end). Our study evaluates the user’s experience taking into account several factors (control, sensory, distraction, ergonomics and realism). An Augmalpha-lowerented Reality app was developed to carry out the comparison for two MR HMDs. The application had exactly the same visual appearance and functionality for both devices. Forty adults participated in our study. From the results, there were no statistically significant differences for the users’ experience for the different factors when using the two MR HMDs, except for the ergonomic factors in favour of the high-end MR HMD. Even though the scores for the high-end MR HMD were higher in nearly all of the questions, both MR HMDs provided a very satisfying viewing experience with very high scores. The results were independent of gender and age. The participants rated the high-end MR HMD as the best one. Nevertheless, when they were asked which MR HMD they would buy, the participants chose the low-end MR HMD taking into account its price. Full article
(This article belongs to the Special Issue Advances in Mobile Augmented Reality)
Figures

Figure 1

Open AccessFeature PaperArticle 6DoF Object Tracking based on 3D Scans for Augmented Reality Remote Live Support
Received: 30 November 2017 / Revised: 22 December 2017 / Accepted: 29 December 2017 / Published: 2 January 2018
Cited by 2 | PDF Full-text (13389 KB) | HTML Full-text | XML Full-text
Abstract
Tracking the 6DoF pose of arbitrary 3D objects is a fundamental topic in Augmented Reality (AR) research, having received a large amount of interest in the last decades. The necessity of accurate and computationally efficient object tracking is evident for a broad base
[...] Read more.
Tracking the 6DoF pose of arbitrary 3D objects is a fundamental topic in Augmented Reality (AR) research, having received a large amount of interest in the last decades. The necessity of accurate and computationally efficient object tracking is evident for a broad base of today’s AR applications. In this work we present a fully comprehensive pipeline for 6DoF Object Tracking based on 3D scans of objects, covering object registration, initialization and frame to frame tracking, implemented to optimize the user experience and to perform well in all typical challenging conditions such as fast motion, occlusions and illumination changes. Furthermore, we present the deployment of our tracking system in a Remote Live Support AR application with 3D object-aware registration of annotations and remote execution for delay and performance optimization. Experimental results demonstrate the tracking quality, real-time capability and the advantages of remote execution for computationally less powerful mobile devices. Full article
(This article belongs to the Special Issue Advances in Mobile Augmented Reality)
Figures

Figure 1

Open AccessArticle DARGS: Dynamic AR Guiding System for Indoor Environments
Received: 21 November 2017 / Revised: 14 December 2017 / Accepted: 24 December 2017 / Published: 28 December 2017
PDF Full-text (5068 KB) | HTML Full-text | XML Full-text
Abstract
Complex public buildings, such as airports, use various systems to guide people to a certain destination. Such approaches are usually implemented by showing a floor plan that has guiding signs or color coded lines on the floor. With a technology that supports six
[...] Read more.
Complex public buildings, such as airports, use various systems to guide people to a certain destination. Such approaches are usually implemented by showing a floor plan that has guiding signs or color coded lines on the floor. With a technology that supports six degrees of freedom (6DoF) tracking in indoor environments, it is possible to guide people individually, thereby considering obstacles, path lengths, or pathways for handicapped people. With an augmented reality (AR) device, such as a smart phone or AR glasses, the path can be presented on top of the real environment. In this paper, we present DARGS, an algorithm, which calculates a path through a complex building in real time. Usual path planning algorithms use either shortest paths or dynamic paths for robot interaction. The human factor in a real environment is not considered. The main advantage of DARGS is the incorporation of the current field of view (FOV) of the used device to visualize a more dynamic presentation. Rather than searching for the AR content with a small FOV, with the presented approach the user always gets a meaningful three-dimensional overlay of the path independent of the viewing direction. A detailed user study is performed to prove the applicability of the system. The results indicate that the presented system is especially helpful in the first few important seconds of the guiding process, when the user is still disoriented. Full article
(This article belongs to the Special Issue Advances in Mobile Augmented Reality)
Figures

Figure 1

Review

Jump to: Research

Open AccessReview Recommendations for Integrating a P300-Based Brain Computer Interface in Virtual Reality Environments for Gaming
Received: 8 March 2018 / Revised: 11 May 2018 / Accepted: 18 May 2018 / Published: 28 May 2018
PDF Full-text (1837 KB) | HTML Full-text | XML Full-text
Abstract
The integration of a P300-based brain–computer interface (BCI) into virtual reality (VR) environments is promising for the video games industry. However, it faces several limitations, mainly due to hardware constraints and constraints engendered by the stimulation needed by the BCI. The main limitation
[...] Read more.
The integration of a P300-based brain–computer interface (BCI) into virtual reality (VR) environments is promising for the video games industry. However, it faces several limitations, mainly due to hardware constraints and constraints engendered by the stimulation needed by the BCI. The main limitation is still the low transfer rate that can be achieved by current BCI technology. The goal of this paper is to review current limitations and to provide application creators with design recommendations in order to overcome them. We also overview current VR and BCI commercial products in relation to the design of video games. An essential recommendation is to use the BCI only for non-complex and non-critical tasks in the game. Also, the BCI should be used to control actions that are naturally integrated into the virtual world. Finally, adventure and simulation games, especially if cooperative (multi-user) appear the best candidates for designing an effective VR game enriched by BCI technology. Full article
(This article belongs to the Special Issue Advances in Mobile Augmented Reality)
Figures

Figure 1

Back to Top