Topic Editors

Prof. Dr. Andrea Sanna
Dipartimento di Automatica e Informatica, Politecnico di Torino, C.so Duca degli Abruzzi 24, I-10129 Torino, Italy
Dr. Federico Manuri
Dipartimento di Automatica e Informatica, Politecnico di Torino, c.so Duca degli Abruzzi 24, I-10129 Torino, Italy
Dr. Francesco De Pace
Dipartimento di Automatica e Informatica, Politecnico di Torino, c.so Duca degli Abruzzi 24, I-10129 Torino, Italy

Augmented and Mixed Reality

Abstract submission deadline
closed (31 March 2022)
Manuscript submission deadline
closed (31 May 2022)
Viewed by
5837

Topic Information

Dear Colleagues,

The aim of this article collection is to contribute to the advancement of the Augmented and Mixed Reality fields and their impact on different areas such as: industry, entertainment, tourism, medicine, architecture and so on.

Augmented and mixed reality technologies allow researchers and practitioners to design and develop applications that augment the real world by a set of computer-generated assets. Different paradigms can be employed: hand-held, see-through and projected. Different devices are used to support these paradigms, such as smartphones, tablets, headsets and projectors. Even if a lot of work has been carried out already and many scientific publications are known in the literature, several open problems and exciting challenges still need to be addressed.

AR and MR technologies are growing very quickly, and they are part of our everyday life. Technological improvements of smartphones and tablets have boosted the spread of applications that are able to augment the real world; on the other hand, AR is also one of the Industry 4.0 pillars, and the new interfaces to support human–machine and human–robot interaction could strongly benefit from augmentation. Moreover, the integration of AR/MR technologies with other input interfaces (speech, gesture, gaze, brain, etc.) can open new scenarios, thus allowing users to take advantage of novel interaction paradigms never experienced before.

Despite of all of the exciting opportunities related to AR/MR, several problems need to be tackled. Content generation, tracking systems, limited field of view, usability, technology acceptance, and affordability are just some of issues to be addressed. In this context, this Topic offers a framework for integrating augmented and mixed reality with other technologies, providing technological solutions, proposing new application areas, and defining new evaluations approaches. Topics of interest could include but are not limited to:

General topics:

Indoor and outdoor tracking systems;

Content generation for AR/MR;

AR/MR for industry;

AR/MR for medicine;

AR/MR for entertainment and tourism;

AR/MR for broadcasting;

AR/MR for architecture;

AR/MR for cultural heritage;

AR/MR for military;

AR/MR for education and training.

Particular Themes:

Augmented reality solutions to support assembly, repair and maintenance tasks;

Design of adaptive AR interfaces;

Human–robot interaction supported by AR;

Integration of AR/MR with other input technologies (i.e., speech, gaze, brain, gesture, etc.);

Augmented reality-based games;

Shared augmented and virtual environments;

Application of AR to support learning;

Augmented reality in surgery;

Advanced technologies in AR;

Combining artificial intelligence and augmented reality;

Augmented reality in advertising, shopping and e-commerce;

Augmented reality frameworks/libraries;

Augmented reality for situational awareness: tactical augmented reality;

Augmented reality in sport apps;

Augmented reality as a tool to foster multi-user collaboration.

Prof. Dr. Andrea Sanna
Dr. Federico Manuri
Dr. Francesco De Pace
Topic Editors

Keywords

  • augmented reality
  • mixed reality
  • tracking systems
  • Industry 4.0
  • augmented reality smart glasses
  • artificial intelligence
  • computer vision
  • data visualization
  • education

Participating Journals

Journal Name Impact Factor CiteScore Launched Year First Decision (median) APC
Sensors
sensors
3.847 6.4 2001 17.4 Days 2400 CHF
Future Internet
futureinternet
- 5.4 2009 12.7 Days 1400 CHF
Information
information
- 4.2 2010 18.4 Days 1400 CHF
Journal of Imaging
jimaging
- 4.8 2015 22 Days 1600 CHF
Multimodal Technologies and Interaction
mti
- 4.5 2017 21 Days 1600 CHF

Published Papers (6 papers)

Order results
Result details
Journals
Select all
Export citation of selected articles as:
Article
Practical Application of Augmented/Mixed Reality Technologies in Surgery of Abdominal Cancer Patients
J. Imaging 2022, 8(7), 183; https://doi.org/10.3390/jimaging8070183 (registering DOI) - 30 Jun 2022
Abstract
The technology of augmented and mixed reality (AR/MR) is useful in various areas of modern surgery. We considered the use of augmented and mixed reality technologies as a method of preoperative planning and intraoperative navigation in abdominal cancer patients. Practical use of AM/MR [...] Read more.
The technology of augmented and mixed reality (AR/MR) is useful in various areas of modern surgery. We considered the use of augmented and mixed reality technologies as a method of preoperative planning and intraoperative navigation in abdominal cancer patients. Practical use of AM/MR raises a range questions, which demand suitable solutions. The difficulties and obstacles we encountered in the practical use of AR/MR are presented, along with the ways we chose to overcome them. The most demonstrative case is covered in detail. The three-dimensional anatomical model obtained from the CT scan needed to be rigidly attached to the patient’s body, and therefore an invasive approach was developed, using an orthopedic pin fixed to the pelvic bones. The pin is used both similarly to an X-ray contrast marker and as a marker for augmented reality. This solution made it possible, not only to visualize the anatomical structures of the patient and the border zone of the tumor, but also to change the position of the patient during the operation. In addition, a noninvasive (skin-based) marking method was developed that allows the application of mixed and augmented reality during operation. Both techniques were used (8 clinical cases) for preoperative planning and intraoperative navigation, which allowed surgeons to verify the radicality of the operation, to have visual control of all anatomical structures near the zone of interest, and to reduce the time of surgical intervention, thereby reducing the complication rate and improving the rehabilitation period. Full article
(This article belongs to the Topic Augmented and Mixed Reality)
Show Figures

Figure 1

Article
A Multi-User Collaborative AR System for Industrial Applications
Sensors 2022, 22(4), 1319; https://doi.org/10.3390/s22041319 - 09 Feb 2022
Abstract
Augmented reality (AR) applications are increasingly being used in various fields (e.g., design, maintenance, assembly, repair, training, etc.), as AR techniques help improve efficiency and reduce costs. Moreover, collaborative AR systems extend applicability, allowing for collaborative environments for different roles. In this paper, [...] Read more.
Augmented reality (AR) applications are increasingly being used in various fields (e.g., design, maintenance, assembly, repair, training, etc.), as AR techniques help improve efficiency and reduce costs. Moreover, collaborative AR systems extend applicability, allowing for collaborative environments for different roles. In this paper, we propose a multi-user collaborative AR system (aptly called the “multi-user collaborative system”, or MUCSys); it is composed of three ends—MUCStudio, MUCView, and MUCServer. MUCStudio aims to construct industrial content with CAD model transformation, simplification, database update, marker design, scene editing, and exportation, while MUCView contains sensor data analysis, real-time localization, scene loading, annotation editing, and virtual–real rendering. MUCServer—as the bridge between MUCStudio and MUCView—presents collaborative and database services. To achieve this, we implemented the algorithms of local map establishment, global map registration, optimization, and network synchronization. The system provides AR services for diverse industrial processes via three collaborative ways—remote support, collaborative annotation, and editing. According to the system, applications for cutting machines were presented to improve efficiency and reduce costs, covering cutting head designs, production line sales, and cutting machine inspections. Finally, a user study was performed to prove the usage experience of the system. Full article
(This article belongs to the Topic Augmented and Mixed Reality)
Show Figures

Figure 1

Article
Multisensory Testing Framework for Advanced Driver Assistant Systems Supported by High-Quality 3D Simulation
Sensors 2021, 21(24), 8458; https://doi.org/10.3390/s21248458 - 18 Dec 2021
Abstract
ADAS and autonomous technologies in vehicles become more and more complex, which increases development time and expenses. This paper presents a new real-time ADAS multisensory validation system, which can speed up the development and implementation processes while lowering its cost. The proposed test [...] Read more.
ADAS and autonomous technologies in vehicles become more and more complex, which increases development time and expenses. This paper presents a new real-time ADAS multisensory validation system, which can speed up the development and implementation processes while lowering its cost. The proposed test system integrates a high-quality 3D CARLA simulator with a real-time-based automation platform. We present system experimental verifications on several types of sensors and testing system architectures. The first, open-loop experiment explains the real-time capabilities of the system based on the Mobileye 6 camera sensor detections. The second experiment runs a real-time closed-loop test of a lane-keeping algorithm (LKA) based on the Mobileye 6 line detection. The last experiment presents a simulation of Velodyne VLP-16 lidar, which runs a free space detection algorithm. Simulated lidar output is compared with the real lidar performance. We show that the platform generates reproducible results and allows closed-loop operation which, combined with a real-time collection of event information, promises good scalability toward complex ADAS or autonomous functionalities testing. Full article
(This article belongs to the Topic Augmented and Mixed Reality)
Show Figures

Figure 1

Article
An Augmented Reality Periscope for Submarines with Extended Visual Classification
Sensors 2021, 21(22), 7624; https://doi.org/10.3390/s21227624 - 17 Nov 2021
Cited by 1
Abstract
Submarines are considered extremely strategic for any naval army due to their stealth capability. Periscopes are crucial sensors for these vessels, and emerging to the surface or periscope depth is required to identify visual contacts through this device. This maneuver has many procedures [...] Read more.
Submarines are considered extremely strategic for any naval army due to their stealth capability. Periscopes are crucial sensors for these vessels, and emerging to the surface or periscope depth is required to identify visual contacts through this device. This maneuver has many procedures and usually has to be fast and agile to avoid exposure. This paper presents and implements a novel architecture for real submarine periscopes developed for future Brazilian naval fleet operations. Our system consists of a probe that is connected to the craft and carries a 360 camera. We project and take the images inside the vessel using traditional VR/XR devices. We also propose and implement an efficient computer vision-based MR technique to estimate and display detected vessels effectively and precisely. The vessel detection model is trained using synthetic images. So, we built and made available a dataset composed of 99,000 images. Finally, we also estimate distances of the classified elements, showing all the information in an AR-based interface. Although the probe is wired-connected, it allows for the vessel to stand in deep positions, reducing its exposure and introducing a new way for submarine maneuvers and operations. We validate our proposal through a user experience experiment using 19 experts in periscope operations. Full article
(This article belongs to the Topic Augmented and Mixed Reality)
Show Figures

Figure 1

Article
CultReal—A Rapid Development Platform for AR Cultural Spaces, with Fused Localization
Sensors 2021, 21(19), 6618; https://doi.org/10.3390/s21196618 - 05 Oct 2021
Cited by 3
Abstract
Virtual and augmented reality technologies have known an impressive market evolution due to their potential to provide immersive experiences. However, they still have significant difficulties to enable fully fledged, consumer-ready applications that can handle complex tasks such as multi-user collaboration or time-persistent experiences. [...] Read more.
Virtual and augmented reality technologies have known an impressive market evolution due to their potential to provide immersive experiences. However, they still have significant difficulties to enable fully fledged, consumer-ready applications that can handle complex tasks such as multi-user collaboration or time-persistent experiences. In this context, CultReal is a rapid creation and deployment platform for augmented reality (AR), aiming to revitalize cultural spaces. The platform’s content management system stores a representation of the environment, together with a database of multimedia objects that can be associated with a location. The localization component fuses data from beacons and from video cameras, providing an accurate estimation of the position and orientation of the visitor’s smartphone. A mobile application running the localization component displays the augmented content, which is seamlessly integrated with the real world. The paper focuses on the series of steps required to compute the position and orientation of the user’s mobile device, providing a comprehensive evaluation with both virtual and real data. Pilot implementations of the system are also described in the paper, revealing the potential of the platform to enable rapid deployment in new cultural spaces. Offering these functionalities, CultReal will allow for the fast development of AR solutions in any location. Full article
(This article belongs to the Topic Augmented and Mixed Reality)
Show Figures

Figure 1

Article
Programming Robots by Demonstration Using Augmented Reality
Sensors 2021, 21(17), 5976; https://doi.org/10.3390/s21175976 - 06 Sep 2021
Cited by 1
Abstract
The world is living the fourth industrial revolution, marked by the increasing intelligence and automation of manufacturing systems. Nevertheless, there are types of tasks that are too complex or too expensive to be fully automated, it would be more efficient if the machines [...] Read more.
The world is living the fourth industrial revolution, marked by the increasing intelligence and automation of manufacturing systems. Nevertheless, there are types of tasks that are too complex or too expensive to be fully automated, it would be more efficient if the machines were able to work with the human, not only by sharing the same workspace but also as useful collaborators. A possible solution to that problem is on human–robot interaction systems, understanding the applications where they can be helpful to implement and what are the challenges they face. This work proposes the development of an industrial prototype of a human–machine interaction system through Augmented Reality, in which the objective is to enable an industrial operator without any programming experience to program a robot. The system itself is divided into two different parts: the tracking system, which records the operator’s hand movement, and the translator system, which writes the program to be sent to the robot that will execute the task. To demonstrate the concept, the user drew geometric figures, and the robot was able to replicate the operator’s path recorded. Full article
(This article belongs to the Topic Augmented and Mixed Reality)
Show Figures

Figure 1

Back to TopTop