Next Article in Journal
Human-Computer Interaction in Digital Mental Health
Previous Article in Journal
E-MDAV: A Framework for Developing Data-Intensive Web Applications
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Systematic Review

Systematic Review of Multimodal Human–Computer Interaction

by
Jose Daniel Azofeifa
1,*,
Julieta Noguez
1,
Sergio Ruiz
1,
José Martín Molina-Espinosa
1,
Alejandra J. Magana
2 and
Bedrich Benes
3
1
Department of Engineering and Sciences, Tecnologico de Monterrey, Monterrey 64849, NL, Mexico
2
Department of Computer and Information Technology, Purdue University, West Lafayette, IN 47907, USA
3
Department of Computer Science, Purdue University, West Lafayette, IN 47907, USA
*
Author to whom correspondence should be addressed.
Informatics 2022, 9(1), 13; https://doi.org/10.3390/informatics9010013
Submission received: 13 November 2021 / Revised: 6 February 2022 / Accepted: 10 February 2022 / Published: 15 February 2022
(This article belongs to the Section Human-Computer Interaction)

Abstract

:
This document presents a systematic review of Multimodal Human–Computer Interaction. It shows how different types of interaction technologies (virtual reality (VR) and augmented reality, force and vibration feedback devices (haptics), and tracking) are used in different domains (concepts, medicine, physics, human factors/user experience design, transportation, cultural heritage, and industry). A systematic literature search was conducted identifying 406 articles initially. From these articles, we selected 112 research works that we consider most relevant for the content of this article. The articles were analyzed in-depth from the viewpoint of temporal patterns, frequency of usage in types of technology in different domains, and cluster analysis. The analysis allowed us to answer relevant questions in searching for the next steps in work related to multimodal HCI. We looked at the typical technology type, how the technology type and frequency have changed in time over each domain, and how papers are grouped across metrics given their similarities. This analysis determined that VR and haptics are the most widely used in all domains. While VR is the most used, haptic interaction is presented in an increasing number of applications, suggesting future work on applications that configure VR and haptic together.

1. Introduction

We present a systematic review of multimodal human-computer interaction (HCI) with the primary objective of showing how different types of technologies are used in different subject areas, herein called domains. Various domain-specific surveys and reviews have recently been published. In particular, studies have focused on VR use in manufacturing [1,2], VR in education [3], haptic interaction in medicine [4], orthopedic surgery [5], medical training [6], wearable haptics [7], VR systems in museums [8], and cultural heritage [9].
We are not aware of a systematic review covering multiple technologies used in different application areas. Most of the above-mentioned studies analyze, in-depth, a small number of domains and types of technology (e.g., VR in manufacturing, haptics in medicine, etc.). There is a need for a more extensive study covering various domains that does not provide a detailed review but paints a larger horizontal picture. Thus, a critical research endeavor would be to identify the combination of multiple domains and various technologies for the development and research of HCI. To make advances toward this research gap, this study addresses the research questions described in Section 2.
Multimodal interfaces offer efficient and expressive human–computer interaction. The term “multimodal” focuses on the combination of human perception channels (vision, touch, hearing, taste, and smell) to involve various human abilities (communication, cognition, and perception) in order to improve the user’s understanding of what is being presented computationally [10]. This is achieved by including sensory technologies such as haptic displays, virtual reality, and augmented reality. This systematic review covers multimodal human–computer interactions based on the use of types of technologies such as haptic displays, virtual reality, or augmented reality, and the use of devices that allow specific one-directional and bidirectional interactions. This review also presents the bases to guide researchers towards possible intersections that appear mainly in the domains and technologies mentioned in this work. Furthermore, this study also identifies how research and technology developments have been carried out over the past few years. The primary contribution of this study is in providing an overview of what has been researched and developed to date and serves as a guide to identify and develop future research by working with multimodal interactions in several domains.

1.1. Types of Technologies

We base the categorization used on Displays and Interactions in previous works by Anthes et al. [11] and Hornbæk et al. [12]. In addition to the information provided by the works reviewed for this systematic review, this survey focuses on various types of displays (visual 2D, immersive (VR), augmented, and haptics), types of interaction (touch, vibration, wind, temperature, audio, and gizmo), and types of tracking (object, eye, hand, head, and body). We do not focus on the type of graphics primitives and underlying data (points, meshes, voxels, etc.), whether temporal, or the kind of visualization used; for these topics, we refer the reader to surveys [13,14,15]. Despite recent advances in computer hardware, this survey also does not focus on the specific hardware used. The following sections describe the types of technologies that are the focus of our research in order to define the scope of our study.
Displays are devices that convey information to the user. They can be categorized into being dimensionless, such as audio, 1D displays used to display messages in the Braille alphabet, 2D display, and 3D that are also called virtual reality (VR) or immersive [16]. While 2D displays do not require any knowledge of the user’s position, immersive systems need to track the user’s viewing position and direction to synchronize the display with the motion.
Augmented reality enables an interactive real-world experience enhanced with perceptual computer-generated information [17], allowing users to combine the real world with various computer-generated content to enhance the real world with information from a computer [18].
Haptic displays represent the sense of touch through force-feedback devices, which generate forces that allow the user to apply pressure to explore virtual objects [19]. This type of display provides kinesthetic and tactile information about the virtual environment via sensors, control circuits, and actuators that vibrate or exert force [20]. An example would be having difficulty pressing buttons that result in dangerous actions within a virtual environment [19].
Interactions include one-directional communication with the computer device, such as using a mouse or touch-based devices that scan the applied pressure.
Bidirectional interactions include devices that provide vibration, haptic devices that apply forces, and even wind and temperature.
A special kind of interaction uses various devices that we call gizmos. Some authors define a gizmo as a mechanical device or gadget that is used to perform a mechanical procedure [21]. These interactions are related to the use of controls or command devices to interact with the system by using ad hoc hardware components.
The last interaction includes tracking that monitors objects, humans, or parts of human bodies in space and encodes this information to electric signals that are interpreted by the computer. This systematic review considers object tracking with the eye, hand, head, and body.

1.2. Domains

We based the definition of domains on the subject areas proposed by Freina and Ott [3] (e.g., medicine, physics, computer science, social science, materials science, and engineering), Vera-Baceta et al. [22] (e.g., arts and humanities, life sciences and biomedicine, physical sciences, social sciences, and technology), Garcia et al. [23] (e.g., arts and humanities, computer science, medicine, physics and astronomy, and social sciences), and on the information provided by the research publications analyzed for this systematic review. We link the use and the advances of these categories to the following domains: Concepts and overviews, Medicine, Physics, Transportation, Cultural heritage, Industry, and Human factors/User experience design (UX). It is important to mention that since UX can be viewed as transversal to all disciplines and we consider that multimodal interaction enriches and improves user experience, it was added to the study. The selection of the domains is a result of careful analysis of a vast body of papers and is detailed in Section 2. This survey shows how each discipline utilizes various modalities and inspires future work.

2. Methods

This systematic review follows the structure and methods according to the guidelines for performing Systematic Literature Reviews described by Page et al. [24], the guidelines of Kitchenham and Charters [25], Xiao and Watson [26], and Torres-Carriét al. [27]. We developed a review protocol to guide our research. In particular, the search strategy was conducted by using the following steps: (1) definition of research questions; (2) search method for the identification of studies; (3) quality assessment; (4) paper inclusion and exclusion criteria; (5) data collection; and (6) data analysis first presented as a synthesis of the manuscripts identified, followed by the response to the research questions.

2.1. Research Questions

In order to guide our study towards the objective of determining the effort made in multimodal interaction and the next steps in multimodal HCI work, we identified the following four research questions:
  • RQ1: How has the type of technology changed over time in each domain?
  • RQ2: What is the typical technology type by domain?
  • RQ3: How has the frequency of research publications changed over time by domain?
  • RQ4: How are research publications grouped across metrics given their similarities?

2.2. Keyword Identification

We first identified keywords related to human–computer interaction, virtual reality, augmented reality, and haptic devices.
First, the selection process was based on general keywords related to the scope of this work, such as human–computer interaction, virtual reality, augmented reality, and haptics. Then, combinations of these words were used to find results where more than one of the terms were combined (e.g., haptic virtual reality, human–computer interaction in virtual reality). Finally, a combination of all keywords was marked with the domain name and terms directly related to each domain to identify jobs related to each specific domain (e.g., virtual reality in transport, virtual reality haptics in museums, human–computer interaction in manufacturing). Table 1 shows the keywords used to construct the combination of queries.

2.3. Study Identification Search Method

We searched for research works in which multimodal human–computer interactions were presented based on the use of haptic displays, virtual reality, or augmented reality. We used the Purdue University Library to identify relevant studies related to the objective of this systematic review. The library provides complete access to 676 databases and most existing sites and publishers. We also used Google Scholar to find studies related to the objective of this systematic review.

2.4. Quality Assessment

Purdue University Library and Google Scholar provide journal ratings and the event’s importance (their h-index). Based on these ratings, we identified research works published during the last eight years in the top-ranked 20 conferences and journals in Computer Graphics, HCI, haptics, and VR/AR up to July 2021 related to multimodal human–computer interactions that use haptic displays, VR/AR, and devices that allow specific one-directional and bidirectional interactions that present tracking.

2.5. Data Collection

We focused on types of interactions depending on the display type used in each domain. In particular, we reviewed the following: IEEE Transactions on Visualization and Computer Graphics; IEEE Transactions on Haptics; ACM CHI; IEEE Computer Graphics and Applications; IEEE Symposium on Visual Analytics Science and Technology; Joint EuroHaptics Conference; Visualization and Data Analysis; ACM Transactions on Graphics; Virtual Reality; HCI International; ACM UIST; IEEE Virtual Reality Conference; International Conference on Haptics perception devices mobility and communication; Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems; and ACM TOCHI, among other related conferences and journals. We found an initial set of 406 articles. After an exhaustive review, 112 articles were selected; we kept only the documents with the most relevant content and characteristics for analysis (see Figure 1).

2.6. Inclusion and Exclusion Criteria

While subjective, the key criteria for inclusion were user interaction and display technology. Special attention was paid to works related to multimodal human–computer interactions based on the use of haptic displays, virtual reality or augmented reality, and the use of devices that allow specific one-directional and bidirectional interactions, which could present tracking.
The exclusion process began with an identification phase in which words related to multimodal human–computer interaction were searched based on the keywords of human–computer interaction, VR/AR, and haptics in databases following the parameters mentioned in Section 2.4. We excluded works dealing with basic computer graphics algorithms, as well as works dealing with advances in computer hardware. The works considered duplicates were eliminated, as were those not contributing or not aligned with the scope of this work. Then, we read all works that still presented a possible contribution to the objective of this study. Several were excluded because they did not have clear information or were not related to the scope of the work. Finally, the remaining works were reread and analyzed in-depth; we realized that some would not provide as much input as expected, and therefore excluded them. In the end, the 112 works were selected.

2.7. Data Analysis

According to the methodology selected for our literature review [24,25,26,27], data analysis was performed in two stages. First, all manuscripts identified were synthesized. For this process, we organized the manuscripts by domain. Then, we searched for the response to the research questions from Section 2.1.
For the synthesis of manuscripts, the initial set was classified according to the technologies and domains used. We based the subdivision into different categories on the recent survey by Freina and Ott [3] that discusses the usage of VR in education. We also used the works of Anthes et al. [11] and Hornbæk et al. [12] to define the initial interaction domains. To balance the number of papers per domain, we merged domains with very few papers (for example, geography and transportation) and split domains with too many papers. This analysis results in a set of domains of all papers according to our criteria.
The classification result is a map that relates the application domain to the technology used. While we could have listed the documents according to each technology, such as according to the display, it would result in disproportional classes and more importantly, would not be interesting for people from different areas of application. Therefore, we decided to use the domain as the primary classification criterion, hoping researchers from different fields will be able to learn something about their area.

3. Results

This section is organized into two main parts. First, we synthesize the results of the literature review, characterizing the most relevant papers identified by following the above-specified criteria and organized by each application domain. Then, we present the analysis describing the main tendencies of the research regarding (a) the number of papers by domain and technology type published over time, (b) the number of papers in each application domain, cross-measuring them with the kinds of applications, and (c) the proximity and similarity of papers from the viewpoint of domain and type.

3.1. Synthesis

The following presents each application domain and the most relevant papers according to the above-specified criteria. In particular, we discuss the following domains: (1) Concepts and overviews include algorithms and methods that can be applied across different categories, (2) Medicine, (3) Physics, (4) Human factors/User experience design (UX), (5) Transportation, (6) Cultural heritage, and (7) Industry. Papers are further grouped into smaller blocks, and each block is organized chronologically in ascending order.

3.1.1. Concepts and Overviews

This section describes contemporary methods that are not domain specific and can be applied in multiple areas. It also includes reviews from all categories discussed in this paper summarized in Table 2.

Visualization

It is a large domain by itself, and we refer the reader to [28] who presented a survey of interaction and data representation in scientific visualization. Reda et al. [29] introduced a hybrid reality system in which they study 2D and 3D data visualization in large-scale settings using immersive technology. Olshannikova et al. [30] overview methods for processing Big Data, their visualization, and integration with AR and VR. They conclude that visualized data can significantly improve understanding of the pre-selected information and create a new interactive system to operate with the visualized objects. Moreover, human perception and cognition must be considered, and virtual and physical objects should be well-integrated.

VR and AR

Freina and Ott [3] survey the use of VR in education, and the state-of-the-art of Slater and Sanchez-Vives [31] reviews VR in domains focused on applications with some level of research support. Mihelj et al. [32] introduced characterization and definitions of manipulation, interaction, and navigation in multiuser VR. Critical elements of VR experience using a CAVE environment and a taxonomy based on the interaction (vibration, gizmo, hand, head, and body tracking) were discussed in [33]. Another work that deals with the taxonomy for VR is [11], and it also focuses on the hardware and the different technical characteristics of the existing types of devices. Chavan [34] studied a comparison of VR and AR in various contexts, including price, differences, similarities, and application areas. They discussed future work options, including screen resolution, eye and face tracking, and more advanced controls. The work of Rubio-Tamayo et al. [35] discusses basic concepts of VR and the relationship between the virtual and the real world and presents the ideas of representation, expressiveness, and interaction in VR.

VR and Haptics

Achibet et al. [36] introduced a virtual glove as a VR approach to tactile experience in an immersive environment. It is a visuo-haptic system for grip force interaction; it uses pressure sensors and cameras for hand tracking, allowing interaction with virtual objects. Deng et al. reviewed the use of haptic devices and eye-tracking in [37]. Pacchierotti et al. [7] discuss taxonomy, design, perspectives, and review of wearable haptic systems for hand and fingertip, and they also discuss the role of wearable haptics in the cutaneous stimulus.

Behavioral Theories of HCI

Hekler et al. [38] explained and presented a guide to interpret, use, and contribute to the behavioral theories of HCI. They realize that this guide is superficial and that, in future works, it is possible to go further on topics such as the best methods to evaluate change technologies of behavior in HCI research, a full understanding of the required knowledge that each field requires before committing to the other, the possibility of distortions arising from mistranslations of concepts between areas, and the impact of socio-cultural differences related to the origin of theories about the interpretability, utility, and generalization of different behavioral approaches within an HCI context. Its ultimate goal is to call HCI behavioral scientists and researchers to work more closely together, both in designing behavior change technologies and developing better theory.
Vines et al. [39] studied the concepts that should be taken into account when users participate in developing an HCI system. Their goal has been to draw attention to the plurality of participation in HCI, and the problems and possibilities that this brings to future research. They seek to present a more nuanced understanding of shared control between researchers and participants. Finally, they indicate that these are exciting times because new technologies come to new audiences and new perspectives arise on what design could and should be.
A review on the introduction of hedonism in HCI was presented by Diefenbach et al. [40]. An important aspect is the conceptualization of the value of experience in terms of the product’s attributes.
Table 2. Data table of the concepts and overviews domain.
Table 2. Data table of the concepts and overviews domain.
Authors/YearDisplayInteractionTrackingApplication
Hekler et al. (2013) [38]NANANABehavioral Theory in HCI
Kherer and Hauser (2013) [28]NANANASurvey of multifaceted scientific data visualization
Reda et al. (2013) [29]NANANAHybrid-Reality Environments review
Vines et al. (2013) [39]NANANAResearch of participation in HCI
Achibet et al. (2014) [36]Haptic, VRGizmo, trackingHandVisuo-haptic grip force system
Deng et al. (2014) [37]Haptic, VRNANAReview of multimodality with Eye tracking and haptics
Diefenbach et al. (2014) [40]NANANAReview of hedonism in human–computer interaction
Freina and Ott (2015) [3]VRNANAReview of immersive VR in education
Muhanna (2015) [33]Haptic, VRVibration, gizmo, trackingHand, head, bodyCAVE System
Olshannikova et al. (2015) [30]VR, ARNANAOverview of Big Data visualization with AR and VR
Anthes et al. (2016) [11]VRNANAState-of-the-art of VR technology
Chavan (2016) [34]VR, ARNANAReview with the comparison between AR and VR
Slater and Sanchez-Vives (2016) [31]VRNANAState-of-the-art of immersive VR
Pacchierotti et al. (2017) [7]HapticNANAReview of haptic systems for fingertip and the hand taxonomy
Rubio-Tamayo et al. (2017) [35]VRNANAReview of immersive environments and VR

3.1.2. Medicine

Human–Computer Interaction in medicinal applications is closely related to robotics and, compared to other areas, has specific requirements such as high precision, fast feedback, intuitiveness, and realism. Ruthenbeck and Reynolds [41] presented state-of-the-art use of VR and haptic devices for medical training, and Vaughan et al. [5] provided a review of visuo-haptic training simulators for orthopedic surgery (see the summary in Table 3).

Surgery

An early work of Talasaz and Patel [42] discusses haptic teleoperation to locate minimally invasive robot-assisted tumors. They use an on-screen display to see the location, and a haptic interface controlled by hand to handle the robot’s assistance. Díaz et al. [43] presented a haptic system for surgery drilling assistance, which is composed of a haptic pedal and a visual interface. The system transmits vibrations and audio feedback to the user during the interaction. Jeon and Harders [44] introduced a system to operate on palpate tumors through the use of AR and haptic devices. A comparison of environments that aid learning visuo-haptically to perform surgeries and traditional forms was presented in the work of Esteban et al. [45]. The authors conclude that introducing the sense of touch in surgery simulators through haptic devices is an essential addition. Ruffaldi et al. [46] introduced a visuo-haptic system to perform ultrasonography by using VR and a haptic manual device together with hand and head tracking.

Training

Several works addressed simulations and training for medical purposes (see also the review [4]). One of them is the work of Fortmeier et al. [47] that presents a visuo-haptic simulation framework for needle insertion capable of simulating patients breathing movements. Cardiac life support VR training simulator was studied by Khanal et al. [48]. In addition to the immersive visualization, their system uses haptic devices and audio to provide timely feedback for error detection and correction feedback for proper technique. They concluded that VR-based training could effectively complement the conventional training method. Hamza-Lup et al. [49] surveyed the use of visuo-haptic simulation in the surgical training process, features, APIs, and frameworks on the haptic devices used in this type of training. They described the methodology for simulating a laparoscopic surgical procedure using a visuo-haptic interactive application. Finally, Pan et al. [50] demonstrated a VR system combined with haptic devices for laparoscopic rectal surgery.

Rehabilitation

Rose et al. reviewed VR for rehabilitation in [51]. They report that immersion improves the navigation task’s performance and accuracy of the assignment and provides instability in posture. They mention that the potential of VR for rehabilitation is not fully explored. Andaluz et al. [52] presented a system for upper limb rehabilitation with the use of VR coupled with a haptic device for feedback of force and vibrations in addition to a method for tracking the hand and fingers. Won et al. [53] reviewed immersive VR systems for rehabilitation of pediatric pain by categorizing the qualities and practical aspects of VR. They concluded that, together with the applications and effectiveness of VR for the treatment of pain, pediatrics is necessary to understand the impact on the quality of life of pediatric patients.

Dentistry

Wang et al. [54] surveyed virtual multisensory feedback systems for dental training. They summarize the components, functions, and unique characteristics of several methods and discuss the technical challenges behind these systems. Wang et al. [55] evaluated a VR dental simulator for the drilling operation. By using haptics, audio, and VR, they proposed adding extra haptic support that simulates the fingers to perform complex tasks and improve the graphic representation of the virtual environment.
Table 3. Data table of the Medicine domain.
Table 3. Data table of the Medicine domain.
Authors/YearDisplayInteractionTrackingApplication
Talasaz and Patel (2013) [42]HapticGizmo, trackingHandHaptic teleoperation system
Díaz et al. (2014) [43]HapticVibration, audio, gizmo, trackingHand, bodyHaptic system for surgery drilling assistance
Esteban et al. (2014) [45]Haptic, VRGizmo, trackingHand, bodyVisuo-haptic surgical learning environment
Jeon and Harders (2014) [44]Haptic, ARNANAHaptic AR tumor palpation theory
Khanal et al. (2014) [48]Haptic, VRAudio, gizmo, trackingHandVisuo-haptic cardiac life support simulator
Fortmeier et al. (2015) [47]Haptic, VRGizmo, trackingHandVisuo-haptic respiratory motion simulation
Pan et al. (2015) [50]Haptic, VRGizmo, trackingHandVisuo-haptic laparoscopic rectum surgery training
Ruffaldi et al. (2015) [46]Haptic, VRGizmo, trackingHand, headVisuo-haptic Ultrasonography system
Ruthenbeck and Reynolds (2015) [41]Haptic, VRNANAState of the art of VR for medical training
Wang et al. (2015) [55]Haptic, VRAudio, gizmo, trackingHandVisuo-haptic dental simulation on drilling operation
Andaluz et al. (2016) [52]Haptic, VRVibration, gizmo, trackingHand, headVisuo-haptic Upper Limb Rehabilitation system
Escobar-Castillejos et al. (2016) [4]Haptic, VR, AR, 2DVibration, audio, gizmo, trackingObject, eye, hand, head, bodyMedical training simulators review
Vaughan et al. (2016) [5]Haptic, VRNANAReview of VR training simulators for orthopedic surgery
Wang et al. (2016) [54]Haptic, VRNANASurvey of multisensory feedback VR on a dental training system
Won et al. (2017) [53]VRNANAReview of immersive VR for pediatric pain
Rose et al. (2018) [51]Haptic, VR, 2DNANAReview of immersive VR for rehabilitation
Hamza-Lup et al. (2019) [49]Haptic, VRNANASurvey of surgical training with visuo-haptic simulation

3.1.3. Physics

This area includes several works dealing with the HCI system that use physics and related fields. Kucukyilmaz et al. [56] presented an experimental study on a system that recognizes an intuitive communication between the partners during remote haptic collaboration in physics. The results suggest that human–computer communication can be improved by adding a decision-making process in which the computer infers the intentions of the human operator and dynamically adjusts the controls of the interacting parts to ensure more intuitive interactions. Table 4 summarizes this section.

Surfaces

Donalek et al. [57] presents a platform for data visualization using collaborative VR, where the VR headset and the tracking device were used to place the hand in the virtual environment. Kim and Kwon [58] proposed a haptics rendering method based on geometry for 2D images. Their focus is to estimate the haptic information of the structure of objects contained in 2D images while preserving the image. Kokubun et al. [59] described a visuo-haptic system to represent normal and shear forces in a mobile device through pseudo haptic interaction and a subsequent tactile interface to evoke the haptic sensation without using haptic devices. Nakamura and Yamamoto [60] described a prototype of a visuo-haptic system with multitouch surface interaction that uses direct electrostatic stimulation as feedback. They use haptic gloves on a multitactile screen report that the rendering experiments for dynamic objects revealed a problem known as “object stiction”, which is exclusive to multi-touch haptic systems and is caused by the non-directional nature of electrostatic stimulation that appears when an object is pinched and dragged at the same time. Visuo-haptics simulation of friction has been studied by Yuksel et al. [61]; Yuksel et al. [62] and by Neri et al. [63]. Visual and visuo-haptic simulations were compared to physical simulation, and the learning gain was considered.

Object Grasping

Prachyabrued and Borst [64] researched visual feedback to understand the signals that improve behavior when manipulating virtual objects with fingers. Eight visual feedback techniques were compared and evaluated to improve performance or subjective experience. Among them, audio, gizmos, and hand tracking were used. The authors concluded that future work should combine other techniques such as haptic or heuristic release feedback. Madan et al. [65] presented work to obtain a more in-depth understanding of recognizing patterns of collaborative haptic interaction in manipulating dyadic articular objects.

Fluid Mechanics

Wang and Wang [66] proposed a hybrid model for haptic interaction with fluids based on solid–fluid interaction. In addition to evaluating the efficiency of the hybrid model, comparative experiments and result analysis are presented. The authors mention that future work should detail the special effects. There is a haptic interaction with the fluid, accompanied by turbulence, filtration, bubbles, and the acoustic phenomenon to improve telepresence.

Electromagnetism

Walsh et al. [67] compared physical simulations of systems with pulleys to visuo-haptics simulations, and the same group in [68] used visuo-haptics simulations to study the learning of electric charges and the magnetic fields. This work was also considered by Shaikh et al. [69]. The authors conclude that visuo-haptic simulation has at least the same learning gain as a simulation without haptic feedback.

Dynamic Systems

Amirkhani and Nahvi [70] designed and implemented an interactive visual-haptic laboratory for students to experience the theory discussed in class. They discovered that the interactive virtual laboratory compensates for the lack of a real laboratory. More experiments related to those presented can be included as they are dynamic and include the vibration.

Astrophysics

Another example of usage of simulations in education is the work of Lindgren et al. [71] that studied gravity and planetary movement in a mixed reality system of immersive interactive simulation. They compared learning and perceptions about science with students who used a desktop version of the same simulation, concluding that the students who used the immersive interactive full-body simulation showed higher learning and more positive attitudes towards the simulation experience and towards the learning environment.

Molecular Physics

Edwards et al. [72] showcases an immersive visuo-haptic system for learning organic chemistry. In particular, the learners manipulate hydrocarbon molecules by using vibrations, a haptic glove for hand tracking, and a VR headset for head tracking. They show how an immersive learning experience integrates several learning approaches that include multimedia and multisensory instruction.
Table 4. Data table of the Physics domain.
Table 4. Data table of the Physics domain.
Authors/YearDisplayInteractionTrackingApplication
Kucukyilmaz et al. (2013) [56]Haptic, VRVibration, gizmo, trackingHandVisuo-haptic shared ball board game
Donalek et al. (2014) [57]VRGizmo, trackingHand, headVR Data visualization
Kim and Kwon (2014) [58]Haptic, 2DGizmo, trackingHandHaptic interaction with 2D images
Kokubun et al. (2014) [59]Haptic, VRTouchpad, trackingHandHaptic stiffness on touchscreen
Nakamura and Yamamoto (2014) [60]Haptic, VRGizmo, trackingHandVisuo-haptic multi-finger surface system
Prachyabrued and Borst (2014) [64]VRAudio, gizmo, trackingHandVR hand grasping research
Wang and Wang (2014) [66]Haptic, VRGizmo, trackingHandVisuo-haptic rowing canoe simulation
Madan et al. (2015) [65]Haptic, 2DGizmo, trackingHandVisuo-haptic collaborative transport in a maze
Amirkhani and Nahvi (2016) [70]Haptic, VRGizmo, trackingHandVisuo-haptic physics experiments
Lindgren et al. (2016) [71]VR, ARGizmo, trackingBodyPlanetary motion system in Mixed Reality
Magana et al. (2017) [68]Haptic, VR, 2DVibration, gizmo, trackingHandVisuo-haptic learning of electricity and magnetism
Shaikh et al. (2017) [69]Haptic, VRGizmo, trackingHandVisuo-Haptic learning of electricity and magnetism
Yuksel et al. (2017) [61]Haptic, VRGizmo, trackingHandVisuo-haptic learning of concept of friction
Edwards et al. (2018) [72]Haptic, VRVibration, gizmo, trackingHand, headVisuo-haptic system for learning organic chemistry
Neri et al. (2018) [63]Haptic, VRGizmo, trackingHandVisuo-haptic experiment to improve understanding of friction
Walsh et al. (2018) [67]Haptic, VRGizmo, trackingHandVisuo-haptic experiment of conceptual understanding of forces acting in trusses
Yuksel et al. (2019) [62]Haptic, VRGizmo, trackingHandVisuo-haptic experiment to explore the effects of visual and haptic feedback on learning friction

3.1.4. Human factors/User Experience Design (UX)

This area includes several works dealing with the HCI that use human factors/User experience design and related fields, as summarized in Table 5.

User Factors

Okamoto et al. [73] reviewed psycho-physical dimensions related to tactile perception, concluding that the tactile perception of materials is composed of five dimensions. They also mentioned that promoting studies on the perceptual mechanisms of each tactile dimension will confirm the classification. Kober and Neuper [74] analyzed how a personality and a presence in VR depend on the level of immersion through the realization of VR navigation tasks and the application of questionnaires. Cavrag et al. [75] used a visuo-haptic interaction system for the treatment of arachnophobia (fear of spiders). Their work compares and discusses the modeling of 3D objects and test scenarios. An analysis of placelessness, spacelessness, and formlessness in virtual possessions perception has been evaluated in [76]. The authors reflected and synthesized findings in five field studies investigating people’s practices in digital environments and their attitudes toward virtual possessions.
Social interaction in immersive environments was studied by Bombari et al. [77]. The perceived realism of virtual humans can be improved by adding features such as making participants feel that virtual humans better understand them, including the theory of the mind, verbal behavior, and a physical aspect congruent to its characteristics. Ahmed et al. [78] evaluated haptic technologies that simulate the affective touch in VR. They conclude that regardless of the agent’s expression, force feedback becomes more natural than the vibration only. Kyriakou et al. [79] examined the attributes of virtual human behavior that can increase the plausibility of crowd simulation and can affect user experience in the virtual environment.

Product Factors

The role of haptic feedback for the integration of intentions in shared execution tasks has been studied by Groten et al. [80]. Several experiments showed two users moving an object together with audio and a haptic device with haptic hand tracking. They conclude that mutual haptic feedback is a valuable channel for integrating haptic tasks if shared decision making is required. Aras et al. [81] presented a quantitative evaluation of effectiveness using two visualization techniques with haptic interaction to manipulate 3D objects in virtual environments. Their study serves as the basis for more advanced studies of visuo-haptic coupling and its impact on the mental/cognitive workload.
Hamam et al. [82] introduced a taxonomy to classify the experience quality parameters for visuo-haptic environments. Next, an experiment with a visuo-haptic system to balance a ball is presented to test the model. Achibet et al. [83] offer a device for haptic feedback using an elastic arm to increase interaction and perception in virtual environments, and they showed cases to illustrate the capabilities of the system. Fittkau et al. [84] introduced a model to explore a visualization metaphor called software cities through VR. An evaluation of gestures created for the use of the system is presented, and the possible assessment of the system with other VR devices and tracking is proposed.
Moran et al. [85] showed a tool to improve the visual analysis of Big Data with interactive VR, which allows visualization and interaction that can facilitate the understanding and representation of extensive data. They conclude that VR also serves as a data visualization platform enabling the most efficient user interaction with patterns and visual analysis when working in a geospatial domain. The work of Atienza et al. [86] presented a VR interaction technique using head gaze. They found that using hands and feet to navigate and control the environment degrades immersion level. In addition, the users prefer reliability over intuition in the system, and intelligent navigation guides may be the next significant improvement. Carvalheiro et al. [87] proposed a haptics interaction system for VR based on a combination of tracking devices and a real-to-virtual mapping system for the redirection of users. Y.-S. Chen et al. [88] used an augmented and connectable haptic device to improve control in immersive virtual situations in which the user can receive audio, visual, wind, thermal, and force feedback. Matsumoto et al. [89] showed an environment that efficiently directs a user within a visual-haptic climate by using tactile signals to modify spatial perception actively. M. Kim et al. [90] proposed a system that uses a tracking device as a haptic interface to interact immersively in VR by using a haptic device in hand with vibrations and heat feedback. Lee et al. [91] presented an immersive virtual environment based on mazes that seek to provide users with a greater sense of presence and experience through the virtual scene and immersive interaction with the use of a VR headset and a tracking device for the feet. The work of Maereg et al. [92] presents a vibrotactile haptic device to perceive the rigidity during interaction with virtual objects. Piumsomboon et al. [93] presented three gaze-based interaction techniques inspired by natural eye movements for VR immersion. Only recently, Reski and Alissandrakis [94] presented a comparison of various input technologies for interactive VR environments. They identified trends in the preference of visual representations, but physical controls in scenarios that stimulate exploration without a time limit are inconclusive.
Table 5. Data table of the human factors/user experience design (UX) domain.
Table 5. Data table of the human factors/user experience design (UX) domain.
Authors/YearDisplayInteractionTrackingApplication
Groten et al. (2013) [80]Haptic, 2DAudio, gizmo, trackingHandVisuo-haptic 2D shared ball tracking
Kober and Neuper (2013) [74]VRGizmoNAAnalysis of personality and presence in VR
Okamoto et al. (2013) [73]HapticNANAReview of psychophysical dimensions of tactile perception of textures
Aras et al. (2014) [81]Haptic, VR, 2DGizmo, trackingHandVisuo-haptic system to transport virtual objects 2D and 3D
Cavrag et al. (2014) [75]Haptic, VRGizmo, trackingHandVisuo-haptic system to interact with spiders
Hamam et al. (2014) [82]Haptic, VRGizmo, trackingHandVisuo-haptic balance ball system
Odom et al. (2014) [76]VRNANAAnalysis of qualities of virtual possessions
Achibet et al. (2015) [83]Haptic, VRGizmo, trackingHandHaptic elastic arm for virtual interaction
Bombari et al. (2015) [77]VRNANAAnalysis of social interactions through immersive virtual environments
Fittkau et al. (2015) [84]VRGizmo, trackingHandVR software architecture view system
Moran et al. (2015) [85]VRGizmo, trackingHandVR big data visualization system
Ahmed et al. (2016) [78]Haptic, VRVibration, audio, gizmo, trackingHand, headHaptic affective touch in VR analysis
Atienza et al. (2016) [86]VRGizmo, trackingHand, headVR interaction technique using head gaze
Carvalheiro et al. (2016) [87]Haptic, VRAudio, gizmo, trackingObject, handVisuo-haptic interactive real to virtual mapping system
Chen et al. (2016) [88]Haptic, VRVibration, wind, temperature, audio, gizmo, trackingBodyVisuo-haptic immersive system
Matsumoto et al. (2016) [89]Haptic, VRGizmo, trackingHandVisuo-haptic walking corridor
Kim et al. (2017) [90]Haptic, VRVibration, temperature, audio, gizmo, trackingHandVisuo-haptic immersive hand interaction system
Kyriakou et al. (2017) [79]VRAudio, gizmo, trackingHand, head, bodyVR environments for simulation of crowd interactions
Lee et al. (2017) [91]VRGizmo, trackingBodyVR walking simulation
Maereg et al. (2017) [92]Haptic, VRVibration, gizmo, trackingHandVisuo-haptic stiffness interaction system
Piumsomboon et al. (2017) [93]VRGizmo, trackingEye, headEye tracker for image visualization on VR analysis
Reski and Alissandrakis (2019) [94]VRGizmo, trackingHandComparison of several VR devices for data exploration

3.1.5. Transportation

This section describes systems that use multimodal interaction and visualization for transport, and the papers are summarized in Table 6.

Driving

Grane and Bengtsson [95] studied how visual and tactile interfaces affect drivers’ performance and how visual-haptic feedback could reduce the effects of driver distraction. They discovered that haptic support could reduce the impact of visual load without adding a cognitive load. Kemeny [96] analyzed the challenges of driving simulation through the use of VR and provided the main points that must be taken into account when carrying out this type of simulation with the perception of movement, distance, acceleration, and speed. Driving simulation has been a standard technology since the advent of VR for high-end 3D vision and tracking.
A haptic-multimodal interaction system with cooperative guidance, control, and cognitive automation was presented by Altendorf et al. [97]. Their case study compared the haptic device and haptic hand tracking in a virtual driving simulator. Mars et al. [98] studied human–machine cooperation when driving in a simulated environment with different degrees of shared control and haptic support. The authors stated that more studies should be carried out to determine how their results can be generalized to other shared control designs and different situations. Wang et al. [99] presented a shared control model for lane tracking through driver interaction with a haptic guide. Their results suggested that the higher the degree of dependence on the driver with the haptic-guided direction, the less the effort. Stamer et al. [100] presented a glove-based study for tactile and force feedback to support car driving in virtual simulations. They showed that visuo-haptic feedback brings essential advantages to virtual interactions.

Flight

Aslandere et al. [101] used a flight simulator interaction system in VR. The immersive system was equipped with audio, manual control with a virtual button, hand and head tracking, and simulated scenarios. They tested various configurations and concluded that the virtual interaction of a manual button depends heavily on the avatar of the hand, that the participants presented more efficient interaction with a less abstract virtual hand, and that the collision of a button was equal to its visual volume.
Li and Zhou [102] showed a VR flight simulator that supports real-time multiuser interaction. This exhibition is exciting and attractive as a new efficient form of scientific dissemination. Marayong et al. [103] presented a modification of the volumetric status display of the cockpit of NASA’s next-generation air transport system, which is an advanced software tool for managing flights in real-time from the cabin. This study integrates force feedback into the cabin visualization framework and its effectiveness in performing two tasks: object selection and route manipulation. Oberhauser and Dreyer [104] presented a VR flight simulator that combines the flexibility of a desktop flight simulator with the level of immersion close to a full flight simulator. Their results show that the system provides reliable information on interaction with the human–machine interface making it a low-cost, trustworthy addition to the early development process of in-cockpit interaction technologies. When it comes to human function evaluations, Valentino et al. [105] developed a VR flight simulator with simple flight dynamics, limited terrain, and objects. This simulator provides a great perspective of flying. They also mention that the flight simulator was not complete and powerful because of its limited flight dynamics.
Table 6. Data table of the transportation domain.
Table 6. Data table of the transportation domain.
Authors/yearDisplayInteractionTrackingApplication
Grane and Bengtsson (2013) [95]Haptic, VRVibration, gizmoNAVisuo-haptic interfaces affect driver performance analysis
Altendorf et al. (2014) [97]Haptic, VR, 2DGizmo, trackingHand, bodyVisuo-haptic driving simulator
Kemeny (2014) [96]VRNANAVR Driving simulators analysis
Mars et al. (2014) [98]Haptic, VRgizmo, trackingHand, bodyVisuo-haptic driving simulator
Aslandere et al. (2015) [101]VRAudio, gizmo, trackingHand, headVirtual hand button interaction on VR flight simulator
Li and Zhou (2016) [102]VRGizmo, trackingHand, headVirtual experience on an aircraft carrier simulator
Marayong et al. (2017) [103]Haptic, 2DGizmo, trackingHandHaptic cockpit air transportation system
Oberhauser and Dreyer (2017) [104]VRTouchpad, gizmo, trackingEye, hand, headVR flight simulator
Valentino et al. (2017) [105]VRGizmo, trackingHeadVR flight simulator
Wang et al. (2018) [99]Haptic, VRAudio, gizmoNAHaptic driving guidance system
Stamer et al. (2020) [100]Haptic, VRVibration, gizmo, trackingHandVisuo-haptic driving simulator benefits analysis

3.1.6. Cultural Heritage

Here, we discuss the use of multimodal interaction and visualization in the context of cultural heritage. See the summary in Table 7.

Museum

Chen et al. [106] presented an AR multimedia system that does not require the user to operate any designated hardware devices such as a keyboard, mouse, or touch screen. Computer vision retrieves the user’s input signal by using an aerial camera, enabling various tasks with virtual objects such as mapping textures, text, and audio. Dima et al. [107] developed a haptic interaction system providing the illusion of touching museum artifacts. They concluded that non-digital prototypes presented more crucial sensory information. In contrast, digital prototypes offer the possibility of adding additional interactive elements that could improve interaction. Papaefthymiou et al. [108] presented a virtual museum environment that can be observed through a cardboard-style headset and controlled with different devices. Jung et al. [8] investigated the impact of VR and AR on the visitor’s broad experience in the museum context. Only a few studies have been conducted in AR environments compared to VR environments. They indicate that VR and AR can be valuable tools for improving tourists’ experience by motivating the intent to visit the actual destination. Kersten et al. [109] developed a virtual museum with two options: (a) interactive software and (b) a VR system, HTC Vive. They collected data about interaction in the exhibition and showed different animations, explaining the changes in the building’s construction over the centuries. Tsai et al. [110] presented an AR museum information guide application. Their results show that usage conforms to usability standards and provides a positive experience during a visit.
Carrozzino et al. [111] investigated the possible positive effects that the use of avatars can provide to a virtual cultural experience, proposing a virtual museum with three different alternatives (panel, audio, and virtual guide), and comparing the results in terms of engagement and understanding of the proposed content.

Archaeology

Gaugne et al. [112] contributed to the multidomain research of archaeology and VR, and they concluded that VR could improve archaeology by proposing modeling and the collaborative analysis of archaeological objects. Pietroni and Adami [113] discussed fundamental concepts about the potential of virtual reconstructions of cultural sites. They mention that a virtual reconstruction should have different visualization, 3D models, narration, behaviors, visualization, and interaction tools. They concluded that it is necessary to design the content of virtual applications and ensure excellent communication. Barbieri et al. [114] described the development of a VR exhibit for interactive exploration of archaeological artifacts. Moreover, they address various technical issues related to the design of virtual museum exhibits based on standard technologies.

Tourism

Younes et al. [115], with the use of VR and AR, showcase the Roman theater of Byblos and discuss potential strategies for implementing this approach in other scenarios. Bekele et al. [9] mention that the use of technologies, such as AR and VR, enables a user-centered presentation and makes cultural heritage digitally accessible, mainly when physical access is restricted. Finally, this work mentions future research directions for AR and VR, focused on interaction interfaces, and suggests the implications for cultural heritage.
Table 7. Data table of the cultural heritage domain.
Table 7. Data table of the cultural heritage domain.
Authors/YearDisplayInteractionTrackingApplication
Chen et al. (2014) [106]ARAudio, trackingHandAR museum guidance system
Dima et al. (2014) [107]Haptic, ARGizmo, trackingHandHaptic interaction that makes the illusion of touching museum artifact
Gaugne et al. (2014) [112]Haptic, VRAudio, gizmo, trackingHand, head, bodyVisuo-haptic interaction in buildings and chambers
Pietroni and Adami (2014) [113]VRAudio, trackingBodyVR museum interactive system
Papaefthymiou et al. (2015) [108]VRTouchpad, audio, gizmo, trackingHead, bodyVR museum application
Jung et al. (2016) [8]VR, ARNANAVR, AR Visitor Experiences in Museum analysis
Kersten et al. (2017) [109]VRGizmo, trackingHand, head, bodyVR museum system
Tsai et al. (2017) [110]ARTouchpad, audio, gizmoNAAR museum tour guide system
Younes et al. (2017) [115]VR, ARGizmo, trackingHandSee cultural buildings in AR and VR
Barbieri et al. (2018) [114]VRTouchpad, audioNAVR exhibition for archaeological museums
Bekele et al. (2018) [9]Haptic, VR, ARNANASurvey of AR, VR, and MR for Cultural Heritage
Carrozzino et al. (2018) [111]VRAudio, gizmoNAVR analysis to evaluate virtual guidance in museums

3.1.7. Industry

The last content section describes the usage of multimodal interaction and visualization in industrial domains. Berg and Vance [2] surveyed VR in the industry, concluding that VR has grown over the past twenty years, and that its knowledge base has expanded significantly in this domain (see the summary in Table 8).

Manufacturing

Perret et al. [116] present a work dealing with implementing an interactive simulation with haptic feedback. They describe the challenges of movement integration, collision detection, and change to assembly constraints. They presented an idea of the maturity of the technology and concluded that one of the main future challenges will be introducing deformable objects since modeling this type of object is essential for the simulation of gripping tasks. Qiu et al. [117] presented a real-time model of a virtual human to perform assembly tasks. They also analyzed driving errors to help users choose a suitable motion capture system.
Xia et al. [118] presented a comprehensive review of VR and tactile issues for product assembly based on rigid pieces of soft wire; they researched new ideas and recent advancements in the area. Gonzales-Badillo et al. [119] studied the development and key features of a visuo-haptic system for planning and evaluating assemblies, which is intended to be used as a tool for training, design analysis, and route planning. The results demonstrated that it could be effectively used to simulate, evaluate, plan, and automatically formalize the assembly of complex models naturally and intuitively. Hamid et al. [120] studied advances in computer modeling, visualization, simulation, and management of product data through the use of VR. They mention that these technologies are a viable alternative for product manufacturing. They also concluded that it is essential to realize that VR is not solely for visualization purposes. Vélaz et al. [121] focused on the use of VR systems to teach industrial assembly tasks and studied the influence of interaction technologies on the learning process.
Abidi et al. [122] developed a VR haptic platform that allows management and interaction of virtual components for assembly. The use of haptics is an effective method to improve the sense of presence in virtual environments and the benefits for tasks such as virtual assembly. Choi et al. [1] surveyed using manufacturing and VR and evaluated the application of the VR technologies element in the context of developing new processes. They concluded that more research is needed to improve manufacturing competitiveness based on the dynamic integration of components, which requires the extension and constant development of related standards for dynamic integration, VR element technologies, and standards. Gavish et al. [123] assessed VR and AR training platforms for maintenance and assembly tasks, concluding that these platforms provide new interaction, and that users need time to learn how to use them efficiently. Grajewski et al. [124] tested different approaches to creating realistic, immersive educational simulations of workplace conditions for assembly operations with the help of haptic and VR systems. The level of realism was a crucial factor when performing immersive simulations of workstations for training purposes. This type of simulation was identified as an excellent tool for performing training tasks. Different types of views and AR features to show assembly instructions using AR applications were studied by Radkowski et al. [125]. This was demonstrated to improve user confidence while performing tasks, and it also allowed the transfer of the learned skills to other tasks. Al-Ahmari et al. [126] presented a manufacturing assembly simulation system that uses a virtual environment to create an interactive workbench that can be used to evaluate assembly decisions and assembly operations training. It is a comprehensive system that provides visual, auditory, tactile, and force feedback. Future work includes a series of user-based evaluation studies to assess the effectiveness of their system for training. In addition, they mention that other haptic feedback mechanisms, such as friction and gravity, will be added to the environment. Wang et al. [127] presented an AR simulator that assists in completing assembly tasks, facilitates assembly, planning, and product design. They conclude that it is crucial to improve depth detection to facilitate the construction of an assembly allowing a greater fusion of real and virtual components. Xia [128] surveyed the use of haptics for product design and manufacturing simulation. They observed that many researchers have developed their haptic interfaces to simulate the design and manufacture of products, but that most of these devices are still in the laboratory stage. Finally, Ho et al. [129] proposed and evaluated a VR training system for the assembly of hybrid medical devices. Their system integrates Artificial Intelligence, VR, and game concepts, and their results showed that the proposed training has significant advantages over standard VR training and conventional training.
Roldán et al. [130] propose a system to transfer knowledge in the context of Industry 4.0. The system provides an immersive VR-based interface for expert operators and trainees. The aim is to focus on applying the proposed system to more realistic assemblies as future work. Finally, they mention that a comparison of VR and AR in the industry context would be of interest to determine the future of immersive training systems.

Maintenance

Loch et al. [131] proposed a concept of haptic interaction in a virtual training system for maintenance procedures. They report that one benefit of virtual training systems is the attractiveness for students, as well as their flexibility in presentation and interaction; therefore, they are enhanced by the possibilities of haptic interaction.
Table 8. Data table of the industry domain.
Table 8. Data table of the industry domain.
Authors/YearDisplayInteractionTrackingApplication
Perret et al. (2013) [116]Haptic, VRNANAHaptic feedback for assembly tasks analysis
Qiu et al. (2013) [117]Haptic, VRAudio, gizmo, trackingHand, head, bodyVisuo-haptic assembly system
Xia et al. (2013) [118]Haptic, VRNANAVisuo-haptic review for product assembly
Gonzalez-Badillo et al. (2014) [119]Haptic, VRGizmo, trackingHandVisuo-haptic assembly system
Hamid et al. (2014) [120]VR, ARNANAReview of VR applications in manufacturing
Vélaz et al. (2014) [121]Haptic, VR, 2Dgizmo, trackingHandVisuo-haptics assembling parts learning system
Abidi et al. (2015) [122]Haptic, VRGizmo, trackingHandVisuo-haptic assembly system
Choi et al. (2015) [1]VRNANASurvey of VR in manufacturing
Gavish et al. (2015) [123]Haptic, VR, ARTouchpad, vibration, audio, gizmo, trackingObject, handVR, AR assembly system
Grajewski et al. (2015) [124]Haptic, VRGizmo, trackingHand, head, bodyVisuo-haptic assembling parts learning simulator
Radkowski et al. (2015) [125]ARTrackingObjectAR training system to show assembly instructions
Al-Ahmari et al. (2016) [126]Haptic, VRAudio, gizmo, trackingHand, headVisuo-haptic manufacturing assembly simulator
Wang et al. (2016) [127]ARTrackingObject, handAR assembly simulation
Xia (2016) [128]Haptic, VR, ARNANAHaptic manufacturing simulators survey
Berg and Vance (2017) [2]Haptic, VRNANASurvey of use of VR in manufacturing
Ho et al. (2018) [129]VRTouchpad, gizmo, trackingHand, head, bodyVR assembly training system
Loch et al. (2018) [131]Haptic, VRGizmo, trackingHandHaptic interaction into a virtual training system for maintenance procedures
Roldán et al. (2019) [130]VRAudio, gizmo, trackingHand, headVR system to transfer knowledge in the context of Industry 4.0

3.2. Analysis

This section presents the analysis of literature guided by our four research questions from Section 2.1. We analyzed the collected articles by focusing on three main tendencies. (1) Temporal: we compared the number of papers in each area over the last seven years. This analysis focused on responding to the first research question (RQ1) and question two (RQ2). (2) Frequency: we calculated the number of papers in each application domain and cross-measured them with the types of displays, interaction, tracking, and applications. This analysis focused on responding to research question three (RQ3). Finally, we have (3) Cluster: Each paper was considered to be a point in a multidimensional space, and we performed a cluster analysis to show the proximity and similarity of papers from the viewpoint of domain and type. This analysis focused on responding to research question four (RQ4).

3.2.1. Temporal Analysis

Figure 2a shows the results of our temporal analysis of the papers published within the range of the past seven years (2013-2021) per domain and per year to address RQ1. Overall, a decrease in the work carried out based on multimodal HCI can be observed. A significant increase was observed in 2014 in all fields, and only UX continued to grow until 2017. The years 2018 and 2019 show a decrease. Note that the year 2021 had not concluded at the time of this submission; thus, the numbers are incomplete. Figure 2b shows the same time but reflects the number of published papers that have been dedicated to using technology, thus addressing RQ2. Haptic technology, AR, and VR had a strong increase in 2014, but the number of papers published per year has decreased since then.
We can only speculate about the temporal tendencies from Figure 2. One observation is that the number of papers is substantially divided per domain or technology, which may exacerbate fluctuations. Another impact may be market behavior. There were several sizable purchases (e.g., Facebook acquired Oculus in 2014 [132]) that may have a future effect on these technologies. Still, there may be a lack of widespread adoption by users. Moreover, the device’s price may be a substantial factor (Oculus cost about USD 400 in 2020, but the force-based haptic devices range from USD 150 to USD 20,000).

3.2.2. Frequency Analysis

Frequency analysis focuses on addressing RQ3 and shows the absolute histogram of domain and type of interaction used in Figure 3. The maximum value of 20 related to papers on the cross-section of VR and UX. Moreover, many papers studied UX in interaction with gizmos and varying kinds of tracking. VR has also been used frequently in Industry, Physics, and Medicine. Another frequently used technology is haptic interaction that has been studied in the context of Medicine, Physics, UX, and Industry. We found that certain types of interaction, such as temperature and wind, are not commonly used.

3.2.3. Cluster Analysis

This step investigates the proximity and similarity of the papers from the viewpoint of the domain and application described in Section 1, thus addressing RQ4. Let us recall that we have the following domains: Concepts and overviews (basic algorithms) (Section 3.1.1), Medicine (Section 3.1.2), Physics (Section 3.1.3), Human factors/User experience design (UX) (Section 3.1.4), Transportation (Section 3.1.5), Cultural heritage (Section 3.1.6), and Industry (Section 3.1.7). We will denote the domains by d, and we will use the lower index to identify each domain as follows:
D = { d b , d m , d p , d u , d c , d t , d i } ,
where d b means Concepts and overviews, d m denotes Medicine, d p denotes Physics, d u denotes Human factors/User experience design, d c denotes Cultural heritage, d t denotes Transportation, and d i denotes Industry.
Each domain d D can use zero, one, or more types of interactions. In order to compare papers from the domain D, we define a metric that assigns each paper a value depending on the type of interactions used. In particular, we assign an integer number defining how many types of interaction the paper uses. Having a paper p k from domain d k , k { d , m , p , u , c , t , i } , the value of the paper is set as follows:
p k ( j ) = p k [ h a p t i c s , V R , A R , 2 D , . . . ] ,
where each value of p k is either one or zero depending on if the paper uses the type of interaction.
Distance of two papers p a and p b in the n dimensional space is then calculated by using the Euclidean L 2 norm as follows.
L 2 ( p a , p b ) = j = 0 n p a ( j ) p b ( j ) 2 .
By having a defined distance of two papers, we can perform cluster analysis in n dimensional space. We applied the k-means algorithm that clusters data into a predefined number of clusters. While the number of clusters is unknown, it can be determined by using the elbow method, as shown in Figure 4. The idea is to run the algorithm with an increasing number of clusters and measure the compactness of each cluster. Initially, all papers are in a single sparse cluster. As the number of clusters increases, the clusters become less dense, eventually resulting in each paper being in its cluster alone. A good number of clusters is in the “elbow” of the graph that shows a compromise between the number of papers in each cluster (higher is better) and the compactness of each cluster (more compact is better). The ratio is expressed as the distortion score. We have found the number of clusters k = 3 . Figure 4 also shows the choice of clusters for k = 2 and k = 4 , demonstrating that k + 3 is a good choice.
The results of k-means cluster the papers according to their distance, but it is not obvious what each cluster includes and why.
The initial space is five-dimensional, and we used T-distributed Stochastic Neighbor Embedding (t-SNE) algorithm to project it to 2D. The algorithm attempts to keep the points that are close to in the higher dimensional space also close in 2D. The algorithm has some parameters, and we used the following: perplexity = 100, exaggeration = 1, and 29 PCA components. The results of the t-SNE algorithm are shown in Figure 5, where the three clusters from k-means are identified with different colors; each data point corresponds to one paper, and we also show each paper’s authors and year of publication. Moreover, we visualized the cluster subdivision as a dendrogram in Figure 6.

4. Discussion

Rubio-Tamayo et al. [35] stated that VR and technologies associated with the virtuality continuum are “emerging media” referring to VR as a concept and proposing models to link it to other domains, such as UX. While the same authors follow the “bidirectional communication theory” approach by Marko [133], we focus on the link of multimodal HCI technology in the particular domains of Concepts and overviews, Medicine, Physics, Transportation, Cultural heritage, and Industry, in addition to UX, in order to highlight specific technology configurations that are typical for each domain, suggesting future lines of work from the characteristics of these configurations. These possible future work options, in turn, stem from the answer to RQ1 and RQ2:
Research Question 1—How has the type of technology changed over time in each domain?
As can be observed throughout Section 3, in general, the main types of technology used by all domains over time are haptic and VR. In addition, it is perceived that there is greater use of AR in the domains of Industry and Cultural Heritage than in the other domains. In contrast, in domains such as Transportation or UX, the use of AR is almost nil. In 2013, Medicine and UX domains had higher use of haptics, while the Physics domain had higher use of VR. Some domains such as Industry and Transportation had fair use of haptics and VR, and the Concepts and Overviews and Cultural Heritage domains do not present works. For 2014, the domain with the highest use of haptics is Medicine, while the domains with the highest use of VR are Physics, UX, Transportation, and Industry, while the Concepts and Overviews, and Cultural Heritage domains present an equitable use of haptics and VR. It was also shown that the domains of Medicine, Industry, and, for the most part, Cultural Heritage work with AR for that year. In 2015, the Physics domain presents a greater use of haptics; the Concepts and Overviews, UX, Transportation, Cultural Heritage, and Industry domains present a greater use of VR; and the Medicine domain presents an equitable use of haptic and VR technologies. When using AR, the Concepts and Overviews, and Industry domains are the only ones that present jobs. Then, in 2016, the dominant technology is VR in the domains of Concepts and Overviews, Physics, UX, and Transportation, while in the domain of Medicine, there is an equitable use of haptics and VR, and in the domain of Industry there is an equitable use of haptics, VR and AR, together. It can, thus, be indicated that in the use of AR, the domains of Concepts and Overviews, Medicine, Physics, and Cultural Heritage are also presented. In 2017, there is greater use of VR in the domains of Medicine, UX, and Transportation, while the domains of Concepts and Overviews, Physics, and Industry present an equitable use of haptics and VR, and the Cultural Heritage domain presents an equitable use of VR and AR. In subsequent years, there is a decrease in the number of works developed in all domains, but VR is still maintained as the leading technology and supported in some cases by haptics or AR.
The synthesis aims to provide an overview of each domain’s technological state by noting changes in technology type for each. An example of configuration change for technology type over time is found in the work of Pacchierotti et al. [7], which reviews the progression of haptic systems for the fingertip and the hand from stationary to wearable devices in ten years. Typical technology type configurations for a specific domain change, not only as new applications appear but also on the technology type available to approach a particular domain problem. In this review, we find a difference in the frequency of works related to their domain and technology type, as shown in Section 2.
Research Question 2—What is the typical technology type by domain?
Rubio-Tamayo et al. [35] affirm that “it is necessary to determine what we want to develop as an experience and how to connect it in a more multisensorial experience”. Aiming to develop this concept, we propose an updated review to report how experts working in each domain design applications, choosing specific types of technology when they develop the kind of experience that is appropriate for their target users and, hence, the relevance of the selected technology type. We found that, for the works reviewed in Section 3.2.2, the most used technology type is VR across all domains (Concepts and overviews, Physics, Transportation, Cultural heritage, Industry, and UX), except in Medicine where haptics is the most used, followed by VR.
Examples of VR and technologies associated with the virtuality continuum as emerging media are found in specific configurations according to the studied domain and time of application. For instance, Rose et al. [51] reviewed several studies aiming to understand the impact of VR and haptic feedback on the application of healthcare, noting that viewing mediums acquire immersive properties as technology advances: from computer monitors to panoramic TVs to Head-Mounted Displays (HMD). Similarly, Hamza-Lup et al. [49] surveyed visuo-haptic systems for surgical training. When the studied domain changes, so does the configuration of technological tools. For example, Stamer et al. [100] explore the use of VR with a haptic glove. They report the design of an application that uses a visuo-haptic system but only in a configuration that responds to its context; thus, it is different in technology type from the typical healthcare visuo-haptic application. We aim to show that the term visuo-haptic has a tool set with a different technology type, depending on the domain and time. The applications cited in Medicine and Transportation research show a different visuo-haptic conceptualization. This temporal analysis provides data to answer RQ3:
Research Question 3—How has the frequency of research publications changed over time by domain?
For the set of reviewed works between 2013 and 2021, Figure 2 shows a decreasing frequency in all domains since 2017. Some authors [134,135] mention that such a decrease in the development of this type of work may be due to various causes. Among those mentioned by said authors, increasingly complex applications can have greater barriers to adoption. In addition, a higher resolution is needed, which implies greater development and, therefore, greater investment. Despite this, it is expected that these barriers will decrease over time and that there will be a positive evolution in the coming years. Furthermore, we noted that the lack of widespread adoption of the haptic, AR, and VR technologies could be the cause for this decrease, as shown in Section 3.2.1. Specifically, the following appears to apply:
  • The domains of Concepts and overviews and UX do not clearly present new works that have the characteristics to be part of the analysis of this work since 2018 and that the domain of Cultural heritage does not clearly present new works that have the characteristics to form part of the analysis of this work since 2019.
  • One work in the Transportation domain from 2020 has the characteristics to form part of the analysis of this work, while no new work was found with the characteristics to form part of the analysis of this work for the other domains.
Finally, the frequency of works by domain does not provide enough insight into the relation between technology type and domain. Section 3.2.3 and the Figure 5 show the three clusters identified by k-means and the distribution of papers from each domain, revealing that, for the works reviewed, a significant amount of shared technology and interaction types between domains exists. Specifically, for the works reviewed, the following was observed:
  • In all domains, we found works with applications of haptics and VR.
  • The most used technology type is VR in all domains (Concepts and overviews, Physics, Transportation, Cultural heritage, Industry, and UX), except in Medicine where haptics is the most commonly used.
  • The second most used technology type is haptics in most domains (Concepts and overviews, Physics, Transportation, Industry, and UX), except in Medicine, where VR is the second most used, and Cultural heritage, where AR is the second most used.
  • We did not find AR applications for the Transportation or UX domains, nor did we find 2D applications for the Cultural heritage domain. Both technology types, 2D and AR, are the least used in all domains.
  • The most used interaction types for all domains are gizmo and tracking.
This information is the reference to answer our RQ4:
Research Question 4—How are research publications grouped across metrics given their similarities?
Figure 5 and Table 9 show the existence of three clusters, and they also show how each domain contributes to each cluster. The smallest is the green cluster that shows works from closely-related domains since there is a relationship between the domains Concepts and overviews, Medicine, and to a more significant extent Physics, UX, and Transportation, while leaving out the domains of Cultural heritage and Industry.
The medium-sized cluster is the red cluster. It is the cluster with the least number of related domains since a more significant relationship is presented only in the domains Concepts and overviews, Medicine, and to lesser extent Transportation, excluding the other four domains studied in this work.
Finally, the largest cluster is shown in blue. It includes Physics and, to a greater extent UX, Cultural heritage, and Industry, leaving out the Concepts and overviews, Medicine, and Transportation domains.
Table 10 shows the total number of works by domain. There is a greater number of works related to UX, Industry, Physics, and Medicine and, to a lesser extent, Concepts and overviews, Cultural Heritage, and Transportation. This table also shows how the maximum number of works in each domain has been published within our studied period of 2013–2017. This table also shows the types of display most commonly used among those analyzed (with prevailing VR and Haptic) and the most commonly used interactions (prevailing gizmo and tracking). These findings possibly open the door to novel works that would exploit the combinations of different types of displays and interactions within any domain or combination of domains, with a solid knowledge base.

5. Conclusions, Limitations, and Future Work

We presented a systematic review of multimodal human–computer interaction showing how different technologies are used in various domains. We defined the initial set of domains calibrated to provide balanced numbers of papers in each area. We then studied the works from the viewpoint of temporal patterns, frequency of usage in technology types in different domains, and cluster analysis by using paper metrics. This analysis allowed us to answer relevant questions when searching for the next steps in work related to multimodal HCI.
We studied the typical technology type, how the technology type and frequency have changed over time in each domain, and how papers are grouped across metrics given their similarities. We determined that VR and haptic are the most widely used in all domains. While VR is the most used, haptic interaction is presented in an increasing number of applications, suggesting future work on applications that configure VR and haptic together. We can only speculate about the implications for future development. Still, it seems that VR will become more common in many areas, and haptics technologies should follow a logical expansion.
We found the following limitations to the technology type and domain approach. (1) The clustering process occurs over seven domains and four types of interaction, resulting in three clusters with low density. (2) The clustering uses domains and types of interaction but not the application type. (3) The presented review work reports the applications used in a particular domain but does not indicate how well they are used. We also recognize that not all training technologies were included in this review. For example, technologies such as type and screen for medical training were not included in the medical domain, nor was task-specific virtual reality training included for rehabilitation. This was due to the key search terms and the defined inclusion and exclusion criteria.
There are many possible avenues for future work: (1) designing and proposing a metric to report the effectiveness of the used technology in each domain; (2) analyzing and describing the process behind the fluctuations of frequency shown in the Temporal Analysis Section 3.2.1 to explain the decrease in works per year in every domain; (3) exploring further the clustering by domains to scan a more compact cluster.

Author Contributions

Conceptualization, all authors; methodology, J.D.A., J.N., and B.B.; validation, J.N., S.R., J.M.M.-E., A.J.M., and B.B.; formal analysis, all authors; research, J.D.A.; writing—original draft preparation, all authors; writing—review and editing, all authors; supervision, J.N., S.R., J.M.M.-E., A.J.M., and B.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the “Fund for Energy Sustainability CONACyT-SENER” (Fondo Sectorial CONACyT-Secretaría de Energía-Sustentabilidad Energética) under Grant 266632 Bi-national laboratory for the intelligent management of energy sustainability and technological education and CONACyT under Grant 847335; and Tecnologico de Monterrey, Campus Ciudad de México by a grant for scholarship tuition. This research was funded in part by the National Science Foundation grant #10001387 Functional Proceduralization of 3D Geometric Models.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Acknowledgments

We would like to thank Vicerrectoría de Investigación y Posgrado, the Research Group of Product Innovation, and the Cyber Learning and Data Science Laboratory of Tecnologico de Monterrey.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Choi, S.; Jung, K.; Noh, S.D. Virtual reality applications in manufacturing industries: Past research, present findings, and future directions. Concurr. Eng. 2015, 23, 40–63. [Google Scholar] [CrossRef]
  2. Berg, L.P.; Vance, J.M. Industry use of virtual reality in product design and manufacturing: A survey. Virtual Real. 2017, 21, 1–17. [Google Scholar] [CrossRef]
  3. Freina, L.; Ott, M. A literature review on immersive virtual reality in education: State of the art and perspectives. In The International Scientific Conference eLearning and Software for Education; “Carol I” National Defence University: Bucharest, Romania, 2015; Volume 1, p. 133. [Google Scholar]
  4. Escobar-Castillejos, D.; Noguez, J.; Neri, L.; Magana, A.; Benes, B. A Review of Simulators with Haptic Devices for Medical Training. J. Med. Syst. 2016, 40, 1–22. [Google Scholar] [CrossRef] [PubMed]
  5. Vaughan, N.; Dubey, V.N.; Wainwright, T.W.; Middleton, R.G. A review of virtual reality based training simulators for orthopaedic surgery. Med. Eng. Phys. 2016, 38, 59–71. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  6. Escobar-Castillejos, D.; Noguez, J.; Bello, F.; Neri, L.; Magana, A.J.; Benes, B. A Review of Training and Guidance Systems in Medical Surgery. Appl. Sci. 2020, 10, 5752. [Google Scholar] [CrossRef]
  7. Pacchierotti, C.; Sinclair, S.; Solazzi, M.; Frisoli, A.; Hayward, V.; Prattichizzo, D. Wearable Haptic Systems for the Fingertip and the Hand: Taxonomy, Review, and Perspectives. IEEE Trans. Haptics 2017, 10, 580–600. [Google Scholar] [CrossRef] [Green Version]
  8. Jung, T.; Tom Dieck, M.C.; Lee, H.; Chung, N. Effects of virtual reality and augmented reality on visitor experiences in museum. In Information and Communication Technologies in Tourism 2016; Springer: Bilbao, Spain, 2016; pp. 621–635. [Google Scholar]
  9. Bekele, M.K.; Pierdicca, R.; Frontoni, E.; Malinverni, E.S.; Gain, J. A survey of augmented, virtual, and mixed reality for cultural heritage. J. Comput. Cult. Herit. 2018, 11, 1–36. [Google Scholar] [CrossRef]
  10. Obrenovic, Z.; Starcevic, D. Modeling multimodal human-computer interaction. Computer 2004, 37, 65–72. [Google Scholar] [CrossRef] [Green Version]
  11. Anthes, C.; García-Hernández, R.J.; Wiedemann, M.; Kranzlmuller, D. State of the art of virtual reality technology. In Proceedings of the 2016 IEEE Aerospace Conference, Big Sky, MT, USA, 5–12 March 2016; pp. 1–19. [Google Scholar] [CrossRef]
  12. Hornbæk, K.; Mottelson, A.; Knibbe, J.; Vogel, D. What Do We Mean by “Interaction”? An Analysis of 35 Years of CHI. ACM Trans. Comput.-Hum. Interact. 2019, 26, 1–30. [Google Scholar] [CrossRef]
  13. Elvins, T.T. A survey of algorithms for volume visualization. ACM Siggraph Comput. Graph. 1992, 26, 194–201. [Google Scholar] [CrossRef]
  14. Kucher, K.; Kerren, A. Text visualization techniques: Taxonomy, visual survey, and community insights. In Proceedings of the 2015 IEEE Pacific Visualization Symposium (PacificVis), Hangzhou, China, 14–17 April 2015; pp. 117–121. [Google Scholar]
  15. Liu, S.; Cui, W.; Wu, Y.; Liu, M. A survey on information visualization: Recent advances and challenges. Vis. Comput. 2014, 30, 1373–1393. [Google Scholar] [CrossRef]
  16. Desai, P.R.; Desai, P.N.; Ajmera, K.D.; Mehta, K. A review paper on oculus rift-a virtual reality headset. arXiv 2014, arXiv:1408.1173. [Google Scholar]
  17. Garzón, J. An Overview of Twenty-Five Years of Augmented Reality in Education. Multimodal Technol. Interact. 2021, 5, 37. [Google Scholar] [CrossRef]
  18. Marques, B.; Alves, J.a.; Neves, M.; Justo, I.; Santos, A.; Rainho, R.; Maio, R.; Costa, D.; Ferreira, C.; Dias, P.; et al. Interaction with Virtual Content Using Augmented Reality: A User Study in Assembly Procedures. Proc. ACM Hum.-Comput. Interact. 2020, 4, 1–17. [Google Scholar] [CrossRef]
  19. Faeth, A.; Harding, C. Emergent Effects in Multimodal Feedback from Virtual Buttons. ACM Trans. Comput.-Hum. Interact. 2014, 21, 1–23. [Google Scholar] [CrossRef]
  20. Sreelakshmi, M.; Subash, T. Haptic technology: A comprehensive review on its applications and future prospects. Mater. Today Proc. 2017, 4, 4182–4187. [Google Scholar] [CrossRef]
  21. Leff, B.; Finucane, T.E. Gizmo idolatry. JAMA 2008, 299, 1830–1832. [Google Scholar] [CrossRef]
  22. Vera-Baceta, M.A.; Thelwall, M.; Kousha, K. Web of Science and Scopus language coverage. Scientometrics 2019, 121, 1803–1813. [Google Scholar] [CrossRef]
  23. García, J.A.; Rodriguez-Sánchez, R.; Fdez-Valdivia, J. Ranking of the subject areas of Scopus. J. Am. Soc. Inf. Sci. Technol. 2011, 62, 2013–2023. [Google Scholar] [CrossRef]
  24. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. BMJ 2021, 372, 105906. [Google Scholar]
  25. Kitchenham, B.; Charters, S. Guidelines for Performing Systematic Literature Reviews in Software Engineering; Technical Report, Ver.2.3; Keele University and Durham University Joint Report: Keele, UK; Durham, UK, 2007. [Google Scholar]
  26. Xiao, Y.; Watson, M. Guidance on conducting a systematic literature review. J. Plan. Educ. Res. 2019, 39, 93–112. [Google Scholar] [CrossRef]
  27. Torres-Carrión, P.V.; González-González, C.S.; Aciar, S.; Rodríguez-Morales, G. Methodology for systematic literature review applied to engineering and education. In Proceedings of the 2018 IEEE Global Engineering Education Conference (EDUCON), Santa Cruz de Tenerife, Spain, 17–20 April 2018; pp. 1364–1373. [Google Scholar] [CrossRef]
  28. Kehrer, J.; Hauser, H. Visualization and Visual Analysis of Multifaceted Scientific Data: A Survey. IEEE Trans. Vis. Comput. Graph. 2013, 19, 495–513. [Google Scholar] [CrossRef] [PubMed]
  29. Reda, K.; Febretti, A.; Knoll, A.; Aurisano, J.; Leigh, J.; Johnson, A.; Papka, M.E.; Hereld, M. Visualizing Large, Heterogeneous Data in Hybrid-Reality Environments. IEEE Comput. Graph. Appl. 2013, 33, 38–48. [Google Scholar] [CrossRef]
  30. Olshannikova, E.; Ometov, A.; Koucheryavy, Y.; Olsson, T. Visualizing Big Data with augmented and virtual reality: Challenges and research agenda. J. Big Data 2015, 2, 22. [Google Scholar] [CrossRef]
  31. Slater, M.; Sanchez-Vives, M.V. Enhancing Our Lives with Immersive Virtual Reality. Front. Robot. AI 2016, 3, 74. [Google Scholar] [CrossRef] [Green Version]
  32. Mihelj, M.; Novak, D.; Begus, S. Interaction with a Virtual Environment. In Virtual Reality Technology and Applications; Springer: Dordrecht, The Netherlands, 2014; pp. 205–211. [Google Scholar]
  33. Muhanna, M.A. Virtual reality and the CAVE: Taxonomy, interaction challenges and research directions. J. King Saud Univ. Comput. Inf. Sci. 2015, 27, 344–361. [Google Scholar] [CrossRef] [Green Version]
  34. Chavan, S.R. Augmented reality vs. virtual reality: Differences and similarities. Int. J. Adv. Res. Comput. Eng. Technol. 2016, 5, 1947–1952. [Google Scholar]
  35. Rubio-Tamayo, J.L.; Gertrudix Barrio, M.; Garcia Garcia, F. Immersive Environments and Virtual Reality: Systematic Review and Advances in Communication, Interaction and Simulation. Multimodal Technol. Interact. 2017, 1, 21. [Google Scholar] [CrossRef] [Green Version]
  36. Achibet, M.; Marchal, M.; Argelaguet, F.; Lécuyer, A. The Virtual Mitten: A novel interaction paradigm for visuo-haptic manipulation of objects using grip force. In Proceedings of the 2014 IEEE Symposium on 3D User Interfaces (3DUI), Minneapolis, MN, USA, 29–30 March 2014; pp. 59–66. [Google Scholar]
  37. Deng, S.; Kirkby, J.A.; Chang, J.; Zhang, J.J. Multimodality with eye tracking and haptics: A new horizon for serious games? Int. J. Serious Games 2014, 1, 17–34. [Google Scholar] [CrossRef]
  38. Hekler, E.B.; Klasnja, P.; Froehlich, J.E.; Buman, M.P. Mind the Theoretical Gap: Interpreting, Using, and Developing Behavioral Theory in HCI Research. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’13, Paris, France, 27 April–2 May 2013; ACM: New York, NY, USA, 2013; pp. 3307–3316. [Google Scholar] [CrossRef]
  39. Vines, J.; Clarke, R.; Wright, P.; McCarthy, J.; Olivier, P. Configuring Participation: On How We Involve People in Design. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’13, Paris, France, 27 April–2 May 2013; ACM: New York, NY, USA, 2013; pp. 429–438. [Google Scholar] [CrossRef]
  40. Diefenbach, S.; Kolb, N.; Hassenzahl, M. The ‘hedonic’ in human-computer interaction: History, contributions, and future research directions. In Proceedings of the 2014 Conference on Designing Interactive Systems, Vancouver, BC, Canada, 21–25 June 2014; ACM: New York, NY, USA, 2014; pp. 305–314. [Google Scholar]
  41. Ruthenbeck, G.S.; Reynolds, K.J. Virtual reality for medical training: The state-of-the-art. J. Simul. 2015, 9, 16–26. [Google Scholar] [CrossRef]
  42. Talasaz, A.; Patel, R.V. Integration of Force Reflection with Tactile Sensing for Minimally Invasive Robotics-Assisted Tumor Localization. IEEE Trans. Haptics 2013, 6, 217–228. [Google Scholar] [CrossRef] [PubMed]
  43. Díaz, I.; Gil, J.J.; Louredo, M. A haptic pedal for surgery assistance. Comput. Methods Programs Biomed. 2014, 116, 97–104. [Google Scholar] [CrossRef] [PubMed]
  44. Jeon, S.; Harders, M. Haptic Tumor Augmentation: Exploring Multi-Point Interaction. IEEE Trans. Haptics 2014, 7, 477–485. [Google Scholar] [CrossRef] [PubMed]
  45. Esteban, G.; Fernández, C.; Conde, M.A.; García-Peñalvo, F.J. Playing with SHULE: Surgical Haptic Learning Environment. In Proceedings of the Second International Conference on Technological Ecosystems for Enhancing Multiculturality, TEEM ’14, Salamanca, Spain, 1–3 October 2014; ACM: New York, NY, USA, 2014; pp. 247–253. [Google Scholar] [CrossRef]
  46. Ruffaldi, E.; Brizzi, F.; Filippeschi, A.; Avizzano, C.A. Co-located haptic interaction for virtual USG exploration. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; pp. 1548–1551. [Google Scholar] [CrossRef]
  47. Fortmeier, D.; Wilms, M.; Mastmeyer, A.; Handels, H. Direct Visuo-Haptic 4D Volume Rendering Using Respiratory Motion Models. IEEE Trans. Haptics 2015, 8, 371–383. [Google Scholar] [CrossRef] [PubMed]
  48. Khanal, P.; Vankipuram, A.; Ashby, A.; Vankipuram, M.; Gupta, A.; Drumm-Gurnee, D.; Josey, K.; Tinker, L.; Smith, M. Collaborative virtual reality based advanced cardiac life support training simulator using virtual reality principles. J. Biomed. Informat. 2014, 51, 49–59. [Google Scholar] [CrossRef] [Green Version]
  49. Hamza-Lup, F.G.; Bogdan, C.M.; Popovici, D.M.; Costea, O.D. A Survey of Visuo-Haptic Simulation in Surgical Training. arXiv 2019, arXiv:1903.03272. [Google Scholar]
  50. Pan, J.J.; Chang, J.; Yang, X.; Liang, H.; Zhang, J.J.; Qureshi, T.; Howell, R.; Hickish, T. Virtual reality training and assessment in laparoscopic rectum surgery. Int. J. Med. Robot. Comput. Assist. Surg. 2015, 11, 194–209. [Google Scholar] [CrossRef] [Green Version]
  51. Rose, T.; Nam, C.S.; Chen, K.B. Immersion of virtual reality for rehabilitation—Review. Appl. Ergon. 2018, 69, 153–161. [Google Scholar] [CrossRef]
  52. Andaluz, V.H.; Salazar, P.J.; Escudero, M.; Bustamante, C.; Silva, M.; Quevedo, W.; Sánchez, J.S.; Espinosa, E.G.; Rivas, D. Virtual reality integration with force feedback in upper limb rehabilitation. In International Symposium on Visual Computing; Springer: Las Vegas, NV, USA, 2016; pp. 259–268. [Google Scholar]
  53. Won, A.S.; Bailey, J.; Bailenson, J.; Tataru, C.; Yoon, I.A.; Golianu, B. Immersive Virtual Reality for Pediatric Pain. Children 2017, 4, 52. [Google Scholar] [CrossRef] [Green Version]
  54. Wang, D.; Li, T.; Zhang, Y.; Hou, J. Survey on multisensory feedback virtual reality dental training systems. Eur. J. Dent. Educ. 2016, 20, 248–260. [Google Scholar] [CrossRef]
  55. Wang, D.; Zhao, S.; Li, T.; Zhang, Y.; Wang, X. Preliminary evaluation of a virtual reality dental simulation system on drilling operation. Bio-Med. Mater. Eng. 2015, 26, S747–S756. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  56. Kucukyilmaz, A.; Sezgin, T.M.; Basdogan, C. Intention Recognition for Dynamic Role Exchange in Haptic Collaboration. IEEE Trans. Haptics 2013, 6, 58–68. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  57. Donalek, C.; Djorgovski, S.G.; Davidoff, S.; Cioc, A.; Wang, A.; Longo, G.; Norris, J.S.; Zhang, J.; Lawler, E.; Yeh, S.; et al. Immersive and Collaborative Data Visualization Using Virtual Reality Platforms. arXiv 2014, arXiv:1410.7670. [Google Scholar]
  58. Kim, S.C.; Kwon, D.S. Haptic interaction with objects in a picture based on pose estimation. Multimed. Tools Appl. 2014, 72, 2041–2062. [Google Scholar] [CrossRef]
  59. Kokubun, A.; Ban, Y.; Narumi, T.; Tanikawa, T.; Hirose, M. Representing normal and shearing forces on the mobile device with visuo-haptic interaction and a rear touch interface. In Proceedings of the 2014 IEEE Haptics Symposium (HAPTICS), Houston, TX, USA, 23–26 February 2014; pp. 415–420. [Google Scholar] [CrossRef]
  60. Nakamura, T.; Yamamoto, A. Multi-finger surface visuo-haptic rendering using electrostatic stimulation with force-direction sensing gloves. In Proceedings of the 2014 IEEE Haptics Symposium (HAPTICS), Houston, TX, USA, 23–26 February 2014; pp. 489–491. [Google Scholar] [CrossRef]
  61. Yuksel, T.; Walsh, Y.; Krs, V.; Benes, B.; Ngambeki, I.B.; Berger, E.J.; Magana, A.J. Exploration of affordances of visuo-haptic simulations to learn the concept of friction. In Proceedings of the 2017 IEEE Frontiers in Education Conference (FIE), Indianapolis, IN, USA, 18–21 October 2017; pp. 1–9. [Google Scholar]
  62. Yuksel, T.; Walsh, Y.; Magana, A.J.; Nova, N.; Krs, V.; Ngambeki, I.; Berger, E.J.; Benes, B. Visuohaptic experiments: Exploring the effects of visual and haptic feedback on students’ learning of friction concepts. Comput. Appl. Eng. Educ. 2019, 27, 1376–1401. [Google Scholar] [CrossRef]
  63. Neri, L.; Magana, A.J.; Noguez, J.; Walsh, Y.; Gonzalez-Nucamendi, A.; Robledo-Rella, V.; Benes, B. Visuo-haptic Simulations to Improve Students’ Understanding of Friction Concepts. In Proceedings of the 2018 IEEE Frontiers in Education Conference (FIE), San Jose, CA, USA, 3–6 October 2018; pp. 1–6. [Google Scholar]
  64. Prachyabrued, M.; Borst, C.W. Visual feedback for virtual grasping. In Proceedings of the 2014 IEEE Symposium on 3D User Interfaces (3DUI), Minneapolis, MN, USA, 29–30 March 2014; pp. 19–26. [Google Scholar] [CrossRef]
  65. Madan, C.E.; Kucukyilmaz, A.; Sezgin, T.M.; Basdogan, C. Recognition of Haptic Interaction Patterns in Dyadic Joint Object Manipulation. IEEE Trans. Haptics 2015, 8, 54–66. [Google Scholar] [CrossRef] [Green Version]
  66. Wang, Z.; Wang, Y. Haptic Interaction with Fluid Based on Smooth Particles and Finite Elements. In Computational Science and Its Applications; ICCSA 2014 Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2014; pp. 808–823. [Google Scholar] [CrossRef]
  67. Walsh, Y.; Magana, A.J.; Quintana, J.; Krs, V.; Coutinho, G.; Berger, E.; Ngambeki, I.B.; Efendy, E.; Benes, B. Designing a Visuohaptic Simulation to Promote Graphical Representations and Conceptual Understanding of Structural Analysis. In Proceedings of the 2018 IEEE Frontiers in Education Conference (FIE), San Jose, CA, USA, 3–6 October 2018; pp. 1–7. [Google Scholar]
  68. Magana, A.; Sanchez, K.; Shaikh, U.; Jones, G.; Tan, H.; Guayaquil, A.; Benes, B. Exploring Multimedia Principles for Supporting Conceptual Learning of Electricity and Magnetism with Visuohaptic Simulations. Comput. Educ. J. 2017, 8, 9–23. [Google Scholar]
  69. Shaikh, U.A.S.; Magana, A.J.; Neri, L.; Escobar-Castillejos, D.; Noguez, J.; Benes, B. Undergraduate students’ conceptual interpretation and perceptions of haptic-enabled learning experiences. Int. J. Educ. Technol. High. Educ. 2017, 14, 15. [Google Scholar] [CrossRef] [Green Version]
  70. Amirkhani, S.; Nahvi, A. Design and implementation of an interactive virtual control laboratory using haptic interface for undergraduate engineering students. Comput. Appl. Eng. Educ. 2016, 24, 508–518. [Google Scholar] [CrossRef]
  71. Lindgren, R.; Tscholl, M.; Wang, S.; Johnson, E. Enhancing learning and engagement through embodied interaction within a mixed reality simulation. Comput. Educ. 2016, 95, 174–187. [Google Scholar] [CrossRef] [Green Version]
  72. Edwards, B.I.; Bielawski, K.S.; Prada, R.; Cheok, A.D. Haptic virtual reality and immersive learning for enhanced organic chemistry instruction. Virtual Real. 2018, 23, 363–373. [Google Scholar] [CrossRef]
  73. Okamoto, S.; Nagano, H.; Yamada, Y. Psychophysical Dimensions of Tactile Perception of Textures. IEEE Trans. Haptics 2013, 6, 81–93. [Google Scholar] [CrossRef] [PubMed]
  74. Kober, S.E.; Neuper, C. Personality and Presence in Virtual Reality: Does Their Relationship Depend on the Used Presence Measure? Int. J. -Hum.-Comput. Interact. 2013, 29, 13–25. [Google Scholar] [CrossRef]
  75. Cavrag, M.; Larivière, G.; Cretu, A.M.; Bouchard, S. Interaction with virtual spiders for eliciting disgust in the treatment of phobias. In Proceedings of the 2014 IEEE International Symposium on Haptic, Audio and Visual Environments and Games (HAVE) Proceedings, Richardson, TX, USA, 10–11 October 2014; pp. 29–34. [Google Scholar]
  76. Odom, W.; Zimmerman, J.; Forlizzi, J. Placelessness, Spacelessness, and Formlessness: Experiential Qualities of Virtual Possessions. In Proceedings of the 2014 Conference on Designing Interactive Systems, DIS ’14, Vancouver, BC, Canada, 21–25 June 2014; ACM: New York, NY, USA, 2014; pp. 985–994. [Google Scholar] [CrossRef]
  77. Bombari, D.; Schmid Mast, M.; Canadas, E.; Bachmann, M. Studying social interactions through immersive virtual environment technology: Virtues, pitfalls, and future challenges. Front. Psychol. 2015, 6, 869. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  78. Ahmed, I.; Harjunen, V.; Jacucci, G.; Hoggan, E.; Ravaja, N.; Spapé, M.M. Reach out and touch me: Effects of four distinct haptic technologies on affective touch in virtual reality. In Proceedings of the 18th ACM International Conference on Multimodal Interaction, Tokyo, Japan, 12–16 November 2016; ACM: New York, NY, USA, 2016; pp. 341–348. [Google Scholar]
  79. Kyriakou, M.; Pan, X.; Chrysanthou, Y. Interaction with virtual crowd in Immersive and semi-Immersive Virtual Reality systems. Comput. Animat. Virtual Worlds 2017, 28, e1729. [Google Scholar] [CrossRef]
  80. Groten, R.; Feth, D.; Klatzky, R.L.; Peer, A. The role of haptic feedback for the integration of intentions in shared task execution. IEEE Trans. Haptics 2013, 6, 94–105. [Google Scholar] [CrossRef]
  81. Aras, R.; Shen, Y.; Noor, A. Quantitative assessment of the effectiveness of using display techniques with a haptic device for manipulating 3D objects in virtual environments. Adv. Eng. Softw. 2014, 76, 43–47. [Google Scholar] [CrossRef]
  82. Hamam, A.; Saddik, A.E.; Alja’am, J. A Quality of Experience Model for Haptic Virtual Environments. ACM Trans. Multimedia Comput. Commun. Appl. 2014, 10, 28:1–28:23. [Google Scholar] [CrossRef]
  83. Achibet, M.; Girard, A.; Talvas, A.; Marchal, M.; Lécuyer, A. Elastic-Arm: Human-scale passive haptic feedback for augmenting interaction and perception in virtual environments. In Proceedings of the 2015 IEEE Virtual Reality (VR), Arles, France, 23–27 March 2015; pp. 63–68. [Google Scholar]
  84. Fittkau, F.; Krause, A.; Hasselbring, W. Exploring software cities in virtual reality. In Proceedings of the 2015 IEEE 3rd Working Conference on Software Visualization (VISSOFT), Bremen, Germany, 27–28 September 2015; pp. 130–134. [Google Scholar] [CrossRef]
  85. Moran, A.; Gadepally, V.; Hubbell, M.; Kepner, J. Improving Big Data visual analytics with interactive virtual reality. In Proceedings of the 2015 IEEE High Performance Extreme Computing Conference (HPEC), Waltham, MA, USA, 15–17 September 2015; pp. 1–6. [Google Scholar] [CrossRef] [Green Version]
  86. Atienza, R.; Blonna, R.; Saludares, M.I.; Casimiro, J.; Fuentes, V. Interaction techniques using head gaze for virtual reality. In Proceedings of the 2016 IEEE Region 10 Symposium (TENSYMP), Bali, Indonesia, 9–11 May 2016; pp. 110–114. [Google Scholar] [CrossRef]
  87. Carvalheiro, C.; Nóbrega, R.; da Silva, H.; Rodrigues, R. User redirection and direct haptics in virtual environments. In Proceedings of the 24th ACM international conference on Multimedia, Amsterdam, The Netherlands, 23–27 October 2016; ACM: New York, NY, USA, 2016; pp. 1146–1155. [Google Scholar]
  88. Chen, Y.S.; Han, P.H.; Hsiao, J.C.; Lee, K.C.; Hsieh, C.E.; Lu, K.Y.; Chou, C.H.; Hung, Y.P. SoEs: Attachable Augmented Haptic on Gaming Controller for Immersive Interaction. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology, Tokyo, Japan, 16–19 October 2016; ACM: New York, NY, USA, 2016; pp. 71–72. [Google Scholar]
  89. Matsumoto, K.; Ban, Y.; Narumi, T.; Yanase, Y.; Tanikawa, T.; Hirose, M. Unlimited Corridor: Redirected Walking Techniques Using Visuo Haptic Interaction. In Proceedings of the ACM SIGGRAPH 2016 Emerging Technologies, SIGGRAPH ’16, Macao, China, 5–8 December 2016; ACM: New York, NY, USA, 2016. [Google Scholar] [CrossRef]
  90. Kim, M.; Jeon, C.; Kim, J. A Study on Immersion and Presence of a Portable Hand Haptic System for Immersive Virtual Reality. Sensors 2017, 17, 1141. [Google Scholar] [CrossRef] [Green Version]
  91. Lee, J.; Kim, M.; Kim, J. A Study on Immersion and VR Sickness in Walking Interaction for Immersive Virtual Reality Applications. Symmetry 2017, 9, 78. [Google Scholar] [CrossRef] [Green Version]
  92. Maereg, A.T.; Nagar, A.; Reid, D.; Secco, E.L. Wearable vibrotactile haptic device for stiffness discrimination during virtual interactions. Front. Robot. AI 2017, 4, 42. [Google Scholar] [CrossRef]
  93. Piumsomboon, T.; Lee, G.; Lindeman, R.W.; Billinghurst, M. Exploring natural eye-gaze-based interaction for immersive virtual reality. In Proceedings of the 2017 IEEE Symposium on 3D User Interfaces (3DUI), Los Angeles, CA, USA, 18–19 March 2017; pp. 36–39. [Google Scholar] [CrossRef]
  94. Reski, N.; Alissandrakis, A. Open data exploration in virtual reality: A comparative study of input technology. Virtual Real. 2019, 24, 1–22. [Google Scholar] [CrossRef] [Green Version]
  95. Grane, C.; Bengtsson, P. Driving performance during visual and haptic menu selection with in-vehicle rotary device. Transp. Res. Part F Traffic Psychol. Behav. 2013, 18, 123–135. [Google Scholar] [CrossRef]
  96. Kemeny, A. From driving simulation to virtual reality. In Proceedings of the 2014 Virtual Reality International Conference, Laval, France, 9–11 April 2014; pp. 1–5. [Google Scholar]
  97. Altendorf, E.; Baltzer, M.; Heesen, M.; Kienle, M.; Weissgerber, T.; Flemisch, F. H-Mode: A haptic-multimodal interaction concept for cooperative guidance and control of partially and highly automated vehicles. In Handbook of Driver Assistance Systems: Basic Information, Components and Systems for Active Safety and Comfort; Springer: Berlin/Heidelberg, Germany, 2014; pp. 1–16. [Google Scholar]
  98. Mars, F.; Deroo, M.; Hoc, J. Analysis of Human-Machine Cooperation When Driving with Different Degrees of Haptic Shared Control. IEEE Trans. Haptics 2014, 7, 324–333. [Google Scholar] [CrossRef] [PubMed]
  99. Wang, Z.; Zheng, R.; Kaizuka, T.; Nakano, K. Driver-automation shared control: Modeling driver behavior by taking account of reliance on haptic guidance steering. In Proceedings of the 2018 IEEE Intelligent Vehicles Symposium (IV), Changshu, China, 26–30 June 2018; pp. 144–149. [Google Scholar]
  100. Stamer, M.; Michaels, J.; Tümler, J. Investigating the Benefits of Haptic Feedback During In-Car Interactions in Virtual Reality. In International Conference on Human-Computer Interaction; Springer: Berlin/Heidelberg, Germany, 2020; pp. 404–416. [Google Scholar]
  101. Aslandere, T.; Dreyer, D.; Pankratz, F. Virtual hand-button interaction in a generic virtual reality flight simulator. In Proceedings of the 2015 IEEE Aerospace Conference, Big Sky, MT, USA, 7–14 March 2015; pp. 1–8. [Google Scholar] [CrossRef]
  102. Li, L.; Zhou, J. Virtual reality technology based developmental designs of multiplayer-interaction-supporting exhibits of science museums: Taking the exhibit of “virtual experience on an aircraft carrier” in China science and technology museum as an example. In Proceedings of the 15th ACM SIGGRAPH Conference on Virtual-Reality Continuum and Its Applications in Industry, Zhuhai, China, 3–4 December 2016; Volume 1, pp. 409–412. [Google Scholar]
  103. Marayong, P.; Strybel, T.Z.; Robles, J.; O’Connor, R.; Vu, K.P.L.; Battiste, V. Force-Feedback Integration with NASA’s Next Generation Air Transportation System Cockpit Situation Display. J. Air Transp. 2017, 25, 17–26. [Google Scholar] [CrossRef]
  104. Oberhauser, M.; Dreyer, D. A virtual reality flight simulator for human factors engineering. Cogn. Technol. Work. 2017, 19, 263–277. [Google Scholar] [CrossRef]
  105. Valentino, K.; Christian, K.; Joelianto, E. Virtual reality flight simulator. Internetworking Indones. J. 2017, 9, 21–25. [Google Scholar]
  106. Chen, C.Y.; Chang, B.R.; Huang, P.S. Multimedia augmented reality information system for museum guidance. Pers. Ubiquitous Comput. 2014, 18, 315–322. [Google Scholar] [CrossRef]
  107. Dima, M.; Hurcombe, L.; Wright, M. Touching the Past: Haptic Augmented Reality for Museum Artefacts. In Virtual, Augmented and Mixed Reality. Applications of Virtual and Augmented Reality; Shumaker, R., Lackey, S., Eds.; Springer International Publishing: Cham, Switzerland, 2014; pp. 3–14. [Google Scholar]
  108. Papaefthymiou, M.; Plelis, K.; Mavromatis, D.; Papagiannakis, G. Mobile Virtual Reality Featuring a Six Degrees of Freedom Interaction Paradigm in a Virtual Museum Application; Institute of Computer Science: London, UK, 2015; Available online: https://fdocument.org/document/mobile-virtual-reality-featuring-a-six-degrees-of-freedom-google-cardboard.html (accessed on 15 August 2021).
  109. Kersten, T.P.; Tschirschwitz, F.; Deggim, S. Development of a virtual museum including a 4D presentation of building history in virtual reality. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, 42, 361. [Google Scholar] [CrossRef] [Green Version]
  110. Tsai, T.H.; Shen, C.Y.; Lin, Z.S.; Liu, H.R.; Chiou, W.K. Exploring location-based augmented reality experience in museums. In International Conference on Universal Access in Human-Computer Interaction; Springer: Berlin/Heidelberg, Germany, 2017; pp. 199–209. [Google Scholar]
  111. Carrozzino, M.; Colombo, M.; Tecchia, F.; Evangelista, C.; Bergamasco, M. Comparing different storytelling approaches for virtual guides in digital immersive museums. In International Conference on Augmented Reality, Virtual Reality and Computer Graphics; Springer: Berlin/Heidelberg, Germany, 2018; pp. 292–302. [Google Scholar]
  112. Gaugne, R.; Gouranton, V.; Dumont, G.; Chauffaut, A.; Arnaldi, B. Immersia, an open immersive infrastructure: Doing archaeology in virtual reality. Archeol. Calc. 2014, 5, 1–10. [Google Scholar]
  113. Pietroni, E.; Adami, A. Interacting with virtual reconstructions in museums: The Etruscanning Project. J. Comput. Cult. Herit. 2014, 7, 1–29. [Google Scholar] [CrossRef] [Green Version]
  114. Barbieri, L.; Bruno, F.; Muzzupappa, M. User-centered design of a virtual reality exhibit for archaeological museums. Int. J. Interact. Des. Manuf. 2018, 12, 561–571. [Google Scholar] [CrossRef]
  115. Younes, G.; Kahil, R.; Jallad, M.; Asmar, D.; Elhajj, I.; Turkiyyah, G.; Al-Harithy, H. Virtual and augmented reality for rich interaction with cultural heritage sites: A case study from the Roman Theater at Byblos. Digit. Appl. Archaeol. Cult. Herit. 2017, 5, 1–9. [Google Scholar] [CrossRef]
  116. Perret, J.; Kneschke, C.; Vance, J.; Dumont, G. Interactive assembly simulation with haptic feedback. Assem. Autom. 2013, 33, 214–220. [Google Scholar] [CrossRef] [Green Version]
  117. Qiu, S.; Fan, X.; Wu, D.; He, Q.; Zhou, D. Virtual human modeling for interactive assembly and disassembly operation in virtual reality environment. Int. J. Adv. Manuf. Technol. 2013, 69, 2355–2372. [Google Scholar] [CrossRef]
  118. Xia, P.; Lopes, A.M.; Restivo, M.T. A review of virtual reality and haptics for product assembly: From rigid parts to soft cables. Assem. Autom. 2013, 33, 157–164. [Google Scholar] [CrossRef]
  119. Gonzalez-Badillo, G.; Medellin-Castillo, H.; Lim, T.; Ritchie, J.; Garbaya, S. The development of a physics and constraint-based haptic virtual assembly system. Assem. Autom. 2014, 34, 41–55. [Google Scholar] [CrossRef] [Green Version]
  120. Hamid, N.S.S.; Aziz, F.A.; Azizi, A. Virtual reality applications in manufacturing system. In Proceedings of the 2014 Science and Information Conference, London, UK, 27–29 August 2014; pp. 1034–1037. [Google Scholar]
  121. Vélaz, Y.; Arce, J.R.; Gutiérrez, T.; Lozano-Rodero, A.; Suescun, A. The influence of interaction technology on the learning of assembly tasks using virtual reality. J. Comput. Inf. Sci. Eng. 2014, 14, 041007. [Google Scholar] [CrossRef] [Green Version]
  122. Abidi, M.H.; Ahmad, A.; Darmoul, S.; Al-Ahmari, A.M. Haptics assisted virtual assembly. IFAC-PapersOnLine 2015, 48, 100–105. [Google Scholar] [CrossRef]
  123. Gavish, N.; Gutiérrez, T.; Webel, S.; Rodríguez, J.; Peveri, M.; Bockholt, U.; Tecchia, F. Evaluating virtual reality and augmented reality training for industrial maintenance and assembly tasks. Interact. Learn. Environ. 2015, 23, 778–798. [Google Scholar] [CrossRef]
  124. Grajewski, D.; Górski, F.; Hamrol, A.; Zawadzki, P. Immersive and haptic educational simulations of assembly workplace conditions. Procedia Comput. Sci. 2015, 75, 359–368. [Google Scholar] [CrossRef] [Green Version]
  125. Radkowski, R.; Herrema, J.; Oliver, J. Augmented Reality-Based Manual Assembly Support with Visual Features for Different Degrees of Difficulty. Int. J.-Hum.-Comput. Interact. 2015, 31, 337–349. [Google Scholar] [CrossRef]
  126. Al-Ahmari, A.M.; Abidi, M.H.; Ahmad, A.; Darmoul, S. Development of a virtual manufacturing assembly simulation system. Adv. Mech. Eng. 2016, 8, 1687814016639824. [Google Scholar] [CrossRef] [Green Version]
  127. Wang, X.; Ong, S.; Nee, A. Real-virtual components interaction for assembly simulation and planning. Robot. -Comput.-Integr. Manuf. 2016, 41, 102–114. [Google Scholar] [CrossRef]
  128. Xia, P. Haptics for product design and manufacturing simulation. IEEE Trans. Haptics 2016, 9, 358–375. [Google Scholar] [CrossRef] [PubMed]
  129. Ho, N.; Wong, P.M.; Chua, M.; Chui, C.K. Virtual reality training for assembly of hybrid medical devices. Multimed. Tools Appl. 2018, 77, 30651–30682. [Google Scholar] [CrossRef]
  130. Roldán, J.J.; Crespo, E.; Martin-Barrio, A.; Peña-Tapia, E.; Barrientos, A. A training system for Industry 4.0 operators in complex assemblies based on virtual reality and process mining. Robot. -Comput.-Integr. Manuf. 2019, 59, 305–316. [Google Scholar] [CrossRef]
  131. Loch, F.; Ziegler, U.; Vogel-Heuser, B. Integrating Haptic Interaction into a Virtual Training System for Manual Procedures in Industrial Environments. IFAC-PapersOnLine 2018, 51, 60–65. [Google Scholar] [CrossRef]
  132. Gurman, M. Facebook’s Oculus Is Developing a New Quest VR Headset. 2020. [Google Scholar]
  133. Marko, H. The Bidirectional Communication Theory—A Generalization of Information Theory. IEEE Trans. Commun. 1973, 21, 1345–1351. [Google Scholar] [CrossRef]
  134. Laurell, C.; Sandström, C.; Berthold, A.; Larsson, D. Exploring barriers to adoption of Virtual Reality through Social Media Analytics and Machine Learning—An assessment of technology, network, price and trialability. J. Bus. Res. 2019, 100, 469–474. [Google Scholar] [CrossRef]
  135. Heinonen, M. Adoption of VR and AR technologies in the enterprise. In Proceedings of the ISPIM Innovation Conference—Innovation, The Name of The Game, Stockholm, Sweden, 17–20 June 2018. [Google Scholar]
Figure 1. Paper selection steps.
Figure 1. Paper selection steps.
Informatics 09 00013 g001
Figure 2. Temporal analysis charts. (a) Number of papers of domain per year. (b) Number of papers of different technologies per year.
Figure 2. Temporal analysis charts. (a) Number of papers of domain per year. (b) Number of papers of different technologies per year.
Informatics 09 00013 g002
Figure 3. Frequency of types of interaction per specific domain.
Figure 3. Frequency of types of interaction per specific domain.
Informatics 09 00013 g003
Figure 4. Elbow method applied.
Figure 4. Elbow method applied.
Informatics 09 00013 g004
Figure 5. T-distributed Stochastic Neighbor Embedding (t-SNE) has been used for visualization of the relationship among papers. The image shows three cluster identified by dendrogram and the distribution of papers from each domain.
Figure 5. T-distributed Stochastic Neighbor Embedding (t-SNE) has been used for visualization of the relationship among papers. The image shows three cluster identified by dendrogram and the distribution of papers from each domain.
Informatics 09 00013 g005
Figure 6. Dendrogram of the cluster subdivision.
Figure 6. Dendrogram of the cluster subdivision.
Informatics 09 00013 g006
Table 1. Keywords by Domain.
Table 1. Keywords by Domain.
DomainsKeywords
Concepts and overviewsHuman–computer interaction, virtual reality, augmented reality, haptic, visualization, and behavioral theories
MedicineHuman–computer interaction, virtual reality, augmented reality, haptic, medicine, surgery, training, rehabilitation, and dentistry
PhysicsHuman–computer interaction, virtual reality, augmented reality, haptic, physics, surfaces, object grasping, fluid mechanics, electromagnetism, dynamic systems, astrophysics, and molecular physics
TransportationHuman–computer interaction, virtual reality, augmented reality, haptic, transportation, driving, and flight
Cultural heritageHuman–computer interaction, virtual reality, augmented reality, haptic, cultural heritage, museum, archaeology, and tourism
IndustryHuman–computer interaction, virtual reality, augmented reality, haptic, industry, and manufacturing
Human factors/User experience designHuman–computer interaction, virtual reality, augmented reality, haptic, user experience, user factors, and product factors
Table 9. Number of works per domain in each cluster.
Table 9. Number of works per domain in each cluster.
DomainCluster 1 (Green)Cluster 2 (Red)Cluster 3 (Blue)
Concepts and overviews2130
Medicine3140
Physics908
Human factors/User experience design8014
Cultural heritage0012
Transportation740
Industry0018
Cluster Sum293152
Table 10. Total number of reviewed papers by domain, most used display types and interaction types.
Table 10. Total number of reviewed papers by domain, most used display types and interaction types.
DomainTotal Number
of Works
Year (Number) of
Maximum Works
Most Used Display
Type (DT)
Most Used
Interaction Type (IT)
TT 1TT 2IT 1IT 2
Concepts and overviews152013 (4)VRHapticGizmo and Tracking
Medicine172015 (5)HapticVRGizmo and Tracking
Physics172014 (6)VRHapticTrackingGizmo
Human factors/User experience design222017 (6)VRHapticGizmoTracking
Cultural heritage122014 (4)VRARGizmo and Tracking
Transportation112017 (3)VRHapticGizmoTracking
Industry182015 (5)VRHapticTrackingGizmo
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Azofeifa, J.D.; Noguez, J.; Ruiz, S.; Molina-Espinosa, J.M.; Magana, A.J.; Benes, B. Systematic Review of Multimodal Human–Computer Interaction. Informatics 2022, 9, 13. https://doi.org/10.3390/informatics9010013

AMA Style

Azofeifa JD, Noguez J, Ruiz S, Molina-Espinosa JM, Magana AJ, Benes B. Systematic Review of Multimodal Human–Computer Interaction. Informatics. 2022; 9(1):13. https://doi.org/10.3390/informatics9010013

Chicago/Turabian Style

Azofeifa, Jose Daniel, Julieta Noguez, Sergio Ruiz, José Martín Molina-Espinosa, Alejandra J. Magana, and Bedrich Benes. 2022. "Systematic Review of Multimodal Human–Computer Interaction" Informatics 9, no. 1: 13. https://doi.org/10.3390/informatics9010013

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop