Next Article in Journal
Combining BioTRIZ and Multi-Factor Coupling for Bionic Mechatronic System Design
Next Article in Special Issue
Enhanced Virtual Sound Source Construction Based on Wave Field Synthesis Using Crossfade Processing with Electro-Dynamic and Parametric Loudspeaker Arrays
Previous Article in Journal
Generative Aspect Sentiment Quad Prediction with Self-Inference Template
Previous Article in Special Issue
Speech Puzzles (Spuzzles): Engaging the Reduced, Causal, and Semantic Listening Modes for Puzzle Design in Audio Games
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Review of Recent Literature on Audio-Based Pseudo-Haptics

1
maxSIMhealth Group, Ontario Tech University, Oshawa, ON L1G 0C5, Canada
2
School of Information Technology, Carleton University, Ottawa, ON K1S 5B6, Canada
*
Author to whom correspondence should be addressed.
Appl. Sci. 2024, 14(14), 6020; https://doi.org/10.3390/app14146020
Submission received: 16 May 2024 / Revised: 20 June 2024 / Accepted: 25 June 2024 / Published: 10 July 2024
(This article belongs to the Special Issue Applied Audio Interaction)

Abstract

:
Immersive virtual learning environments (iVLEs), particularly serious games and virtual simulations, typically ignore psychomotor skills development, partly due to the difficulty and cost associated with accurately replicating touch. Simulating touch, also known as haptics, requires specialized haptic devices that are not widely accessible at the consumer-level. Using visual (and/or auditory) cues, pseudo-haptics aims to mimic touch sensations without haptic devices. Although pseudo-haptics has predominantly focused on visual cues, a 2019 review by Collins and Kapralos on pseudo-haptics emphasized the role of auditory cues and cross-modal interactions. Since then, great advancements, notably during the COVID-19 pandemic’s shift to remote learning, have been made. Here, we build upon the work of Collins and Kapralos with a narrative review on audio-based pseudo-haptics. This narrative review explores 17 articles obtained from the Google Scholar, RefSeek, Scopus, and PubMed databases, with the aim of providing a comprehensive summary of the progress in this field since 2019. Pseudo-haptics presents a viable alternative to simulate various aspects of touch, including weight, stiffness, roughness, dampness, force, and glossiness, when haptic devices are unavailable, enhancing immersion and providing the potential to improve psychomotor skill training within iVLEs.

1. Introduction

The technologies of video games, virtual worlds, and social networks have become collectively known as immersive technologies [1] and can blur the boundary between the physical and virtual worlds while immersing the user [2]. Although historically, the strongest demand for immersive technologies has come from the entertainment industry (video game players have generally been early adopters), immersive technologies have found wider applications in industries as diverse as health care, education, the military, and real estate [3]. A growing area of immersive technologies is education/training in the form of immersive virtual learning environments (iVLEs), which may be used as an umbrella term for innovations such as virtual simulations and serious games. These offer learners an engaging, cost-effective, and safe training environment. The use of iVLEs shifted from a “backburner training tool to a first-choice strategy for ensuring individual, team, and system readiness” as we moved to remote learning to facilitate lockdowns/stay-at-home orders during the COVID-19 pandemic [4]. There are many benefits to remote learning, including the removal of geographical boundaries, flexibility (for the educators and learners), and being a more environmentally friendly option (reduced driving to/from educational institutions), and thus it is likely that remote learning will continue to increase. According to Straits Research, the global virtual learning (e-learning) market was estimated at USD 215 billion in 2021 and is estimated to reach an expected value of USD 645 billion by 2030 at a compound annual growth rate (CAGR) of 13% during this period [5]. Within the umbrella of virtual learning, the global serious games market size was valued at USD 9 billion in 2022 and is projected to reach USD 32.73 billion by 2030 (a CAGR of 18.41%) given the growing demand for interactive learning experiences [6].
Often with many iVLEs, particularly virtual simulations and serious games, the goal is to immerse the user into a scenario that faithfully recreates reality and responds to their actions accordingly. To accomplish this, knowledge of the human senses is imperative. The human senses have been defined as the physiological capacities that provide data for perception [7], allowing humans to establish relations with the physical world [8]. Going back over 2000 years, the philosopher Aristotle, who was one of the first to systematically examine the human senses, classified the five traditionally recognized and widely accepted senses as sight (vision), hearing (audition), smell (olfaction), taste (gustation), and touch (somatosensation) based on the physiological properties of humans along the biological (“sensory”) receptors (specialized neurons) of the human body, and more specifically, the eyes, ears, nose, and hands, respectively, that respond to specific sensory information [8]. Detection of sensory information by sensory receptors results in a sensation, and the way that this sensory information is organized, interpreted, and consciously experienced is known as perception [9]. Although the five senses have become “conventional wisdom,” many scientists and philosophers have argued that this idea is outdated and inaccurate, and we now know that we can detect stimuli beyond these five traditional senses (e.g., temperature, pain, and balance, amongst others) [10]. However, while humans do perceive through other modalities beyond the five traditional senses, they are often considered extensions or derivatives of the primary senses rather than separate entities [11].
Of the five senses, it has been argued that sight is the most important and most complex. There exists much more research on sight than on the other sensory modalities [12], and the public believes that sight is the most important of the senses despite the limited empirical data to support this [13]. Although the sense of touch “lies at the heart of our experience of ourselves and the world, it often remains unspoken”, our reliance on the sense of touch often goes unnoticed [14]. From the start of human history, touch has helped humans in most of their primordial tasks, becoming an integral part of human nature [15]. Touch serves as the primary means whereby humans interact with their environment [16] and has been described as the sense that cannot be fooled or deceived [17].

1.1. Haptics

The simulation of the sense of touch falls under the field of haptics. As Adilkhanov et al. [18] describe, the word “haptics” (from the Greek work haptikos or to be “able to come into contact with”) refers to the capability to sense a natural or synthetic mechanical environment through touch. In other words, haptics can be described as the means whereby information is conveyed through touch [19]. Haptics is comprised of both kinesthetic and cutaneous (tactile) feedback. Kinesthetic feedback refers to the feeling of motion and relates to sensations originating in muscles, tendons, and joints [20], and cutaneous feedback to the stimuli detected by low-threshold mechanoreceptors under the skin within the contact area [21]. In other words, given the diverse mechanical receptors on human skin, the haptic system allows us to distinguish the shape, texture, and other surface properties when we interact with the real world [22]. In contrast to the visual and auditory systems, capable of providing highly precise spatial and temporal information, respectively, the touch system is particularly effective at processing material characteristics of surfaces and objects [23]. Haptics encompasses the sensory feedback received through tactile sensations, proprioception (awareness of body position and movement), and kinesthetic feedback (sensations of motion or strain in muscles, tendons, and joints) during physical interactions with objects or environments [24,25,26]. In educational and skill acquisition contexts, understanding haptics is crucial for learning tasks that involve physical manipulation, such as fine motor skills, hand–eye coordination, and spatial awareness [27]. A complete discussion regarding the perception of touch is beyond the scope of this paper. However, Lederman and Klatzky [28] provide a thorough tutorial on touch within the context of a fully active human observer that is written specifically for those outside the discipline interested in an introduction to the field.
Accurately simulating touch within virtual environments is difficult, as it requires a realistic perception of pressure, temperature, position, movement, texture, and vibration through actuators embedded in high-end haptic devices that are cost-prohibitive for many applications. Moreover, in contrast to the other senses, touch is not localized to a specific body part(s) (e.g., the eyes or the nose), but is rather distributed across the entire body through the skin, and in the joints, muscles, and tendons [29]. Given the many difficulties associated with simulating the sense of touch in the virtual domain and the limited availability of cost-effective devices to do so, haptics is often ignored, particularly when considering iVLEs that predominantly focus on the cognitive and affective learning domains, often ignoring psychomotor skills development altogether as a result.

1.2. Haptic Devices

A haptic device can provide users with a sense of touch [16]. More specifically, a haptic device provides touch feedback that is perceived by applying tactile and/or kinesthetic stimuli to sense and/or manipulate objects with a user input device, allowing the user to interact with virtual or tangible objects [30]. Devices can engage both kinesthetic and/or cutaneous feedback and provide users the feeling of touch [18]. Culbertson et al. [29] divide haptic devices into three major categories: (i) graspable, (ii) wearable, and (iii) touchable. Graspable haptic devices typically rely on kinesthetic (force) feedback, are grounded (e.g., attached to a desk/table) and allow the user to push on them using an end effector held by the user. Wearable haptic devices typically employ tactile (cutaneous) feedback and are mounted on some part of the body (e.g., hands in the form of a haptic data glove) and provide feedback directly to the skin via mechanoreceptors. Wearable devices are attractive for mobile applications where the user is required to move within their environment [29]. With touchable haptic devices, the user is able to manipulate the entire surface (e.g., touch-based monitor/display). As Huang et al. [22] point out, compared to the conventional (graspable/grounded) haptic devices that are cumbersome, lightweight, miniaturized haptic feedback interfaces that include portable, wearable, or even skin-integrated formats, are more suitable for the next generation of human–machine interfaces (including those used in virtual and augmented reality) and electronics. Wearable haptic devices, particularly those that are skin-integrated, are thoroughly reviewed by Huang et al. [22].
Existing haptic solutions can be cumbersome and complex, requiring robotic mechanisms comprised of several actuators, sensors, and processing units [31]. Having such complex systems leads to cost-prohibitive solutions, particularly when high fidelity is required [32]. Simple haptic feedback is widely available at the consumer level through devices that include actuators that provide simple vibration feedback (e.g., video game controller “rumble packs” or mobile phone vibrations). As consumer interest in haptics grows, the application of lower-fidelity consumer-level haptic devices capable of providing force feedback beyond simple vibrations will become more widespread, given their decreasing cost [33]. However, the currently available consumer-level haptic devices cannot provide the higher level of fidelity and the range of motion required to simulate many tasks realistically [34]. Nevertheless, these lower-end devices are proving to be effective in entertainment, design, and educational applications, where higher levels of fidelity may not necessarily be required.
In the absence of adequate haptic devices, haptic sensations can still be invoked by leveraging cross-modal illusions to invoke haptic sensations [33]. This emerging field is known as pseudo-haptics and is grounded in the work of Aldridge [35], who showed that the visual representation of a virtual object can affect the integration of the touch-based feedback. Pseudo-haptics has been employed to convey various touch-based properties, including friction [36], stiffness [37], mass [38], and texture [39]. Most of the pseudo-haptics work has focused on visual cues rather than auditory (see [40,41]). This requires the use of a display to present the visual information and therefore will not work with interfaces that do not include a display or with visually impaired users [42]. However, sound (and a combination of sound and visual cues) can also be used to induce the illusion of touch. Collins and Kapralos [33] provided an overview of the literature in pseudo-haptics with an emphasis on cross-modal integration and perception across the three domains most common to virtual environments, and more specifically, visuals, sound, and haptics [33]. Although the Collins and Kapralos review is only five years old, during this time, the field of pseudo-haptics and immersive technologies in general has advanced significantly, particularly when considering the vital role immersive technologies played in in education across all levels and domains given the abrupt shift to remote learning to facilitate COVID-19 lockdowns [43].

1.3. Review Details

In this paper, we provide an overview of multimodal cue pseudo-haptics, with an emphasis on the inclusion of auditory cues, focusing on advancements made after the Collins and Kapralos 2019 review [33]. Our reason to focus on the auditory sensory modality is motivated by the limited research into the role that audio can play in pseudo-haptics, in contrast to the significant amount of research investigating visual-based pseudo-haptics. Furthermore, several reviews focusing on visual-based pseudo-haptics are available, including the review by Lécuyer [44] and the more recent review of by Ujitoko and Ban [40].
Our review follows the structure of a narrative review, as outlined by Oxman et al. [45]. A narrative review serves to examine the literature comprehensively, offering a broad summary of the field. This approach is especially beneficial for readers new to the subject, providing them with valuable insights [46]. When crafting this review, we adhered to the principles delineated by Green et al. [47], ensuring alignment with established guidelines for preparing a narrative literature review intended for publication in a peer-reviewed journal. The search was carried out using the Google Scholar, RefSeek, PubMed, and Scopus databases between 23 December 2023, and 5 June 2024. The research query was: (“sound” OR “audio” OR “auditory”) AND (“visual” OR “graphical”) AND (“multimodal interaction” OR “cross-modal interaction”) AND (“pseudo haptics” OR “pseudo-haptics” OR “pseudohaptics”), which initially yielded 391 articles. A more recent search revealed one additional article. While vision-based pseudo-haptics is an important and relevant area of work, we have restricted our scope to audio-based work in this review. After initially reviewing the abstracts of these 391 articles to assess relevance, 52 articles were identified. Following a detailed examination, of these 52 articles, 17 articles were found to be pertinent to our review. A summary of our search process, outlining the information flow through the phases of our review process is provided in the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) flow diagram [48] shown in Figure 1.
The remainder of this paper is organized as follows. Section 2 (the following section) summarizes our review. To provide context and to better motivate our audio-based pseudo-haptics discussion, we begin our review with an overview of audio–haptic interactions, and more specifically, the interaction between auditory and haptic cues whereby the haptic cues are provided using some form of haptic device. The discussion then moves to audio-based pseudo-haptics, beginning with a discussion on audio-only pseudo-haptics followed by a discussion of audio-visual pseudo-haptics. In Section 3, a discussion of the review results is presented in addition to a discussion of the potential of pseudo-haptics in surgical training. Finally, concluding remarks are presented in Section 4.

2. The Review

2.1. Audio–Haptic Interactions

Unlike visual–auditory and visual–haptic interactions, audio–haptic interactions remain significantly underexplored [33]. Collins and Kapralos [33] emphasized that auditory cues can effectively convey information about object size, properties, and texture, particularly through the sounds produced when interacting with the material. Prior to the COVID-19 pandemic, our own work focused on multimodal interactions, beginning with a series of experiments that examined the interactions between sound and visuals within virtual environments, and revealed that sound can have significant effects on both visual fidelity perception and task performance, albeit these effects are subjective [49]. This work was expanded to examine audio–haptic interactions, and more specifically, the effect of sound on haptic fidelity perception within the context of a virtual drilling task; that is, drilling to a pre-defined depth within a virtual block of wood using a virtual drill [50]. Results revealed that sound related to the task (i.e., contextual sound) led to higher haptic fidelity perception. However, a follow-up experiment that examined accuracy in a virtual drilling task found that neither contextual or non-contextual sound showed any significant difference in drilling accuracy, although accuracy was higher in the presence of sound (contextual or non-contextual) than no sound at all [50]. It was concluded that further work is necessary to account for the experimental shortcomings, including a lack of a dynamic sound stimulus (e.g., the drilling sound remained constant throughout the entire drilling process) [50].
Lu et al. [51] conducted a study where they compared recorded sounds and sounds recreated by participants when interacting with different surfaces. Eighteen participants were seated at a table and the realism of these sounds and their impact on participant perception tested. More specifically, eighteen participants engaged with three sound conditions: (i) recorded sound, (ii) sound generated by the participant as they interacted with the surface, and (iii) white noise to block ambient sounds. In all three conditions, the participants wore headphones. Black curtains and a foam board were used to eliminate visual cues from the texture. The task of the participants was to interact with a textured surface by dragging a tool across it. The tool, equipped with a position and force sensor, recorded both the sound and the user’s motion during the interaction. The participants were then asked to rate 30 conditions (10 textures, each played three times in a random order). Results indicated that while sound alone (Condition i) affected participants’ judgments of the roughness and hardness of the texture, the sound produced as the participant interacted with the textured surface (Condition ii) was most effective at capturing the surface texture’s roughness and hardness compared to haptic cues alone (Condition iii), while slipperiness was primarily determined by haptic cues.
As Chan et al. [52] pointed out, physical interaction with real-world objects elicits haptic feedback, but also causes mechanical excitations that induce vibrations, often leading to audible sounds. They devised a method to synthesize realistic sound and vibrotactile stimuli from geometric representations and object material properties at three different scale levels (macro, meso, and micro) in a virtual environment. Micro-texture surfaces consisted of flat with either a glossy or matte texture, whereas as meso- and macro-texture surfaces had a single ridge, bump, sine wave, and sawtooth patterns. Vibrotactile actuators were used to provide the haptic feedback, which was generated by tracking the user’s hand pose. Eight participants were tasked with interacting with various surface textures under three feedback conditions: (i) audio only, (ii) haptic only, and (iii) audio–haptic combination. The results indicated that the participants were able to identify textures most accurately in the presence of both auditory and haptic cues. The authors concluded that realistic feedback synthesized from virtual objects can enhance virtual environments through visual, auditory, and haptic interactions.
Motivated by the limitations in the perception of graphical information by the visually impaired and the need to substitute the sense of sight with other senses, Maćkowski et al. [53] describe a tablet-based method that presents visual information in the form of a “tactile picture” that incorporates haptic and audio cues to provide contextually selected information. Tactile pictures incorporate raised lines and elevated surfaces that are detectable under the user’s fingers. Although there are guidelines/recommendations regarding the size, shape, and spacing of the lines and elevated surfaces, there are shortcomings with the approach. To help overcome these shortcomings, Maćkowski et al. [53] include auditory cues in the form of speech-based descriptions of the tactile picture (printed with a braille printer placed on the tablet screen) to complement the tactile cues. A user study found that the accompanying auditory descriptions improve the understanding and interpretation of the information presented by the tactile picture. Results also revealed that these descriptions should be 5 to 10 s in duration; when this duration was exceeded, the message was incomprehensible [53].
This section focused on studies that examined the interaction between auditory and haptic feedback using a form of haptic device. Sound can significantly enhance the perception of haptic feedback by providing additional cues that complement the tactile sensations. As described above, auditory stimuli can amplify the realism and intensity of haptic feedback, making virtual interactions feel more authentic. For instance, the sound of an object being manipulated in a virtual environment can reinforce the perceived texture, weight, and resistance of that object, leading to a more cohesive sensory experience (see [33]). This synergy between auditory and haptic feedback is crucial for applications ranging from virtual reality simulations to teleoperation systems, where the fidelity of sensory inputs can affect user performance and satisfaction. Moreover, the temporal synchronization between sound and haptic cues plays a pivotal role: any discrepancy can disrupt the user’s sense of presence and immersion. Therefore, understanding and leveraging the interplay between sound and haptic perception is essential for developing more effective and engaging virtual environments. The influence of sound on haptic perception in virtual environments is a nuanced area of research that underscores the multisensory integration fundamental to immersive experiences. In the subsequent section, we explore the use of auditory feedback to induce pseudo-haptic sensations.

2.2. Audio-Based Pseudo-Haptics

Bosman et al. [54] conducted a qualitative study to explore cross-modal illusions, particularly inducing haptic feedback using auditory cues. A within-subject study was conducted to explore participants’ experience when presented with different auditory combinations while interacting with a block in VR. The study consisted of two sessions: (i) a usability test and (ii) a focus group session. As part of the usability test, 23 participants were tasked with exploring nine virtual rooms, each featuring distinct auditory stimuli, and engaged with tactile blocks (suspended in the air) by rubbing them. A Leap Motion Controller (optical hand tracker) allowed the participants to interact with virtual objects without physically touching the blocks. The purpose of the usability test was to select “information-rich” participants to participate in the focus group session collected through a questionnaire, observation, and an interview. In the virtual environment, the participants interacted with these blocks. The blocks responded to touch by returning to their original position. Three auditory stimuli were used during these interactions: (i) pure texture: audio recordings of real-world textures being rubbed, providing a direct tactile experience; (ii) atonal abstract: pitch-bent continuous notes played using sine, triangle, sawtooth, or square waveforms, varying based on hand movement velocity; and (iii) tonal abstract: synthesized notes across all waveforms played across different octaves, simulating minute textural interactions for a pseudo-haptic experience. These auditory strategies aimed to enhance the immersive nature of the virtual reality (VR) simulation and create intuitive mappings between audio and tactile sensations. Participant behaviors were video-recorded, and at the end of the session, participants were asked to complete a questionnaire and participate in a semistructured interview. During the focus group sessions, based on their usability test scores, 10 of the 23 participants engaged in a moderately structured discussion guided by pre-defined questions. These questions aimed at testing the participants’ experience and the sensations they felt while in the virtual environment. The results indicated that believable, real-world-like auditory and visual feedback, along with a consistent and immersive VR environment, were necessary to induce a pseudo-haptic sensation.
Kaneko et al. [42] presented a novel method to vary the sensation of heaviness by modifying the auditory feedback presented to users in response to user input. Their method manipulated the response delay, frequency, and loudness of a sound. A total of 131 participants were asked to rate the heaviness of the two squares displayed on their screen. By clicking on a button associated with each of the two squares, one of the sound conditions was played. The sound conditions included two frequencies, (i) 200 Hz and (ii) 400 Hz, accompanied by one of five delay conditions: (i) 0 ms, (ii) 100 ms, (iii) 200 ms, (iv) 300 ms, and (v) 500 ms. The output sound consisting of one frequency and one delay condition was randomly assigned to each of the two buttons. A control condition without any sound was also included. Eleven sound conditions (two frequencies × five delays + one no sound condition) were considered, each repeated four times for a total of 44 trials. After the sound condition was played from the two squares, the participants were asked to compare the two squares and rate which one they perceived to be heavier using a 5-point Likert scale (1 being the square on the left was heavier and 5 being the square on the right is heavier). Results showed significant differences between the two frequencies, with the lower frequencies being perceived as heavier. Moreover, there was significance between the delay times, where the participants perceived a stronger heaviness sensation with increased delay. However, no significant difference was found between frequency and delay interaction. In a similar experiment, Kaneko et al. [42] also compared the perceived heaviness of two square buttons, although they now manipulated the sound’s loudness instead of frequency. They found that louder sounds were perceived as being significantly heavier. Interestingly, the tail end of the sound appeared to play a critical role in the perceived heaviness: in comparison to the onset sound (initiation of sound), the tail end of the sound yielded significant results. By changing the visual and/or auditory stimuli, the researchers concluded that haptic feedback without the use of expensive haptic devices can be simulated. These series of studies confirmed the effectiveness of their method, indicating that the heaviness sensation conventionally induced by modulating visual feedback can also be created by modulating auditory feedback.
Zhexuan and Zhong [55] explored the relationship between various sound parameters, and more specifically, pitch, waveform shape (square, sawtooth, triangle, and sine), attenuation duration, and artificial harmonics (sound generated by a second oscillator to enrich the sound produced) and material sensation. The study was comprised of seven experiments. Experiments 1 and 2 consisted of a trial to determine the range and the specific parameters used for the remaining five experiments. In Experiment 1, the participants were asked to listen to different tracks to determine the range of parameters required for the remaining experiments, and for those who succeeded, they were asked to judge whether the 22 sounds heard were percussive or not. Experiment 2 required the participants to describe the material they heard in the audio. Experiment 3 examined the effect of pitch on weight perception, where the participants were asked to rate the weight of six sounds with different pitches on a 5-point Likert scale. Experiment 4 examined the different waveforms at various pitches and their effect on roughness, where participants listened to 30 sounds (five waveforms evaluated at six different pitches) and ranked the “prickliness” of each sound on a 5-point Likert scale. Experiment 5 examined the effect of attenuated duration of different pitches on dampness. More specifically, participants evaluated 54 sounds (nine sounds with different attenuation times with six different pitches) and ranked dampness on a 5-point Likert scale. Experiment 6 explored the effect of combining different pitches and artificial harmonics on the perception of glossiness. Here, 11 sounds with different pitches and artificial harmonics were evaluated and ranked for glossiness on a 5-point Likert scale. The purpose of Experiment 7 was to test the feasibility of the results of the prior experiments. Participants were asked to match a sound to a picture of a material and explain their reasoning. In sum, 80 participants were recruited to participate in Experiments 1 and 3–6, while 100 participants were recruited to participate in Experiments 2 and 7. The results of Experiment 3 indicated that low-pitched sounds were perceived as heavier than higher-pitch sounds. Experiment 4 indicated that pink noise (sound that carries the same energy across each octave) maintained its perceived roughness sensation at different pitches. Moreover, the relative roughness of square and sawtooth waveforms appeared to decrease as pitch increased, whereas the perceived roughness of the triangle and sine waveforms remained relatively constant regardless of the pitch. The results of Experiment 5 revealed a correlation between attenuation time, pitch, and dampness sensation, with lower frequencies and shorter attenuation durations increasing the perception of dampness. Experiment 6 showed that pitch influenced the perception of glossiness, and despite participants having trouble identifying changes in artificial harmonics (sound enrichment), they believed that increasing the artificial harmonic pitch enhanced the perceived glossiness. Experiment 7 tested the validity of the previous six experiments. Overall, 100 participants were tasked with matching sound effects to pictures of real materials based on the perceived sensation. A 53% accuracy and a 78% comprehension rate were observed, leaving room for improvement.
Focusing on the influence of sound on visual cues using head-mounted displays (HMDs), Malpica et al. [56] were interested in material perception in virtual reality, with a particular focus on how sound might impact the perception of virtual material textures. In one experiment, they sought to determine if the presence of a collision sound could alter the perceived appearance of a material in virtual environments, focusing on what they called low-level perceptual traits (soft/hard, glossy/matte, and rough/smooth), and high-level descriptors such as “plastic-like” or “fabric-like”). Their results suggested that the effect of sound in material identification tasks was more relevant when using low-fidelity visual stimuli.
Driven by the COVID-19 pandemic and the resulting move to remote learning, our own earlier sound–haptic interaction work served as a motivator to shift our research focus on audio-based pseudo-haptics as a cost-effective method to facilitate psychomotor skills development without the use of any haptic devices. Our first study, conducted entirely online during the COVID-19 lockdowns, examined whether a virtual drilling task can be appropriately simulated using combinations of sound and kinesthetic stimuli obtained with movement of a basic 2D mouse in the absence of a haptic device [57]. Drilling is a common task used across many domains, including medicine (e.g., dentistry and orthopedic surgery). Participants were asked to drill to a pre-defined depth in a virtual block of wood under various auditory conditions using a standard computer mouse or keyboard (e.g., using commonly available computer hardware and devices). Figure 2 provides a view of the visual scene presented to the participants. Although only a pilot study and based on the results of just 13 participants with low statistical power, results showed that sound feedback alone is not enough to provide adequate drilling feedback. However, when coupled with kinesthetic feedback using a simple computer mouse, sound can convey pseudo-haptic cues and simulate a simple virtual drilling task [57].

2.3. Visual–Auditory-Based Pseudo-Haptics

Speicher et al. [58] examined menu control using either a 2D panel and a pseudo-haptic interface based on physical metaphors (buttons/knobs, switches, and sliders) in VR using a Leap Motion Controller mounted on an HMD for hand tracking. The 2D menu interface served as a control based on common touchscreen and mobile user interfaces implemented using standard controls without depth. The pseudo-haptic user interface presented a menu control with pseudo-haptic feedback to enhance users’ experience when interacting with button, slider, and switch controllers (i.e., pushing a protruding button/knob or pulling a lever) using auditory and visual feedback. A within-subject experiment was conducted to compare the two interfaces with respect to their influence on user experience, workload, sickness, and immersion. Results showed that the pseudo-haptics–user interface was preferred in all aspects. Although 2D menu control interfaces are simple, widely accepted, and familiar to many, given their widespread use in smartphones and desktop computers, the authors conclude that pseudo-haptic-based controls are the better option for VR menu interfaces.
Eckhoff et al. [59] created an augmented reality (AR) simulation that enabled 12 participants to experience their own hand burning using only visual and auditory cues. Physiological data (heart rate (HR), and galvanic skin response (GSR) in particular), and responses to a self-reported questionnaire examining the participants’ anxiety levels, presence, and immersion were collected. Half of the participants reported that they experienced the heat sensation. Although the anxiety questionnaire results did not show any significance, participants experienced a significant increase in skin conductance when observing their hand burning in the simulation. Moreover, those who reported experiencing the heat sensation experienced a higher skin conductance response.
Haruna et al. [60] compared three modalities (sound (hearing), vibration (touch), and light (vision)) to provide pseudo-haptic information (grasping force) when in contact with an object. They built a prototype of a remote machine and implemented visual–haptic feedback in similar conditions to real-life operation. A loudspeaker and a vibration motor were attached to the gripper control interface, while an LED was attached to the tip of the robot gripper. A study with 20 participants was conducted and involved grasping and carrying objects for one minute using each of the different feedback modalities. The control consisted of no feedback. Results indicated that although all modalities reduced the grasping force compared to the control, light (visual-based pseudo-haptics) achieved the most significant reduction in grasping force, possibly given that participants found the auditory feedback noisy. Despite this, auditory feedback was found to increase information flow to the brain, suggesting an impact on task performance.
Kang et al. [61] conducted a study to investigate the effect of visual and auditory cues on the perception of roughness and stiffness when interacting with virtual objects in active and passive manners. Forty participants were recruited and underwent active-touch and passive-touch tasks with the objects while wearing a mixed-reality (MR) headset. Active touch refers to haptic feedback generated by the participants touching an object, whereas passive touch is the haptic feedback generated by being touched by an object. The active-touch task involved rotating a virtual cylinder to assess roughness perception, while the passive-touch task required sensing the movement of a vibratory cube to evaluate stiffness perception. The study implemented various auditory and visual cues, and the ordering of the required tasks was counterbalanced. Visual cues included motion speed (either slow or fast), while auditory cues included frequency manipulation (no change, low-pass, or high-pass filters). Under active touch, both visual and auditory cues influenced roughness perception, with objects displaying slower motion and low-frequency accentuated sounds leading to higher perceived roughness ratings. Moreover, post hoc tests for auditory cues indicated significant differences between the three frequencies considered. In the passive-touch, auditory-only cues, particularly low-frequency accentuated sounds were perceived as stiffer. Post hoc tests revealed significant differences between high and medium sound levels and between high and low sound levels. The manipulation of visual and auditory cues can influence participants’ perception of the physical characteristics of virtual objects, suggesting multisensory information processing, and enable the perception of haptic sensations.
Puértolas Bálint et al. [62] developed a simulation to enable medical students to train during the COVID-19 pandemic by performing procedures on a virtual patient. Using a Leap Motion Controller mounted on an HMD, they tracked and estimated the user’s body position in real time and used the body’s gestures as input to control the user’s avatar. To further deepen immersion, they integrated percussion sounds to enable the perception of the outline of the virtual patient’s internal organs (e.g., heart, lungs, and liver), and determine the shape and size of the organs. Additionally, they included normal organic sounds such as heart percussion sounds and sounds typically emanating from the chest cavity. When the user’s hand collides with an organ, a percussion sound plays, and the intensity of this sound varies according to the collision speed. To enhance realism, they incorporated visual-based pseudo-haptics to create the impression of the sense of touch when the user was interacting with the virtual patient’s organs. Although no formal experimentation was conducted, the authors believe this approach has significant potential for teaching students during the pandemic.
Desnoyers-Stewart et al. [63] explored the embodied social touch illusion that provides the ability to “touch” another in a virtual environment, using pseudo-haptics in VR. In their prototype, two users had their corresponding particle-based avatar and were able to interact with each other via their avatars. More specifically, the avatars were able to hold hands or high-five each other. All of the haptic interactions were conducted using pseudo-haptics and included (i) proximity interaction such as visual warmth where the particles comprising the avatar become more red to indicate warmth, particle attraction, where the two particles of each avatar are attracted to each other the closer in proximity the avatars get, and tremolo flute, an auditory response that plays as the avatars get closer together, (ii) contact interaction where visual (small fireworks and sparks lighting) and auditory feedback (ping sound) are represented when the two avatars make contact with each other, and (iii) resistance that prevents the avatars from passing through each other by incorporating physics colliders and audio force feedback, and more specifically, a hum that grew louder with increased force. It was observed that although pseudo-haptics cannot replace ordinary touch, it was able to provide a subtle alternative to represent the perception of touch during social interaction in a virtual environment.
Kurzweg et al. [64] examined the simulation of a vibrating virtual object using visual and auditory cues in an AR environment. The task of the 18 participants was to touch a virtual white plate to activate different conditions. These conditions included: (i) visual (different levels of blurring of the virtual object), (ii) auditory (different auditory frequency), and (iii) audiovisual (a combination of the two stimuli in ascending order). When they blurred the virtual object’s edge to either 0.4 cm or 0.6 cm and accompanied it with sound at a specific frequency of 265 Hz or 966 Hz, participants perceived a vibrating object.
Lee et al. [65] examined improving immersion, attractiveness, and intuitiveness of VR, AR, and the metaverse using cross-modal (audio and visual)-driven pseudo-haptics for fundamental tasks such as clicking, scrolling, object manipulation, and zooming in and out. Ten participants took part in a study consisting of conventional (visual and auditory) and multimodal (visual, audio, and haptic) feedback when conducting five virtual activities (mental, physical, temporal, performance, effort, and frustration) in AR. It was found that cross-modal-driven pseudo-haptic interaction resulted in higher performance on all the parameters, i.e., intuitiveness, attractiveness, and immersion.

3. Discussion

With the rapid advancement of technology, immersive technologies are experiencing unprecedented popularity. These technologies not only serve entertainment purposes but also find extensive applications in training in the form of immersive virtual learning environments (iVLEs) and serious games and virtual simulations in particular. With respect to iVLEs, a notable gap exists in their focus, with most emphasizing cognitive and affective skill development due to the challenge of simulating touch inherent in psychomotor skills, which requires specialized and cost-prohibitive haptic devices that are not widely accessible at the consumer-level. To address this limitation, the field of pseudo-haptics has emerged.
Pseudo-haptics leverages alternative sensory modalities, primarily vision, to replicate the tactile experience without relying on physical haptic devices. This approach has been successfully applied across various domains [66] to provide, for example, sensations of weight, stiffness, roughness, reliefs, slipperiness, and stickiness [67,68,69]. Various reviews have considered visual-based pseudo-haptics, including Crandall and Karadoğan [70], who explored best design practices to develop haptic simulations for learning, Bermejo and Hui [71], who focused on haptic technology for mobile AR, Bouzbib et al. [72], who focused on haptic solutions in VR, and Seinfeld et al. [36], who focused on user representation in virtual environments. These specific reviews concluded that by manipulating visual cues, the users were able to perceive alteration in the sense of touch, such as changes in weight, force, friction, and stiffness. A similar conclusion was also reached by Hatzfeld and Kern [73], who examined haptics as an interaction modality. A thorough review is also provided by Ujitoko and Ban [40], who discuss the design of visual-based pseudo-haptic applications for a variety of applications, including training, assistance, and entertainment.
Although the majority of pseudo-haptics work has focused on visual cues, Collins and Kapralos [33] underscored the potential of inducing pseudo-haptics using sound or a combination of auditory and visual cues in their review published in 2019. Expanding on this insight, our narrative review delved into the existing literature exploring audio-based or audiovisual-based pseudo-haptics, analyzing 26 relevant studies. These studies were conducted across various levels of realism and across a range of platforms, including PC, AR, VR, and mixed reality (MR). A summary of the findings of our review is presented in Table 1. Overall, the results strongly support the efficacy of audio-based and audiovisual-based pseudo-haptics. The results also suggest that manipulating factors such as frequency, pitch, loudness, delay, and attenuation time can alter perceptions of weight, stiffness, roughness, dampness, force, and glossiness. Playing recorded audio representations of textures successfully conveyed the impression of different surfaces. Furthermore, several studies indicate that integrating sound and/or visual stimuli to simulate haptic sensations enhances immersion, realism, and overall performance. In their review, Lim et al. [74] examined weight perception in VR using multimodal illusion, particularly visual–audio interaction. They found that auditory perception of weight included change of pitch, different sound of footsteps, loudness, and faster beat to increase force magnitude and lower frequency. Although Lim et al. [74] focused on perceived weight changes, their overall findings align with the ones observed in this review. Bermejo and Hui [71] described how sound can influence gesture perception, indicating that users can learn gestures associated with specific sounds and kinesthetic force in AR.
In essence, the integration of sound and visual stimuli holds immense promise in enhancing immersion, realism, and overall performance in iVLEs, marking a pivotal step towards bridging the gap in psychomotor skill development. Further research and innovation in this field are poised to unlock even greater potential in revolutionizing the way we interact with virtual environments and train for real-world tasks. With the goal of making pseudo-haptics available to a larger audience of both scientists and practitioners, Pusch and Lécuyer [75] devised a generic yet practical set of system design recommendations meant to facilitate the advancement in the field of pseudo-haptics for user interface researchers and practitioners. As Meyer et al. [76] point out, the success of haptic simulations seems to be influenced by the type of parameter being simulated and the amount of deviation between the physical haptic input and virtual audiovisual input, and since these parameters’ deviation thresholds are not clear, a set of “best practices” for sensory simulations in VR applications would be helpful.

Applications to Medical Training

The integration of pseudo-haptics in medical, and more specifically, surgical training holds significant promise, particularly in the realm of orthopedic bone drilling. This skill is fundamental across various surgical procedures, encompassing tasks such as the reduction of complex fractures, bone plating, and spinal adjustments. Orthopedic bone plating, for instance, involves the application of a metal plate to stabilize fractured bones, crucial for promoting bone healing and restoring structural integrity [77]. Notably, one application is in the reduction of complex radial fractures, where precise drilling and fixation are paramount for optimal outcomes.
During bone plating surgery, surgeons must navigate many factors simultaneously, including the anatomy of the radial bone and its surrounding structures, such as nerves, blood vessels, tendons, and neighboring bones [78]. A misstep in drilling can have grave consequences, especially if the second cortex of the bone is breached, potentially leading to injury of vital structures like the radial nerve and artery. This highlights the criticality of precision and control during surgical drilling to safeguard patient safety and minimize postoperative complications [79].
Research by Dubrowski and Backstein [80] revealed insights into the psychomotor control differences between novice and experienced surgeons during bone drilling. Their studies highlighted the impact of distracting noise on surgical performance, particularly in integrating auditory feedback associated with drilling sounds. Novices exhibited higher plunge depths compared to intermediate trainees and experienced surgeons, indicating the importance of auditory cues in guiding drilling motions [81]. Furthermore, experts demonstrated the ability to rapidly integrate haptic and auditory information to anticipate and control drilling actions effectively. They used a feedback loop, mapping haptic and auditory cues from the first cortex to anticipate drilling cessation upon reaching the second cortex, thereby minimizing plunge depth and mitigating patient risk [81]. The findings suggest that the incorporation of pseudo-haptics can enhance the development of sensory mappings crucial for surgical expertise. By simulating real-world surgical scenarios with immersive sensory feedback, pseudo-haptics may accelerate the acquisition of surgical skills and improve procedural outcomes. Thus, the integration of pseudo-haptics holds immense potential in augmenting surgical training, particularly in orthopedic bone drilling, by providing a dynamic learning environment that mimics the complexities of surgical practice.
Expanding upon the role of pseudo-haptics in surgical training, it is essential to consider the broader implications and potential applications across various surgical disciplines. While orthopedic bone drilling serves as a prime example, pseudo-haptics can also be leveraged in other surgical contexts, including laparoscopic procedures, neurosurgery, and cardiovascular interventions. In laparoscopic surgery, for instance, pseudo-haptic feedback can enhance trainees’ ability to manipulate surgical instruments and perform delicate maneuvers within the confined space of the abdomen. By simulating the tactile sensations of tissue manipulation and organ interaction, pseudo-haptics can bridge the gap between traditional laparoscopic training simulators and real surgical scenarios, allowing trainees to develop proficiency in minimally invasive techniques. Similarly, in neurosurgery, where precision and spatial awareness are paramount, pseudo-haptics can provide trainees with realistic feedback on the manipulation of delicate neural structures and the placement of surgical instruments. By integrating haptic and auditory cues, trainees can refine their motor skills and develop a nuanced understanding of neuroanatomy, ultimately improving patient outcomes and reducing the risk of intraoperative complications. In cardiovascular interventions, such as coronary artery bypass grafting and percutaneous coronary interventions, pseudo-haptic feedback can enhance trainees’ ability to navigate complex vascular anatomy and perform intricate procedures with precision. By simulating the tactile sensations of catheter manipulation and vessel engagement, pseudo-haptics can improve trainees’ spatial orientation and procedural dexterity, leading to safer and more effective interventions.
Overall, the integration of pseudo-haptics in surgical training may represent a paradigm shift in medical education, offering trainees a novel and economic approach to acquiring surgical skills in a safe, standardized, and immersive environment. Using pseudo-haptics to simulate the tactile sensations and auditory feedback encountered during real surgical procedures, pseudo-haptics can accelerate the learning curve, improve procedural proficiency, and ultimately enhance patient outcomes across a wide range of surgical specialties.

4. Conclusions

This overview examined the recent use of haptics-based applications in virtual environments, with a specific emphasis on audio-based pseudo-haptics. Pseudo-haptics uses auditory and/or visual cues to simulate the perception of touch, a challenging sense to replicate in virtual environments without dedicated haptic devices. The manipulation of auditory cues has been shown throughout the literature to result in perceptual adjustments in haptic attributes such as material texture, weight, roughness, dampness, and stiffness. Consistent with decades of research into cross-modal phenomena in the physical world, the use of audio in the virtual world has been shown repeatedly to influence haptic perception.
While the focus of much past pseudo-haptics research has been on the use of visuals, sound is a growing area of important research showing that other sensory inputs also influence our haptic perception. However, there is still considerable work to be done. Many questions remain as to how to best leverage sound to compensate for the loss or reduced fidelity of haptic stimulation. There is little work, for instance, on differences across different user populations (ages, genders, cultures, sensory abilities, and so on). Understanding these differences could help in designing more universally effective systems or allowing different parameters of personalization. Moreover, what are the optimal characteristics of audio cues (e.g., frequency, amplitude, temporal dynamics) that best enhance the perception of different aspects of pseudo-haptics? Identifying these parameters can aid in creating more realistic and convincing tactile illusions. We also know little about the critical timing thresholds for synchronizing audio cues with visual and proprioceptive feedback to maximize the illusion of haptic feedback. Furthermore, all of the studies presented took place over short time periods. How do users adapt to audio-based pseudo-haptics over extended periods, and what are the long-term effects on their sensory perception and motor skills? Longitudinal studies could reveal potential benefits or drawbacks of prolonged use.
The surge in popularity of immersive technologies, particularly driven by the shift to virtual learning during the COVID-19 pandemic, has seen the widespread adoption of pseudo-haptics across diverse fields. In areas such as medical training, where the acquisition of tactile skills is paramount, pseudo-haptics has emerged as a valuable tool. While it cannot completely substitute for intricate haptic feedback, pseudo-haptics serves as a viable alternative when sophisticated haptic devices are unavailable.

Author Contributions

Conceptualization, S.A., B.K., A.D. and K.C.; methodology, S.A., B.K., A.D. and K.C.; formal analysis, S.A., B.K., A.D. and K.C.; investigation, S.A., B.K., A.D. and K.C.; writing—original draft preparation, S.A. and B.K.; writing—review and editing, S.A., B.K., A.D. and K.C.; supervision, B.K. and A.D.; project administration, B.K. and A.D.; funding acquisition, B.K. and A.D. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Natural Sciences and Engineering Research Council of Canada (NSERC) in the form of a Discovery grant to B. Kapralos, the Ontario Tech University Research Excellence Chair to B. Kapralos, and through an Ontario Graduate Scholarship to S. Abdo.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Wortley, D. The Future of Serious Games and Immersive Technologies and Their Impact on Society. In Trends and Applications of Serious Gaming and Social Media; Baek, Y., Ko, R., Marsh, T., Eds.; Gaming Media and Social Effects; Springer: Singapore, 2014; pp. 1–14. ISBN 978-981-4560-25-2. [Google Scholar]
  2. Suh, A.; Prophet, J. The State of Immersive Technology Research: A Literature Analysis. Comput. Hum. Behav. 2018, 86, 77–90. [Google Scholar] [CrossRef]
  3. Hall, S.; Takahashi, R. Augmented and Virtual Reality: The Promise and Peril of Immersive Technologies. In Proceedings of the World Economic Forum, Davos-Klosters, Switzerland, 17–20 January 2017; Volume 2. [Google Scholar]
  4. Brydges, R.; Campbell, D.M.; Beavers, L.; Khodadoust, N.; Iantomasi, P.; Sampson, K.; Goffi, A.; Caparica Santos, F.N.; Petrosoniak, A. Lessons Learned in Preparing for and Responding to the Early Stages of the COVID-19 Pandemic: One Simulation’s Program Experience Adapting to the New Normal. Adv. Simul. 2020, 5, 8. [Google Scholar] [CrossRef]
  5. Straits Research E-Learning Market to Reach USD 645 Billion in Market Size by 2030, Growing at a CAGR of 13%: Straits Research. Available online: https://www.infoprolearning.com/blog/elearning-localization-your-secret-code-to-reach-the-global-audience/ (accessed on 24 June 2024).
  6. Zion Research Serious Games Market Size, Share, Growth Report 2030. Available online: https://www.zionmarketresearch.com/report/serious-games-market (accessed on 24 June 2024).
  7. Mather, G. Foundations of Sensation and Perception; Psychology Press: London, UK, 2016; ISBN 978-1-317-37255-4. [Google Scholar]
  8. Dubois, D.; Cance, C.; Coler, M.; Paté, A.; Guastavino, C. Sensory Experiences: Exploring Meaning and the Senses; John Benjamins Publishing Company: Amsterdam, The Netherlands, 2021; Volume 24. [Google Scholar]
  9. Spielman, M.R.; Dumper, K.; Jenkins, W.; Lacombe, A.; Lovett, M.D.; Perlmutter, M. Psychology, H5P ed.; BCcampus: Victoria, BC, Canada, 2021. [Google Scholar]
  10. Wilson, K.A. How Many Senses? Multisensory Perception Beyond the Five Senses. In Sabah Ülkesi; IGMG: Cologne, Germany, 2021; pp. 76–79. Available online: https://philpapers.org/rec/WILMP-8 (accessed on 24 June 2024).
  11. Purves, A.C. Touch and the Ancient Senses; Bradley, M., Butler, S., Eds.; The Senses in Antiquity; Routledge: Abingdon, UK; New York, NY, USA, 2018; ISBN 978-1-84465-871-8. [Google Scholar]
  12. Hutmacher, F. Why Is There So Much More Research on Vision Than on Any Other Sensory Modality? Front. Psychol. 2019, 10, 2246. [Google Scholar] [CrossRef]
  13. Enoch, J.; McDonald, L.; Jones, L.; Jones, P.R.; Crabb, D.P. Evaluating Whether Sight Is the Most Valued Sense. JAMA Ophthalmol. 2019, 137, 1317. [Google Scholar] [CrossRef]
  14. Classen, C. The Deepest Sense: A Cultural History of Touch; University of Illinois Press: Champaign, IL, USA, 2012; ISBN 978-0-252-03493-0. [Google Scholar]
  15. El Rassi, I.; El Rassi, J.-M. A Review of Haptic Feedback in Tele-Operated Robotic Surgery. J. Med. Eng. Technol. 2020, 44, 247–254. [Google Scholar] [CrossRef]
  16. See, A.R.; Choco, J.A.G.; Chandramohan, K. Touch, Texture and Haptic Feedback: A Review on How We Feel the World around Us. Appl. Sci. 2022, 12, 4686. [Google Scholar] [CrossRef]
  17. Gallace, A.; Spence, C. In Touch with the Future: The Sense of Touch from Cognitive Neuroscience to Virtual Reality; Oxford University Press: Oxford, UK, 2014; ISBN 978-0-19-964446-9. [Google Scholar]
  18. Adilkhanov, A.; Rubagotti, M.; Kappassov, Z. Haptic Devices: Wearability-Based Taxonomy and Literature Review. IEEE Access 2022, 10, 91923–91947. [Google Scholar] [CrossRef]
  19. Hannaford, B.; Okamura, A.M. Haptics. In Springer Handbook of Robotics; Siciliano, B., Khatib, O., Eds.; Springer Handbooks; Springer International Publishing: Cham, Switzerland, 2016; pp. 1063–1084. ISBN 978-3-319-32550-7. [Google Scholar]
  20. Oakley, I.; McGee, M.R.; Brewster, S.; Gray, P. Putting the Feel in ‘look and Feel’. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Hague, The Netherlands, 1–6 April 2000; ACM: New York, NY, USA, 2000; pp. 415–422. [Google Scholar]
  21. Pacchierotti, C.; Prattichizzo, D.; Kuchenbecker, K.J. Cutaneous Feedback of Fingertip Deformation and Vibration for Palpation in Robotic Surgery. IEEE Trans. Biomed. Eng. 2016, 63, 278–287. [Google Scholar] [CrossRef] [PubMed]
  22. Huang, Y.; Yao, K.; Li, J.; Li, D.; Jia, H.; Liu, Y.; Yiu, C.K.; Park, W.; Yu, X. Recent Advances in Multi-Mode Haptic Feedback Technologies towards Wearable Interfaces. Mater. Today Phys. 2022, 22, 100602. [Google Scholar] [CrossRef]
  23. Lederman, S.; Klatzky, R.L. Human Haptics. In Encyclopedia of Neuroscience; Academic Press: Cambridge, MA, USA, 2009; Volume 5. [Google Scholar]
  24. Johansson, R.S.; Flanagan, J.R. Sensory Control of Object Manipulation. In Sensorimotor Control of Grasping: Physiology and Pathophysiology; Cambridge University Press: Cambridge, UK, 2009; pp. 141–160. [Google Scholar]
  25. Klatzky, R.; Lederman, S. The Haptic Identification of Everyday Life Objects. In Touching for Knowing: Cognitive Psychology of Haptic Manual Perception; John Benjamins Publishing Company: Amsterdam, The Netherlands, 2003. [Google Scholar]
  26. Goodwin, A.W.; Wheat, H.E. Sensory Signals in Neural Populations Underlying Tactile Perception and Manipulation. Annu. Rev. Neurosci. 2004, 27, 53–77. [Google Scholar] [CrossRef]
  27. Goble, D.J.; Brown, S.H. Upper Limb Asymmetries in the Matching of Proprioceptive Versus Visual Targets. J. Neurophysiol. 2008, 99, 3063–3074. [Google Scholar] [CrossRef] [PubMed]
  28. Lederman, S.J.; Klatzky, R.L. Haptic Perception: A Tutorial. Atten. Percept. Psychophys. 2009, 71, 1439–1459. [Google Scholar] [CrossRef]
  29. Culbertson, H.; Schorr, S.B.; Okamura, A.M. Haptics: The Present and Future of Artificial Touch Sensation. Annu. Rev. Control Robot. Auton. Syst. 2018, 1, 385–409. [Google Scholar] [CrossRef]
  30. Jafari, N.; Adams, K.D.; Tavakoli, M. Haptics to Improve Task Performance in People with Disabilities: A Review of Previous Studies and a Guide to Future Research with Children with Disabilities. J. Rehabil. Assist. Technol. Eng. 2016, 3, 205566831666814. [Google Scholar] [CrossRef] [PubMed]
  31. Panariello, D.; Caporaso, T.; Grazioso, S.; Di Gironimo, G.; Lanzotti, A.; Knopp, S.; Pelliccia, L.; Lorenz, M.; Klimant, P. Using the KUKA LBR Iiwa Robot as Haptic Device for Virtual Reality Training of Hip Replacement Surgery. In Proceedings of the 2019 Third IEEE International Conference on Robotic Computing (IRC), Naples, Italy, 25–27 February 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 449–450. [Google Scholar]
  32. Junput, B.; Wei, X.; Jamone, L. Feel It on Your Fingers: Dataglove with Vibrotactile Feedback for Virtual Reality and Telerobotics. In Towards Autonomous Robotic Systems; Althoefer, K., Konstantinova, J., Zhang, K., Eds.; Lecture Notes in Computer Science; Springer International Publishing: Cham, Switzerland, 2019; Volume 11649, pp. 375–385. ISBN 978-3-030-23806-3. [Google Scholar]
  33. Collins, K.; Kapralos, B. Pseudo-Haptics: Leveraging Cross-Modal Perception in Virtual Environments. Senses Soc. 2019, 14, 313–329. [Google Scholar] [CrossRef]
  34. D’Abbraccio, J.; Massari, L.; Prasanna, S.; Baldini, L.; Sorgini, F.; Airò Farulla, G.; Bulletti, A.; Mazzoni, M.; Capineri, L.; Menciassi, A.; et al. Haptic Glove and Platform with Gestural Control for Neuromorphic Tactile Sensory Feedback in Medical Telepresence. Sensors 2019, 19, 641. [Google Scholar] [CrossRef] [PubMed]
  35. Aldridge, R.J.; Carr, K.; England, R.; Meech, J.F.; Solomonides, T. Getting a Grasp on Virtual Reality. In Proceedings of the Conference Companion on Human Factors in Computing Systems Common Ground—CHI ’96; ACM Press: Vancouver, BC, Canada, 1996; pp. 229–230. [Google Scholar]
  36. Seinfeld, S.; Feuchtner, T.; Maselli, A.; Müller, J. User Representations in Human-Computer Interaction. Hum.–Comput. Interact. 2021, 36, 400–438. [Google Scholar] [CrossRef]
  37. Argelaguet Sanz, F.; Jáuregui, D.A.G.; Marchal, M.; Lécuyer, A. Elastic Images: Perceiving Local Elasticity of Images through a Novel Pseudo-Haptic Deformation Effect. ACM Trans. Appl. Percept. 2013, 10, 1–14. [Google Scholar] [CrossRef]
  38. Yu, R.; Bowman, D.A. Pseudo-Haptic Display of Mass and Mass Distribution During Object Rotation in Virtual Reality. IEEE Trans. Vis. Comput. Graph. 2020, 26, 2094–2103. [Google Scholar] [CrossRef]
  39. Lécuyer, A.; Burkhardt, J.-M.; Tan, C.-H. A Study of the Modification of the Speed and Size of the Cursor for Simulating Pseudo-Haptic Bumps and Holes. ACM Trans. Appl. Percept. 2008, 5, 1–21. [Google Scholar] [CrossRef]
  40. Ujitoko, Y.; Ban, Y. Survey of Pseudo-Haptics: Haptic Feedback Design and Application Proposals. IEEE Trans. Haptics 2021, 14, 699–711. [Google Scholar] [CrossRef] [PubMed]
  41. Lécuyer, A.; Coquillart, S.; Kheddar, A.; Richard, P.; Coiffet, P. Pseudo-Haptic Feedback: Can Isometric Input Devices Simulate Force Feedback? In Proceedings of the IEEE Virtual Reality 2000 (Cat. No.00CB37048), New Brunswick, NJ, USA, 18–22 March 2000; IEEE: Piscataway, NJ, USA, 2000; pp. 83–90. [Google Scholar]
  42. Kaneko, S.; Yokosaka, T.; Kajimoto, H.; Kawabe, T. A Pseudo-Haptic Method Using Auditory Feedback: The Role of Delay, Frequency, and Loudness of Auditory Feedback in Response to a User’s Button Click in Causing a Sensation of Heaviness. IEEE Access 2022, 10, 50008–50022. [Google Scholar] [CrossRef]
  43. Garlinska, M.; Osial, M.; Proniewska, K.; Pregowska, A. The Influence of Emerging Technologies on Distance Education. Electronics 2023, 12, 1550. [Google Scholar] [CrossRef]
  44. Lécuyer, A. Simulating Haptic Feedback Using Vision: A Survey of Research and Applications of Pseudo-Haptic Feedback. Presence Teleoper. Virtual Environ. 2009, 18, 39–53. [Google Scholar] [CrossRef]
  45. Oxman, A.D. Users’ Guides to the Medical Literature: VI. How to Use an Overview. JAMA 1994, 272, 1367. [Google Scholar] [CrossRef] [PubMed]
  46. Grant, M.J.; Booth, A. A Typology of Reviews: An Analysis of 14 Review Types and Associated Methodologies. Health Inf. Libr. J. 2009, 26, 91–108. [Google Scholar] [CrossRef] [PubMed]
  47. Green, B.N.; Johnson, C.D.; Adams, A. Writing Narrative Literature Reviews for Peer-Reviewed Journals: Secrets of the Trade. J. Chiropr. Med. 2006, 5, 101–117. [Google Scholar] [CrossRef] [PubMed]
  48. Haddaway, N.R.; Page, M.J.; Pritchard, C.C.; McGuinness, L.A. PRISMA2020: An R Package and Shiny App for Producing PRISMA 2020-compliant Flow Diagrams, with Interactivity for Optimised Digital Transparency and Open Synthesis. Campbell Syst. Rev. 2022, 18, e1230. [Google Scholar] [CrossRef] [PubMed]
  49. Kapralos, B.; Moussa, F.; Collins, K.; Dubrowski, A. Fidelity and Multimodal Interactions. In Instructional Techniques to Facilitate Learning and Motivation of Serious Games; Wouters, P., Van Oostendorp, H., Eds.; Springer International Publishing: Cham, Switzerland, 2017; pp. 79–101. ISBN 978-3-319-39296-7. [Google Scholar]
  50. Melaisi, M.; Rojas, D.; Kapralos, B.; Uribe-Quevedo, A.; Collins, K. Multimodal Interaction of Contextual and Non-Contextual Sound and Haptics in Virtual Simulations. Informatics 2018, 5, 43. [Google Scholar] [CrossRef]
  51. Lu, S.; Chen, Y.; Culbertson, H. Towards Multisensory Perception: Modeling and Rendering Sounds of Tool-Surface Interactions. IEEE Trans. Haptics 2020, 13, 94–101. [Google Scholar] [CrossRef]
  52. Chan, S.; Tymms, C.; Colonnese, N. Hasti: Haptic and Audio Synthesis for Texture Interactions. In Proceedings of the 2021 IEEE World Haptics Conference (WHC), Montreal, QC, Canada, 6–9 July 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 733–738. [Google Scholar]
  53. Maćkowski, M.; Brzoza, P.; Spinczyk, D. An Alternative Method of Audio-Tactile Presentation of Graphical Information in Mathematics Adapted to the Needs of Blind. Int. J. Hum.-Comput. Stud. 2023, 179, 103122. [Google Scholar] [CrossRef]
  54. Bosman, I.D.V.; De Beer, K.; Bothma, T.J.D. Creating Pseudo-Tactile Feedback in Virtual Reality Using Shared Crossmodal Properties of Audio and Tactile Feedback. S. Afr. Comput. J. 2021, 33, 1–21. [Google Scholar] [CrossRef]
  55. Zhexuan, W.; Zhong, W. Research on a Method of Conveying Material Sensations through Sound Effects. J. New Music Res. 2022, 51, 121–141. [Google Scholar] [CrossRef]
  56. Malpica, S.; Serrano, A.; Allue, M.; Bedia, M.G.; Masia, B. Crossmodal Perception in Virtual Reality. Multimed. Tools Appl. 2020, 79, 3311–3331. [Google Scholar] [CrossRef]
  57. Ning, G.; Grant, B.; Kapralos, B.; Quevedo, A.; Collins, K.; Kanev, K.; Dubrowski, A. Understanding Virtual Drilling Perception Using Sound, and Kinesthetic Cues Obtained with a Mouse and Keyboard. J. Multimodal User Interfaces 2023, 17, 165. [Google Scholar] [CrossRef]
  58. Speicher, M.; Ehrlich, J.; Gentile, V.; Degraen, D.; Sorce, S.; Krüger, A. Pseudo-Haptic Controls for Mid-Air Finger-Based Menu Interaction. In Proceedings of the Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; ACM: New York, NY, USA, 2019; pp. 1–6. [Google Scholar]
  59. Eckhoff, D.; Cassinelli, A.; Liu, T.; Sandor, C. Psychophysical Effects of Experiencing Burning Hands in Augmented Reality. In Virtual Reality and Augmented Reality; Bourdot, P., Interrante, V., Kopper, R., Olivier, A.-H., Saito, H., Zachmann, G., Eds.; Lecture Notes in Computer Science; Springer International Publishing: Cham, Switzerland, 2020; Volume 12499, pp. 83–95. ISBN 978-3-030-62654-9. [Google Scholar]
  60. Haruna, M.; Noboru, K.; Ogino, M.; Koike-Akino, T. Comparison of Three Feedback Modalities for Haptics Sensation in Remote Machine Manipulation. IEEE Robot. Autom. Lett. 2021, 6, 5040–5047. [Google Scholar] [CrossRef]
  61. Kang, N.; Sah, Y.J.; Lee, S. Effects of Visual and Auditory Cues on Haptic Illusions for Active and Passive Touches in Mixed Reality. Int. J. Hum.-Comput. Stud. 2021, 150, 102613. [Google Scholar] [CrossRef]
  62. Puértolas Bálint, L.A.; Althoefer, K.; Perez Macias, L.H. Virtual Reality Percussion Simulator for Medical Student Training. In Proceedings of the 2021 IEEE 6th International Forum on Research and Technology for Society and Industry (RTSI), Naples, Italy, 6–9 September 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 295–299. [Google Scholar]
  63. Desnoyers-Stewart, J.; Stepanova, E.R.; Liu, P.; Kitson, A.; Pennefather, P.P.; Ryzhov, V.; Riecke, B.E. Embodied Telepresent Connection (ETC): Exploring Virtual Social Touch Through Pseudohaptics. In Proceedings of the Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems, Hamburg, Germany, 23–28 April 2023; ACM: New York, NY, USA, 2023; pp. 1–7. [Google Scholar]
  64. Kurzweg, M.; Letter, M.; Wolf, K. Vibrollusion: Creating a Vibrotactile Illusion Induced by Audiovisual Touch Feedback. In Proceedings of the 22nd International Conference on Mobile and Ubiquitous Multimedia, Vienna, Austria, 3–6 December 2023; ACM: New York, NY, USA, 2023; pp. 185–197. [Google Scholar]
  65. Lee, D.S.; Lee, K.C.; Kim, H.J.; Kim, S. Pseudo-Haptic Feedback Design for Virtual Activities in Human Computer Interface. In Virtual, Augmented and Mixed Reality; Chen, J.Y.C., Fragomeni, G., Eds.; Lecture Notes in Computer Science; Springer Nature: Cham, Switzerland, 2023; Volume 14027, pp. 253–265. ISBN 978-3-031-35633-9. [Google Scholar]
  66. Lécuyer, A.; Burkhardt, J.-M.; Etienne, L. Feeling Bumps and Holes without a Haptic Interface: The Perception of Pseudo-Haptic Textures. In Proceedings of the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vienna, Austria, 24–29 April 2004; ACM: New York, NY, USA, 2004; pp. 239–246. [Google Scholar]
  67. Moosavi, M.S.; Raimbaud, P.; Guillet, C.; Plouzeau, J.; Merienne, F. Weight Perception Analysis Using Pseudo-Haptic Feedback Based on Physical Work Evaluation. Front. Virtual Real. 2023, 4, 973083. [Google Scholar] [CrossRef]
  68. Mori, S.; Kataoka, Y.; Hashiguchi, S. Exploring Pseudo-Weight in Augmented Reality Extended Displays. In Proceedings of the 2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Christchurch, New Zealand, 12–16 March 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 703–710. [Google Scholar]
  69. Costes, A.; Argelaguet, F.; Danieau, F.; Guillotel, P.; Lécuyer, A. Touchy: A Visual Approach for Simulating Haptic Effects on Touchscreens. Front. ICT 2019, 6, 1. [Google Scholar] [CrossRef]
  70. Crandall, R.; Karadoğan, E. Designing Pedagogically Effective Haptic Systems for Learning: A Review. Appl. Sci. 2021, 11, 6245. [Google Scholar] [CrossRef]
  71. Bermejo, C.; Hui, P. A Survey on Haptic Technologies for Mobile Augmented Reality. ACM Comput. Surv. 2021, 54, 1–35. [Google Scholar] [CrossRef]
  72. Bouzbib, E.; Bailly, G.; Haliyo, S.; Frey, P. “Can I Touch This?”: Survey of Virtual Reality Interactions via Haptic Solutions: Revue de Littérature Des Interactions En Réalité Virtuelle Par Le Biais de Solutions Haptiques. In Proceedings of the 32e Conférence Francophone sur l’Interaction Homme-Machine, Virtual Event, 13 April 2021; ACM: New York, NY, USA, 2021; pp. 1–16. [Google Scholar]
  73. Hatzfeld, C.; Kern, T.A. Haptics as an Interaction Modality. In Engineering Haptic Devices; Kern, T.A., Hatzfeld, C., Abbasimoshaei, A., Eds.; Springer Series on Touch and Haptic Systems; Springer International Publishing: Cham, Swtizerland, 2023; pp. 35–108. ISBN 978-3-031-04535-6. [Google Scholar]
  74. Lim, W.N.; Yap, K.M.; Lee, Y.; Wee, C.; Yen, C.C. A Systematic Review of Weight Perception in Virtual Reality: Techniques, Challenges, and Road Ahead. IEEE Access 2021, 9, 163253–163283. [Google Scholar] [CrossRef]
  75. Pusch, A.; Lécuyer, A. Pseudo-Haptics: From the Theoretical Foundations to Practical System Design Guidelines. In Proceedings of the 13th International Conference on Multimodal Interfaces, Alicante, Spain, 14–18 November 2011; ACM: New York, NY, USA, 2011; pp. 57–64. [Google Scholar]
  76. Meyer, U.; Becker, J.; Draheim, S.; von Luck, K. Sensory Simulation in the Use of Haptic Proxies: Best Practices? In Proceedings of the Conference on Human Factors in Computing Systems, Online, 8–13 May 2021. [Google Scholar]
  77. Kaipel, M.; Majewski, M.; Regazzoni, P. Double-Plate Fixation in Lateral Clavicle Fractures—A New Strategy. J. Trauma Inj. Infect. Crit. Care 2010, 69, 896–900. [Google Scholar] [CrossRef] [PubMed]
  78. Lindenhovius, A.L.C.; Felsch, Q.; Ring, D.; Kloen, P. The Long-Term Outcome of Open Reduction and Internal Fixation of Stable Displaced Isolated Partial Articular Fractures of the Radial Head. J. Trauma Inj. Infect. Crit. Care 2009, 67, 143–146. [Google Scholar] [CrossRef] [PubMed]
  79. Houston, J.; Chiang, A.; Haleem, S.; Bernard, J.; Bishop, T.; Lui, D.F. Reproducibility and Reliability Analysis of the Luk Distal Radius and Ulna Classification for European Patients with Adolescent Idiopathic Scoliosis. J. Child. Orthop. 2021, 15, 166–170. [Google Scholar] [CrossRef] [PubMed]
  80. Dubrowski, A.; Backstein, D. The Contributions of Kinesiology to Surgical Education. J. Bone Jt. Surg.-Am. Vol. 2004, 86, 2778–2781. [Google Scholar] [CrossRef]
  81. Praamsma, M.; Carnahan, H.; Backstein, D.; Veillette, C.J.H.; Gonzalez, D.; Dubrowski, A. Drilling Sounds Are Used by Surgeons and Intermediate Residents, but Not Novice Orthopedic Trainees, to Guide Drilling Motions. Can. J. Surg. J. Can. Chir. 2008, 51, 442–446. [Google Scholar]
Figure 1. Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) flow diagram summarizing the review process.
Figure 1. Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) flow diagram summarizing the review process.
Applsci 14 06020 g001
Figure 2. Visual scene in the pseudo-haptic study of [57]. (a) Top view; (b) front view.
Figure 2. Visual scene in the pseudo-haptic study of [57]. (a) Top view; (b) front view.
Applsci 14 06020 g002
Table 1. Summary of articles considered in this review.
Table 1. Summary of articles considered in this review.
ArticleBrief Summary
Melaisi et al. [50]
  • Contextual sound led to higher perceived haptic fidelity.
  • Accuracy was higher with sound (contextual or non-contextual) than without sound.
Lu et al. [51]
  • Sound alone influenced judgments of texture roughness and hardness.
  • Sound produced during interaction with the texture most effectively captured surface roughness and hardness.
  • Haptic cues alone were less effective for roughness and hardness but primarily determined slipperiness.
Chan et al. [52]
  • Participants identified textures most accurately with both auditory and haptic cues.
  • Realistic feedback from virtual objects enhances virtual environments through combined visual, auditory, and haptic interactions.
Maćkowski et al. [53]
  • Accompanying auditory descriptions improve the understanding and interpretation of tactile pictures.
  • Optimal duration for these descriptions is 5 to 10 s.
  • Descriptions longer than 10 s become incomprehensible.
Bosman et al. [54]
  • Believable, real-world-like auditory and visual feedback are necessary for inducing pseudo-haptic sensations.
  • A consistent and immersive virtual reality (VR) environment is also required for this effect.
Kaneko et al. [42]
  • Lower frequencies were perceived as heavier.
  • Increased delay times led to stronger heaviness sensations.
  • Louder sounds were perceived as heavier.
  • Sound offset (end of the sound) was more critical in heaviness perception than sound onset.
  • Haptic feedback can be simulated through auditory feedback without expensive haptic devices
Zhexuan and Zhong [55]
  • Low-pitched sounds are perceived as heavier.
  • Pink noise maintains roughness across pitches.
  • Square and sawtooth waveform roughness decreases with higher pitch; triangle and sine waveform roughness stays constant.
  • Lower frequencies and shorter attenuation increase dampness perception.
  • Higher artificial harmonic pitch enhances glossiness perception.
Malpica et al. [56]
  • Sound had a greater impact on material identification when using low-fidelity visual stimuli.
Ning et al. [57]
  • Sound feedback alone was insufficient for adequate drilling feedback.
  • Combining sound with kinesthetic feedback from a computer mouse could convey pseudo-haptic cues and simulate a virtual drilling task.
Speicher et al. [58]
  • Despite the simplicity and familiarity of 2D menu control interfaces, the study concluded that pseudo-haptic-based controls are superior for VR menu interfaces.
Eckhoff et al. [59]
  • Although anxiety questionnaire results were not significant, participants showed a significant increase in skin conductance when viewing their hand burning in the simulation.
  • Participants who reported feeling the heat sensation also had a higher skin conductance response.
Haruna et al. [60]
  • All modalities (sound, vibration, and light) reduced grasping force compared to the control, with light achieving the most significant reduction, possibly due to participants finding auditory feedback noisy.
  • Auditory feedback increased information flow to the brain, suggesting an impact on task performance despite the noise.
Kang et al. [61]
  • Under active touch, both visual and auditory cues influenced roughness perception, with slower motion speeds and low-frequency accentuated sounds leading to higher perceived roughness ratings.
  • In passive touch, auditory-only cues, particularly low-frequency accentuated sounds, were perceived as stiffer.
  • Manipulating visual and auditory cues can influence participants’ perception of virtual object characteristics, suggesting multisensory processing and enabling the perception of haptic stimuli.
Puértolas Bálint et al. [62]
  • Including multimodal (sound, visual and pseudo-haptics) interaction enhanced realism and immersion in surgical simulation.
Desnoyers-Stewart et al. [63]
  • While pseudo-haptics cannot replace real touch, it can provide a subtle alternative to represent touch perception during social interaction in VR
Kurzweg et al. [64]
  • By blurring the virtual object’s edge to 0.4 cm or 0.6 cm and playing sound at specific frequencies (265 Hz or 966 Hz), participants perceived a vibrating object.
Lee et al. [65]
  • Cross-modal-driven pseudo-haptic interaction led to higher performance across all parameters, including intuitiveness, attractiveness, and immersion.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Abdo, S.; Kapralos, B.; Collins, K.; Dubrowski, A. A Review of Recent Literature on Audio-Based Pseudo-Haptics. Appl. Sci. 2024, 14, 6020. https://doi.org/10.3390/app14146020

AMA Style

Abdo S, Kapralos B, Collins K, Dubrowski A. A Review of Recent Literature on Audio-Based Pseudo-Haptics. Applied Sciences. 2024; 14(14):6020. https://doi.org/10.3390/app14146020

Chicago/Turabian Style

Abdo, Sandy, Bill Kapralos, KC Collins, and Adam Dubrowski. 2024. "A Review of Recent Literature on Audio-Based Pseudo-Haptics" Applied Sciences 14, no. 14: 6020. https://doi.org/10.3390/app14146020

APA Style

Abdo, S., Kapralos, B., Collins, K., & Dubrowski, A. (2024). A Review of Recent Literature on Audio-Based Pseudo-Haptics. Applied Sciences, 14(14), 6020. https://doi.org/10.3390/app14146020

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop