Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (87)

Search Parameters:
Keywords = tactile display

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
25 pages, 5055 KiB  
Article
FlickPose: A Hand Tracking-Based Text Input System for Mobile Users Wearing Smart Glasses
by Ryo Yuasa and Katashi Nagao
Appl. Sci. 2025, 15(15), 8122; https://doi.org/10.3390/app15158122 - 22 Jul 2025
Viewed by 341
Abstract
With the growing use of head-mounted displays (HMDs) such as smart glasses, text input remains a challenge, especially in mobile environments. Conventional methods like physical keyboards, voice recognition, and virtual keyboards each have limitations—physical keyboards lack portability, voice input has privacy concerns, and [...] Read more.
With the growing use of head-mounted displays (HMDs) such as smart glasses, text input remains a challenge, especially in mobile environments. Conventional methods like physical keyboards, voice recognition, and virtual keyboards each have limitations—physical keyboards lack portability, voice input has privacy concerns, and virtual keyboards struggle with accuracy due to a lack of tactile feedback. FlickPose is a novel text input system designed for smart glasses and mobile HMD users, integrating flick-based input and hand pose recognition. It features two key selection methods: the touch-panel method, where users tap a floating UI panel to select characters, and the raycast method, where users point a virtual ray from their wrist and confirm input via a pinch motion. FlickPose uses five left-hand poses to select characters. A machine learning model trained for hand pose recognition outperforms Random Forest and LightGBM models in accuracy and consistency. FlickPose was tested against the standard virtual keyboard of Meta Quest 3 in three tasks (hiragana, alphanumeric, and kanji input). Results showed that raycast had the lowest error rate, reducing unintended key presses; touch-panel had more deletions, likely due to misjudgments in key selection; and frequent HMD users preferred raycast, as it maintained input accuracy while allowing users to monitor their text. A key feature of FlickPose is adaptive tracking, which ensures the keyboard follows user movement. While further refinements in hand pose recognition are needed, the system provides an efficient, mobile-friendly alternative for HMD text input. Future research will explore real-world application compatibility and improve usability in dynamic environments. Full article
(This article belongs to the Special Issue Extended Reality (XR) and User Experience (UX) Technologies)
Show Figures

Figure 1

21 pages, 1877 KiB  
Article
Touching Emotions: How Touch Shapes Facial Emotional Processing Among Adolescents and Young Adults
by Letizia Della Longa and Teresa Farroni
Int. J. Environ. Res. Public Health 2025, 22(7), 1112; https://doi.org/10.3390/ijerph22071112 - 15 Jul 2025
Viewed by 330
Abstract
Emotion recognition is an essential social ability that continues to develop across adolescence, a period of critical socio-emotional changes. In the present study, we examine how signals from different sensory modalities, specifically touch and facial expressions, are integrated into a holistic understanding of [...] Read more.
Emotion recognition is an essential social ability that continues to develop across adolescence, a period of critical socio-emotional changes. In the present study, we examine how signals from different sensory modalities, specifically touch and facial expressions, are integrated into a holistic understanding of another’s feelings. Adolescents (n = 30) and young adults (n = 30) were presented with dynamic faces displaying either a positive (happy) or a negative (sad) expression. Crucially, facial expressions were anticipated by a tactile stimulation, either positive or negative. Across two experiments, we use different tactile primes, both in first-person experience (experiment 1) and in the vicarious experience of touch (experiment 2). We measured accuracy and reaction times to investigate whether tactile stimuli affect facial emotional processing. In both experiments, results indicate that adolescents were more sensitive than adults to the influence of tactile primes, suggesting that sensory cues modulate adolescents’ accuracy and velocity in evaluating emotion facial expression. The present findings offer valuable insights into how tactile experiences might shape and support emotional development and interpersonal social interactions. Full article
(This article belongs to the Section Behavioral and Mental Health)
Show Figures

Figure 1

18 pages, 5364 KiB  
Article
Stimulus Optimization for Softness Perception on a Friction-Variable Tactile Texture Display
by Ami Chihara, Shogo Okamoto and Ai Kurita
Sci 2025, 7(3), 96; https://doi.org/10.3390/sci7030096 - 2 Jul 2025
Viewed by 307
Abstract
Surface texture displays are touch panels that provide tactile feedback. Presenting softness sensations on such rigid surfaces remains a challenge, and effective methods are not yet established. This study explores how low-frequency frictional modulation during finger sliding can evoke the perception of softness. [...] Read more.
Surface texture displays are touch panels that provide tactile feedback. Presenting softness sensations on such rigid surfaces remains a challenge, and effective methods are not yet established. This study explores how low-frequency frictional modulation during finger sliding can evoke the perception of softness. We examined multimodal optimization—whether the optimal tactile parameters vary depending on the type of visually presented fabric. Videos of draping cloth were shown beneath the panel, while spatial wavelength of frictional modulation and finger sliding speed were optimized using response surface methodology. The optimal spatial wavelength did not significantly differ across fabric types: towel (16.8 mm), cotton (16.5 mm), leather (17.1 mm), and suede (15.4 mm), with an overall range of 15–18 mm. In contrast, the optimal sliding speed significantly varied by fabric: towel (144 mm/s), cotton (118 mm/s), leather (167 mm/s), and suede (96 mm/s). These results suggest that frictional variation with a fixed spatial wavelength may serve as a general strategy for presenting softness. The findings contribute to advancing tactile rendering techniques for hard touch surfaces. Full article
Show Figures

Figure 1

11 pages, 4677 KiB  
Article
Development of Multimodal Stimulator for Studying Human Tactile Perception and Cognitive Functions: Preliminary Results
by Soon-Cheol Chung, Jinsu An, Kyu-Beom Kim, Mi-Hyun Choi and Hyung-Sik Kim
Appl. Sci. 2025, 15(13), 7184; https://doi.org/10.3390/app15137184 - 26 Jun 2025
Viewed by 240
Abstract
Humans mostly perceive tactile sensations in daily life as a combination of warmth, vibration, and pressure. To understand the complex tactile perception and cognitive processes, in this study, we aimed to develop a multimodal stimulator and investigate changes in neuronal activity. An actuator [...] Read more.
Humans mostly perceive tactile sensations in daily life as a combination of warmth, vibration, and pressure. To understand the complex tactile perception and cognitive processes, in this study, we aimed to develop a multimodal stimulator and investigate changes in neuronal activity. An actuator that can display warmth (W), vibration (V), and pressure (P) on the distal region of the index finger has been developed. Preliminary experiments were conducted with nine subjects. Electroencephalograms were measured for six tactile stimuli—three single stimuli (W, V, and P) and three combination stimuli (W + V, V + P, and W + V + P)—and event-related desynchronization/synchronization (ERD/S) analysis were performed. The actuator can present all kinds of stimuli in the same location and control stimulation parameters quantitatively. For all experiments, there was an ERD in the α and β bands about 0.5 s after stimulation followed by ERS was observed in the C3 area. The change in the peak-to-peak value was the largest for warmth and the smallest for pressure. In contrast, in the duration of the ERD, W was the shortest and P was the longest. As stimulus presented simultaneously, the ERD became longer in both the alpha and beta bands. In the beta band, the peak of ERD became larger. The developed system was confirmed to be capable of providing valid tactile stimulation, inducing appropriate neuronal activation, and enabling multimodal tactile research. Full article
(This article belongs to the Section Electrical, Electronics and Communications Engineering)
Show Figures

Figure 1

16 pages, 1471 KiB  
Article
Interpersonal Synchrony Affects the Full-Body Illusion
by Hiromu Ogawa, Hirotaka Uchitomi and Yoshihiro Miyake
Appl. Sci. 2025, 15(12), 6870; https://doi.org/10.3390/app15126870 - 18 Jun 2025
Viewed by 433
Abstract
The full-body illusion (FBI) is a phenomenon where individuals experience body perception not in their physical body but in an external virtual body. Previous studies have shown that the relationship between the self and the virtual body influences the occurrence and intensity of [...] Read more.
The full-body illusion (FBI) is a phenomenon where individuals experience body perception not in their physical body but in an external virtual body. Previous studies have shown that the relationship between the self and the virtual body influences the occurrence and intensity of the FBI. However, the influence of interpersonal factors on the FBI has not been explored. This study investigated the effect of interpersonal synchrony on body perception through an evaluation experiment involving the FBI. Specifically, the participant and an experimenter clapped together while their movements were recorded by a video camera placed behind the participant and displayed to them via a head-mounted display (HMD). This setup presented synchronous visuotactile stimuli, aligning the visual feedback with the tactile sensations in the participant’s hands, to induce the FBI. The experimenter’s clapping rhythm was manipulated to either be synchronous or asynchronous with the participant’s rhythm, thus controlling the state of movement synchronization between the participant and the experimenter. The impact on the participant’s body perception was then assessed through subjective reports. The results indicated that when the clapping rhythm was synchronized with the other person, there was a significant reduction in touch referral to the participant’s virtual body. Additionally, there was a trend toward a reduction in ownership. This study demonstrated for the first time that interpersonal synchrony affects body perception. Full article
(This article belongs to the Special Issue Virtual and Augmented Reality: Theory, Methods, and Applications)
Show Figures

Figure 1

11 pages, 1079 KiB  
Technical Note
Visuohaptic Feedback in Robotic-Assisted Spine Surgery for Pedicle Screw Placement
by Giuseppe Loggia, Fedan Avrumova and Darren R. Lebl
J. Clin. Med. 2025, 14(11), 3804; https://doi.org/10.3390/jcm14113804 - 29 May 2025
Viewed by 638
Abstract
Introduction: Robotic-assisted (RA) spine surgery enhances pedicle screw placement accuracy through real-time navigation and trajectory guidance. However, the absence of traditional direct haptic feedback by freehand instrumentation remains a concern for some, particularly in minimally invasive (MIS) procedures where direct visual confirmation [...] Read more.
Introduction: Robotic-assisted (RA) spine surgery enhances pedicle screw placement accuracy through real-time navigation and trajectory guidance. However, the absence of traditional direct haptic feedback by freehand instrumentation remains a concern for some, particularly in minimally invasive (MIS) procedures where direct visual confirmation is limited. During RA spine surgery, navigation systems display three-dimensional data, but factors such as registration errors, intraoperative motion, and anatomical variability may compromise accuracy. This technical note describes a visuohaptic intraoperative phenomenon observed during RA spine surgery, its underlying mechanical principles, and its utility. During pedicle screw insertion with a slow-speed automated drill in RA spine procedures, a subtle and rhythmic variation in resistance has been observed both visually on the navigation interface and haptically through the handheld drill. This intraoperative pattern is referred to in this report as a cyclical insertional torque (CIT) pattern and has been noted across multiple cases. The CIT pattern is hypothesized to result from localized stick–slip dynamics, where alternating phases of resistance and release at the bone–screw interface generate periodic torque fluctuations. The pattern is most pronounced at low insertion speeds and diminishes with increasing drill velocity. CIT is a newly described intraoperative observation that may provide visuohaptic feedback during pedicle screw insertion in RA spine surgery. Through slow-speed automated drilling, CIT offers a cue for bone engagement, which could support intraoperative awareness in scenarios where tactile feedback is reduced or visual confirmation is indirect. While CIT may enhance surgeon confidence during screw advancement, its clinical relevance, reproducibility, and impact on placement accuracy have yet to be validated. Full article
(This article belongs to the Special Issue Advances in Spine Surgery: Best Practices and Future Directions)
Show Figures

Figure 1

17 pages, 1039 KiB  
Article
Limited Short-Term Effects of Tactile Stimulation on the Welfare of Newborn Nellore Calves
by Mariana Parra Cerezo, Victor Brusin, Pedro Henrique Esteves Trindade, Adalinda Hernández, Jens Jung, Charlotte Berg and Mateus José Rodrigues Paranhos da Costa
Vet. Sci. 2025, 12(4), 393; https://doi.org/10.3390/vetsci12040393 - 21 Apr 2025
Viewed by 716
Abstract
This study aimed to evaluate the effects of tactile stimulation on calf welfare. A total of 54 Nellore calves were assessed, with 28 of them receiving tactile stimulation (WTS) for ~60 s and 26 serving as a control. Five body movements and seven [...] Read more.
This study aimed to evaluate the effects of tactile stimulation on calf welfare. A total of 54 Nellore calves were assessed, with 28 of them receiving tactile stimulation (WTS) for ~60 s and 26 serving as a control. Five body movements and seven facial expressions were scored. Heart rates (HRs) were recorded in three situations: when the calves were placed in lateral recumbency (HR1), during identification procedures (HR2), and after completion of identification procedures (HR3). The differences between HR3 and HR1, as well as HR3 and HR2 were calculated. Initial and weaning weights were recorded, and ADG and weaning weights adjusted to 240 days were determined. Tactile stimulation significantly influenced “head movements”, “third eyelid” exposure, “eye-opening”, and “strained nostrils”. Except for “strained nostrils”, WTS calves exhibited higher scores in these behavioral categories. Treatment also influenced the difference between HR3 and HR2 (p < 0.05) and showed a trend for HR3 and the difference between HR3 and HR1 (p < 0.06). A qualitative behavior assessment (QBA) was applied using facial expressions. Two main principal components were identified, PC1 explaining 63.01% of the data variance and reflecting the calves’ emotionality, and PC2 explaining 19.88% and reflecting excitability. Most WTS calves displayed positive emotional states and high excitability, whereas most NTS calves exhibited the opposite. Treatment did not significantly impact PC1 and PC2 indexes and long-term performance indicators (p > 0.05). We conclude that tactile stimulation of newborn Nellore calves during their initial handling has the potential to enhance their short-term welfare, but only to a limited extent. Full article
(This article belongs to the Section Anatomy, Histology and Pathology)
Show Figures

Figure 1

16 pages, 21667 KiB  
Article
MateREAL Touch: Handheld Haptic Texture Display with Real Rolling Materials
by Katsuya Maezono, Hikaru Nagano, Yuichi Tazaki and Yasuyoshi Yokokohji
Electronics 2025, 14(7), 1250; https://doi.org/10.3390/electronics14071250 - 21 Mar 2025
Viewed by 465
Abstract
This paper presents the development of “MateREAL Touch”, a tactile display system that reproduces the sensation of stroking various material textures. The system can store up to 30 samples of material, which are connected via a continuous piece of tape. When not touching, [...] Read more.
This paper presents the development of “MateREAL Touch”, a tactile display system that reproduces the sensation of stroking various material textures. The system can store up to 30 samples of material, which are connected via a continuous piece of tape. When not touching, the material switches seamlessly, and the tape moves in sync with the user’s finger, dynamically replicating the feeling of stroking. Additionally, the device simulates transitions between contact and non-contact states by adjusting the grip mechanism based on virtual interactions. As fundamental performance assessments, the material’s switching time was measured. In addition, a discrimination task compared users’ ability to distinguish eight materials under static and dynamic touch conditions in both real and virtual environments. The results showed comparable discrimination accuracy, demonstrating the effectiveness of the system in reproducing real-world material textures in VR. These findings confirm the system’s ability to enable realistic texture perception in virtual environments. Full article
(This article belongs to the Special Issue Haptic Systems and the Tactile Internet: Design and Applications)
Show Figures

Graphical abstract

14 pages, 981 KiB  
Article
Sensory Perception During Partial Pseudo-Haptics Applied to Adjacent Fingers
by Satoshi Saga and Kotaro Sakae
Multimodal Technol. Interact. 2025, 9(3), 19; https://doi.org/10.3390/mti9030019 - 26 Feb 2025
Viewed by 836
Abstract
Pseudo-haptics, the phenomenon of creating a simulated tactile sensation by introducing a discrepancy between a voluntary movement and its visual feedback, is well known. Typically, when inducing pseudo-haptics, the same control-display ratio (C/D ratio) is applied to all effectors. However, with the aim [...] Read more.
Pseudo-haptics, the phenomenon of creating a simulated tactile sensation by introducing a discrepancy between a voluntary movement and its visual feedback, is well known. Typically, when inducing pseudo-haptics, the same control-display ratio (C/D ratio) is applied to all effectors. However, with the aim of expanding the potential illusions that can be presented with pseudo-haptics, we investigated how perceived sensations change when partial pseudo-haptics are applied to adjacent body parts. In this research, we examined how perceived sensations change when pseudo-haptic stimuli are applied to adjacent body parts. Specifically, we investigated the correlation between finger states and the magnitude of illusory perception during both quasi-static and dynamic movements and identified the finger that experienced discomfort during dynamic movements with pseudo-haptics. Our findings revealed that: First, the magnitude of the illusion varied based on the contact state of adjacent fingers. Second, the illusion was more pronounced during dynamic movements compared to quasi-static movements. Third, regardless of the finger receiving the pseudo-haptic stimulus, the discomfort was primarily experienced in the finger exhibiting an overall inhibitory movement. The findings contribute to the practical application of pseudo-haptics as a virtual haptic display technology. Full article
Show Figures

Figure 1

19 pages, 5533 KiB  
Article
An Innovative Coded Language for Transferring Data via a Haptic Thermal Interface
by Yosef Y. Shani and Simon Lineykin
Bioengineering 2025, 12(2), 209; https://doi.org/10.3390/bioengineering12020209 - 19 Feb 2025
Viewed by 600
Abstract
The objective of this research was to develop a coded language, similarly to Morse or Braille, via a haptic thermal interface. The method is based on the human thermal sense to receive and decode the messages, and is to be used as an [...] Read more.
The objective of this research was to develop a coded language, similarly to Morse or Braille, via a haptic thermal interface. The method is based on the human thermal sense to receive and decode the messages, and is to be used as an alternative or complementary channel for various scenarios in which conventional channels are not applicable or not sufficient (e.g., communication with the handicapped or in noisy/silent environments). For the method to be effective, it must include a large variety of short recognizable cues. Hence, we designed twenty-two temporally short (<3 s) cues, each composed of a sequence of thermal pulses, meaning a combination of warm and/or cool pulses with several levels of intensity. The thermal cues were generated using specially designed equipment in a laboratory environment and displayed in random order to eleven independent participants. The participants identified all 22 cues with 95% accuracy, and 16 of them with 98.3% accuracy. These results reflect extraordinary reliability, indicating that this method can be used to create an effective innovative capability. It has many potential implications and is applicable immediately in the development of a new communication capability, either as a single-modality thermal interface, or combined with tactile sensing to form a full haptic multisensory interface. This report presents the testing and evaluating process of the proposed set of thermal cues and lays out directions for possible implementation and further investigations. Full article
Show Figures

Figure 1

25 pages, 9799 KiB  
Article
A Diamond Approach to Develop Virtual Object Interaction: Fusing Augmented Reality and Kinesthetic Haptics
by Alma Rodriguez-Ramirez, Osslan Osiris Vergara Villegas, Manuel Nandayapa, Francesco Garcia-Luna and María Cristina Guevara Neri
Multimodal Technol. Interact. 2025, 9(2), 15; https://doi.org/10.3390/mti9020015 - 13 Feb 2025
Viewed by 849
Abstract
Using the senses is essential to interacting with objects in real-world environments. However, not all the senses are available when interacting with virtual objects in virtual environments. This paper presents a diamond methodology to fuse two technologies to represent the senses of sight [...] Read more.
Using the senses is essential to interacting with objects in real-world environments. However, not all the senses are available when interacting with virtual objects in virtual environments. This paper presents a diamond methodology to fuse two technologies to represent the senses of sight and touch when interacting with a virtual object. The sense of sight is represented through augmented reality, and the sense of touch is represented through kinesthetic haptics. The diamond methodology is centered on the user experience and comprises five general stages: (i) experience design, (ii) sensory representation, (iii) development, (iv) display, and (v) fusion. The first stage is the expected, proposed, or needed user experience. Then, each technology takes its homologous activities from the second to the fourth stage, diverging from each other along their development. Finally, the technologies converge to the fifth stage for fusion in the user experience. The diamond methodology was tested by generating a user’s dual sensation when interacting with the elasticity of a tension virtual spring. The user can simultaneously perceive the visual and tactile change of the virtual spring during the interaction, representing the object’s deformation. The experimental results demonstrated that an interactive experience can be felt and seen in augmented reality following the diamond methodology. Full article
Show Figures

Graphical abstract

24 pages, 20196 KiB  
Article
Inclusive Museum Engagement: Multisensory Storytelling of Cagli Warriors’ Journey and the Via Flamina Landscape Through Interactive Tactile Experiences and Digital Replicas
by Paolo Clini, Romina Nespeca, Umberto Ferretti, Federica Galazzi and Monica Bernacchia
Heritage 2025, 8(2), 61; https://doi.org/10.3390/heritage8020061 - 6 Feb 2025
Cited by 3 | Viewed by 2435
Abstract
This paper presents a case study from the Archaeological and Via Flaminia Museum in Cagli (Italy), developed within the ERASMUS+ Next-Museum project, which explores inclusive approaches through the digital transformation of small museums and their connection to the surrounding territory. A key goal [...] Read more.
This paper presents a case study from the Archaeological and Via Flaminia Museum in Cagli (Italy), developed within the ERASMUS+ Next-Museum project, which explores inclusive approaches through the digital transformation of small museums and their connection to the surrounding territory. A key goal was to “return” bronze statuettes to the museum, symbolically compensating the community for their absence. The initiative integrates accessibility and multisensory storytelling following “Design for All” principles. Three installations were implemented: tactile replicas of the statuettes produced through 3D printing, a sensorized table for interactive storytelling, and a story map displayed on a touchscreen for exploring local archaeological heritage. The design prioritized inclusivity, particularly for visitors with visual impairments, while addressing practical constraints such as the need for a mobile and flexible setup within a limited budget. Verification and validation tests were conducted with visually impaired participants during the pre-opening phase, and the installations were later evaluated using the User Experience Questionnaire, complemented by qualitative feedback. These evaluations highlight the potential of phygital experiences to foster engagement with cultural heritage while addressing technological and design challenges. Full article
Show Figures

Figure 1

16 pages, 3014 KiB  
Article
Cross-Modal Interaction Between Perception and Vision of Grasping a Slanted Handrail to Reproduce the Sensation of Walking on a Slope in Virtual Reality
by Yuto Ohashi, Monica Perusquía-Hernández, Kiyoshi Kiyokawa and Nobuchika Sakata
Sensors 2025, 25(3), 938; https://doi.org/10.3390/s25030938 - 4 Feb 2025
Cited by 1 | Viewed by 879
Abstract
Numerous studies have previously explored the perception of horizontal movements. This includes research on Redirected Walking (RDW). However, the challenge of replicating the sensation of vertical movement has remained a recurring theme. Many conventional methods rely on physically mimicking steps or slopes, which [...] Read more.
Numerous studies have previously explored the perception of horizontal movements. This includes research on Redirected Walking (RDW). However, the challenge of replicating the sensation of vertical movement has remained a recurring theme. Many conventional methods rely on physically mimicking steps or slopes, which can be hazardous and induce fear. This is especially true when head-mounted displays (HMDs) obstruct the user’s field of vision. Our primary objective was to reproduce the sensation of ascending a slope while traversing a flat surface. This effect is achieved by giving the users the haptic sensation of gripping a tilted handrail similar to those commonly found on ramps or escalators. To achieve this, we developed a walker-type handrail device capable of tilting across a wide range of angles. We induced a cross-modal effect to enhance the perception of walking up a slope. This was achieved by combining haptic feedback from the hardware with an HMD-driven visual simulation of an upward-sloping scene. The results indicated that the condition with tactile presentation significantly alleviated fear and enhanced the sensation of walking uphill compared to the condition without tactile presentation. Full article
(This article belongs to the Special Issue Sensors for Object Detection, Pose Estimation, and 3D Reconstruction)
Show Figures

Figure 1

13 pages, 5359 KiB  
Article
Displaying Tactile Sensation by SMA-Driven Vibration and Controlled Temperature for Cutaneous Sensation Assessment
by Tomohiro Nozawa, Renke Liu and Hideyuki Sawada
Actuators 2024, 13(11), 463; https://doi.org/10.3390/act13110463 - 18 Nov 2024
Viewed by 2760
Abstract
In this paper, we propose a novel tactile display that can present vibration patterns and thermal stimuli simultaneously. The vibration actuator employs a shape memory alloy (SMA) wire to generate micro-vibration with a frequency control of up to 300 Hz. The micro-vibration is [...] Read more.
In this paper, we propose a novel tactile display that can present vibration patterns and thermal stimuli simultaneously. The vibration actuator employs a shape memory alloy (SMA) wire to generate micro-vibration with a frequency control of up to 300 Hz. The micro-vibration is conducted to a tactile pin for amplifying the vibration, to be sufficiently recognized by a user. A thermal stimulation unit, on the other hand, consists of four Peltier elements with heatsinks for heat radiation. Four vibration actuators and a thermal unit are arranged in a flat plane with a size of 20 mm × 20 mm, on which a user places the tip of an index finger to feel the presented vibratory stimuli under different temperature conditions. We conducted an experiment by employing nine subjects to evaluate the performance of the proposed tactile display and also to investigate the effects of temperature on recognizing tactile sensation. The results demonstrated that the proposed device was feasible for the quantitative diagnosis of tactile sensation. In addition, we verified that the sensitivity of tactile sensation decreased with colder stimuli. Full article
(This article belongs to the Special Issue Innovative Actuators Based on Shape Memory Alloys)
Show Figures

Figure 1

17 pages, 4004 KiB  
Article
Designing a Tactile Document UI for 2D Refreshable Tactile Displays: Towards Accessible Document Layouts for Blind People
by Sara Alzalabny, Omar Moured, Karin Müller, Thorsten Schwarz, Bastian Rapp and Rainer Stiefelhagen
Multimodal Technol. Interact. 2024, 8(11), 102; https://doi.org/10.3390/mti8110102 - 8 Nov 2024
Cited by 3 | Viewed by 1650
Abstract
Understanding document layouts is vital for enhancing document exploration and information retrieval for sighted individuals. However, for blind and visually impaired people, it becomes challenging to have access to layout information using typical assistive technologies such as screen readers. In this paper, we [...] Read more.
Understanding document layouts is vital for enhancing document exploration and information retrieval for sighted individuals. However, for blind and visually impaired people, it becomes challenging to have access to layout information using typical assistive technologies such as screen readers. In this paper, we examine the potential benefits of presenting documents on two-dimensional (2D) refreshable tactile displays. These displays enable the tactile perception of 2D data, offering the advantage of dynamic and interactive functionality. Despite their potential, the development of user interfaces (UIs) for such displays has not advanced significantly. Thus, we propose a design of an intelligent tactile user interface (TUI), incorporating touch and audio feedback to represent documents in a tactile format. Our exploratory study for evaluating this approach revealed satisfaction from participants with the experience of directly viewing documents in their true form, rather than relying on screen-reading interpretations. Additionally, participants offered recommendations for incorporating additional features and refining the approach in future iterations. To facilitate further research and development, we have made our dataset and models publicly available. Full article
Show Figures

Graphical abstract

Back to TopTop