Next Article in Journal
Virtual Reality Applied to Design Reviews in Shipbuilding
Previous Article in Journal
Systemic Gamification Theory (SGT): A Holistic Model for Inclusive Gamified Digital Learning
Previous Article in Special Issue
Individual Variability in Cognitive Engagement and Performance Adaptation During Virtual Reality Interaction: A Comparative EEG Study of Autistic and Neurotypical Individuals
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Interaction with Tactile Paving in a Virtual Reality Environment: Simulation of an Urban Environment for People with Visual Impairments

by
Nikolaos Tzimos
1,
Iordanis Kyriazidis
2,
George Voutsakelis
1,
Sotirios Kontogiannis
3 and
George Kokkonis
2,*
1
Department of Business Administration, University of Western Macedonia, 51100 Grevena, Greece
2
Department of Information and Electronic Engineering, International Hellenic University, 57400 Thessaloniki, Greece
3
Department of Mathematics, University of Ioannina, 45110 Ioannina, Greece
*
Author to whom correspondence should be addressed.
Multimodal Technol. Interact. 2025, 9(7), 71; https://doi.org/10.3390/mti9070071
Submission received: 30 April 2025 / Revised: 10 July 2025 / Accepted: 10 July 2025 / Published: 14 July 2025

Abstract

Blindness and low vision are increasing serious public health issues that affect a significant percentage of the population worldwide. Vision plays a crucial role in spatial navigation and daily activities. Its reduction or loss creates numerous challenges for an individual. Assistive technology can enhance mobility and navigation in outdoor environments. In the field of orientation and mobility training, technologies with haptic interaction can assist individuals with visual impairments in learning how to navigate safely and effectively using the sense of touch. This paper presents a virtual reality platform designed to support the development of navigation techniques within a safe yet realistic environment, expanding upon existing research in the field. Following extensive optimization, we present a visual representation that accurately simulates various 3D tile textures using graphics replicating real tactile surfaces. We conducted a user interaction study in a virtual environment consisting of 3D navigation tiles enhanced with tactile textures, placed appropriately for a real-world scenario, to assess user performance and experience. This study also assess the usability and user experience of the platform. We hope that the findings will contribute to the development of new universal navigation techniques for people with visual impairments.

1. Introduction

The population with visual problems has increased over the past two decades. Visual impairment affects hundreds of millions globally and manifests in various forms and severities. Vision loss significantly impacts individuals’ quality of life and daily functioning. A 2015 World Health Organization (WHO) study reported a reduction in blindness prevalence from 0.75% in 1990 to 0.48% in 2015, with moderate to severe vision loss decreasing from 3.83% to 2.90% [1]. A 2020 WHO report estimates that approximately 2.2 billion people worldwide have low vision [1]. The projections for the future are concerning. The number of people affected by these conditions is estimated to increase significantly in the coming years. Specifically, the affected individuals are expected to reach nearly 115 million by 2050. Contributing factors include population growth and aging; an increasing diabetes prevalence, one of the leading causes of blindness; urbanization with poor environmental conditions; and limited access to healthcare, resulting in delays in diagnosis and treatment.
For individuals with visual impairments, independence and self-confidence in performing daily activities are significantly affected. Blindness affects both the psychological well-being and the practical ability of individuals to function independently. The main areas include
  • Independence (daily activities, mobility, use of public transportation);
  • Self-confidence (reduced self-esteem, decreased confidence in their abilities);
  • Emotional impact (influence on social relationships, isolation, social prejudice and stereotypes, social activity, grouping).
A decline in quality of life and productivity, particularly among older adults, imposes significant economic costs on society, including increased expenditures for long-term care [2]. Most studies emphasize that a reduced quality of life primarily concerns elderly populations. The study by Tore Bonsaksen et al. expands the literature by showing that variations in quality of life occur across adults of all age groups [3]. Moreover, the findings suggest that these differences are mainly found at a higher quality of life, as individuals with visual impairments tend to adapt more comfortably to intermediate levels. The results indicate that visual impairment hinders individuals from attaining the highest levels of their perceived ideal quality of life. The extent to which these effects are felt depends on how well a society has created the appropriate conditions for people with vision impairments to live socially active, meaningful, and fulfilling lives.
Factors such as functional independence, community inclusion, and satisfaction with daily life and employment can improve quality of life. A related study conducted in Taiwan [4] examines how visual impairment influences social participation, intending to develop interventions that promote inclusion and well-being among individuals with visual disabilities. Overall, the study underscores the need for policies that enhance access to employment and support independent living for individuals with visual impairments.
The research investigates the recognition and use of tactile floors in virtual reality, contributing to the mobility and navigation challenges faced by visually impaired individuals in urban environments. Specifically, the difficulties they face are centered on
  • Outdoor navigation (safety while moving, risk of traffic accidents, navigating routes without assistance);
  • Public transportation usage (inability to orient themselves at stations, difficulty accessing vehicles);
  • Sensory-guided routes (tactile paving and other guidance technologies, use of tools such as GPS and navigation devices);
  • Indoor orientation (difficulty navigating enclosed environments, need for voice guidance systems).
In recent years, modern object positioning detection systems have been developed. These systems utilize a combination of radio frequency technologies, sensors, and vision-based methods, including GPS, WPS, RFID, cellular-based systems, UWB, WLAN, and Bluetooth. In their study, Hui Liu et al. [5] review these technologies and conclude that new hybrid positioning algorithms are needed to integrate the strengths of the existing methods. Many major cities have embraced the concept of accessible tourism, which emphasizes that the built environment should be usable by individuals with diverse abilities [6]. This approach encourages individuals with disabilities to travel to these cities without experiencing feelings of isolation, fear, or insecurity. Tactile ground surface indicators are an effective tool in these cities to ensure safe mobility for blind individuals. These indicators were first implemented in Okayama, Japan, in 1967. They are mostly embossed square tiles that indicate a direction, change in direction, hazard zones, and service points [6,7]. Architecturally, these are textured surfaces made from durable materials such as steel, granite, or hard rubber and feature contrasting colors like red and yellow to ensure they are easily perceptible and recognizable by touch. The tiles’ design and strategic placement in urban environments are critical considerations. However, it has been observed that while tactile surfaces aid individuals with visual impairments, they may pose obstacles for wheelchair users or elderly individuals [6].
Mobility and navigation in urban environments remain major challenges for individuals with visual impairments. However, with adequate support, targeted training, and the integration of modern technologies, there is considerable potential to enhance their independence and rebuild self-confidence. Orientation and Mobility (O&M) training is a key factor in enabling individuals to move independently in any environment, whether familiar or unfamiliar [2,8].
Navigation aids, such as canes, guide dogs, and assistive technologies, are crucial in enhancing mobility and independence for individuals with visual impairments [9]. However, each of these aids, along with Orientation and Mobility (O&M) training techniques, presents certain limitations and drawbacks. Their effectiveness varies depending on individual preferences, the type of visual impairment, and the specific navigation strategies employed. This highlights the need for consistent and universal navigation aids [10]. A significant concern is that these tools can expose users, particularly trainees, to risks such as falls, injuries, or unintended contact with people or objects [2]. A study involving students with visual impairments investigated the challenges encountered during Orientation and Mobility training [8]. Commonly reported challenges include difficulty recognizing obstacles while in motion, an increased risk of accidents, and limitations in self-protective behaviors.
This study explores how tactile paving can be used in virtual reality to help people with visual impairments move around and navigate more easily in urban environments. We propose a guided virtual reality experience, specifically designed for people who are blind, which uses haptic feedback technology. In a safe environment, participants can practice and train, while both quantitative and qualitative data are collected in real time.

2. Related Work

Although research specifically addressing the simulation and interaction with tactile paving in urban environments for individuals with visual impairments remains limited, several studies have explored related domains using virtual reality (VR) and augmented reality (AR) technologies to improve navigation and mobility. In their study, Fabiana Sofia Ricci et al. [2] examine the use of virtual reality as a tool for Orientation and Mobility training among individuals with visual impairments. The virtual environment designed in Unity and the use of the Oculus Quest VR headset offer interactive training through haptic, visual, and auditory feedback. The study focuses on simulating glaucoma, during which users demonstrated decreased performance in Orientation and Mobility assessments. Additionally, participants reported high levels of engagement and immersion in the VR environment. The platform proved effective in raising awareness about the challenges of visual impairments.
In their study, Alice Lo Valvo et al. present ARIANNA+ [10], a navigation system that uses computer vision [11], machine learning [12], and augmented reality (AR) to support the mobility of individuals with visual impairments. Essentially, ARIANNA+ is an enhanced version of the original ARIANNA system [13], which assists users in following predefined routes in both indoor and outdoor environments. Testing demonstrated that the system achieves a high accuracy in guiding users and recognizing landmarks using machine learning techniques. Leveraging technologies such as ARKit, SceneKit, convolutional neural networks (CNNs), and optical flow tracking, ARIANNA+ delivers an innovative and accessible navigation solution operable on a standard smartphone. Similar research deals with the development of autonomous navigation systems, presenting SEVN (Sidewalk Environment for Visual Navigation), an innovative navigation system based on reinforcement learning (RL) [14]. The study investigates the system’s ability to accurately reach designated destinations using multimodal input data. Experimental results showed a 74.9% success rate, indicating a promising performance but also highlighting the need for further research to improve generalization and efficiency. Many navigation applications use various spatial landmarks to help users orient themselves and navigate safely. However, there is a lack of systematic categorization and integration of these landmarks within navigation systems. A related study addresses this gap by categorizing the most frequently used landmarks, based on interviews with individuals who are blind [15]. The landmarks are classified into those detectable by touch (e.g., tactile floors, stairs, building corners) and those that confirm the user is on the correct path (e.g., auditory signals, traffic light signals). The study concludes that incorporating these landmarks into mapping algorithms and route planning can significantly enhance navigation for individuals with visual impairments. To further enhance and facilitate this integration, specific geometric features and constraints were established for each type of landmark. For instance, tactile flooring is modeled as a 400×400 mm rectangle, while pedestrian crossings are represented as rectangular areas connecting the BEVs (Built Environment Vertices) on either side of the street [15].
In a related experiment, nine distinct geometric patterns were tested for their ability to convey haptic properties within a virtual 3D environment [16]. Using the H3D 2.4.0 version software and the Touch Haptic Device (Phantom Omni), the study examined users’ ability to recognize and differentiate between the nine patterns. Users could distinguish the different textures, with varying recognition rates for each. The VirtuNav system is an innovative navigation system that offers independence and autonomy [17,18]. It combines virtual reality (VR), haptic feedback, audio cues, and advanced navigation algorithms to assist users in exploring indoor spaces. The system reconstructs 2D floorplans into immersive 3D environments using the X3D graphical format to generate detailed scenes and models. Obstacle recognition in the scene is achieved using image-processing algorithms. Participants using the Novint Falcon 3-degrees-of-freedom (DoF) Haptic Device demonstrated improved spatial awareness over time, with concurrent reductions in collisions during repeated trials. The transfer of knowledge from the virtual to the real environment was successful. The article by Suayder M. Costa et al. proposes an innovative navigation system combining visual attention and haptic guidance [19]. This research focuses on a machine learning framework designed to detect optimal navigation paths using weakly supervised learning, requiring minimal human intervention. Haptic feedback is achieved through vibrations with accurate guidance, even in complex environments. Another noteworthy innovation is virtual paving [20], which supports independent navigation through non-visual sensory feedback. Its design guidelines are informed by interviews and usability tests with blind individuals, aiming to facilitate smooth and efficient walking experiences. Feedback is delivered through a wearable, everyday-use backpack. Trial evaluations indicated that participants were able to navigate smoothly along primary pathways measuring 2.1 m in width. Matthew S. K. Yeo’s research [7] is based on an effort to combine hazard detection methods while robots move through public spaces with visual and haptic tools. The approach enhances conventional tactile paving to convey additional spatial hazard information. This experimental tactile paving design enables robots to interpret embedded information using a custom TSM haptic detection system in combination with a graph neural network (GNN). Experimental results demonstrated a 71.6% improvement in proactive hazard detection capabilities.
Despite significant advances in virtual navigation tools for individuals with visual impairments, existing approaches often present limitations regarding realism, standardization, and evaluation methodology. For instance, the work of Lahav et al. [21,22] focuses primarily on cognitive mapping through tactile representations of spatial layouts but lacks direct engagement with realistic path-following tasks. Similarly, the system proposed by Kreimeier and Götzelmann [23], which incorporates real-world proxy objects into VR environments, offers limited scalability and flexibility due to its dependence on physical setups. The study by Zhao et al. [24] introduces haptic and auditory feedback via a simulated white cane but does not integrate structured urban elements such as tactile paving surfaces. In this context, our work attempts to fill this gap by proposing a fully digital and configurable system grounded in official tactile paving standards (Greek Government Gazette 2/2022) [25], enabling immersive, multimodal training within realistic urban scenarios. Moreover, it incorporates a quantitative assessment framework (AS1, AS2, AS3) and allows users to adjust key haptic parameters to suit their perceptual preferences.

3. Materials and Methods

3.1. Short Description

This section presents the implementation of an application designed to train and familiarize blind individuals with navigating pathways using virtual 3D haptic tactile paving tiles. The section begins by introducing the hardware and software components, along with the experimental conditions and key design considerations. The focus then shifts to the environment’s design, detailing the haptic elements and the tasks assigned to participants within the platform. Next, the experimental simulation is presented along with the hypotheses underlying the experiment and the statistical methods used to test them. The virtual reality platform is intended to equip specialists with a tool for teaching visually impaired individuals to navigate in an informed, effective, and safe manner, while enhancing their engagement through an immersive virtual experience.

3.2. Software and Device

To create the virtual environment, the Unity platform was selected from among other available options. According to research, it is a popular free software development tool primarily used for creating games and applications with interactive 3D and 2D graphics [26,27]. It offers robust tools for graphics development, user interaction, and scene management. Unity supports a wide range of platforms, including PCs, consoles, mobile devices, and VR systems, enabling developers to deploy applications across diverse environments. The platform utilizes the C# programming language and provides extensive libraries and tools for developing, designing, and integrating graphics and interactive elements.
The haptic device used to interact with the virtual environment is the “Touch”, developed by 3D Systems (Figure 1). The “Touch” is an electromechanical device designed to provide kinesthetic feedback by simulating the sensation of physical objects through resistance to the user’s movements. This enables users to “feel” different textures and shapes and to interact with virtual objects via a contact and deformation algorithm [28]. The device is commonly used in VR applications, medical simulations, training scenarios, and fields that demand precise manipulation of virtual models, helping reduce product design and development costs [29]. The device features a stylus with three degrees of freedom (DOF) of motion, which provides resistance or “feedback” to mimic the sensation of physical objects, such as touching hard or soft surfaces. Previous studies have evaluated the device in human–machine interaction contexts, particularly for haptic guidance applications [28]. A key design consideration for the application was identifying the device’s optimal operating zone and effective working area [30]. The device’s structural features and capabilities aligned well with the application’s interaction requirements. It supports 3-DOF force feedback and 6-DOF positional sensing [31].

3.3. Experimental Design

Tactile paving surfaces play a crucial role in assisting individuals with visual impairments by helping them orient themselves and detect potential hazards. However, significant variations in the technical implementations across countries may lead to confusion for users. The literature review identified diverse implementation approaches of tactile flooring at bus stops in various countries [6]. Tactile paving tiles are designed to provide orientation and safety information to visually impaired individuals through touch, either via the feet or a cane. The primary pattern types include linear grooves or ridges, which indicate the direction of movement and are typically installed along pathways and corridors leading to key locations. Another common pattern consists of round domes, which signal caution or stopping points. These are usually placed before turns or intersections to prompt increased environmental awareness near platform edges or information points. Studies evaluating the effectiveness of tactile paving at pedestrian crossings for visually impaired individuals have shown that well-designed tactile guides positively impact safety and navigation [33]. In our work, the 3D modeling of the guidance tiles followed standards set by Greek legislation, specifically Government Gazette No. 2, dated December 7, 2022, Article 7 [25]. Tile patterns (Figure 2 and Table 1) were selected based on their prescribed textures, dimensions, and color specifications. In the virtual interaction environment, regulations were adhered to concerning tile size, the distance from the pedestrian free-walk zone and building line, and the width of the blind guidance path. At points where the direction changes perpendicularly, three Type A tiles are placed perpendicular to the pathway to serve as a warning. The table below presents the tile types designed and used in our application. Type A is further categorized into Type A1 and Type A2, depending on whether it is used for directional guidance (aligned with movement) or as a turning warning (perpendicular to movement).
Designing interactive systems remains a complex endeavor, particularly due to the diversity of users, which necessitates interfaces that are intuitive, accessible, and adaptable to varying levels of user experience. In his study, Guerra A. [34] proposes a model that integrates technological innovation with sustainable design principles, drawing on craft-based methodologies and strongly emphasizing interdisciplinary collaboration.
This study investigates whether blind and blindfolded users can recognize tactile guidance tiles accurately. Additionally, we assess users’ ability to navigate a predefined path, designed according to Greek legislation on tactile guidance systems, using only their sense of touch. Specific haptic parameters were assigned to each guidance tile to support tactile differentiation. Table 2 presents three predefined combinations of haptic parameters (Tile Presets 1, 2, and 3), which are used to render different textures on virtual tiles within the virtual reality environment. Each combination is defined by four parameters: tile stiffness, damping, static friction, and dynamic friction. Stiffness and damping have a fixed value of 1 across all presets, as they are considered sufficient for conveying the basic sense of solidity of the tile. In contrast, the static and dynamic friction parameters vary from 0.25 to 0.75 to offer a range of haptic experiences: Preset 1 provides a smooth and easier-to-touch texture, while Preset 3 generates strong friction and thus a rougher texture. These combinations allow users to select the one that best matches their haptic sensitivity, enhancing the readability of the tiles and navigation efficiency. The values range from 0 to 1, facilitating a comparison and evaluation of tactile feedback across different tile types while also ensuring a standardized haptic experience according to the Haptics Direct Unity plugin manual [35].
Stiffness: Stiffness shows how “hard” a surface feels when you touch it through a haptic device. It is calculated with the formula F = Kx, where F is the force, x is the displacement, and K is the spring constant. If stiffness is set to zero (0), then no other haptic sensation will be felt.
Damping: Damping reduces the bounciness or elasticity of a surface. Its value ranges from 0 (no damping—very springy surface) to 1 (maximum damping—minimal bounce). It helps reduce the rebound effect.
Static Friction: This defines how hard it is to start sliding your finger on the surface from a complete stop. A value of 0 means no friction at all, while a value of 1 means the maximum static friction the device can simulate.
Dynamic Friction: This shows how hard it is to keep sliding your finger once it has already started moving. Again, 0 means no friction, and 1 is the maximum dynamic friction the device can render.
Table 3 refers to the predefined collider settings, that is, the “collision body” used to detect contact between the controller and the virtual tiles in the virtual reality environment. The terms “small” (Collider Preset 1), “medium” (Collider Preset 2), and “large” (Collider Preset 3) describe the size of the collider surface used to record the user’s interaction with the tile’s texture. Specifically, the values are defined as follows:
Small: Small: diameter approximately 1 cm;
Medium: diameter approximately 2.5 cm;
Large: diameter approximately 4 cm.
Participants in the application were able to select the appropriate collider based on their preferences and tactile sensitivity, thereby enhancing the personalization of the experience.
Moreover, our aim is to design virtual spaces with tactile interaction that meet the accessibility standards required for blind users. This implies that the integration of both tactile and auditory feedback in a harmonized manner is essential for enhancing the user experience of individuals who are blind. This aligns with the findings of Sara Alzalabny et al. [36], who combine tactile and auditory feedback in user interface design. In a multimodal system, coordinating the integration of interaction modes is vital to ensure no conflicts arise. This research investigates the key aspects of multimodal interaction, including haptic interaction, offering insights into their practical application in real-world environments [37].
To achieve a more realistic experience, we added audio messages to the application, complementing the haptic and visual information. Audio feedback is crucial in enriching the user experience in haptic applications. In our implementation, audio cues inform users of ongoing actions, alert them to errors, and provide a confirmation of successful interactions.

3.4. Experimental Conditions

The experiment involved four scenarios (one training and three experimental) and was conducted with three categories of participants: individuals with total blindness, individuals with partial vision, and individuals with normal vision whose eyes were blindfolded throughout the procedure. The presence of differing sensory abilities among these groups is a critical factor in interpreting the results. Research such as that by Flamine Alary [38] has shown that individuals with blindness demonstrate a superior tactile ability in certain discrimination tasks compared to sighted individuals, due to neuroplasticity and enhanced processing of tactile stimuli. At the same time, other studies have confirmed the improved tactile acuity of blind individuals, particularly in tasks involving detailed surface and pattern recognition [39]. This difference highlights the need for tactile interfaces and applications to be designed inclusively, so that they are understandable and functional for users with different visual profiles, without relying solely on a hypothetical “average” experience of the sighted user. In the context of the present study, although participants with normal vision were blindfolded, we acknowledge that this cannot fully simulate the actual navigation experience of a person with visual impairment—an important consideration in the interpretation of the results.
This study involved 19 participants in total, comprising 9 men and 10 women. To calculate the mean and standard deviation of the ages of the 19 participants, we need to make some assumptions about the ages within each age range, because we only have the intervals and not the exact ages. The age distribution of the participants and the assumptions for each are presented in Table 4.
Based on this data, we calculated the overall average age: 37.42 years with a standard deviation of 8.8 years.
Four participants, two males and two females, had partial blindness, two males participants had total blindness, and thirteen (five males and eight females) were sighted individuals. The six blind participants were selected in collaboration with the “Association of the Blind of Western Macedonia, Greece” and the “Panhellenic Association of the Blind of Central Macedonia, Greece.” Two participants had complete vision loss (100%), which had progressively worsened over the years. Among the four participants with partial vision loss, one had a 95% impairment since birth, while the others experienced vision losses of 87%, 90%, and 95% respectively, starting in early childhood. The diagnosed conditions were related to glaucoma and retinitis pigmentosa.
The sample of 13 participants was selected through non-random sampling from the researcher’s circle of acquaintances. This selection was based on practical and research criteria. Specifically, some of the participants had previously participated in similar research processes, which contributed to their increased familiarity with the tactile device used, as well as with the broader issue of awareness addressed by this study. This experience was considered useful as it reduced the time needed to familiarize participants with the process and contributed to the smooth running of the experiment. Although this method does not ensure the representativeness of the sample, the data collected are considered indicative and appropriate for the needs of this pilot/exploratory study.
The experiment took place in a quiet, controlled laboratory environment. Each participant was seated at an individual workstation equipped with a Touch haptic feedback device and a laptop running the interactive application (Figure 3).
Before the start of the procedure, participants were fully briefed on this study’s objectives and provided informed consent in accordance with ethical research guidelines. Participants then completed an initial tutorial task designed to familiarize them with the virtual environment, the haptic device, the various tactile tile textures, and the audio messages. During interaction with the tile patterns, users were given the opportunity to adjust the Tile Preset (Table 2) and Collider Preset settings (Table 3), enabling them to select the combination that best facilitated haptic texture perception. In this phase, users were invited to freely explore the tile surfaces and the surrounding virtual environment, with audio messages offering guidance for scene navigation. Subsequently, participants completed a five-step movement test in which they were asked to identify the five tile types (Table 1). Each tile type appeared once per trial in a randomized sequence along the haptic five-step path. The application recorded both the accuracy of tile recognition and the total duration of user interaction. In the subsequent condition, participants navigated two urban scenarios, one with low difficulty (Scenario 2) and one with moderate difficulty (Scenario 3), featuring structured layouts composed of the previously encountered tile types (Figure 4a,b). During navigation along the route of Scenario 2 (Figure 4a), participants had direct control over their movement using a keyboard, proceeding along a predefined path. The design of the application did not allow for deviations; any attempt to stray from the designated route triggered an error sound and was recorded as a mistake. In contrast, in Scenario 3 (Figure 4b), route navigation was controlled by the application designer (applicator), who followed the verbal instructions of the user as the latter interacted with the Touch device. At this stage, the design allowed movement outside the tactile path, with the application recording the percentage of relative motion versus total interaction time.
Several studies suggest that the primary challenges of wayfinding are rooted in cognitive processes [40]. Orientation in navigation requires cognitive processes such as determining one’s current position relative to a reference point, selecting a route toward a destination, maintaining that route, and recognizing the endpoint upon arrival. Participants explored the virtual environment using the haptic device, which provided vibrotactile feedback as they moved along or deviated from the designated path. Distinct vibration patterns and audio cues aided participants in perceiving the movement direction, staying on course, and making corrections when necessary, thereby delivering essential spatial information throughout the task. Accordingly, the haptic device functioned as an input mechanism for navigation and an output device for delivering vibrotactile stimulation. The application tracked the number of errors (i.e., deviations from the path), movement duration along the path, and total interaction time (Figure 5a,b).
The design of the experimental procedure of Scenario 2 and 3 was based on a realistic urban navigation scenario, as individuals with visual impairments might experience it. The platform was developed to simulate a realistic training session, which had a maximum duration of 30 min, to minimize potential discomfort caused by prolonged exposure to a virtual reality environment. Certain specifications were established during the experimental phase to achieve this goal. In the initial task, which involved pattern recognition, participants were given a time limit of 5 min for familiarization, after which their performance was recorded. The remaining time (up to 25 min) was used to navigate the two routes of Scenario 2 and 3 with increasing difficulty (Figure 4a,b). Successful completion of the first route was not a prerequisite for proceeding to the second. In any case, if the participant experienced fatigue or discomfort, they were allowed to terminate the interaction at any time.
Participant performance was evaluated based on the total interaction time, tile recognition accuracy, error count, and duration spent navigating along the intended path. Finally, all participants, regardless of whether they completed the experimental procedure, provided feedback on the application by completing a post-experiment questionnaire (Appendix A).

4. Results

This study’s primary objective was to gain a deeper understanding of how blind users perceive haptic technology through interaction with objects in three-dimensional virtual environments. The following analysis presents the results of participants’ interactions across the three experimental scenarios and the questionnaire findings. The findings highlight the participants’ ability to distinguish various textural patterns on pathway virtual tiles within a 3D environment, with the ultimate goal of transferring this perceptual experience to urban navigation in the real world. User performance and evaluation scores were analyzed independently. According to the experimental design, participants interacted with the virtual haptic tiles in three different scenarios (Scenario 1, Scenario 2, Scenario 3). A mathematical model was developed for each scenario to calculate navigation ability, based on the experimental data. The goal of the mathematic formula is to reflect the user’s ability to correctly identify textures, navigate accurately and quickly, and maintain precision in their path without making errors.
Upon completion of the experiment, participants answered a series of 20 questions, 12 based on the USE questionnaire and 8 based on the UEQ. The USE (Usefulness, Satisfaction, and Ease of Use) consists of four thematic sections (Usefulness, Ease of Use, Ease of Learning, and Satisfaction), each comprising three questions. It measures usefulness, ease of use, learning curve, and satisfaction, focusing on the functional aspects of the system [41]. The UEQ (User Experience Questionnaire), consisting of eight questions, examines more emotional dimensions, such as attractiveness, innovation, and the overall user experience with the system. Combining the two questionnaires helped us gain a complete picture of the practicality and the emotional interaction of the users [42].

4.1. Analysis of the Experimental Procedure Results

During the first scenario, the participants were asked to recognize each of the five tile patterns of Figure 2. We calculate the overall Ability Score (AS1) for each participant, based on the recognition rate (A) and the interaction time (T1). Using the mathematical Equation (1), we defined a performance function that rewards high recognition accuracy (A) and penalizes long interaction times (T1).
Ability Score (AS1) = A × (T1min/T1)b
  • A: normalized recognition rate, with values ranging between [0,1];
  • T1: total interaction time in Scenario 1;
  • T1min: shortest interaction time among all participants;
  • b: weighting factor (we set b = 1).
A participant exhibiting a high recognition rate but a slow interaction speed will demonstrate low overall proficiency. Conversely, a participant with a moderate recognition rate who interacts rapidly will also not achieve a high Ability Score. The highest Ability Score is attained when a high recognition rate is combined with rapid interaction.
In the second scenario (Scenario 2), the participants were asked to virtually navigate along a predefined path in Unity that is composed of various virtual tactile tiles (Figure 4a). For Scenario 2, we calculated the overall Ability Score 2 (AS2) for each participant based on two criteria: whether the navigation task was completed (C2) and the interaction time (T2). Errors in navigations, such as collisions and deviations from the tactile path, led to longer completion times, which in turn directly affected the navigation Ability Score (AS2).
Ability Score 2 is based on mathematical Equation (2). The performance function AS2 penalizes errors and delays, assigns zero performance in the case of incomplete execution, and rewards fast, accurate, and complete task completion. This specific metric was designed by Yokoyama et al. [43] to evaluate navigation performance by considering both the success and the speed of completion.
Ability Score 2 (AS2) = C2 × (T2min/T2)
  • C2: task completion status, with values of 1 for completion and 0 for non-completion;
  • T2: total interaction time in Scenario 2;
  • T2min: fastest interaction time among participants who completed the task.
Similarly, for Scenario 3, we calculated the overall Ability Score (AS3) for each participant when they navigated through the virtual path of Scenario 3, shown in Figure 4b.
Ability Score 3 (AS3) = C3 × (T3min/T3)
  • C3: task completion status, with values 1 for completion and 0 for non-completion;
  • T3: total interaction time in Scenario 3;
  • T3min: fastest interaction time among participants who completed the task.
If participants failed to complete a task, their ability score was set to zero. For completed tasks, the ability index was determined by the completion time.
We calculated the average navigation ability ASmean in Equation (4) and the overall performance AStotal in Equation (5) for each participant.
ASmean = (AS1 + AS2 + AS3)/3
AStotal = AS1 + AS2 + AS3
In Table 5, the average, standard deviation, the min and max values of AS1, AS2, and AS3 for all category users are presented.
To examine whether there were performance differences between blind participants (6 individuals) and sighted participants (13 individuals), we used the Mann–Whitney U test [44,45]. This is a statistical method that compares two independent groups without requiring the data to follow any specific distribution (such as the normal distribution). The test was applied to the variables AS1 (recognition), AS2 (navigation in Scenario 2), and AS3 (navigation in Scenario 3) to determine whether their values differed significantly between the two groups.
The Mann–Whitney U test ranks all the values and compares the sum of their ranks, focusing on the overall difference in values between the groups. It is suitable for cases with a small number of participants, such as this study, and provides reliable results without requiring the strict assumptions of the t-test. The analysis can be easily performed in SPSS version 29.0 [46] by setting the group variable as whether the participant is blind or not and the dependent variables as AS1, AS2, and AS3. The methodology and calculations followed were based on scientific sources and established practices, as described in the literature [44,45,46]. We calculate the p-values (e.g., p < 0.05) to determine whether the differences were statistically significant, and we computed the effect size (r) to assess the magnitude of the statistical difference between the two groups.
The results of the Mann–Whitney U test for AS1, AS2, and AS3 (blind vs. sighted) are presented in Table 6.
In all cases, the p-values were significantly greater than the significance level of 0.05, indicating that no statistically significant difference in performance emerged between blind and sighted participants in any of the scenarios [45]. At the same time, the effect size (r) values were very low (r < 0.3 in all cases), which indicates a negligible difference between the two groups [45]. This finding shows that user performance in the specific virtual environment was not significantly affected by the presence or absence of vision, which is likely due to the nature of the task relying mainly on tactile and auditory information.

4.1.1. Scenario Comparison

We categorize the scenarios into two groups. The first group consists of Scenario 1, which assesses tile recognition ability. The second group consists of Scenario 2 and Scenario 3, which focus on evaluating navigation ability. During the analysis of the results, the minimum fulfillment time was used for each user per scenario (66, 193, and 188 s, respectively), meaning that these times served as the basis for normalization, for the calculation of AS1, AS2, and AS3.
In Scenario 1, a lower mean performance is observed compared to Scenario 2 and 3, indicating that tile recognition may be more demanding or may require a different set of skills than navigation. This finding can be explained by the fact that recognition tasks demand greater precision and clarity in responses, whereas in navigation tasks, users were able to make faster decisions by relying on pre-existing tile sequence rules, thus requiring less interaction time. Moreover, Scenario 1 exhibits lower performance variability (standard deviation = 0.18), suggesting greater homogeneity among users. In contrast, Scenario 2 presents the highest variability (standard deviation = 0.26), indicating a wider range of navigation abilities. This variation is attributed to the adoption of diverse navigation strategies by users: individuals with visual impairments employed techniques they already use in real-world environments, while sighted users, encountering these tactile patterns for the first time, developed new interaction strategies.
The average Ability Score in Scenario 2 (AS2) is higher than in Scenario 3. This indicates that users had a better performance in navigation in Scenario 2 than Scenario 3. This outcome can be attributed to the reduced complexity of Scenario 2, as the environment in Scenario 2 consisted solely of turns, without the presence of intersections or other complex navigation elements. However, it was observed that several users encountered difficulties distinguishing between turn warning tiles (Type A2) and hazard warning tiles (Type D). Additionally, the larger standard deviation observed in Scenario 2 suggests a greater variability in user performance, which may be associated with different levels of familiarity with the navigation system. This variability was also influenced by the amount of time each user dedicated to familiarizing themselves with the device and the overall application environment. Navigation times varied significantly, with some users exhibiting either considerably higher or notably lower times compared to the average, potentially indicating difficulties or errors during navigation.

4.1.2. User Performance Analysis

Before proceeding to the individual analysis of user performance, we aim to establish a categorization of navigation ability levels according to the following methodology:
  • High ability: User_AS_mean ≥ 0.75;
  • Moderate ability: 0.50 ≤ User_AS_mean < 0.75;
  • Low ability: User_AS_mean < 0.50.
Overall, no user managed to exceed the 0.75 threshold, which we defined for “high ability”. Five users demonstrated moderate ability, while the majority of users (74%) exhibited low navigation and recognition ability. Only one user achieved a performance of ASmean = 0.72, almost reaching the “high ability” threshold (>0.75). The low values of ASmean indicate potential difficulties in using the system or making incorrect decisions during the execution of the scenarios.
Furthermore, it is observed that the choices of “Tiles Preset” and “Collider Preset” of Table 2 and Table 3 influence the navigation Ability Score AS. Specifically, “Tiles Preset 1” and “Collider Preset 1” are associated with higher navigation ability, while users who worked with “Tiles Preset 2” and “Collider Preset 2” demonstrated lower performance.
The performances of blind and sighted users across the three distinct scenarios (AS1, AS2, AS3) reveal substantial differences both in mean performance and in performance variability between the two groups of users. In Scenario 1, blind users exhibited a lower mean score of AS1 (0.25) compared to sighted users (0.34), accompanied by an exceptionally low standard deviation (0.03), indicating systematic barriers but also a high degree of internal consistency within the group. This happened because sighted users could see the shape and layout of the tiles before the experiment began. In contrast, blind users could only touch them through the “Touch” device, without having a visual image. Sighted users had an advantage, as they were able to visually perceive the tiles during the training interaction. In contrast, in Scenario 3 (AS3), blind users outperformed sighted users (mean score of AS3 for blind users 0.68 vs. 0.30 for blindfolded users). Scenario 2 (AS2) demonstrated intermediate results (mean score of AS2 = 0.52 for blind users versus 0.44 for sighted users). Overall, the results highlight the importance of designing user interfaces to be accessible for individuals with different abilities, both cognitive and sensory. This emphasizes the need for scenarios to be adapted to the needs of each user group.

4.2. Analysis of the Results of the Evaluation Questionnaire

All participants, comprising 10 women and 9 men, responded to the evaluation form provided at the conclusion of the experiment. The majority of participants (52.6%) were aged between 36 and 45 years and a significant proportion (56.2%) were employed in the public sector. For the analysis of the responses, a total of 19 participants were considered. A Likert-type questionnaire, with a scale ranging from one to five, was used for the 12 usability questions (USE) (Appendix A.1), while a seven-point Likert-type scale was employed for the 8 user experience-related questions (UEQ) (Appendix A.2). This method allowed for a comprehensive assessment of both the usability and user experience of the system.
In order to normalize and facilitate the comparison of the results, we calculated the average score percentage (RATE) for each question, as shown in Equations (6) and (7). This approach allowed for a standardized representation of the participants’ evaluations, making it easier to interpret and compare the responses across different questions and Likert scales.
RATE = (SUM * 100)/MAXSUM
MAXSUM = 19 (participants) × 5 (maximum scale score) or 7 (maximum scale score)
The usability evaluation yielded overall very positive results, with all percentages exceeding 85%. The system’s utility was rated at 88.8%, confirming that it effectively meets users’ needs. Ease of use received a score of 85.6%, while ease of learning reached 86%, indicating that the system is largely user-friendly and does not require significant effort for comprehension. The highest score was observed in user satisfaction (90.5%), suggesting that the overall user experience was exceptionally positive. Despite the excellent results, small improvements in ease of use could further enhance the system’s overall efficiency and user experience (Table 7).
The analysis of the user experience of the UEQ in Table 8 revealed an overall highly positive evaluation by the participants. The individual dimensions of the experience, such as supportiveness (88.7%), ease of use (84.2%), efficiency (85.7%), and clarity (83.5%), demonstrated high levels of acceptance, indicating that the system is highly functional and user-friendly. Particularly positive evaluations were recorded in the dimensions of innovation and engagement, with the characteristics “Interesting” (94%), “Inventive” (91.7%), and “Leading edge” (92.5%) receiving the highest percentages, suggesting that our application was regarded as both pioneering and attractive. Despite the generally high satisfaction levels, it is notable that clarity received the lowest (yet still highly positive) score, potentially indicating an area for further improvement. Overall, the findings highlight an extremely positive assessment of the experience by users, with a particular emphasis on innovation, engagement, and support.

5. Discussion

This study highlights the significant potential of virtual reality (VR) and haptic feedback. The quantitative results regarding mobility skills (AS) and the qualitative data from the questionnaires revealed both the advantages and the limitations of the proposed platform. Specifically, the application was evaluated with high scores in usability (USE) and user experience (UEQ), receiving particularly positive responses concerning its innovation, ease of use, and functionality.
However, the user experience revealed critical areas that require further technical improvements. One of the main factors highlighted was the need for more training time and familiarization with the device and the virtual environment, as several participants reported difficulties in understanding the navigation rules. For example, an adaptive training module could be integrated, allowing gradual practice with increasing complexity and real-time feedback, in order to reduce the cognitive load and increase users’ confidence.
A significant technical issue concerned the differentiation of the various types of tactile tiles, particularly between types B and C, as well as A2 and D. Users had difficulty recognizing the patterns through texture alone, indicating the need for more pronounced differences in materials or surfaces. One possible solution would be the use of differentiated materials with varying roughness or the implementation of multidimensional tactile signals, such as combining different vibration frequencies and patterns. For example, adding additional vibration patterns that vary in intensity and duration, or using micro-elevations on specific tiles, could help achieve clear differentiation.
Additionally, users suggested the incorporation of auditory stimuli, such as natural ambient sounds (e.g., footsteps, the sound of water or birds), which would contribute to better spatial awareness and enhance the realism of the experience. At the same time, the use of sudden and intense vibrations when encountering obstacles could improve the sense of safety and provide immediate warnings of potential dangers, thus offering a more comprehensive and direct haptic feedback.
An additional important issue was the ergonomics of the Touch device, which caused fatigue after prolonged use, mainly due to the need to hold it in the air. This physical strain is a common observation, indicating the need for lighter or supportive design elements. For example, adopting a design that allows the device to be placed on a stable base or developing a wireless and more ergonomic controller could reduce fatigue. Alternatively, as suggested by a blind participant, direct interaction with the palm or other areas of the hand, without the need to hold the device, might offer greater comfort and ease of use, especially for users with limited endurance.
Finally, the preference of blind users for the tactile paths was primarily based on the absence of obstacles and less on their texture, highlighting the need to redesign the paths to better meet their real needs and preferences. This may mean that an emphasis should be placed more on clarity and ease of navigation rather than on the complexity of tactile signals. Future studies should focus on investigating the long-term impact of the platform on mobility in real environments, as well as on further ergonomic optimization of the tactile devices. Expanding the user sample and developing personalized training approaches will be critical for the universal adoption of the platform as a tool for education and accessibility.

6. Conclusions

Research on tactile paving patterns in virtual reality remains in its nascent stages. Nevertheless, future findings and conclusions derived from such investigations are expected to contribute meaningfully to sustainable development and the enhancement of everyday life, particularly for individuals who are blind or experience significant visual impairments. The present study examined the ability to recognize and distinguish between various tactile paving patterns. A series of experiments involving interactions with virtual three-dimensional models revealed key perceptual characteristics among users. Participants engaged with five distinct paving designs, simulating navigational scenarios within urban environments. The perceptual evaluations indicated that substantial potential exists for improving the design and tactile properties of the virtual haptic tiles, intending to maximize both functionality and user safety.

Author Contributions

Conceptualization, N.T., I.K. and G.K.; methodology, N.T.; software, I.K.; validation, N.T., I.K. and G.K.; formal analysis, N.T.; investigation, N.T.; data curation, G.V. and N.T.; writing—original draft preparation, N.T.; writing—review and editing, G.K. and S.K.; visualization, I.K.; supervision, G.K.; project administration, G.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

This study was conducted in accordance with the Declaration of Helsinki and approved by the Ethics Committee of IHU (protocol code 91\25-2-2025, approval date 25 February 2025).

Informed Consent Statement

Informed consent was obtained from all subjects involved in this study. Written informed consent has been obtained from the patient(s) to publish this paper.

Data Availability Statement

Data are provided on request.

Acknowledgments

The authors would like to express their sincere gratitude to all the individuals who participated in this study, especially the members of the “Association of the Blind of Western Macedonia, Greece” and the “Panhellenic Association of the Blind of Central Macedonia, Greece”.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A

Appendix A.1. USE Questionnaire

USE SectionsQuestionLikert Scale (1–5)
UsefulnessDoes it help me to be more effective?1 (Strongly Disagree)5 (Strongly Agree)
It is useful?1 (Strongly Disagree)5 (Strongly Agree)
Does it give me more control over the activities in my life?1 (Strongly Disagree)5 (Strongly Agree)
Ease of UseIt is easy to use?1 (Strongly Disagree)5 (Strongly Agree)
It is user-friendly?1 (Strongly Disagree)5 (Strongly Agree)
Can I recover from mistakes quickly and easily?1 (Strongly Disagree)5 (Strongly Agree)
Ease of LearningDid I learn to use it quickly?1 (Strongly Disagree)5 (Strongly Agree)
Can I easily remember how to use it?1 (Strongly Disagree)5 (Strongly Agree)
Is it easy to learn to use it?1 (Strongly Disagree)5 (Strongly Agree)
SatisfactionDo I feel I need to have it?1 (Strongly Disagree)5 (Strongly Agree)
Would I recommend it to a friend?1 (Strongly Disagree)5 (Strongly Agree)
It is fun to use?1 (Strongly Disagree)5 (Strongly Agree)

Appendix A.2. UEQ (Short Version)

UEQ Option ALikert Scale (1 to 7)UEQ Option B
Obstructive◦ 1 ◦ 2 ◦ 3 ◦ 4 ◦ 5 ◦ 6 ◦ 7Supportive
Complicated◦ 1 ◦ 2 ◦ 3 ◦ 4 ◦ 5 ◦ 6 ◦ 7Easy
Inefficient◦ 1 ◦ 2 ◦ 3 ◦ 4 ◦ 5 ◦ 6 ◦ 7Efficient
Clear◦ 1 ◦ 2 ◦ 3 ◦ 4 ◦ 5 ◦ 6 ◦ 7Confusing
Boring◦ 1 ◦ 2 ◦ 3 ◦ 4 ◦ 5 ◦ 6 ◦ 7Exciting
Not interesting◦ 1 ◦ 2 ◦ 3 ◦ 4 ◦ 5 ◦ 6 ◦ 7Interesting
Conventional◦ 1 ◦ 2 ◦ 3 ◦ 4 ◦ 5 ◦ 6 ◦ 7Inventive
Usual◦ 1 ◦ 2 ◦ 3 ◦ 4 ◦ 5 ◦ 6 ◦ 7Leading edge

References

  1. World Health Organization. Available online: https://www.who.int/ (accessed on 30 April 2025).
  2. Ricci, F.S.; Boldini, A.; Beheshti, M.; Rizzo, J.R.; Porfiri, M. A virtual reality platform to simulate orientation and mobility training for the visually impaired. Virtual Real. 2022, 27, 797–814. [Google Scholar] [CrossRef]
  3. Bonsaksen, T.; Brunes, A.; Heir, T. Quality of life in people with visual impairment compared with the general population. J. Public Health 2025, 33, 23–31. [Google Scholar] [CrossRef]
  4. Kim, H.-m.; Son, S.-m. Impacts of Daily Life and Job Satisfaction on Social Participation of Persons with Visual Impairment; Wiley: Hoboken, NJ, USA, 2023. [Google Scholar] [CrossRef]
  5. Liu, H.; Darabi, H.; Banerjee, P.; Liu, J. Survey of Wireless Indoor Positioning Techniques and Systems. IEEE Trans. Syst. Man Cybern. C 2007, 37, 1067–1080. [Google Scholar] [CrossRef]
  6. Rosa, M.P.; De Mello, G.S.; Morato, S. Tactile Paving Surfaces at Bus Stop. The needof Homogeneous Technical Solutions for Accessible Tourism. J. Access. Des. All 2021, 11, 259–294. [Google Scholar] [CrossRef]
  7. Yeo, M.S.K.; Pey, J.J.J.; Elara, M.R. Passive Auto-Tactile Heuristic (PATH) Tiles: Novel Robot-Inclusive Tactile Paving Hazard Alert System. Buildings 2023, 13, 2504. [Google Scholar] [CrossRef]
  8. Muhammad, J.; Aftab, M.J.; Bano, S.; Iram, U. Challenges encountered by Students with Visual Impairment in Accessing Orientation and Mobility Training Corresponding Author. Ann. Hum. Soc. Sci. 2024, 5, 514–523. [Google Scholar] [CrossRef]
  9. Emerson, R.W.; McCarthy, T. Orientation and Mobility for Students with Visual Impairments:Priorities for Research" International Review of Research in Mental Retardation. Int. Rev. Res. Ment. Retard. 2014, 46, 253–280. [Google Scholar] [CrossRef]
  10. Lo Valvo, A.; Croce, D.; Garlisi, D.; Giuliano, F.; Giarré, L.; Tinnirello, I. A Navigation and Augmented Reality System for Visually Impaired People. Sensors 2021, 21, 3061. [Google Scholar] [CrossRef]
  11. Navarro-Guerrero, N.; Toprak, S.; Josifovski, J.; Jamone, L. Visuo-haptic object perception for robots: An overview. Auton. Robots 2023, 47, 377–403. [Google Scholar] [CrossRef]
  12. Hu, Z.; Lin, L.; Lin, W.; Xu, Y.; Xia, X.; Peng, Z.; Sun, Z.; Wang, Z. Machine Learning for Tactile Perception: Advancements, Challenges, and Opportunities. Adv. Intell. Syst. 2023, 5, 2200371. [Google Scholar] [CrossRef]
  13. Croce, D.; Giarre, L.; La Rosa, F.G.; Montana, E.; Tinnirello, I. Enhancing tracking performance in a smartphone-based navigation system for visually impaired people. In Proceedings of the 24th Mediterranean Conference on Control and Automation (MED), Athens, Greece, 21–24 June 2016. [Google Scholar] [CrossRef]
  14. Weiss, M.; Chamorro, S.; Girgis, R.; Luck, M.; Kahou, S.E.; Cohen, J.P.; Nowrouzezahrai, D.; Precup, D.; Golemo, F.; Pal, C. Navigation Agents for the Visually Impaired: A Sidewalk Simulator and Experiments. In Proceedings of the Conference on Robot Learning (CoRL), Osaka, Japan, 30 October–1 November 2019; pp. 1314–1327. [Google Scholar] [CrossRef]
  15. Wang, M.; Dommes, A.; Renaudin, V.; Zhu, N. Analysis of Spatial Landmarks for Seamless Urban Navigation of Visually Impaired People. IEEE J. Indoor Seamless Position. Navig. 2023, 1, 93–103. [Google Scholar] [CrossRef]
  16. Tzimos, N.; Voutsakelis, G.; Kontogiannis, S.; Kokkonis, G. Evaluation of Haptic Textures for Tangible Interfaces for the Tactile Internet. Electronics 2024, 13, 3775. [Google Scholar] [CrossRef]
  17. Todd, C.; Mallya, S.; Majeed, S.; Rojas, J.; Naylor, K. Haptic-Audio Simulator for Visually Impaired Indoor Exploration. J. Assist. Technol. 2015, 9, 71–85. [Google Scholar] [CrossRef]
  18. Todd, C.; Mallya, S.; Majeed, S.; Rojas, J.; Naylor, K. VirtuNav: A Virtual Reality Indoor Navigation Simulator with Haptic and Audio Feedback for the Visually Impaired. In Proceedings of the IEEE Symposium on Computational Intelligence in Robotic Rehabilitation and Assistive Technologies (CIR2AT), Orlando, FL, USA, 9–12 December 2014; pp. 1–8. [Google Scholar] [CrossRef]
  19. Costa, S.M.; Damaceno, R.J.P.; Morimitsu, H.; Cesar, R.M., Jr. Tactile Path Guidance via Weakly Supervised Visual Attention. In Proceedings of the 4th Annual Workshop on the Future of Urban Accessibility (Urban Access), Online, 24 October 2024. [Google Scholar]
  20. Xu, S.; Yang, C.; Ge, W.; Yu, C.; Shi, Y. Virtual Paving: Rendering a Smooth Path for People with Visual Impairment through Vibrotactile and Audio Feedback. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2020, 4, 1–25. [Google Scholar] [CrossRef]
  21. Lahav, O.; Schloerb, D.; Kumar, S.; Srinivasan, M. Virtual Environment for People Who Are Blind—A Usability Study. J. Assist. Technol. 2012, 6, 38–52. [Google Scholar] [CrossRef] [PubMed]
  22. Lahav, O.; Schloerb, D.; Kumar, S.; Srinivasan, M. A virtual map to support people who are blind to navigate through real spaces. J. Spec. Educ. 2011, 26, 41–56. [Google Scholar] [CrossRef]
  23. Kreimeier, J.; Götzelmann, T. Real World VR Proxies to Support Blind People in Mobility Training. In Proceedings of the Mensch und Computer 2018, Dresden, Germany, 2–5 September 2018. [Google Scholar] [CrossRef]
  24. Zhao, Y.; Bennett, C.L.; Benko, H.; Cutrell, E.; Holz, C.; Morris, M.R.; Sinclair, M. Enabling People with Visual Impairments to Navigate Virtual Reality with a Haptic and Auditory Cane Simulation. In Proceedings of the CHI 2018—Conference on Human Factors in Computing Systems, Montréal, QC, Canada, 21–26 April 2018; pp. 1–14. [Google Scholar] [CrossRef]
  25. Greek Government (Ministry of Environment and Energy). Gazette of the Government of the Hellenic Republic; Greek Government (Ministry of Environment and Energy): Athens, Greece, 2022; Volume 7. [Google Scholar]
  26. Dickson, P.E.; Block, J.E.; Echevarria, G.N.; Keenan, K.C. An Experience-based Comparison of Unity and Unreal for a Stand-alone 3D Game Development Course. In Proceedings of the International Conference on Innovation and Technology in Computer Science Education (ITiCSE), Bologna, Italy, 3–5 July 2017. [Google Scholar] [CrossRef]
  27. Hussain, A.; Shakeel, H.; Hussain, F.; Uddin, N.; Ghouri, T.L. Unity Game Development Engine: A Technical Survey. Univ. Sindh J. Inf. Commun. Technol. 2020, 4, 73–81. [Google Scholar]
  28. Silva, A.; Domínguez Ramírez, O.A.; Vega, V.P.; Ordaz Oliver, J.P. PHANToM OMNI Haptic Device: Kinematic and Manipulability. In Proceedings of the Electronics, Robotics and Automotive Mechanics Conference (CERMA), Cuernavaca, Mexico, 22–25 September 2009; pp. 193–198. [Google Scholar] [CrossRef]
  29. Teklemariam, H.G.; Das, A.K. A Case Study of PHANToM OMNI Force Feedback Device for Virtual Product Design. Int. J. Interact. Des. Manuf. 2015, 9, 881–892. [Google Scholar] [CrossRef]
  30. San Martín, J.; Trivino, G. A Study of the Manipulability of the PHANToM OMNI Haptic Interface. In Proceedings of the Third Workshop on Virtual Reality Interactions and Physical Simulations (VRIPHYS), Madrid, Spain, 6–7 November 2006; pp. 127–128. [Google Scholar] [CrossRef]
  31. Isaksson, M.; Horan, B.; Nahavandi, S. Low-Cost 5-DOF Haptic Stylus Interaction Using Two Phantom Omni Devices. In Proceedings of the EuroHaptics (International Conference on Haptics: Neuroscience, Devices, Modeling, and Applications), Tampere, Finland, 13–15 June 2012; pp. 139–149. [Google Scholar] [CrossRef]
  32. 3D Systems Haptic Devices. Available online: https://www.3dsystems.com/haptics-devices/touch (accessed on 30 April 2025).
  33. Hu, S.; Ma, Y.; Dong, X.; Zhang, W. Evaluating the Effectiveness of Street-Crossing Tactile Paving for People with Visual Impairment Using a Structural Equation Model. In Proceedings of the 20th COTA International Conference of Transportation Professionals (CICTP 2020), Xi’an, China, 14–16 August 2020. [Google Scholar] [CrossRef]
  34. Guerra, A. Craft-Based Methodologies in Human–Computer Interaction: Exploring Interdisciplinary Design Approaches. Multimodal Technol. Interact. 2025, 9, 13. [Google Scholar] [CrossRef]
  35. 3DSystems. Available online: https://www.3dsystems.com/ (accessed on 30 April 2025).
  36. Alzalabny, S.; Moured, O.; Müller, K.; Schwarz, T.; Rapp, B. Designing a Tactile Document UI for 2D Refreshable Tactile Displays: Towards Accessible Document Layouts for Blind People. Multimodal Technol. Interact. 2024, 8, 102. [Google Scholar] [CrossRef]
  37. Dritsas, E.; Trigka, M.; Troussas, C.; Mylonas, P. Multimodal Interaction, Interfaces, and Communication: A Survey. Multimodal Technol. Interact. 2025, 9, 6. [Google Scholar] [CrossRef]
  38. Alary, F.; Duquette, M.; Goldstein, R.; Chapman, C.E.; Voss, P.; La Buissonnière-Ariza, V.; Lepore, F. Tactile acuity in the blind: A closer look reveals superiority over the sighted in some but not all cutaneous tasks. Neuropsychologia 2009, 47, 2037–2043. [Google Scholar] [CrossRef] [PubMed]
  39. Goldreich, D.; Kanics, I.M. Tactile Acuity is Enhanced in Blindness. J. Neurosci. 2003, 23, 3439–3445. [Google Scholar] [CrossRef]
  40. Heuten, W.; Henze, N.; Boll, S.; Pielot, M. Tactile Wayfinder: A Non-Visual Support System for Wayfinding. In Proceedings of the 5th Nordic Conference on Human–Computer Interaction (NordiCHI 2008), Lund, Sweden, 20–22 October 2008. [Google Scholar] [CrossRef]
  41. Lund, A.M. Measuring Usability with the USE Questionnaire. Usability Interface 2001, 8, 3–6. Available online: https://garyperlman.com/quest/quest.cgi?form=USE (accessed on 29 June 2025).
  42. Laugwitz, B.; Held, T.; Schrepp, M. Construction and Evaluation of a User Experience Questionnaire. In HCI and Usability for Education and Work, Proceedings of the 4th Symposium of the Workgroup Human-Computer Interaction and Usability Engineering of the Austrian Computer Society, USAB 2008, Graz, Austria, 20–21 November 2008; Springer: Berlin/Heidelberg, Germany, 2008; pp. 63–76. [Google Scholar] [CrossRef]
  43. Yokoyama, N.; Ha, S.; Batra, D. Success Weighted by Completion Time: A Dynamics-Aware Evaluation Criteria for Embodied Navigation. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Prague, Czech Republic, 27 September–1 October 2021; pp. 1562–1569. [Google Scholar] [CrossRef]
  44. Mann, H.B.; Whitney, D.R. On a test of whether one of two random variables is stochastically larger than the other. Ann. Math. Stat. 1947, 18, 50–60. [Google Scholar] [CrossRef]
  45. MacFarland, T.W.; Yates, J.M. Introduction to Nonparametric Statistics for the Biological Sciences Using R; Springer: Cham, Switzerland, 2016; ISBN 978-3-319-30633-9. [Google Scholar] [CrossRef]
  46. Nachar, N. The Mann-Whitney U: A Test for Assessing Whether Two Independent Samples Come from the Same Distribution. Tutor. Quant. Methods Psychol. 2008, 4, 13–20. [Google Scholar] [CrossRef]
Figure 1. “Touch Haptic Device”, developed by 3D Systems [32].
Figure 1. “Touch Haptic Device”, developed by 3D Systems [32].
Mti 09 00071 g001
Figure 2. Paving tile patterns.
Figure 2. Paving tile patterns.
Mti 09 00071 g002
Figure 3. The Touch device and a computer monitor.
Figure 3. The Touch device and a computer monitor.
Mti 09 00071 g003
Figure 4. Scenario 2 with a low degree of difficulty (a) and Scenario 3 with a higher degree of difficulty (b).
Figure 4. Scenario 2 with a low degree of difficulty (a) and Scenario 3 with a higher degree of difficulty (b).
Mti 09 00071 g004
Figure 5. Movement along the route of Scenario 2 (a) and movement along the route of Scenario 3 (b).
Figure 5. Movement along the route of Scenario 2 (a) and movement along the route of Scenario 3 (b).
Mti 09 00071 g005
Table 1. Paving tile types.
Table 1. Paving tile types.
TypeFunctionDescriptionPlace
A1(40 × 40)Directional (parallel to the axis of movement)Wide and thin stripesGuide people with visual impairments on their way
A2(40 × 40)Imminent turn
(perpendicular to the axis of movement)
Turn warningAt the points of vertical course change
B(40 × 40)Warning
(arranged in a square canvas with an arrangement diagonal to the movement)
Scalloped with strong scalesAt the beginning and end of inclined planes, at all door levels, along all landings
C(40 × 40)Direction change
(arranged in a square canvas with an arrangement parallel to the movement)
Scalloped with denser and less pronounced scalesPoints of change in direction of type A plates
D(40 × 40)Service
(parallel to the axis of movement)
Narrow and dense stripesPublic transport stops, telephone booths, special tactile signs for people with visual disabilities, etc.
Table 2. Tile presets.
Table 2. Tile presets.
Haptic ParametersTile Preset 1Tile Preset 2Tile Preset 3
Tile Stiffness111
Damping111
Static Friction0.250.50.75
Dynamic Friction0.250.50.75
Table 3. Collider presets.
Table 3. Collider presets.
Collider PresetCollider Preset 1Collider Preset 2Collider Preset 3
SizeSmallMediumBig
Table 4. Age data of participants.
Table 4. Age data of participants.
AgeNumber of ParticipantsAverage Price
18–253(18 + 25)/2 = 21.5
26–353(26 + 35)/2 = 30.5
36–4510(36 + 45)/2 = 40.5
46+350 (an estimate)
Table 5. Experimental data for the three scenarios.
Table 5. Experimental data for the three scenarios.
ScenarioAverage
All/Blind/Not Blind
Standard Deviation
All/Blind/Not Blind
Minimum
All/Blind/Not Blind
Maximum
All/Blind/Not Blind
AS10.32/0.25/0.340.18/0.03/0.200.08/0.20/0.080.65/0.29/0.65
AS20.47/0.52/0.440.26/0.14/0.300/0.29/01/0.70/1
AS30.63/0.68/0.300.22/0.31/0.160/0/01/1/0.56
Table 6. Results of the Mann–Whitney U test.
Table 6. Results of the Mann–Whitney U test.
U Statisticp-ValueEffect Size (r)Comments
AS137.00.8980.04No statistically significant difference
AS244.00.6930.10No statistically significant difference
AS340.00.9650.02No statistically significant difference
Table 7. Usability evaluation.
Table 7. Usability evaluation.
Usability EvaluationPercentage
Usefulness88.8%
Ease of Use85.6%
Ease of Learning86%
Satisfaction90.5%
Table 8. UEQ—User Experience Questionnaire.
Table 8. UEQ—User Experience Questionnaire.
QuestionPercentage
Supportive88.7%
Easy84.2%
Efficient85.7%
Clear83.5%
Exciting86.5%
Interesting94%
Inventive91.7%
Leading edge92.5%
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Tzimos, N.; Kyriazidis, I.; Voutsakelis, G.; Kontogiannis, S.; Kokkonis, G. Interaction with Tactile Paving in a Virtual Reality Environment: Simulation of an Urban Environment for People with Visual Impairments. Multimodal Technol. Interact. 2025, 9, 71. https://doi.org/10.3390/mti9070071

AMA Style

Tzimos N, Kyriazidis I, Voutsakelis G, Kontogiannis S, Kokkonis G. Interaction with Tactile Paving in a Virtual Reality Environment: Simulation of an Urban Environment for People with Visual Impairments. Multimodal Technologies and Interaction. 2025; 9(7):71. https://doi.org/10.3390/mti9070071

Chicago/Turabian Style

Tzimos, Nikolaos, Iordanis Kyriazidis, George Voutsakelis, Sotirios Kontogiannis, and George Kokkonis. 2025. "Interaction with Tactile Paving in a Virtual Reality Environment: Simulation of an Urban Environment for People with Visual Impairments" Multimodal Technologies and Interaction 9, no. 7: 71. https://doi.org/10.3390/mti9070071

APA Style

Tzimos, N., Kyriazidis, I., Voutsakelis, G., Kontogiannis, S., & Kokkonis, G. (2025). Interaction with Tactile Paving in a Virtual Reality Environment: Simulation of an Urban Environment for People with Visual Impairments. Multimodal Technologies and Interaction, 9(7), 71. https://doi.org/10.3390/mti9070071

Article Metrics

Back to TopTop