Next Article in Journal
Best Practices Evidenced for Software Development Based on DevOps and Scrum: A Literature Review
Previous Article in Journal
Self-Selected Comfortable Foot Position in Upright Standing: Correlation with Anthropometric Measurements and Comparison to Standardized Foot Position During Stabilometric Exam—An Observational Study
Previous Article in Special Issue
Integrating Virtual Reality into Art Education: Enhancing Public Art and Environmental Literacy Among Technical High School Students
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Versatile Tool for Haptic Feedback Design Towards Enhancing User Experience in Virtual Reality Applications

Faculty of Technical Sciences, University of Novi Sad, 21000 Novi Sad, Serbia
*
Author to whom correspondence should be addressed.
Appl. Sci. 2025, 15(10), 5419; https://doi.org/10.3390/app15105419
Submission received: 20 March 2025 / Revised: 19 April 2025 / Accepted: 23 April 2025 / Published: 13 May 2025

Abstract

:
The past 15 years of extensive experience teaching VR system development has taught us that haptic feedback must be more sophisticatedly integrated into VR systems, alongside the already realistic high-fidelity visual and audio feedback. The third generation of students is enhancing VR interactive experiences by incorporating haptic feedback through traditional, proven, commercially available gamepad controllers. Insights and discoveries gained through this process contributed to the development of versatile Unity custom editor tool, which is the focus of this article. The developed tool supports a wide range of use cases, enabling the visual, parametric, and descriptive creation of reusable haptic effects. To enhance productivity in commercial development, it supports the creation of haptic and haptic/audio stimulus libraries, which can be further expanded and combined based on object-oriented principles. Additionally, the tool allows for the definition of specific areas within the virtual space where these stimuli can be experienced, depending on the virtual object the avatar holds and the activities they perform. This intuitive platform allows the design of reusable haptic effects through graphical editor, audio conversion, programmatic scripting, and AI-powered guidance. The sophistication and usability of the tool have been demonstrated through several student VR projects across various application areas.

1. Introduction

Virtual reality (VR) systems are continuously advancing, offering users increasingly realistic and immersive experiences through the stimulation of various senses. To increase the immersion, presence, and user experience, it is necessary to create a complementary multisensory experience in VR, allowing users to feel more physically engaged and deeply absorbed in the virtual world [1]. Haptic feedback, which stimulates the sense of touch, can significantly contribute to this. Numerous studies have demonstrated that the introduction of haptic feedback positively impacts both the subjective perception of presence and influence, as well as objective, task-related performance in immersive extended reality (XR) environments. Reported benefits include faster task completion times [2,3,4], improved accuracy [4], enhanced user experience [5,6], more natural interaction [7,8,9], greater perceived realism [10], increased enjoyment [10], deeper immersion [10,11,12], a stronger sense of embodiment and ownership [13], and even the perception of improved task performance, including increased perceived performance and reduced perceived difficulty [10,11].
A systematic analysis of papers published from 2010 to 2024 found that haptic feedback enhances realism, modulates emotional responses, and influences behaviors such as compliance, decision-making, and social interaction in VR, which represents one of the most sophisticated aspects of interaction [14]. Moreover, research has shown that haptic feedback can be highly beneficial in both increasing the accessibility of existing applications for people with impaired vision and hearing, and in enabling the creation of new applications specifically designed for them. For example, gamepad vibration could be used for the development of audio-haptic maps [15,16], more inclusive games [17,18,19], or specialized applications aimed at improving reality perception for individuals with sensory impairments [20]. These findings confirm the capability of haptic feedback to effectively stimulate the sense of touch.

1.1. Challenges and Opportunities for Haptic Feedback Implementation in VR

Since 1992, hundreds of haptic devices have been invented for use across various application areas [21], and since 2015 alone, dozens have been specifically developed for use in VR [22]. In parallel, haptic rendering has been established as an architectural element of VR systems since 2004 [23]. Despite this, historically and currently, VR experiences have focused primarily on visual and auditory stimulation [1,10,24], while haptic feedback remains underutilized.
The broader adoption of haptic feedback is hindered by several factors, including the complexity of the human somatosensory system [22], the high cost of specialized hardware, the technical expertise required for programming, and a general lack of industry standardization [1]. As a result, advanced haptic systems remain largely confined to specialized settings, such as research laboratories and industrial applications, many of which are still in the experimental or prototype phase [1].
In the commercial VR landscape, haptic feedback in VR is provided through various devices, such as gloves, vests, and belts, but most often through native VR controllers. The use of native controllers or hand and body tracking has often been the preferred option for movement and interaction in the virtual world for practical reasons, such as a greater sense of presence, avoiding cybersickness, fine motor control, and physical engagement.
Although traditional gamepads offer lower haptic feedback capabilities compared to these higher-end haptic devices, an increasing number of them are now equipped with actuators comparable to those found in VR controllers. At the same time, they remain significantly more affordable, making them a highly accessible platform for haptic interaction in VR. However, despite these advancements, the use of gamepad haptic feedback remains inconsistent across not only VR applications but also in video games. Many platforms fail to fully exploit the potential of this technology, leaving a significant gap in immersive and interactive design.

1.2. Gamepad as an Input and Haptic Device in VR

Recent research shows that even a classic gamepad controller can be used and has its advantages as a control device in VR. Therefore, the gamepad may be a viable control interface even for more complex human–system interaction tasks. Cardoso [25] discovered that the gamepad control method is faster and more comfortable compared to Leap Motion-based and gaze-directed locomotion techniques in VR worlds. The gamepad exhibited fewer tracking errors, resulted in lower workloads, and achieved a slightly higher usability score, as measured by the System Usability Scale. In the analysis of different methods of interaction and locomotion used in immersive environments, including point of interest, gamepad, teleport, and room-scale, gamepad obtained medium values in all three different aspects: presence, cybersickness and usability, especially standing out as the best alternative for interacting with static 3D interfaces [26]. During the research on immersion and cybersickness in walking interaction for immersive VR applications, no difference in immersion was observed between the gamepad and hand interface, and only march-in-place provided better immersion. Similar results were found in research where virtual locomotion tasks were evaluated, comparing gamepad with finger distance, finger number, and finger tapping gestures. The gamepad showed the greatest ease of learning and use, highest average locomotion speed, best subjective accuracy, and least fatigue, task completion time, and collisions compared to the other methods [27].
In a study comparing 3D virtual world navigation using a multi-touch device and a gamepad, experienced users performed significantly better with the gamepad than with the multi-touch device, compared to casual users who were often unfamiliar with gamepads. Overall, most users reported favoring the gamepad for 3D navigation [28]. This preference for gamepads is further supported by research comparing gamepads and natural user interfaces (NUIs), where participants described gamepads as being more precise, comfortable, familiar, and successful gaming controllers. Notably, gamepad sessions were much longer on average than NUI sessions, and participants frequently mentioned their experience with gamepad controllers. Anecdotal and empirical investigations continue to report that gamers are often more comfortable with, and perceive traditional gamepad systems as more natural [29]. Likewise, in comparisons between gamepad and naturally-mapped VR controllers, several users reported a preference for using gamepads due to their familiarity with the device [30].
The gamepad is the most commercially accepted, most accessible, and most widely used input device capable of providing haptic feedback, incorporating vibrotactile haptic actuators, most commonly in the grips. Given its widespread availability, user familiarity and experience with them, and ease of integration with various platforms, it should prove useful for providing haptic feedback in VR. This is especially important as haptic actuator technologies in gamepads are constantly evolving and are capable of providing much more sophisticated and realistic haptic effects, such as those found in the Sony DualSense controller (Sony Interactive Entertainment, Tokyo, Japan) [31] and the Nintendo Switch with HD Rumble (Nintendo Co., Ltd., Kyoto, Japan) [32].
However, there are certain challenges when it comes to implementing haptic feedback on a gamepad. The implementation of vibrotactile haptic feedback on a gamepad has long been used in video games, demonstrating positive impacts on player engagement and experience [33]. Still, this functionality is often underutilized and is used only in the form of simple vibration effects whose role is mainly emphasis. Most major console platforms have their own way of implementing haptic feedback and clear guidelines for its use cases, which is why platform-specific titles generally use this functionality to a greater extent compared to multiplatform games [34].
In addition to additional time and budget requirements, the reason for the insufficient presence of haptic feedback could be insufficient knowledge, awareness, or user experience with it, but also the lack of tools in commercial video game studios for its rapid implementation. As part of the research [35], the three currently most popular game engines used in the industry were analyzed, as well as various assets, add-ons and packages for them. In their original form, without the use of plugins, all three engines contain an insufficient set of ready-made functionalities required for effective design and testing of haptic feedback. In general haptics practice, the design and implementation of haptic feedback is mostly realized by implementing it in code or by using different visual tools for designing haptic effects [36]. However, research has demonstrated that use of a prototyping tool with a graphical user interface (GUI) significantly simplifies the process of designing and testing haptic effects, especially in contrast to creating these effects directly through writing code [37].
Similar conclusions were drawn during our teaching in several university courses over the past 15 years, involving students from both programming and non-programming backgrounds in the development of VR and other interactive applications. In these courses, students develop video games, edutainment content, simulations, and various interactive experiences using popular game engines. Throughout the courses, students progress from developing entirely imaginary virtual worlds with minimal connection to reality to creating much more realistic and immersive ones. We noticed that in their projects, students most easily achieve realism and immersion through visual elements, followed by audio, while haptics created the most problems for them. Reluctance towards the implementation of haptics was noticeable, so most of them implemented very simple haptic feedback, such as constant vibration at a fixed intensity, while others chose not to implement it at all.
For the above reasons, there was a need to develop a dedicated haptic feedback design tool with a multimodal approach to the design of haptic effects. This paper describes one such tool developed as a plugin for Unity engine and used for design of gamepad haptic feedback. The insights and discoveries gained through the teaching process and observation of student design and development of interactive solutions have contributed to the creation of the tool and feature choice. This proposed tool streamlines the design of reusable haptic effects through a graphical editor, audio conversion, programmatic scripting, and AI-assisted guidance. It facilitates the creation of haptic and haptic/audio stimulus libraries, which can be expanded and combined. Moreover, users can define specific areas within a virtual environment where these stimuli can be felt, based on various factors, settings, and scenarios.
We decided to focus on the gamepad as the initial target device for the implementation and testing of this tool because it is the most commercially accepted, accessible, simultaneously usable across multiple platforms and types of interactive experiences, and most people already have one at home. Gamepads are traditional, standardized, and have a fairly consistent control layout, so users know what to expect. They have been evolving and improving for decades, and are characterized by ergonomic design, comfort, and a control layout that users are generally satisfied with [38].
The gamepad is the easiest to connect, adjust, and test, as it is universally recognized across different platforms. It is also the most accessible and affordable, especially for laboratories and educational settings. Most students already have one at home, making it easier not only to implement but also to scale testing and experiments. Our students showed the greatest interest in working with the gamepad and they strived the most towards it. We believe this is partly due to familiarity and habit, as they have the most experience with and have used gamepads the longest.
Finally, according to the Steam platform, from 2018 to 2024, daily average controller use has tripled [39]. As reported in the market research by Straits Research, the wireless gamepad market has experienced substantial growth due to the development of the gaming industry and is expected to grow at a rate of 6.17% annually, nearly doubling in value by 2031 compared to 2022 [40]. Therefore, the gamepad will remain a reliable and effective input device, and it is predictable that its potential for providing sophisticated haptic feedback will continue to grow. Based on all the provided benefits, the gamepad stands out as a strong contender for promoting the democratization of haptic feedback.
The experimental design involved the use of the tool by master’s students from the last three generations in the multimedia and computer games department, along with conducting observations, informal interviews, and discussions. Although it is difficult to determine whether its greatest contribution lies in design and development in general or in education, feedback from users of this tool and the applications developed using it both confirm that our research is heading in the right direction.
The paper is divided into five sections. The second section reflects on Materials and Methods, including the developed tool and its key features and capabilities. Results and Discussion are presented in sections three and four, respectively. The fifth section concludes the paper.

2. Materials and Methods

Before implementing the proposed tool, a comprehensive study was conducted to analyze the three most popular game engines currently used in the industry, along with various assets, add-ons, packages, and external software tools intended for them. Based on these insights, the fundamental requirements and desired features for the tool were established, leveraging the strengths of existing solutions while addressing their limitations as much as possible. This research, along with the programmatic foundation for implementing the tool, is described in detail in [35]. In addition to the insights gained from that analysis, the tool’s design and development consistently focused on the context dependence of haptic stimuli and the interaction between sensory modalities. As a result, particular attention was given to functionalities that enable audio–video–haptic synchronization, facilitating more seamless multisensory integration and contributing to a realistic, immersive, and intuitive experience in virtual environments.
The Unity engine (version 2022.3.15f1) was selected for the implementation of this tool, due to its code-centric approach and the customization possibilities available through the editor scripting. These features allow the creation of various custom editors, tools, and workflows directly within the Unity Editor, thereby expanding its set of functionalities, enabling more efficient work, facilitating instant testing in the editor, and more. This tool can be easily adapted for the Godot engine due to the similarities between Unity and Godot. Additionally, its design and philosophy can also be applied to other game engines, such as Unreal Engine, or even outside of any game engine. However, these adaptations would naturally require more substantial modifications than Godot adaptation. Notably, Unity’s and Godot’s limited built-in feature set for gamepad vibration, especially when compared to Unreal Engine, further influenced this choice [35].
The only requirement for the use of this tool is to integrate Unity’s Input System package [41] into the project, which is already the standard for processing user input in Unity projects. The Input System provides a unified interface for different input devices, enabling seamless functionality of this tool with both console controllers and generic controllers with vibration capabilities.
Most gamepad controllers, such as those from Microsoft Xbox (Microsoft Corp., Redmond, WA, USA), use Eccentric Rotating Mass (ERM) actuators that generate vibration by spinning an off-balance weight. These motors allow only basic control over vibration intensity by adjusting the rotation speed, while the frequency and amplitude are determined by the motor’s speed and the size of the weight. The weight size in the left and right motors often differs to create a wider range of sensations. In contrast, Nintendo Switch controllers use Linear Resonant Actuators (LRAs), while the Sony DualSense uses Voice Coil Actuators (VCAs), both of which offer more advanced haptic feedback with the possibility for finer control over frequency and amplitude. However, this level of control is generally only available to licensed developers through their official development tools. Outside of these, vibration is still controlled in the same way as with ERM actuators, by simply setting intensity values (0.0–1.0, or 0–100%) for each motor at any given moment. Despite this, modern actuators deliver smoother and more nuanced feedback compared to ERM. Additionally, the Sony DualSense controller supports an alternative method of haptic control via audio signals, a technique similar to the one used in Sony’s proprietary development tools.

2.1. Key Features of the Haptic Feedback Design Tool

While game engines already support the visualization of visual and audio effects, this tool adds the ability to visualize haptic effects, further enhancing understanding, work efficiency, and synchronization between effects designed for different senses. Figure 1 provides a complete overview of the Unity Editor workspace with the haptic effects editor, including the visual representation of the scene with 3D objects, an overview of the project assets, and the visualization of haptic and audio effects.
Haptic effects are first created as HapticEffect assets in the Unity Editor through the Assets/Create submenu within the Project window (Figure 2a). These lightweight assets are saved within the project (Figure 2b) and can be easily copied into other Unity projects and reused, similar to other assets, such as 3D models and audio effects. The primary component of the tool described in this paper is a custom visual editor for creating and previewing haptic effects, i.e., HapticEffect assets (Figure 2c). Through its streamlined and intuitive graphical user interface, users of the tool can design haptic effects by adjusting various properties, such as effect duration, looping, motor vibration intensities, and changes in intensity over time and distance from the effect’s source in the virtual world. These properties are configured within a custom Inspector interface, utilizing input controls chosen to provide the greatest ease of use and efficiency for the tool user.
A particularly significant feature of the tool is the use of animation curves to represent changes in the intensity of the vibration produced by the device’s vibration motors over time. This allows highly intuitive visual representation of the effect, precise control over intensity and timing, and facilitates rapid experimentation, iterative design, and fine-tuning based on user feedback during preview sessions. Unity’s Curve Editor provides critical functionality that enhances the tool’s usability and user experience, including timeline and grid lines with snapping and zooming capabilities (Figure 3a), a variety of editing tools (Figure 3b), built-in and user-defined curve presets (Figure 3c), and clipboard functionality for copying, cutting, and pasting keys, i.e., control points or entire curves (Figure 3d). In addition to the default built-in curve presets, the curve editor is supplemented with more than 40 additional curve presets, systematically organized into several categories, such as easing functions, noise patterns, and waveforms.
In order to offer tool users greater flexibility in creating and modifying animation curves, an additional Animation Curve Generator tool was developed (Figure 4a). While static curve presets provide convenient, pre-defined curves, the Curve Generator allows tool users to fine-tune and generate custom curves tailored to specific needs by selecting the curve type, the number of keys, and adjusting relevant parameters for the selected curve type, such as frequency, amplitude, phase, offset, etc. It supports various curve types, including linear functions, easing functions, waveforms (sine, sawtooth, square, etc.), and noise patterns (Perlin, Gaussian, Brownian, etc.), among others. This enables tool users to create highly specific haptic effects that may not be achievable with curve presets alone, giving them greater control over the nuances of the feedback. Curves generated with this tool can be copied and pasted directly into any curve field within the Unity Editor, providing seamless integration. The Animation Curve Generator tool was intentionally developed as a separate editor window, because animation curves have a wide range of use cases, such as controlling character animations, object and camera movement patterns, or even sound modulation. As such, it can function as a standalone tool, providing versatility for custom curves in various applications and improving the general engine workflow.
The AI Haptic Advisor provides AI-generated recommendations for designing haptic effects based on the specified use case and additional descriptions (Figure 4b). It also supports the automatic creation of new haptic effect files based on these recommendations, as well as the evaluation and automatic modification of existing effects. After a new effect is generated or an existing one is modified, the Project window automatically navigates to the location where the effect is saved, highlights it visually, and displays it in the Inspector for immediate preview and eventual further adjustments. The recommendations provided by the AI Haptic Advisor tool are comprehensive and include formal design justifications to help tool users understand the alignment between the proposed haptic feedback effect and its intended use case. This clarifies the reasoning behind the recommendation and offers educational value, guiding users in grasping the principles of effective haptic design and its application in various contexts. Technically, the AI Haptic Advisor is implemented as a Unity-integrated interface that communicates with external AI models via API calls. The prompts are specifically tailored for the haptic design use case and are dynamically generated to include contextual information, recent project data (e.g., effect types, user preferences), and relevant design guidelines. The returned output is parsed, validated, and structured into Unity-compatible haptic effect assets. All AI-generated content is additionally post-processed to ensure compatibility with the haptic engine and prevent unsupported configurations. The AI Haptic Advisor tool has demonstrated its functionality and effectiveness in its development version by utilizing the APIs of existing AI models, particularly ChatGPT (version GPT-3.5) and, more recently, DeepSeek (version R1).
Both the Animation Curve Generator and the AI Haptic Advisor tools are freely movable and dockable within the Unity Editor workspace, as shown in Figure 1 (top right corner), where they are docked as tabs next to the Inspector tab. This design enhances usability by allowing tool users to customize their workspace layout to suit their preferences. Alternatively, these tools can be accessed via the Tools option in the menu bar of the Unity Editor.
Expanding the possibilities for haptic effect creation and empowering synchronization with other feedback systems, the tool enables users to create haptic effects from audio files, with animation curves automatically generated to match the audio clip’s waveform. Audio clips are converted into haptic effects by transforming the audio signals from the left and right channels of the audio clip into haptic curves that are then played back through the left and right controller motors, respectively. The conversion process is based on the principles outlined in the following patent [42]. Clearly, it is still possible to subsequently refine these effects by adjusting the curves in the Curve Editor. Therefore, tool users have a great degree of freedom and creativity in approaching the process of creating realistic haptic feedback for various scenarios and events.
The preview section displays animation curves of motor vibration intensities and allows for direct playback of haptic effects within the editor, without the need to run the project and navigate to specific in-game scenarios in which the effect should be played. While this functionality is available when a gamepad is connected to the development computer, it does not naturally extend to mobile devices due to their lack of a direct connection to the computer. Therefore, it is necessary to develop a dedicated playback application for mobile devices. This application would allow real-time preview of haptic effects by receiving haptic effect data from the Unity Editor, bypassing the need to build the project, deploy it on the device, and navigate to specific scenarios in runtime. The development of such an application would be straightforward, as the playback logic is already implemented within the tool, requiring only the addition of communication logic. Since the application could be installed on multiple mobile devices, it would allow for simultaneous comparative testing of haptic effects across gamepads and different mobile platforms, aiding in their adaptation for each device.
When haptic effects are generated from audio clips, a normalized audio waveform appears behind the curves and the audio clip can be previewed together with the haptic effect, facilitating precise editing and synchronization of haptic and audio feedback. During the preview, a playhead marker tracks playback progress along the effect curves, supporting precise real-time adjustments. Users of the tool can independently toggle playback of the audio clip and haptic curves during the preview for focused testing and evaluation (Figure 2c). To help tool users achieve seamless looping of the haptic effect without noticeable transitions, the Preview Options section features a looped preview playback function that continuously repeats the haptic effect until stopped. This allows tool users to adjust the start and end points of the effect, as well as fine-tune its pattern and intensity to prevent fatigue and discomfort during continuous playback. Additionally, this section displays a list of all connected devices, allowing users of the tool to select one, multiple, or all devices for preview. This functionality is particularly useful for comparative testing across different gamepads (e.g., Microsoft Xbox One and Sony DualSense), especially those with diverse actuator technologies, and for ensuring a more consistent experience across platforms. It also simplifies testing and communication within development teams. These preview options are displayed in Figure 2d.
By default, the preview section is docked at the bottom of the Inspector window (Figure 2c), which is consistent with the rest of the Unity Editor, as is the case with models, textures, materials, audio files, and all other assets that support visual and interactive previews. However, it can also be converted into a floating window and freely repositioned across the screen, along with the Curve Editor window, enabling tool users to further adapt the layout of the working area according to their needs and preferences (Figure 5). This also provides a more detailed view of the curves and data, particularly with the use of larger monitors or multiple monitor setups.
Furthermore, the tool supports programmatically generating predefined haptic effects without creating HapticEffect objects in the Editor. This functionality is achieved through a series of methods that enable the reproduction of various template vibrations, including constant, pulsating, and wave-based patterns. For each runtime reproduction of any effect, whether created as a HapticEffect asset or generated programmatically, tool users can specify the source location, adjust how the distance from the source affects the haptic effect’s intensity, and select the target gamepads.
When the intensity of a haptic effect varies based on the user’s avatar’s distance from the effect’s source in the virtual world (Figure 6a), a sphere gizmo is displayed around each object in the scene that acts as the source of the effect, i.e., contains the haptic effect in any of its attached components (Figure 6b). This sphere gizmo represents the perception radius where the effect can be felt, and it oscillates according to the haptic effect curve, visually simulating how the effect would feel based on its intensity fluctuations. Above the perception radius sphere gizmo, a label showing the haptic effect’s name is displayed (Figure 6c). Together, these elements facilitate the level design process, scene navigation, and debugging in the editor.
In addition to assigning haptic effects to physical objects, haptic feedback can also be applied to specific interactable volumes within the virtual world. Haptic Zones are prefabricated objects that can be added to the scene, designed to trigger haptic effects upon interaction and modulate them based on scene events and configured properties. They are easily configurable and can be customized to suit a wide range of interaction types, from basic collision detection to more advanced proximity-based feedback. Tool users can freely translate, rotate, scale, and change the shape of zone collider. Each zone can be assigned an unlimited number of haptic events, each triggering a specific haptic effect. These events are based on the zone’s collider interaction methods (e.g., entering or exiting the zone), with a defined set of objects that can trigger them. Additionally, the tool user can define whether and how the interacting object’s position, direction, and speed within the zone influence the assigned haptic effect for each event (Figure 7a).
This enables the delivery of different stimuli to the user depending on their role, the virtual object their avatar holds, and the activities they perform in the virtual world. For example, in a cannon operating simulation, the user will experience different haptic effects when interacting with the haptic zone at the muzzle opening of the cannon, depending on whether they are loading a cannonball or gunpowder into the barrel, or ramming them into the barrel with a rammer. On the other hand, a distinct haptic effect would be triggered when the cannon is fired and the cannonball exits the haptic zone. A haptic zone could also be created at the fuse, providing subtle haptic sensations to the user during the spark and ignition of the fuse (Figure 7b).
Haptic zones can be positioned within each other, allowing for the creation of areas with combined or different vibration intensities and patterns. Figure 8 shows an example of a motorcycle riding simulation, where the user’s avatar rides over different types of terrain surfaces. Specifically, riding over finer, tightly packed cobblestones will result in more continuous and subtle vibrations, while riding over larger stones will lead to uneven, sharper vibrations as the tires hit the large stones and fall into the gaps between them. When only one wheel of a motorcycle is within the haptic zone, a different haptic stimulus will be perceived compared to when the entire motorcycle, with both wheels, is within the zone. Also, haptic feedback could depend on different properties of the object interacting with the haptic zone, so in the case of wider and/or larger wheels, the vibration caused by uneven ground would be less noticeable.
In the Unity Editor’s scene view, haptic zones are represented by a zone collider gizmo, accompanied by a label that displays the zone’s name (Figure 7b and Figure 8). The color and opacity of the gizmo, as well as the label’s position and distance from the collider, can be adjusted through the Inspector for each haptic zone individually (Figure 7a). This visual representation enhances scene navigation, allowing developers to quickly locate and modify haptic zones. In addition to its adaptability, the use of haptic zones also promotes cleaner, more maintainable code, reduces duplication, and increases work efficiency through flexibility and reusability.
To further increase workflow efficiency and accelerate implementation, the tool includes a library of over 20 preset haptic effects designed to fit the most common usage scenarios (Figure 2b). Tool users can utilize these presets as they are, adjust them, or create new effects based on them. This library is fully customizable to meet the needs of the tool user and each specific project. It is possible to add new effects, modify existing ones, delete those no longer needed, and organize the library using folders. For example, in the development of driving simulations, base preset effects can be created for vibrations such as starting a car, driving over different surfaces, or encountering road bumps. These presets can then be adapted for individual vehicle models or various surface types by adjusting property values programmatically or through a visual editor.
User interface of the tool is divided into clearly labeled sections with descriptive titles and property names. Additional explanations and instructions are provided in help boxes (Figure 9a) and tooltips (Figure 9b). For all actions, clear feedback messages are displayed in the console at the bottom of the editor, including detailed error messages that indicate any issues or failures, as well as informational messages to keep the user of the tool informed about the system’s status (Figure 9c). During development, particular emphasis was placed on robustness and prevention of various errors, especially those related to the connection of the controller, its vibration, and general correct behavior. All actions within the editor can be reversed using undo and redo functionality.
The reproduction unit of the tool exposes properties that can be integrated into application settings, allowing users to disable haptic feedback entirely or fine-tune its intensity both globally and per motor. This is important, as some users find vibration distracting, irritating, or unpleasant [43], and excessive exposure has been linked to Hand–Arm Vibration Syndrome (HAVS) in some cases [44].

2.2. Participants

The research was conducted informally as part of laboratory exercises in the master’s degree program in multimedia and computer games. Over the course of three generations, concluding with the winter semester of 2024, 17 students (10 male, 7 female), aged 23–25, participated in the study.
The majority of students had a background in computer science, but there were also some students from the department of animation in engineering, who generally have a stronger focus on artistic skills throughout their studies. All students reported playing video games at least once a week and were familiar with gamepad controllers. None of them reported previous experience with implementing haptic feedback.

2.3. Procedure

Students enrolled in a course related to the development of VR interactive experiences, where they were introduced to a range of devices, including gamepad controllers, mobile phones with VR headsets, Nintendo Wii controllers, Microsoft Xbox Kinect sensors, and Meta Quest 2 VR headsets (Meta Platforms, Inc., Menlo Park, CA, USA). It is important to note that at the beginning of the semester, as part of the introductory lectures, students had the opportunity and were encouraged to try out and test features of different existing interactive experiences on each of these devices, including video games, simulations, edutainment content, and other types of applications. Throughout the course, students gained hands-on experience by working on assignments and projects with all the mentioned devices, including implementing haptic feedback, and their work was graded accordingly.
As part of the final project for the course, students were tasked with creating interactive applications that, among other requirements, needed to contain a realistic and immersive virtual environment providing adequate stimuli for all senses, including high-quality visuals, immersive audio and haptic support. For the implementation of haptic feedback, they had the option to choose any of the previously mentioned devices that were capable of providing it. However, the majority of students gravitated towards working with gamepad controllers, primarily due to familiarity and habit, as they had the most experience with gamepads and had used them the longest. Students who chose to implement their projects using a gamepad as a haptic feedback device were included in this research.
Students worked on their projects in the laboratory during defined time sessions, and throughout the process, they were encouraged to iteratively test the solutions they developed by having other colleagues test them. Projects were developed on computers with the Unity Editor installed, and build versions were deployed to the selected target platform, including computers, mobile phones, and VR headsets. Standard gamepad controllers with haptic actuators, which could be connected to both a computer and the selected target platform, were used.

2.4. Data Collection

Both at the beginning and end of the semester qualitative data was collected in the form of informal interviews with open-ended discussions and observations during the project work. Although no formal quantitative testing was conducted, this approach was selected due to its ability to provide better contextual understanding, greater engagement and focus, and reduced anxiety for the respondents. It also allowed for iterative research and the opportunity to ask additional questions. Research has shown that these conversations foster easier communication and often yield more naturalistic data [45]. The discussions took place during lessons, practical sessions in which participants worked on the project, and during the defenses of task checkpoints and the final project. In these sessions, students articulated their thoughts, comments, and reflections, which were observed and noted. To provide basic structure to the discussions and achieve greater productivity, a list of guiding questions was prepared in advance and used internally to steer the qualitative discussions.
At the beginning of the semester, an initial informal open-ended discussion was conducted to gauge students’ preliminary knowledge, experience, and expectations of haptic feedback, including their confidence in implementing it in projects. This pre-survey provided a baseline understanding of their initial views on the role of haptics in immersive experiences and its potential integration into their work. The questions in the beginning survey included:
  • How familiar are you with haptic feedback in interactive experiences?
  • How important do you believe haptic feedback is compared to visual and audio feedback in interactive experiences?
  • How confident do you feel about implementing haptic feedback in your projects?
At the end of the semester, after completing their projects, students took part in a more detailed survey, reflecting on their experience with haptic feedback and their use of the haptic feedback design tool. The list of questions was divided into two categories: general questions about the use of haptic feedback and questions specific to the proposed haptic feedback design tool. Some of the questions were adapted from the Witmer and Singer Presence Questionnaire for Virtual Environments [46] and modified to better suit this context. The questions were as follows:
  • General questions about the use of haptic feedback:
    • How familiar are you with the use of haptic feedback in interactive experiences?
    • How comfortable do you find the haptic feedback?
    • How significant is the difference in user experience and immersion with and without haptic feedback?
    • How much did your experiences in the virtual environment seem natural and consistent with your real-world experiences?
    • How much does haptic feedback enhance your connection to characters, objects, and actions within the virtual world?
    • How effective do you find haptic feedback compared to visual and audio feedback in interactive experiences?
    • How satisfied are you with the use of haptic feedback in existing games, simulations, VR apps, and other interactive experiences you’ve tried?
    • How likely are you to use haptic feedback in future projects?
    • How challenging do you find implementing haptic feedback in interactive experiences, compared to other feedback mechanisms (e.g., visual or audio)?
  • Questions specific to the proposed haptic feedback design tool:
    10.
    How much time did it take you to learn how to use the haptic feedback design tool effectively?
    11.
    How easy and intuitive did you find the tool to use?
    12.
    How satisfied are you with your overall experience using the haptic feedback design tool, including its features, functionality, interface, and responsiveness?
    13.
    How closely does the haptic feedback perceived on the device align with your expectations and design in the tool?
    14.
    How much did the haptic feedback design tool help you in implementing your ideas during the project?
    15.
    How likely would you use the haptic feedback design tool in future projects?
    16.
    How confident do you feel in implementing haptic feedback in future projects after using this tool?
The discussion procedure during the lab sessions and project defenses only allowed for an informal interview format to address these questions. Throughout these discussions, the initial guiding idea that haptic stimulation must be included as an enrichment of the visual and audio experience was consistently kept in mind. Therefore, it was ensured that Question 2 from the beginning-of-semester survey, as well as Question 6 from the end-of-semester survey, were not excluded, since they directly assess awareness of the importance of haptic feedback and its development over the semester. The same applies to Question 12 from the end-of-semester survey, which reflects the general user experience with the proposed tool. Lecturers listened to students’ responses during the discussions and ranked them using a 5-point Likert scale, where 1 represented the lowest or most negative response and 5 represented the highest or most positive response.
During the project work, the overall user experience with the proposed gamepad haptic feedback design tool was monitored, including factors such as ease of use, speed in mastering the tool’s functionality, independence in work, workflow, versatility, and the use of various tool features. Particular attention was given to how users assessed the alignment between the intended haptic effect design and the perceived effect on the target device. Finally, any additional comments or observations by teaching staff were also noted.

2.5. Data Analysis

The qualitative data collected during informal discussions was subjected to thematic analysis. Word frequency analysis was performed to identify recurring key words, phrases, patterns and themes regarding the effectiveness of haptic feedback and students’ experiences with the tool. Comparisons between the pre- and post-surveys were made to assess changes in students’ perceptions and confidence in using haptic feedback. All responses and observations were summarized and they are presented in the Results and Discussion sections of the paper.

3. Results

At the beginning of the semester, students generally had a low level of familiarity and experience with haptic feedback. Those who were familiar with the term typically associated it with the vibration of a mobile phone or gamepad, while a few recognized it due to Sony’s DualSense gamepad controller. Most students did not consider haptic feedback to be crucial for the experience and immersion of interactive experiences. However, those who had experienced the haptic feedback of the Sony DualSense controller were impressed by the synchronization and sophistication of the vibrations.
A small group of students with experience in PC gaming expressed frustration with controller vibrations, often reporting that they would turn them off if possible. Additionally, some students voiced concerns about the quality and consistency of controller vibrations in video games, complaining about the inconsistent experience within a single game and across different games, as well as crude or basic effects that could be unpleasant or distracting to players.
Since the students had no prior experience implementing haptic feedback, their confidence in doing so in their own projects was low. While they generally understood that haptic feedback was likely implemented by programmatically adjusting vibration intensity value over time, they lacked a clear understanding of how to implement more complex haptic effects.

3.1. Student Practical Experiences with the Haptic Feedback Design Tool

During practical work with the proposed tool on assignments and projects, students first approached the process of designing haptic feedback by verbally describing their ideas in terms of intensity, rhythm, and sensation, often using metaphorical or associative comparisons (e.g., “like a cat purring”, “like a heartbeat”). They also used specific examples such as “turn signals”, “parking sensors”, or “pressing a key on a virtual keyboard on a mobile phone”. Most students mimicked the desired effects vocally (e.g., “vrrr”, “bzz-bzz”) or by tapping on a solid surface or their body to communicate the type of vibration they wanted to create. One student even recorded the sound of a pen tapping on a table using a headphone microphone, then used that audio file to generate a haptic effect from it by utilizing specialized feature of the tool. After that, he fine-tuned it using curves to achieve the desired sensation. In general, students often referred to existing audio or visual feedback as a guide for designing synchronized haptic feedback. Typically, curve presets and existing haptic effects were used as a starting point for creating custom haptic effects, while for certain, simpler stimuli, some presets could be used as is.
The tool proved intuitive, since all students quickly became proficient with it and had no problems with navigation and use. Interestingly, some started by creating more complex effects, experimenting with multiple functionalities of the tool, which took them longer to finish their first effect. Others began with simpler effects, gradually progressing to more complex ones. Students worked independently and they primarily relied on the tool’s experimental design capabilities for haptic feedback creation. However, when they needed additional ideas, they first consulted the integrated AI Haptic Advisor, before searching the internet or using external AI chats. As the primary reason for this, they stated its easy accessibility, convenient location within the editor, and good integration with the workflow of other tool functionalities. On a couple of occasions, the AI advisor took longer than ten seconds to return a response, which was pointed out by several students. Also, the AI advisor did not give an absolutely identical answer for the same queries, but the variation and randomness were minimal, which is expected considering the way the AI models work.
Students preferred designing haptic effects through the tool’s graphical interface rather than by writing code. Some of the students with programming backgrounds tended to modulate effects programmatically during runtime based on different variables and events to achieve greater variations. In contrast, those with more experience in visual programming showed a preference for duplicating and modifying effects using the graphical interface. During the iterative peer testing process, students frequently provided constructive feedback and creative ideas to one another, which fostered additional engagement and collaboration in the laboratory.

3.2. Outcomes and Examples of Student Projects

All students successfully completed course projects which incorporated haptic feedback to different degrees. These projects covered a range of interactive experiences, including video games, simulations, and applications across fields like entertainment and education. One student developed a video game featuring haptic feedback for realistic cannon handling, where he presented the entire procedure of actions in detail and historically correctly (Figure 6c and Figure 7). The player is provided with distinct haptic feedback depending on which object his avatar is holding in its hands and which action it is currently performing. For example, when loading a cannon, the player experiences different haptic effects depending on the current task. Loading gunpowder or cannonballs into the barrel provides distinct feedback, as does using a rammer tool to push them down. When setting and lighting the fuse, subtle feedback is provided, while firing the cannon triggers deep, rumbling feedback from the gamepad’s low-frequency motor to reflect the power of the shot. The high-frequency motor of the gamepad generates a milder vibration to represent the cannon’s structure shaking. During aiming and moving the cannon, subtle vibrations simulate the motion and inertial drift of the cannon. The gamepad motor used for the vibration depends on the direction in which the cannon is moving. To ensure a consistent experience, during implementation the student had to apply a different intensity modifier for each motor, as the left and right actuators have different weights, resulting in a different haptic experience at the same motor intensity. If the cannon overheats due to rapid firing, pulsating vibrations provide feedback, alerting the player to stop firing to prevent damage to the cannon. As the cannon cools, the intensity of the pulsating vibration gradually weakens, signaling when it will be safe to resume firing.
Another student created a motorcycle driving simulator with varying haptic feedback based on the terrain and vehicle properties, such as wheel size and width (Figure 8). Although he successfully used haptic feedback to simulate the vibrations experienced on a motorcycle when riding over various surfaces, he was unable to create distinct, satisfactory feedback for avatar falls on different surface types. As a result, the same effect was applied regardless of the surface the avatar fell on. Interestingly, the haptic effect used for fall feedback was created by converting a combination of audio recordings he captured using the Foley technique, including the sound of a bag of flour being thrown onto the floor and the rolling of an almost completely deflated ball.
It is important to note that the tool’s usability extended beyond the disciplines of software development education. Students recognized its potential for professions where vibration plays a beneficial role, such as mechanical engineering, construction, and medicine. These ideas and applications were inspired by their personal interests, as well as activities and challenges related to the professions of their friends, parents, and others in their circle.
For example, in the field of mechanical engineering, students developed projects where gamepad vibrations were used to represent vibrations caused by engine imbalances, simulate engine behavior in different operating modes, and simulate mechanical shocks. In the field of construction, one student created a simulation of a construction site where various tools and work machines could be operated. The haptic sensations varied depending on whether the avatar was using the tools and machines with or without protective equipment. Figure 10 shows a simulation of a jackhammer use for breaking concrete without wearing specialized gloves. This project also has applications in occupational safety and health, as it can be used for monitoring the vibration exposure of workers operating different types of machines in various environments, in order to assess the risks of vibration-related conditions such as Hand–Arm Vibration Syndrome (HAVS) [47] and identify preventive measures.
One student developed an augmented reality surgical simulation, where users control a virtual scalpel using a gamepad to perform dissection tasks and learn about human anatomy. In this example, the resistance of different tissues during dissection and surgical interventions is represented through different haptic stimuli, providing feedback on how deeply the scalpel has cut (Figure 11). The scalpel is controlled along the horizontal axis using the left stick of the gamepad, while the height control, i.e., cutting depth, is controlled via the right stick. Cutting pressure is regulated by the right trigger, bumper buttons are used for switching between instruments (scalpel, scissors, forceps, etc.), and camera adjustments are performed using the D-pad. As the scalpel traverses different tissue layers, haptic feedback dynamically adjusts based on the scalpel’s position and movement within the tissue and applied pressure. The haptic feedback generated during tissue interaction is also influenced by the density and texture of the tissue. Denser tissues, such as muscle and bone, produce a stronger and sharper haptic feedback, whereas fatty tissues provide a weaker, softer, and “squishier” feedback. Transitions between haptic feedback of different levels are smooth, allowing users to feel the difference in texture and resistance. When retracting the scalpel from the tissue, haptic feedback decreases with an inverse curve reflecting the reduction in resistance as the scalpel exits the tissue. This transition is smoothly blended with the feedback of the overlaying tissue layer and a slight drag resistance effect is applied. When the scalpel is positioned within multiple tissue types simultaneously, the resulting haptic feedback is determined by its overall position, movement, tissue properties, and the specific action being performed. Prototype curves for haptic effects based on the density and texture of the tissue are shown in Figure 11b.
Such simulations foster education and training, allowing users to experience potentially dangerous environments and situations without physical risk. They also overcome the challenge of finding appropriate equipment and examples for demonstrations, giving students the opportunity to feel haptic stimuli and gain information and experiences that are valuable for learning. During the research, it was observed that the gamepad provided a particularly immersive experience when the virtual object resembled the shape of the gamepad, and the positioning of the user’s hands during the activity aligned with the position of their hands holding the gamepad. This was evident in scenarios like the motorcycle simulation, where users grip the handlebars, or the jackhammer simulation, where users hold the handles of the tool.

3.3. Student Feedback and Reflections

At the end of the semester, students’ familiarity with the use and importance of haptic feedback had significantly increased. Much of this was attributed to the introductory lectures, where they had the opportunity to experience carefully selected interactive applications on different platforms. Most students considered haptic feedback an essential element in their projects, though one student found it distracting in certain scenarios. The general view of the students was that at the beginning of the semester, most felt that haptic feedback was not necessary, but that at that point their experience with haptic feedback was significantly less. This aligns with the assertion that the majority of people may not have enough awareness of the benefits of haptic information, and thus may not fully appreciate the contributions it can make to the VR experience [48].
Most students reported feeling comfortable using haptic feedback, although some expressed a preference for reducing its intensity in certain situations. The general consensus was that, while visual and audio feedback remained more important, well-synchronized haptic feedback enhanced the overall experience. Based on their feedback and the overall findings, it is evident that high-quality haptic effects that are realistic and aligned to events and other feedback systems enhance the transmission of the atmosphere and increase the user’s sense of presence in the virtual world. Conversely, haptic effects that are not synchronized with events, or are inadequate in any sense, can have the opposite effect. They may negatively affect the experience, reduce user immersion, or even cause frustration.
As shown in Figure 12, data from informal interviews conducted at the beginning and end of the semester indicate that visual elements were consistently described as the most important, with only a slight increase in perceived importance (approximately 1%). Audio feedback saw a more notable rise of 9.7%, while haptic feedback showed the most significant increase, with a 21.8% change in perceived importance.
Figure 13 presents the overall user experience ratings of the haptic feedback design tool, based on informal interviews and discussions with seventeen students across three generations, consisting of five, six, and six students, respectively. The first generation showed particularly positive reactions, with all students providing extremely positive feedback about the tool. It is likely that the high ratings from the first generation reflect not only their satisfaction but also their enthusiasm and appreciation for having access to such a tool, rather than being a fully objective evaluation. In subsequent generations, while the ratings and comments remained positive, they appeared more realistic. This shift may be attributed to students’ higher expectations, as well as the increased complexity of their project scenarios and the more sophisticated haptic effects they aimed to implement. In the second generation, one student expressed particular dissatisfaction with the lack of functionality for nesting haptic zones, which required them to manually program the desired behavior in their project. This limitation was addressed and resolved in the version of the tool used by the next generation.
All students showed positive experiences regarding the tool, its functionality, ease of use, and usability. They stated that it was easier for them to understand the theory, principles, and importance of haptic feedback for the user experience because they could experience and feel it with examples. Many mentioned that they now pay more attention to haptic feedback in commercial games and have started noticing potential places for the integration or improvement of haptic feedback.
Two students reported that they initially felt intimidated by the number of functionalities offered by the tool, but later found it simple and logical to use. The flexibility of the tool was greatly appreciated, with comments noting that it “contains everything you need”. They were especially pleased with the ability to quickly create complex haptic effects without writing code but valued the option to programmatically create and modify effects during runtime.
Students noted that, in some cases, the stimulus experienced on gamepads did not completely align with the design in the editor and their expectations, especially for effects involving subtle intensity changes or abrupt, sharp pulses. This discrepancy is due to the limitations of eccentric rotating mass (ERM) actuators, which are predominant in most gamepad controllers. These actuators, due to their mechanical operation, require a certain amount of time to accelerate to or decelerate from a specific speed, affecting the accuracy of fine or rapid haptic effects.
Regarding the complexity of implementing haptic feedback compared to other types of feedback, the common conclusion was that it is more challenging to implement. This difficulty was primarily due to the lack of available tutorials and materials, as well as high-quality pre-made assets, which are much more commonly found for other forms of feedback. However, students appreciated the availability of a library with ready-made effects and curve presets, as well as the ability to save and transfer these effects between projects.
In summary, students agreed that, with their newfound experience, implementing haptic feedback in future projects would be much easier, and their confidence had grown. They expressed an intention to incorporate haptic feedback wherever appropriate and showed interest in using the tools they had worked with during the course in their future projects. It is worth mentioning that some students expressed interest in developing additional tools to improve the efficiency of work in game engines, as well as tools that could be connected to the haptic design tool introduced in the class.

4. Discussion

Haptic feedback can make interactions with virtual environments and objects more realistic, immersive, and interactive. This is especially important for training and skill development, such as in medical simulations, where haptic feedback can simulate the texture and resistance of tissues, as well as the forces needed to perform specific actions. In driving simulators, haptic feedback can replicate vibrations experienced when driving different types of vehicles on various surfaces, encountering road bumps, feeling tire traction, and forces during acceleration or braking.
The following section of the paper will specifically address the significance and advantages of this tool in the context of use in education of IT professionals in order to ensure the future development of VR applications with improved user experience contributed by better haptic feedback. Of course, this tool is absolutely usable in professional production environments as well. Based on experience using the described tool across several university courses, this section highlights key insights and recommendations for educators seeking to implement and improve the teaching of haptic feedback in their educational settings. Since it was designed and implemented with educational needs in mind, the tool offers numerous benefits for the teaching process and its participants, including both instructors and students. Gamepads and mobile devices are the most accessible and widely used devices capable of providing haptic feedback among users, but also within university laboratories.
First and foremost, the visual creation of haptic effects through a streamlined graphical user interface and live preview within the editor significantly increases work efficiency, as well as teaching and lab activities. Implementing haptic effects in code, especially more complex ones, is not intuitive and requires writing new code for each pattern. Additionally, when implementing effects through code, the project must be started, and a specific runtime scenario must be reached to test the effects. Figure 14a shows the code implementation of a simple haptic effect with a linear intensity increase, while Figure 14b shows the same effect represented as an animation curve within the visual editor. This approach enables users of the tool to easily visualize, create, and test haptic effects without the need to write additional code for each new effect, as the playback code is generic and consistent, regardless of the effect being played.
A comprehensive preview system provides the ability to independently demonstrate different aspects of the haptic effect, and thanks to the robustness of its implementation and error prevention, controllers can be disconnected from one and connected to other computers, without side effects and the need to restart the project or the Unity Editor. The live preview feature also facilitates communication between lecturers and students, as well as among members of the development team. It is much easier to demonstrate and explain some concepts when people can feel and hear the vibrations, rather than only through theoretical oral explanations [49]. As a result of the ability to play an effect on all connected gamepad controllers simultaneously, the instructor can share controllers with students in the classroom, then play a demonstrational preview of the effects he is designing, allowing everyone to feel them in real time while teaching various design concepts.
Also, the design of the tool itself encourages independent research, experimentation, and creativity, fostering greater student engagement in classes through interaction and discussion. The teacher can assign tasks that require students to create specific effects based on descriptions, purposes, and use cases, and the students can work iteratively, discuss, and express creativity. Depending on the number of controllers available in the classroom, students can work independently or split into groups and work as a team. Team work can be combined with individual work regardless of the number of controllers, in order to apply and test different teaching methods. Although the tool was developed in an academic environment, it is equally accessible in high school settings and below. Creating haptic effects is simple, requiring only a “point and click” approach. Of course, some experience with programming and game engine use is necessary to apply the created effects in projects.
Thanks to the format in which the effects are saved, students can send them via email, upload them to an online storage drive or transfer them to a flash drive, then continue to work on them at home. Since the tool is universally compatible, students can test effects at home using any gamepad controller with vibration motors that connects to a computer. In the case of team work, effects created on the computer of one team member can be easily transferred to the computers of other members. The saving method also allows students to send their work to professors or assistants for review, so that they can get feedback for their work even completely remotely, which is especially important in the case of online classes.
As a result of the tool’s multimodal approach to creating haptic effects, students can easily understand the advantages and limitations of each method, the scenarios in which they are applicable, and the most efficient techniques. For instance, while some basic vibrations can be easily created using either a graphical user interface or code, designing specific custom effects is much simpler by drawing curves within a graphical user interface than by writing code.
Since the tool integrates directly into the Unity engine, there is no need for additional software installation, which is particularly suitable for work in computer labs that often have problems with the available amount of free memory storage on computers. This approach avoids the purchase of paid software and saves administrators time, as they do not have to invest additional time and effort in installation and licensing. Students can easily and quickly add the package to the Unity engine on the spot when needed, both in the lab and at home.
Full ownership and access to the code allow for corrections, iterative adaptations, and upgrades based on needs, teaching experiences, and comments and suggestions from students and lecturers. As no third-party software is used, there is no risk of instability, unavailability, or unexpected changes to the tool. In addition, complete access enables teachers to demonstrate and explain the architecture and implementation of the tool for the purpose of improving programming skills and learning concepts such as scripting for custom editors, tools, and plugins for Unity and other game engines. Therefore, students can perform the interlinking of subjects and show greater appreciation for concepts they have learned before and their applicability to real-life problems.
Taking into account the wide range of features for rapid implementation of haptic feedback, the utility and usability of this tool cannot be considered only in the domain of the industry or education. Its main purpose is to create and reproduce haptic effects, regardless of the type of software in which it is used. Thanks to its functionalities and capabilities, the tool is highly effective in the general context of designing and developing VR applications, video games, and other interactive experiences. It allows for quick integration of haptic feedback, thus increasing the level of immersion and making haptic feedback supported in a larger number of applications and available to more users.

4.1. Comparison with Existing Tools

The aforementioned advantages of the visual editor with live preview, compared to writing code, make the haptic feedback design tool more convenient than the basic vibration control functions built into Unity [50] and Godot (version 4.4.1) [51] engines. In their original form, and without the use of additional paid third-party assets, these engines lack the functionality for visualization, visual creation, and live preview of haptic effects.
Unreal Engine (version 5.5) natively offers the most comprehensive and advanced built-in system for implementing gamepad vibration among all game engines. According to its documentation [52], an arbitrary number of channels can be created when designing a haptic effect asset, with each channel capable of playing a different effect defined by a curve and assigned to specific vibration motors. These effects can be played on a specific device either once or in a loop, stopped, and dynamically added to any actor during live gameplay. Per-device overrides are supported, allowing different feedback configurations for each platform. Additionally, effects can be reproduced from a specific source within the game world, with their intensity varying based on the distance between the source and the player. However, it is not possible to freely define the distance-to-intensity curve for these effects. Unreal Engine includes an experimental feature for audio-based vibration, which is currently only supported on the Sony PS5 DualSense controller.
The current version of the tool proposed in this paper does not support the creation of haptic feedback channels in the same way as Unreal Engine, but it does allow for the simultaneous playback of multiple haptic effects, automatically layering and blending them based on their properties. Furthermore, while it does not include a per-device override system identical to that of Unreal Engine, similar functionality can be achieved by duplicating an effect and modifying the copy to suit a specific platform. These are features that will be given further attention in future versions of the tool.
Although Unreal Engine provides a visual editor for haptic effects with a preview feature, it lacks a visual representation of the effect during preview, unless a specific effect is opened in the Force Feedback Effect editor [52]. Additionally, there is no playhead to track the preview progress, no option to loop the preview, and users cannot toggle the different components of the haptic effect individually during the preview. Furthermore, there is no option to select one or more specific devices for the preview (Figure 14c).
Meta Haptics Studio (version 1.3.3) [53] enables the design of haptic effects that can be integrated with the Meta Haptics SDK for Unity, Unreal Engine, and native applications. It supports the translation of audio clips into haptic feedback and offers a wide range of tools for editing. However, it is limited to native VR controllers designed for Meta VR headsets, for which it is specifically created, and does not support gamepads.
Interhaptics offers a Haptic Composer tool (version 1.2.0) [54], which is used to design haptic effects that are then integrated into projects via APIs and tools provided by the Interhaptics SDK and plugins for Unity and Unreal Engine. Haptic Composer is comprehensive software tool for creating haptic effects and it supports a wide array of devices and platforms, offers a feature-rich editor with in-app preview capabilities, includes haptic effect presets, and allows for audio-to-haptics translation.
However, it is important to note that both Meta Haptics Studio and Interhaptics Haptic Composer are standalone tools that are installed and used independently of the game engine. In comparison with the tool described in this paper, these solutions provide more advanced visual representations of haptic effects and expose a wider range of parameters for configuring them. On the other hand, this comes at the cost of a constant need to switch between environments and divide attention between the design tool and the game engine. It also introduces a steeper learning curve and requires additional time to become familiar with the tools’ functionalities and documentation. The haptic feedback design tool presented in this paper is integrated into the game engine, lightweight, and easy to learn, as it is built on concepts already familiar to users of the engine. Additionally, the tool is generally easy to extend, allowing its design approach to be applied to other haptic devices with minimal device-specific code required to initiate and control vibration playback.
What distinguishes the tool described in this paper are the Haptic Zones and the AI Haptic Advisor, features not available in any of the aforementioned game engines or tools. Haptic Zones are implemented to natively support a broad range of use cases, enabling rapid and flexible implementation of context-aware feedback through the combination of four elements: the avatar, the object being held, the activity being performed, and the virtual space. The novelty of the AI Haptic Advisor lies in its integration of natural language-based design input with real-time generation and validation of haptic assets within the game engine environment. This streamlines the haptic design workflow by adopting a “tell us what you need” principle.

4.2. Educational Framework Evaluation

To evaluate and improve the educational framework surrounding the haptic effects design tool, it is essential to consider its alignment with established learning theories.
The Revised Bloom’s Taxonomy categorizes educational goals and objectives from lower-order thinking skills, such as remembering, understanding. and applying, to higher-order skills, such as analyzing, evaluating, and creating [55]. The tool supports this taxonomy by enabling students to remember basic haptic concepts, understand their impact on perceived sensation and their relationship with user experience, apply their knowledge in the design process and practical development scenarios, analyze peer feedback during collaborative discussions, evaluate the effectiveness of different haptic designs and their use in various application scenarios, and create innovative haptic patterns that enhance immersion and user experience in interactive applications.
Fink’s Taxonomy of Significant Learning [56] complements this perspective by also emphasizing deep learning and higher-order thinking, while highlighting integration, the human dimension, and caring, elements that foster deeper personal and social transformation and enhance students’ understanding and application of haptic design in real-world scenarios. In the context of integration, the proposed haptic feedback design tool enables designers to interlink haptic feedback design principles with other aspects of the interactive design, such as visual and audio feedback, interaction mechanics, and user experience. Caring is promoted by empowering designers to rapidly create more immersive and interactive applications and focus on the impact of haptic feedback on user engagement and overall experience. Through the human dimension, designers can reflect on how their haptic feedback designs influence user’s behavior, emotions, thoughts, and interactions, as well as focus on understanding their needs. These components can guide students towards building responsibility and empathy, motivating them to create designs that are not only functional but also meaningful and inclusive for diverse populations, including individuals with impairments.
The tool’s practical and reflective nature aligns well with both Constructivist Learning Theory [57] and Kolb’s Experiential Learning Theory [58], as it emphasizes active experimentation and learning through hands-on experience. Users of this tool iteratively create, preview, and modify haptic effects, which enables them to engage in reflective observation and abstract conceptualization. This process allows them to construct an understanding of haptic design principles by directly interacting with sensory feedback.

4.3. Use Cases/Scenarios

This tool is used in several courses related to human-computer interaction, entertainment and interactive systems, including gaming and VR. Particular focus on haptic feedback is provided in courses like Virtual Reality Systems and Game Development Process, as it plays a crucial role in these fields. These courses include specific sections aimed at teaching students the fundamental principles, concepts, and techniques for designing, implementing, and applying haptic feedback in various interactive experiences. On the other hand, the tool also proved valuable in the Game Engine Architecture course, where its functionality, design, and implementation are used to demonstrate the purpose and importance of scripting custom editor tools in game engines.
Within the laboratory environment, students have access to desktop personal computers that have the Unity engine installed, as well as a certain number of gamepad controllers with vibration motors. Currently, the main constraint is the insufficient number of gamepad controllers, resulting in multiple students sharing a single controller. While this might restrict students from getting the full hands-on experience simultaneously, it can be turned into a learning moment. Students can share devices and work in teams, which promotes collaboration, communication, and discussion between them.
Based on the experience gained during the research, an example of the tool’s use in education was created. As illustrated in Table 1, the haptic feedback design tool supports a variety of use cases and scenarios for teaching and learning, both in the lab and remotely. These range from designing basic reusable haptic effects to creating more advanced effects synchronized with application scenarios and other feedback systems. This flexibility enables the tool to be adapted to various teaching levels, from introductory to advanced courses.

4.4. Limitations

The study described in this paper has some limitations that need to be mentioned. It focused on haptic feedback and the proposed haptic feedback design tool, and was conducted qualitatively through informal interviews. Participants were students enrolled in the VR systems course, some of whom had limited programming experience and limited experience working with game engines.
This choice was deliberate, as the aim of the study was to evaluate students’ initial interaction and first experience with the tool, in order to develop a version of the tool suitable for more rigorous future evaluation. The results of this study serve as a preliminary evaluation, providing a foundation for a long-term study that will involve these same students once they gain practical industry experience. The first generation of students who participated in this study already have two years of industry experience, so it is reasonable to start a long-term study as early as the beginning of next year, during which thorough quantitative and qualitative research would be conducted. Another reason for this choice is the planned adaptation of the tool for other game engines, which should further expand the possibilities for its evaluation and strengthen the study’s outcomes.

5. Conclusions

With the accelerated development of haptic technologies and their significant impact on enhancing immersion and user experience in VR, video games, and other interactive applications, the demand for integrating haptic feedback into these experiences is steadily growing. On the other hand, throughout years of teaching VR, game development, and interactive systems, we have observed that while students easily achieve immersion through visuals and audio, haptic feedback remains the most challenging aspect due to limited awareness, experience, and the basic support offered by most game engines.
This paper presents a custom tool for creating gamepad haptic effects within the Unity engine. The developed tool supports a wide range of use cases, enabling the design of reusable haptic effects through graphical editor, audio conversion, programmatic scripting, and AI-powered guidance. Additionally, the tool allows defining specific areas within the virtual space where different haptic stimuli can be felt, based on the role and activity of the user’s avatar in the virtual world. These haptic zones, along with functionalities for synchronizing haptic and audio feedback, contribute to the methodology for developing haptic authoring tools and designing haptic feedback. These elements emphasize the contextual dependence of haptic feedback, encouraging its synchronization with audio and visual stimuli, as well as with space and activities in virtual reality. This synchronization is crucial for achieving multisensory synthesis, which plays an essential role in realism and user engagement. Within the tool, libraries of haptic effect presets are also available, which can be further expanded and combined. The gamepad was selected as a haptic device due to its widespread use, familiarity, standardized design, ease of use, and connectivity across all gaming platforms, with its growing popularity and advancing haptic actuator technology further strengthening its appeal.
Three generations of master’s students used the tool to complete a project assignment in the field of VR and interactive systems. Informal interviews and discussions were conducted at the beginning and end of the semester, focusing on haptic feedback, its use and significance, and the tools for its implementation. The tool has proven to be highly effective not only in developing virtual and interactive experiences, where it enables the rapid implementation of haptic feedback, but also in meeting the specific needs of educational environments for teaching and learning. It has made it easier for students to see and understand the importance of haptic feedback in VR development and enhancing user experience, while also helping teachers convey various haptic concepts. Through the qualitative informal interview, students expressed positive impressions about working with the tool and stated that they would use this tool again in development. They particularly praised the ability to quickly create more complex haptic effects without programming, but they emphasized that it was useful that there were multiple ways to create haptic effects that could be combined. The ability to live preview the created effects showed obvious positive impact on increasing engagement in laboratory exercises, teamwork, and participation in discussions.
Students have shown initiative in applying the tool in developing various interactive experiences in different areas of application, such as entertainment, simulations in construction, mechanical engineering, medicine, etc. The perceived haptic stimuli have proven to be particularly immersive in applications where the virtualized object closely matches the form and shape of the gamepad. This challenges the conventional perception of a gamepad controller as solely an entertainment device, showcasing its broader functionality and usability. As an additional benefit, the tool has sparked student interest in creating development tools and engines, as they recognized the importance of such resources in the field.
This work also contributes to the methodology of teaching students how to create VR/AR and interactive systems in general, as well as the design and implementation of haptic feedback, promoting iterative testing, discussion, and teamwork. Through practical experience with carefully selected examples of existing commercial interactive applications and by working on projects with various haptic devices, students create haptic feedback. They start with more rudimentary forms, designed to confirm the system’s receipt of a command, and progress to more sophisticated feedback mechanisms aimed at enhancing the experience and conveying deeper meaning. Across three generations of students, the tool’s applicability for educational purposes has been demonstrated, successfully fulfilling the haptic rendering architectural component of VR systems.
The preliminary study described in this paper aimed to examine students’ initial experience with the tool, gather early feedback to guide future development, and raise their awareness of haptic feedback and its significance in interactive system design. Building on the foundation established by this preliminary evaluation, a long-term study is planned. This study will involve industry professionals, enabling more comprehensive validation and assessment of the tool’s usability and contribution to professional use.
In the future, the tool is planned to be further improved and adapted for other game development engines, particularly Godot, as it is free, open-source, and shares similarities with Unity, especially in its support for C# scripting.
Additionally, future work will involve expanding the tool’s support for a broader range of haptic feedback devices and conducting a comparison to determine their abilities and limitations in terms of providing haptic feedback. Given the working principle of the tool and its method of creating haptic effects, it is realistic to expect that this solution can be applied to a variety of haptic devices with minimal adaptation, requiring only adjustments to support device-specific functionalities necessary to initiate vibration reproduction. For instance, the tool is applicable to Meta Quest 2 VR controllers, where the haptic curves assigned to the left and right gamepad motors are mapped to the left and right VR controllers, respectively. Further research in this area is planned, including the use of a haptic glove, which is currently being procured. Support for mobile device vibration, with an in situ preview directly on the device, would allow for efficient creation and testing of haptic feedback for mobile platforms. This approach would aim to achieve more consistent user experiences across applications on mobile devices and other platforms. Such functionality could increase the prevalence of haptic feedback on mobile devices and open up a variety of creative usage scenarios.
To further improve the AI Haptic Advisor tool, specialized AI models could be developed to better handle queries specific to the tool’s purpose, improving resilience to errors and inadequate inputs from the users of the tool. Additionally, running AI models locally on the tool user’s computer could improve reliability, stability, and response speed. Due to its purpose-built nature with full ownership and access to the code, all future corrections, modifications, adaptations, and upgrades can be implemented without restrictions.
It is essential to recognize and remember that the foundation of progress lies in education. A professional who understands the importance and contribution of haptic feedback during education will be well-equipped to develop VR applications that offer the best possible user experience. Consequently, users of these systems will expect an even better experience in the next generation of VR applications, thus raising the bar for immersion and user experience across the broader landscape of VR applications and related platforms. Moreover, the fact that the tool is not limited to educational use alone further reinforces this perspective.

Author Contributions

Conceptualization, V.B. and D.I.; methodology, V.B.; software, V.B.; validation, D.I. and V.B.; formal analysis, V.B.; investigation, V.B. and D.I.; resources, D.I.; data curation, V.B.; writing—original draft preparation, V.B.; writing—review and editing, V.B. and D.I.; visualization, V.B.; supervision, D.I.; project administration, D.I. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data supporting the findings of this study are available within the paper. For additional information, please contact the corresponding author.

Acknowledgments

This research has been supported by the Ministry of Science, Technological Development and Innovation (Contract No. 451-03-137/2025-03/200156) and the Faculty of Technical Sciences, University of Novi Sad through project “Scientific and Artistic Research Work of Researchers in Teaching and Associate Positions at the Faculty of Technical Sciences, University of Novi Sad 2025” (No. 01-50/295).

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Gao, Y.; Spence, C. Enhancing Presence, Immersion, and Interaction in Multisensory Experiences Through Touch and Haptic Feedback. Virtual Worlds 2025, 4, 3. [Google Scholar] [CrossRef]
  2. Van Damme, S.; Legrand, N.; Heyse, J.; De Backere, F.; De Turck, F.; Vega, M.T. Effects of Haptic Feedback on User Perception and Performance in Interactive Projected Augmented Reality, Proceedings of the 1st Workshop on Interactive eXtended Reality, Lisboa, Portugal, 14 October 2022; ACM: New York, NY, USA, 2022; pp. 11–18. [Google Scholar]
  3. Kreimeier, J.; Hammer, S.; Friedmann, D.; Karg, P.; Bühner, C.; Bankel, L.; Götzelmann, T. Evaluation of Different Types of Haptic Feedback Influencing the Task-Based Presence and Performance in Virtual Reality, Proceedings of the 12th ACM International Conference on PErvasive Technologies Related to Assistive Environments, New York, NY, USA, 5 June 2019; Association for Computing Machinery: New York, NY, USA, 2019; pp. 289–298. [Google Scholar]
  4. Collaço, E.; Kira, E.; Sallaberry, L.H.; Queiroz, A.C.M.; Machado, M.A.A.M.; Crivello Jr, O.; Tori, R. Immersion and Haptic Feedback Impacts on Dental Anesthesia Technical Skills Virtual Reality Training. J. Dent. Educ. 2021, 85, 589–598. [Google Scholar] [CrossRef]
  5. Wang, P.; Bai, X.; Billinghurst, M.; Zhang, S.; Han, D.; Sun, M.; Wang, Z.; Lv, H.; Han, S. Haptic Feedback Helps Me? A VR-SAR Remote Collaborative System with Tangible Interaction. Int. J. Hum.–Comput. Interact. 2020, 36, 1242–1257. [Google Scholar] [CrossRef]
  6. Culbertson, H.; Schorr, S.; Okamura, A. Haptics: The Present and Future of Artificial Touch Sensation. Annu. Rev. Control Robot. Auton. Syst. 2018, 1, 385–409. [Google Scholar] [CrossRef]
  7. A Touch of Virtual Reality. Nat. Mach. Intell. 2023, 5, 557. [CrossRef]
  8. Kim, J.J.; Wang, Y.; Wang, H.; Lee, S.; Yokota, T.; Someya, T. Skin Electronics: Next-Generation Device Platform for Virtual and Augmented Reality. Adv. Funct. Mater. 2021, 31, 2009602. [Google Scholar] [CrossRef]
  9. Pezent, E.; O’Malley, M.K.; Israr, A.; Samad, M.; Robinson, S.; Agarwal, P.; Benko, H.; Colonnese, N. Explorations of Wrist Haptic Feedback for AR/VR Interactions with Tasbi. In Proceedings of the Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; Association for Computing Machinery: New York, NY, USA, 2020; pp. 1–4. [Google Scholar]
  10. Tanaçar, N.; Mughrabi, M.; Batmaz, A.; Leonardis, D.; Sarac, M. The Impact of Haptic Feedback During Sudden, Rapid Virtual Interactions. In Proceedings of the IEEE World Haptics Conference (WHC), Delft, The Netherlands, 10–13 July 2023; p. 70. [Google Scholar]
  11. Brasen, P.W.; Christoffersen, M.; Kraus, M. Effects of Vibrotactile Feedback in Commercial Virtual Reality Systems. In Interactivity, Game Creation, Design, Learning, and Innovation; Brooks, A.L., Brooks, E., Sylla, C., Eds.; Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering; Springer International Publishing: Cham, Switzerland, 2019; Volume 265, pp. 219–224. ISBN 978-3-030-06133-3. [Google Scholar]
  12. Sallnäs, E.-L.; Rassmus-Gröhn, K.; Sjöström, C. Supporting Presence in Collaborative Environments by Haptic Force Feedback. ACM Trans. Comput.-Hum. Interact. 2000, 7, 461–476. [Google Scholar] [CrossRef]
  13. Richard, G.; Pietrzak, T.; Argelaguet, F.; Lécuyer, A.; Casiez, G. Frontiers. Studying the Role of Haptic Feedback on Virtual Embodiment in a Drawing Task. Front. Virtual Real. 2021, 1, 573167. [Google Scholar] [CrossRef]
  14. Jacucci, G.; Bellucci, A.; Ahmed, I.; Harjunen, V.; Spape, M.; Ravaja, N. Haptics in Social Interaction with Agents and Avatars in Virtual Reality: A Systematic Review. Virtual Real. 2024, 28, 170. [Google Scholar] [CrossRef]
  15. Kaplan, H.; Pyayt, A. Fully Digital Audio Haptic Maps for Individuals with Blindness. Disabilities 2024, 4, 64–78. [Google Scholar] [CrossRef]
  16. Schmitz, B.; Ertl, T. Making Digital Maps Accessible Using Vibrations. In Computers Helping People with Special Needs; Miesenberger, K., Klaus, J., Zagler, W., Karshmer, A., Eds.; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2010; Volume 6179, pp. 100–107. ISBN 978-3-642-14096-9. [Google Scholar]
  17. Nordvall, M. The Sightlence Game: Designing a Haptic Computer Game Interface. In Proceedings of the DiGRA ’13—2013 DiGRA International Conference: DeFragging Game Studies, Atlanta, GA, USA, 26–29 August 2013. [Google Scholar]
  18. Gutschmidt, R.; Schiewe, M.; Zinke, F.; Jürgensen, H. Haptic Emulation of Games: Haptic Sudoku for the Blind. In Proceedings of the 3rd International Conference on PErvasive Technologies Related to Assistive Environments, Samos, Greece, 23–25 June 2010; Association for Computing Machinery: New York, NY, USA, 2010; pp. 1–8. [Google Scholar]
  19. Kaewprapan, W. Game Development for the Visually Impaired; Sulaiman, H.A., Othman, M.A., Othman, M.F.I., Rahim, Y.A., Pee, N.C., Eds.; Springer International Publishing: Cham, Switzerland, 2016; Volume 362, pp. 1309–1315. [Google Scholar]
  20. Trifănică, V.; Moldoveanu, A.; Butean, A.; Butean, D. Gamepad Vibration Methods to Help Blind People Perceive Colors. In Proceedings of the 12th Romanian Human-Computer Interaction Conference (RoCHI 2015), Bucharest, Romania, 24–25 September 2015; pp. 37–40. [Google Scholar]
  21. Seifi, H.; Fazlollahi, F.; Oppermann, M.; Sastrillo, J.A.; Ip, J.; Agrawal, A.; Park, G.; Kuchenbecker, K.J.; MacLean, K.E. Haptipedia: Accelerating Haptic Device Discovery to Support Interaction & Engineering Design. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; Association for Computing Machinery: New York, NY, USA, 2019; pp. 1–12. [Google Scholar]
  22. Wee, C.; Yap, K.M.; Lim, W.N. Haptic Interfaces for Virtual Reality: Challenges and Research Directions. IEEE Access 2021, 9, 112145–112162. [Google Scholar] [CrossRef]
  23. Jr, J.; Conti, F.; Barbagli, F. Haptics Rendering: Introductory Concepts. IEEE Comput. Graph. Appl. 2004, 24, 24–32. [Google Scholar] [CrossRef]
  24. Gibbs, J.K.; Gillies, M.; Pan, X. A Comparison of the Effects of Haptic and Visual Feedback on Presence in Virtual Reality. Int. J. Hum.-Comput. Stud. 2022, 157, 102717. [Google Scholar] [CrossRef]
  25. Cardoso, J.C.S. Comparison of Gesture, Gamepad, and Gaze-Based Locomotion for VR Worlds. In Proceedings of the Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology, Munich, Germany, 2 November 2016; ACM: New York, NY, USA, 2016; pp. 319–320. [Google Scholar]
  26. Mayor, J.; Raya, L.; Sanchez, A. A Comparative Study of Virtual Reality Methods of Interaction and Locomotion Based on Presence, Cybersickness, and Usability. IEEE Trans. Emerg. Top. Comput. 2021, 9, 1542–1553. [Google Scholar] [CrossRef]
  27. Zhao, J.; An, R.; Xu, R.; Lin, B. Comparing Hand Gestures and a Gamepad Interface for Locomotion in Virtual Environments. Int. J. Hum.-Comput. Stud. 2022, 166, 102868. [Google Scholar] [CrossRef]
  28. Ortega, F.R.; Williams, A.S.; Tarre, K.; Barreto, A.; Rishe, N. 3D Travel Comparison Study between Multi-Touch and GamePad. Int. J. Hum.–Comput. Interact. 2020, 36, 1699–1713. [Google Scholar] [CrossRef]
  29. Bowman, N.D.; Pietschmann, D.; Liebold, B. The Golden (Hands) Rule: Exploring User Experiences with Gamepad and Natural-User Interfaces in Popular Video Games. J. Gaming Virtual Worlds 2017, 9, 71–85. [Google Scholar] [CrossRef]
  30. Ali, M.; Cardona-Rivera, R.E. Comparing Gamepad and Naturally-Mapped Controller Effects on Perceived Virtual Reality Experiences. In Proceedings of the ACM Symposium on Applied Perception, Virtual Event, 12 September 2020; ACM: New York, NY, USA, 2020; pp. 1–10. [Google Scholar]
  31. Konishi, Y. Sony Group Portal—R&D—Stories—Haptics. Available online: https://www.sony.com/en/SonyInfo/technology/stories/Haptics/ (accessed on 22 September 2023).
  32. Kuchera, B. How the Nintendo Switch’s HD Rumble Makes Tumbleseed Feel Real. Polygon. 2017. Available online: https://www.polygon.com/2017/5/1/15499328/tumbleseed-hd-rumble-nintendo-switch (accessed on 15 March 2025).
  33. Tarigan, J.T.; Sikoko, A.S.; Selvida, D. Developing an Efficient Vibrotactile Stimuli in Computer Game. In Proceedings of the 2024 8th International Conference on Electrical, Telecommunication and Computer Engineering (ELTICOM), Medan, Indonesia, 21–22 November 2024; pp. 126–129. [Google Scholar]
  34. Pichlmair, M.; Johansen, M. Designing Game Feel: A Survey. IEEE Trans. Games 2022, 14, 138–152. [Google Scholar] [CrossRef]
  35. Bursać, V.; Ivetić, D.; Kupusinac, A. Program Model for a Visual Editor of Gamepad Haptic Effects. In Proceedings of the 14th International Conference on Applied Internet and Information Technologies (AIIT 2024), Zrenjanin, Serbia, 8 November 2024; University of Novi Sad, Technical Faculty “Mihajlo Pupin”, Zrenjanin, Republic of Serbia: Zrenjanin, Serbia, 2024; pp. 173–180. [Google Scholar]
  36. Terenti, M.; Vatavu, R.-D. VIREO: Web-Based Graphical Authoring of Vibrotactile Feedback for Interactions with Mobile and Wearable Devices. Int. J. Hum.–Comput. Interact. 2023, 39, 4162–4180. [Google Scholar] [CrossRef]
  37. Nordvall, M.; Arvola, M.; Boström, E.; Danielsson, H.; Overkamp, T. VibEd: A Prototyping Tool for Haptic Game Interfaces. In Proceedings of the iConference 2016 Proceedings, Philadelphia, PA, USA, 20–23 March 2016; iSchools: Philadelphia, PA, USA, 2016. [Google Scholar]
  38. Merdenyan, B.; Petrie, H. User Reviews of Gamepad Controllers: A Source of User Requirements and User Experience. Proceedings of the 2015 Annual Symposium on Computer-Human Interaction in Play; Association for Computing Machinery: New York, NY, USA, 2015. [Google Scholar]
  39. Valve Corporation Steam News—An Update on Steam Input and Controller Support–Steam News. Available online: https://store.steampowered.com/news/app/593110/view/4142827237888316811 (accessed on 3 March 2025).
  40. Straits Research Wireless Gamepad Market Size, Share & Trends, Growth Demand Report, 2031. Available online: https://straitsresearch.com/report/wireless-gamepad-market (accessed on 3 March 2025).
  41. Unity Technologies Input System|Input System|1.8.2. Available online: https://docs.unity3d.com/Packages/com.unity.inputsystem@1.8/manual/index.html (accessed on 27 June 2024).
  42. Cruz-Hernandez, J.M.; Ullrich, C.J. Sound to Haptic Effect Conversion System Using Mapping. U.S. Patent 10,339,772 B2, 2 July 2019. [Google Scholar]
  43. Brown, M.; Kehoe, A.; Kirakowski, J.; Pitt, I. Beyond the Gamepad: HCI and Game Controller Design and Evaluation. In Evaluating User Experience in Games; Springer: London, UK, 2010; pp. 197–219. ISBN 978-3-319-15984-3. [Google Scholar]
  44. Cleary, A.G.; McKendrick, H.; Sills, J.A. Hand-Arm Vibration Syndrome May Be Associated with Prolonged Use of Vibrating Computer Games. BMJ 2002, 324, 301. [Google Scholar] [CrossRef]
  45. Swain, J.; King, B. Using Informal Conversations in Qualitative Research. Int. J. Qual. Methods 2022, 21, 16094069221085056. [Google Scholar] [CrossRef]
  46. Witmer, B.G.; Singer, M.J. Measuring Presence in Virtual Environments: A Presence Questionnaire. Presence 1998, 7, 225–240. [Google Scholar] [CrossRef]
  47. Gerger, H.; Søgaard, K.; Macri, E.M.; Jackson, J.A.; Elbers, R.G.; van Rijn, R.M.; Koes, B.; Chiarotto, A.; Burdorf, A. Exposure to Hand-Arm Vibrations in the Workplace and the Occurrence of Hand-Arm Vibration Syndrome, Dupuytren’s Contracture, and Hypothenar Hammer Syndrome: A Systematic Review and Meta-Analysis. J. Occup. Environ. Hyg. 2023, 20, 257–267. [Google Scholar] [CrossRef] [PubMed]
  48. Hornsey, R.L.; Hibbard, P.B. Current Perceptions of Virtual Reality Technology. Appl. Sci. 2024, 14, 4222. [Google Scholar] [CrossRef]
  49. Schneider, O.; MacLean, K.; Swindells, C.; Booth, K. Haptic Experience Design: What Hapticians Do and Where They Need Help. Int. J. Hum.-Comput. Stud. 2017, 107, 5–21. [Google Scholar] [CrossRef]
  50. Unity Technologies Interface IDualMotorRumble Input System 1.8.2. Available online: https://docs.unity3d.com/Packages/com.unity.inputsystem@1.8/api/UnityEngine.InputSystem.Haptics.IDualMotorRumble.html (accessed on 27 June 2024).
  51. Godot Community; Linietsky, J.; Manzur, A. Controllers, Gamepads, and Joysticks. Available online: https://docs.godotengine.org/en/stable/tutorials/inputs/controllers_gamepads_joysticks.html#vibration (accessed on 15 April 2025).
  52. Unreal Engine Force Feedback in Unreal Engine Unreal Engine 5.5 Documentation Epic Developer Community. Available online: https://dev.epicgames.com/documentation/en-us/unreal-engine/force-feedback-in-unreal-engine (accessed on 15 April 2025).
  53. Meta Haptics Overview Meta Developers. Available online: https://developers.meta.com/horizon/resources/haptics-overview/ (accessed on 15 April 2025).
  54. Interhaptics Haptic Composer—Interhaptics. Available online: https://doc.wyvrn.com/docs/interhaptics-sdk/haptic-composer/ (accessed on 15 April 2025).
  55. Anderson, L.W.; Krathwohl, D.R. A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives; Longman: Boston, MA, USA, 2001; ISBN 978-0-321-08405-7. [Google Scholar]
  56. Fink, L.D. Creating Significant Learning Experiences: An Integrated Approach to Designing College Courses; Wiley: Hoboken, NJ, USA, 2003; ISBN 978-0-7879-7121-2. [Google Scholar]
  57. Piaget, J. To Understand Is to Invent: The Future of Education; Penguin Books: London, UK, 1976; ISBN 978-0-14-004378-5. [Google Scholar]
  58. Kolb, D.A. Experiential Learning: Experience as the Source of Learning and Development; Prentice-Hall: Hoboken, NJ, USA, 1984; ISBN 978-0-13-295261-3. [Google Scholar]
Figure 1. Unity Editor workspace including visual representation of the scene with the hierarchy of 3D objects, an overview of project assets, and the haptic effects editor with the visualization of haptic and audio effects.
Figure 1. Unity Editor workspace including visual representation of the scene with the hierarchy of 3D objects, an overview of project assets, and the haptic effects editor with the visualization of haptic and audio effects.
Applsci 15 05419 g001
Figure 2. (a) Creating HapticEffect assets in Unity Editor through the Assets/Create submenu within the Project window. (b) Haptic effects saved as project assets. (c) Custom visual editor for design and preview of HapticEffect object. (d) Preview options including looped preview functionality and display of connected gamepad devices, which can be individually selected for preview (one, multiple, or all devices).
Figure 2. (a) Creating HapticEffect assets in Unity Editor through the Assets/Create submenu within the Project window. (b) Haptic effects saved as project assets. (c) Custom visual editor for design and preview of HapticEffect object. (d) Preview options including looped preview functionality and display of connected gamepad devices, which can be individually selected for preview (one, multiple, or all devices).
Applsci 15 05419 g002
Figure 3. (a) Creating a Haptic Effect using Animation Curve Editor with timeline, grid lines, and curve presets. (b) Editing tools for animation curve keys. (c) Built-in, user-defined curve presets and preset options. (d) Clipboard functionality for copying and pasting curves.
Figure 3. (a) Creating a Haptic Effect using Animation Curve Editor with timeline, grid lines, and curve presets. (b) Editing tools for animation curve keys. (c) Built-in, user-defined curve presets and preset options. (d) Clipboard functionality for copying and pasting curves.
Applsci 15 05419 g003
Figure 4. Additional tools integrated within the haptic feedback design tool: (a) Animation Curve Generator tool enables users to create and fine-tune custom animation curves for haptic effects. (b) AI Haptic Advisor tool utilizes artificial intelligence to recommend optimal haptic feedback settings based on specific use cases.
Figure 4. Additional tools integrated within the haptic feedback design tool: (a) Animation Curve Generator tool enables users to create and fine-tune custom animation curves for haptic effects. (b) AI Haptic Advisor tool utilizes artificial intelligence to recommend optimal haptic feedback settings based on specific use cases.
Applsci 15 05419 g004
Figure 5. Split-screen detailed view of Curve Editor with timeline grid and curve presets, and a preview section with audio and/or haptic playback options. Blue curve represents haptic motor vibration intensities, while audio waveform is represented in orange.
Figure 5. Split-screen detailed view of Curve Editor with timeline grid and curve presets, and a preview section with audio and/or haptic playback options. Blue curve represents haptic motor vibration intensities, while audio waveform is represented in orange.
Applsci 15 05419 g005
Figure 6. (a) Properties for adjusting influence of distance from source of haptic effect on its reproduction. (b) Haptic effect added within a script component located on a game object. (c) Haptic effect’s perception radius represented as a sphere gizmo around the corresponding object in the scene.
Figure 6. (a) Properties for adjusting influence of distance from source of haptic effect on its reproduction. (b) Haptic effect added within a script component located on a game object. (c) Haptic effect’s perception radius represented as a sphere gizmo around the corresponding object in the scene.
Applsci 15 05419 g006
Figure 7. Haptic zones in a cannon operating simulation, enabling different stimuli based on the virtual object the avatar holds and their actions: (a) Unity Inspector settings for the cannon muzzle haptic zone. (b) Haptic zones for the muzzle and fuse of the cannon, positioned on relevant positions on the cannon game object. Haptic zone gizmos are visible only in Unity Editor and are not seen by the player.
Figure 7. Haptic zones in a cannon operating simulation, enabling different stimuli based on the virtual object the avatar holds and their actions: (a) Unity Inspector settings for the cannon muzzle haptic zone. (b) Haptic zones for the muzzle and fuse of the cannon, positioned on relevant positions on the cannon game object. Haptic zone gizmos are visible only in Unity Editor and are not seen by the player.
Applsci 15 05419 g007
Figure 8. Different haptic zones depending on type of surface in a motorcycle riding simulation. User is provided with a specific combined haptic stimulus because only one wheel is in the zone of the surface with larger stones.
Figure 8. Different haptic zones depending on type of surface in a motorcycle riding simulation. User is provided with a specific combined haptic stimulus because only one wheel is in the zone of the surface with larger stones.
Applsci 15 05419 g008
Figure 9. (a) Help box with instructions for generating a haptic effect based on an audio clip. (b) Example tooltips providing additional explanations and tips for various inputs across tool’s user interface. (c) Different types of feedback messages in the console (warning, success, error, information).
Figure 9. (a) Help box with instructions for generating a haptic effect based on an audio clip. (b) Example tooltips providing additional explanations and tips for various inputs across tool’s user interface. (c) Different types of feedback messages in the console (warning, success, error, information).
Applsci 15 05419 g009
Figure 10. Simulation of working with a jackhammer on a construction site. User is provided with a stronger haptic stimulus because their avatar is not wearing protective gloves.
Figure 10. Simulation of working with a jackhammer on a construction site. User is provided with a stronger haptic stimulus because their avatar is not wearing protective gloves.
Applsci 15 05419 g010
Figure 11. Surgical augmented reality simulation for learning anatomy and dissection: (a) Haptic zones marked on different tissue layers of a 3D model, with each tissue type providing distinct haptic feedback when the virtual scalpel interacts with it. (b) Prototype curves for haptic effects based on density and texture of the tissue.
Figure 11. Surgical augmented reality simulation for learning anatomy and dissection: (a) Haptic zones marked on different tissue layers of a 3D model, with each tissue type providing distinct haptic feedback when the virtual scalpel interacts with it. (b) Prototype curves for haptic effects based on density and texture of the tissue.
Applsci 15 05419 g011
Figure 12. Change in students’ perception and awareness of importance of visual, audio, and haptic feedback in interactive experiences from the beginning to the end of the semester. Ratings were extracted from informal interviews. Haptic feedback showed the greatest increase in perceived importance, while visual feedback remained highly rated, with only a slight increase.
Figure 12. Change in students’ perception and awareness of importance of visual, audio, and haptic feedback in interactive experiences from the beginning to the end of the semester. Ratings were extracted from informal interviews. Haptic feedback showed the greatest increase in perceived importance, while visual feedback remained highly rated, with only a slight increase.
Applsci 15 05419 g012
Figure 13. Overall user experience ratings (scale of 1–5) for the haptic feedback design tool, provided by seventeen students across three generations of the study, consisting of five, six, and six students, respectively. These ratings were extracted from informal interviews and reflect students’ subjective impressions of working with the tool.
Figure 13. Overall user experience ratings (scale of 1–5) for the haptic feedback design tool, provided by seventeen students across three generations of the study, consisting of five, six, and six students, respectively. These ratings were extracted from informal interviews and reflect students’ subjective impressions of working with the tool.
Applsci 15 05419 g013
Figure 14. (a) Code implementation of a simple haptic effect with a linear intensity increase. (b) Same haptic effect represented as an animation curve within the visual editor. (c) Force Feedback Effect editor and preview in Unreal Engine.
Figure 14. (a) Code implementation of a simple haptic effect with a linear intensity increase. (b) Same haptic effect represented as an animation curve within the visual editor. (c) Force Feedback Effect editor and preview in Unreal Engine.
Applsci 15 05419 g014
Table 1. Examples of use cases and scenarios for teaching and learning utilizing the haptic feedback design tool.
Table 1. Examples of use cases and scenarios for teaching and learning utilizing the haptic feedback design tool.
Use CaseScenarioActivityLearning Outcome
Introduction to Haptic DesignStudents are introduced to basic haptic concepts and parameters (intensity, duration, rhythmic patterns, frequency, latency).Students create simple haptic effects for basic events (e.g., button presses, character movements) and adjust parameters to feel changes.Students grasp fundamental haptic principles and how to manipulate them.
Design of Advanced Haptic Effects for Mechanics and EventsStudents design complex haptic feedback linked to mechanics and events.Creating haptic patterns (e.g., an accelerating heartbeat when health is low) and environmental feedback (e.g., vibrations for different surfaces), while synchronizing with other feedback systems such as audio and visual effects.Understand the relationship between haptic feedback, other feedback systems, and user experience to enhance immersion.
Collaborative Projects and Peer ReviewStudents work in teams to design and integrate haptic feedback into existing application prototypes.Collaboratively design haptic effects using the tool, integrate them into the application, and participate in discussions to share and receive peer feedback for refinement.Develop teamwork skills and the ability to evaluate, critique, and improve haptic designs.
Home Assignments and Remote LearningStudents continue working on haptic designs outside the lab.Use the tool on personal devices to explore its features, refine projects, and access online resources and tutorials.Encourage independent learning and reinforce concepts learned in the lab.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Bursać, V.; Ivetić, D. A Versatile Tool for Haptic Feedback Design Towards Enhancing User Experience in Virtual Reality Applications. Appl. Sci. 2025, 15, 5419. https://doi.org/10.3390/app15105419

AMA Style

Bursać V, Ivetić D. A Versatile Tool for Haptic Feedback Design Towards Enhancing User Experience in Virtual Reality Applications. Applied Sciences. 2025; 15(10):5419. https://doi.org/10.3390/app15105419

Chicago/Turabian Style

Bursać, Vasilije, and Dragan Ivetić. 2025. "A Versatile Tool for Haptic Feedback Design Towards Enhancing User Experience in Virtual Reality Applications" Applied Sciences 15, no. 10: 5419. https://doi.org/10.3390/app15105419

APA Style

Bursać, V., & Ivetić, D. (2025). A Versatile Tool for Haptic Feedback Design Towards Enhancing User Experience in Virtual Reality Applications. Applied Sciences, 15(10), 5419. https://doi.org/10.3390/app15105419

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop