Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (36)

Search Parameters:
Keywords = haptic design tool

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
21 pages, 24372 KiB  
Article
Streamlining Haptic Design with Micro-Collision Haptic Map Generated by Stable Diffusion
by Hongyu Liu and Zhenyu Gu
Appl. Sci. 2025, 15(13), 7174; https://doi.org/10.3390/app15137174 - 26 Jun 2025
Viewed by 301
Abstract
Rendering surface materials to provide realistic tactile sensations is a key focus in haptic interaction research. However, generating texture maps and designing corresponding haptic feedback often requires expert knowledge and significant effort. To simplify the workflow, we developed a micro-collision-based tactile texture dataset [...] Read more.
Rendering surface materials to provide realistic tactile sensations is a key focus in haptic interaction research. However, generating texture maps and designing corresponding haptic feedback often requires expert knowledge and significant effort. To simplify the workflow, we developed a micro-collision-based tactile texture dataset for several common materials and fine-tuned the VAE model of Stable Diffusion. Our approach allows designers to generate matching visual and haptic textures from natural language prompts and enables users to receive real-time, realistic haptic feedback when interacting with virtual surfaces. We evaluated our method through a haptic design task. Professional and non-haptic designers each created one haptic design using traditional tools and another using our approach. Participants then evaluated the four resulting designs. The results showed that our method produced haptic feedback comparable to that of professionals, though slightly lower in overall and consistency scores. Importantly, professional designers using our method required less time and fewer expert resources. Non-haptic designers also achieved better outcomes with our tool. Our generative method optimizes the haptic design workflow, lowering the expertise threshold and increasing efficiency. It has the potential to support broader adoption of haptic design in interactive media and enhance multisensory experiences. Full article
Show Figures

Figure 1

20 pages, 3616 KiB  
Article
An RGB-D Camera-Based Wearable Device for Visually Impaired People: Enhanced Navigation with Reduced Social Stigma
by Zhiwen Li, Fred Han and Kangjie Zheng
Electronics 2025, 14(11), 2168; https://doi.org/10.3390/electronics14112168 - 27 May 2025
Viewed by 699
Abstract
This paper presents an intelligent navigation wearable device for visually impaired individuals. The system aims to improve their independent travel capabilities and reduce the negative emotional impacts associated with visible disability indicators in travel tools. It employs an RGB-D camera and an inertial [...] Read more.
This paper presents an intelligent navigation wearable device for visually impaired individuals. The system aims to improve their independent travel capabilities and reduce the negative emotional impacts associated with visible disability indicators in travel tools. It employs an RGB-D camera and an inertial measurement unit (IMU) sensor to facilitate real-time obstacle detection and recognition via advanced point cloud processing and YOLO-based target recognition techniques. An integrated intelligent interaction module identifies the core obstacle from the detected obstacles and translates this information into multidimensional auxiliary guidance. Users receive haptic feedback to navigate obstacles, indicating directional turns and distances, while auditory prompts convey the identity and distance of obstacles, enhancing spatial awareness. The intuitive vibrational guidance significantly enhances safety during obstacle avoidance, and the voice instructions promote a better understanding of the surrounding environment. The device adopts an arm-mounted design, departing from the traditional cane structure that reinforces disability labeling and social stigma. This lightweight mechanical design prioritizes user comfort and mobility, making it more user-friendly than traditional stick-type aids. Experimental results demonstrate that this system outperforms traditional white canes and ultrasonic devices in reducing collision rates, particularly for mid-air obstacles, thereby significantly improving safety in dynamic environments. Furthermore, the system’s ability to vocalize obstacle identities and distances in advance enhances spatial perception and interaction with the environment. By eliminating the cane structure, this innovative wearable design effectively minimizes social stigma, empowering visually impaired individuals to travel independently with increased confidence, ultimately contributing to an improved quality of life. Full article
Show Figures

Figure 1

15 pages, 2910 KiB  
Article
Advancing Foundry Training Through Virtual Reality: A Low-Cost, Immersive Learning Environment
by Anson Fry, Ismail Fidan and Eric Wooldridge
Inventions 2025, 10(3), 38; https://doi.org/10.3390/inventions10030038 - 22 May 2025
Cited by 1 | Viewed by 545
Abstract
Metal casting foundries present hazardous working conditions, making traditional training methods costly, time-consuming, and potentially unsafe. To address these challenges, this study presents a Virtual Reality (VR) training framework developed for the Tennessee Tech University (TTU) Foundry. The objective is to enhance introductory [...] Read more.
Metal casting foundries present hazardous working conditions, making traditional training methods costly, time-consuming, and potentially unsafe. To address these challenges, this study presents a Virtual Reality (VR) training framework developed for the Tennessee Tech University (TTU) Foundry. The objective is to enhance introductory training and safety education by providing an immersive, interactive, and risk-free environment where trainees can familiarize themselves with safety protocols, equipment handling, process workflows, and machine arrangements before engaging with real-world operations. The VR foundry environment is designed using Unreal Engine, a freely available software tool, to create a high-fidelity, interactive simulation of metal casting processes. This system enables real-time user interaction, scenario-based training, and procedural guidance, ensuring an engaging and effective learning experience. Preliminary findings and prior research indicate that VR-based training enhances learning retention, improves hazard recognition, and reduces training time compared to traditional methods. While challenges such as haptic feedback limitations and initial setup costs exist, VR’s potential in engineering education and industrial training is substantial. This work-in-progress study highlights the transformative role of VR in foundry training, contributing to the development of a safer, more efficient, and scalable workforce in the metal casting industry. Full article
(This article belongs to the Section Inventions and Innovation in Advanced Manufacturing)
Show Figures

Figure 1

33 pages, 10073 KiB  
Article
A Versatile Tool for Haptic Feedback Design Towards Enhancing User Experience in Virtual Reality Applications
by Vasilije Bursać and Dragan Ivetić
Appl. Sci. 2025, 15(10), 5419; https://doi.org/10.3390/app15105419 - 13 May 2025
Viewed by 880
Abstract
The past 15 years of extensive experience teaching VR system development has taught us that haptic feedback must be more sophisticatedly integrated into VR systems, alongside the already realistic high-fidelity visual and audio feedback. The third generation of students is enhancing VR interactive [...] Read more.
The past 15 years of extensive experience teaching VR system development has taught us that haptic feedback must be more sophisticatedly integrated into VR systems, alongside the already realistic high-fidelity visual and audio feedback. The third generation of students is enhancing VR interactive experiences by incorporating haptic feedback through traditional, proven, commercially available gamepad controllers. Insights and discoveries gained through this process contributed to the development of versatile Unity custom editor tool, which is the focus of this article. The developed tool supports a wide range of use cases, enabling the visual, parametric, and descriptive creation of reusable haptic effects. To enhance productivity in commercial development, it supports the creation of haptic and haptic/audio stimulus libraries, which can be further expanded and combined based on object-oriented principles. Additionally, the tool allows for the definition of specific areas within the virtual space where these stimuli can be experienced, depending on the virtual object the avatar holds and the activities they perform. This intuitive platform allows the design of reusable haptic effects through graphical editor, audio conversion, programmatic scripting, and AI-powered guidance. The sophistication and usability of the tool have been demonstrated through several student VR projects across various application areas. Full article
Show Figures

Figure 1

41 pages, 4809 KiB  
Review
Neurocomputational Mechanisms of Sense of Agency: Literature Review for Integrating Predictive Coding and Adaptive Control in Human–Machine Interfaces
by Anirban Dutta
Brain Sci. 2025, 15(4), 396; https://doi.org/10.3390/brainsci15040396 - 14 Apr 2025
Cited by 1 | Viewed by 1456
Abstract
Background: The sense of agency (SoA)—the subjective experience of controlling one’s own actions and their consequences—is a fundamental aspect of human cognition, volition, and motor control. Understanding how the SoA arises and is disrupted in neuropsychiatric disorders has significant implications for human–machine interface [...] Read more.
Background: The sense of agency (SoA)—the subjective experience of controlling one’s own actions and their consequences—is a fundamental aspect of human cognition, volition, and motor control. Understanding how the SoA arises and is disrupted in neuropsychiatric disorders has significant implications for human–machine interface (HMI) design for neurorehabilitation. Traditional cognitive models of agency often fail to capture its full complexity, especially in dynamic and uncertain environments. Objective: This review synthesizes computational models—particularly predictive coding, Bayesian inference, and optimal control theories—to provide a unified framework for understanding the SoA in both healthy and dysfunctional brains. It aims to demonstrate how these models can inform the design of adaptive HMIs and therapeutic tools by aligning with the brain’s own inference and control mechanisms. Methods: I reviewed the foundational and contemporary literature on predictive coding, Kalman filtering, the Linear–Quadratic–Gaussian (LQG) control framework, and active inference. I explored their integration with neurophysiological mechanisms, focusing on the somato-cognitive action network (SCAN) and its role in sensorimotor integration, intention encoding, and the judgment of agency. Case studies, simulations, and XR-based rehabilitation paradigms using robotic haptics were used to illustrate theoretical concepts. Results: The SoA emerges from hierarchical inference processes that combine top–down motor intentions with bottom–up sensory feedback. Predictive coding frameworks, especially when implemented via Kalman filters and LQG control, provide a mechanistic basis for modeling motor learning, error correction, and adaptive control. Disruptions in these inference processes underlie symptoms in disorders such as functional movement disorder. XR-based interventions using robotic interfaces can restore the SoA by modulating sensory precision and motor predictions through adaptive feedback and suggestion. Computer simulations demonstrate how internal models, and hypnotic suggestions influence state estimation, motor execution, and the recovery of agency. Conclusions: Predictive coding and active inference offer a powerful computational framework for understanding and enhancing the SoA in health and disease. The SCAN system serves as a neural hub for integrating motor plans with cognitive and affective processes. Future work should explore the real-time modulation of agency via biofeedback, simulation, and SCAN-targeted non-invasive brain stimulation. Full article
(This article belongs to the Special Issue New Insights into Movement Generation: Sensorimotor Processes)
Show Figures

Figure 1

18 pages, 11128 KiB  
Article
Implementing Augmented Reality Models in the Classroom Environment Using Merge Cubes: A Quantitative Study of the Effects on Students’ Cognitive Load and Motivation
by Raphael Fehrmann
Educ. Sci. 2025, 15(4), 414; https://doi.org/10.3390/educsci15040414 - 26 Mar 2025
Viewed by 1324
Abstract
The present study investigates the extent to which the use of Merge Cubes as haptic AR tools in the classroom—realized in construction technology lessons at a vocational college as an exemplary case—influences the cognitive load and motivation of learners. A quasi-experimental field study [...] Read more.
The present study investigates the extent to which the use of Merge Cubes as haptic AR tools in the classroom—realized in construction technology lessons at a vocational college as an exemplary case—influences the cognitive load and motivation of learners. A quasi-experimental field study was conducted using a questionnaire in a pre-post design including a control group at a vocational college in Germany (North Rhine-Westphalia). During the intervention phase, the students in the experimental group worked with materials such as textbooks and worksheets that were specifically expanded to include the Merge Cube AR learning tool, while the students in the control group only used conventional learning materials. In both the pre- and post-test, the cognitive load and motivation of the learners were recorded using questionnaires. The results indicate that the use of Merge Cubes can reduce cognitive load: the extraneous cognitive load of the experimental group decreased over the course of the intervention, whereas that for the control group increased significantly in comparison. In addition, the germane cognitive load increased slightly in the experimental group, whereas that for the control group decreased. With regard to the intrinsic motivation of the learners, both groups recorded an increase, although the difference between the two groups was not significant. Based on these results, further factors influencing the effect on learning and implications for the practical use of the Merge Cube in the classroom are discussed, the concrete validation of which requires further research. Full article
(This article belongs to the Section Technology Enhanced Education)
Show Figures

Figure 1

19 pages, 1147 KiB  
Review
A Narrative Review of Haptic Technologies and Their Value for Training, Rehabilitation, and the Education of Persons with Special Needs
by Eloy Irigoyen, Mikel Larrea and Manuel Graña
Sensors 2024, 24(21), 6946; https://doi.org/10.3390/s24216946 - 29 Oct 2024
Cited by 4 | Viewed by 3677
Abstract
Haptic technologies are increasingly valuable for human–computer interaction in its many flavors, including, of course, virtual reality systems, which are becoming very useful tools for education, training, and rehabilitation in many areas of medicine, engineering, and daily life. There is a broad spectrum [...] Read more.
Haptic technologies are increasingly valuable for human–computer interaction in its many flavors, including, of course, virtual reality systems, which are becoming very useful tools for education, training, and rehabilitation in many areas of medicine, engineering, and daily life. There is a broad spectrum of technologies and approaches that provide haptic stimuli, ranging from the well-known force feedback to subtile pseudo-haptics and visual haptics. Correspondingly, there is a broad spectrum of applications and system designs that include haptic technologies as a relevant component and interaction feature. Paramount is their use in training of medical procedures, but they appear in a plethora of systems deploying virtual reality applications. This narrative review covers the panorama of haptic devices and approaches and the most salient areas of application. Special emphasis is given to education of persons with special needs, aiming to foster the development of innovative systems and methods addressing the enhancement of the quality of life of this segment of the population. Full article
(This article belongs to the Special Issue Computational Intelligence and Cyberphysical Systems in Sensing)
Show Figures

Figure 1

23 pages, 17790 KiB  
Technical Note
Development of a Modular Adjustable Wearable Haptic Device for XR Applications
by Ali Najm, Domna Banakou and Despina Michael-Grigoriou
Virtual Worlds 2024, 3(4), 436-458; https://doi.org/10.3390/virtualworlds3040024 - 16 Oct 2024
Cited by 3 | Viewed by 3618
Abstract
Current XR applications move beyond audiovisual information, with haptic feedback rapidly gaining ground. However, current haptic devices are still evolving and often struggle to combine key desired features in a balanced way. In this paper, we propose the development of a high-resolution haptic [...] Read more.
Current XR applications move beyond audiovisual information, with haptic feedback rapidly gaining ground. However, current haptic devices are still evolving and often struggle to combine key desired features in a balanced way. In this paper, we propose the development of a high-resolution haptic (HRH) system for perception enhancement, a wearable technology designed to augment extended reality (XR) experiences through precise and localized tactile feedback. The HRH system features a modular design with 58 individually addressable actuators, enabling intricate haptic interactions within a compact wearable form. Dual ESP32-S3 microcontrollers and a custom-designed system ensure robust processing and low-latency performance, crucial for real-time applications. Integration with the Unity game engine provides developers with a user-friendly and dynamic environment for accurate, simple control and customization. The modular design, utilizing a flexible PCB, supports a wide range of actuators, enhancing its versatility for various applications. A comparison of our proposed system with existing solutions indicates that the HRH system outperforms other devices by encapsulating several key features, including adjustability, affordability, modularity, and high-resolution feedback. The HRH system not only aims to advance the field of haptic feedback but also introduces an intuitive tool for exploring new methods of human–computer and XR interactions. Future work will focus on refining and exploring the haptic feedback communication methods used to convey information and expand the system’s applications. Full article
Show Figures

Figure 1

17 pages, 6147 KiB  
Article
Tactile Simultaneous Localization and Mapping Using Low-Cost, Wearable LiDAR
by John LaRocco, Qudsia Tahmina, John Simonis, Taylor Liang and Yiyao Zhang
Hardware 2024, 2(4), 256-272; https://doi.org/10.3390/hardware2040012 - 29 Sep 2024
Viewed by 1735
Abstract
Tactile maps are widely recognized as useful tools for mobility training and the rehabilitation of visually impaired individuals. However, current tactile maps lack real-time versatility and are limited because of high manufacturing and design costs. In this study, we introduce a device (i.e., [...] Read more.
Tactile maps are widely recognized as useful tools for mobility training and the rehabilitation of visually impaired individuals. However, current tactile maps lack real-time versatility and are limited because of high manufacturing and design costs. In this study, we introduce a device (i.e., ClaySight) that enhances the creation of automatic tactile map generation, as well as a model for wearable devices that use low-cost laser imaging, detection, and ranging (LiDAR,) used to improve the immediate spatial knowledge of visually impaired individuals. Our system uses LiDAR sensors to (1) produce affordable, low-latency tactile maps, (2) function as a day-to-day wayfinding aid, and (3) provide interactivity using a wearable device. The system comprises a dynamic mapping and scanning algorithm and an interactive handheld 3D-printed device that houses the hardware. Our algorithm accommodates user specifications to dynamically interact with objects in the surrounding area and create map models that can be represented with haptic feedback or alternative tactile systems. Using economical components and open-source software, the ClaySight system has significant potential to enhance independence and quality of life for the visually impaired. Full article
Show Figures

Figure 1

12 pages, 1080 KiB  
Article
Development and Validation of a Tool for VBOI (Virtual Body Ownership Illusion) Level Assessment
by Gayoung Yoo and Kyungdoh Kim
Appl. Sci. 2024, 14(18), 8432; https://doi.org/10.3390/app14188432 - 19 Sep 2024
Viewed by 1264
Abstract
Virtual Body Ownership Illusion (Virtual BOI) refers to the perceptual, cognitive, and behavioral changes that occur due to the illusion that a virtual body is one’s own actual body. Recent research has focused on inducing Virtual Body Ownership Illusion (Virtual BOI) using various [...] Read more.
Virtual Body Ownership Illusion (Virtual BOI) refers to the perceptual, cognitive, and behavioral changes that occur due to the illusion that a virtual body is one’s own actual body. Recent research has focused on inducing Virtual Body Ownership Illusion (Virtual BOI) using various physical conditions of VR environments such as haptic feedback and 360-degree immersion, among others. The level of Virtual BOI has been recognized as an important factor in VR-based clinical therapy programs where patient immersion is crucial. However, a common issue is the lack of standardized evaluation tools for Virtual BOI, with most experiments relying on ad hoc tools based on experimental conditions or lacking consideration for the physical design elements of VR. This measurement tool was designed to consider the characteristics of recent VR devices, such as haptics and hand tracking, in the design of experiments and questionnaires. The tool is composed of sub-attributes related to VR technology, including Embodiment, Presence, Visuo-tactile, Visuo-proprioceptive, and Visuo-Motor. Based on a review of the existing literature, we hypothesized that the Virtual BOI scores would vary depending on manipulation methods, viewpoints, and haptic conditions. An experiment was conducted with 39 participants, who performed the same task under four different conditions using a virtual hand. Virtual BOI scores were assessed using the evaluation tool developed for this study. The questionnaire underwent CFA, and three items with factor loadings below 0.5 were removed, resulting in a total of 14 items. Each subscale demonstrated high reliability, with Cronbach’s alpha values greater than 0.60. When developing experiments, clinical programs, or VR content related to Virtual BOI, the evaluation tool presented in this study can be used to assess the level of Virtual BOI. Additionally, by considering technological elements such as haptics and hand tracking, VR environments can be designed to enhance the level of Virtual BOI. Full article
Show Figures

Figure 1

24 pages, 1447 KiB  
Review
Effects of Haptic Feedback Interventions in Post-Stroke Gait and Balance Disorders: A Systematic Review and Meta-Analysis
by Maria Gomez-Risquet, Rocío Cáceres-Matos, Eleonora Magni and Carlos Luque-Moreno
J. Pers. Med. 2024, 14(9), 974; https://doi.org/10.3390/jpm14090974 - 14 Sep 2024
Cited by 3 | Viewed by 2581
Abstract
Background: Haptic feedback is an established method to provide sensory information (tactile or kinesthetic) about the performance of an activity that an individual can not consciously detect. After a stroke, hemiparesis usually leads to gait and balance disorders, where haptic feedback can [...] Read more.
Background: Haptic feedback is an established method to provide sensory information (tactile or kinesthetic) about the performance of an activity that an individual can not consciously detect. After a stroke, hemiparesis usually leads to gait and balance disorders, where haptic feedback can be a promising approach to promote recovery. The aim of the present study is to understand its potential effects on gait and balance impairments, both after interventions and in terms of immediate effects. Methods: This research was carried out using the following scientific databases: Embase, Scopus, Web of Science, and Medline/PubMed from inception to May 2024. The Checklist for Measuring quality, PEDro scale, and the Cochrane collaboration tool were used to assess the methodological quality and risk of bias of the studies. Results: Thirteen articles were chosen for qualitative analysis, with four providing data for the meta-analysis. The findings did not yield definitive evidence on the effectiveness of haptic feedback for treating balance and gait disorders following a stroke. Conclusions: Further research is necessary in order to determine the effectiveness of haptic feedback mechanisms, with larger sample sizes and more robust methodologies. Longer interventions and pre–post design in gait training with haptic feedback are necessary. Full article
Show Figures

Figure 1

22 pages, 9553 KiB  
Article
Application and Assessment of an Experiential Deformation Approach as a Didactive Tool of Truss Structures in Architectural Engineering
by Maristella E. Voutetaki
Educ. Sci. 2024, 14(4), 354; https://doi.org/10.3390/educsci14040354 - 28 Mar 2024
Cited by 2 | Viewed by 1273
Abstract
Experiential learning methods are advantageous for students as they motivate them to comprehend structural concepts without complex calculations, enhancing their inherent understanding of static principles. This research introduces a novel, cost-effective haptic didactic tool to enhance the approach to teaching trusses to students [...] Read more.
Experiential learning methods are advantageous for students as they motivate them to comprehend structural concepts without complex calculations, enhancing their inherent understanding of static principles. This research introduces a novel, cost-effective haptic didactic tool to enhance the approach to teaching trusses to students in a School of Architecture. The primary goal is to address challenges associated with the complexities of teaching structural systems within the context of architectural education. The proposed approach is related to the most critical issue, which is the state in which the individual elements are under applied load, compression, or tension. The approach explores the deformation of the truss elements and establishes a connection between their visible deformation and the stress they develop under various loads. As a didactic tool, this approach offers an alternative perspective to help students understand truss function under various loads. Also, an assessment procedure of learning outcomes and satisfaction indices has been structured to validate the impact on students on the proposed educational procedure. The findings underscore the significant educational efficiency of the proposed procedure as a sustainable way to connect the structural engineering challenges arising during design courses and creative skills in architecture engineering. Full article
Show Figures

Figure 1

14 pages, 2242 KiB  
Article
Fully Digital Audio Haptic Maps for Individuals with Blindness
by Howard Kaplan and Anna Pyayt
Disabilities 2024, 4(1), 64-78; https://doi.org/10.3390/disabilities4010005 - 9 Jan 2024
Cited by 2 | Viewed by 2605
Abstract
Tactile maps designed for individuals with blindness can greatly improve their mobility, safety and access to new locations. While 3D-printed maps have already been demonstrated to be a powerful tool for delivering spatial information, they might not always be available. Alternatively, a combination [...] Read more.
Tactile maps designed for individuals with blindness can greatly improve their mobility, safety and access to new locations. While 3D-printed maps have already been demonstrated to be a powerful tool for delivering spatial information, they might not always be available. Alternatively, a combination of audio and haptic information can be used to efficiently encode 2D maps. In this paper, we discuss the development and user-testing of a novel audio-haptic map creator application. Maps created using this application can provide people with blindness with a tool for understanding the navigational routes and layouts of spaces before physically visiting the site. Thirteen people with blindness tested various components of the virtual map application, such as audio, haptic feedback and navigation controls. Participants’ data and feedback were collected and analyzed to determine the effectiveness of the virtual maps as it relates to this user group’s readability and usability. The study showed that it was easy to use and that it efficiently delivered information about travel routes and landmarks that the participants could successfully understand. Full article
Show Figures

Figure 1

23 pages, 756 KiB  
Review
The Use of Tactile Sensors in Oral and Maxillofacial Surgery: An Overview
by Pietro Navalesi, Calogero Maria Oddo, Glauco Chisci, Andrea Frosolini, Paolo Gennaro, Vincenzo Abbate, Domenico Prattichizzo and Guido Gabriele
Bioengineering 2023, 10(7), 765; https://doi.org/10.3390/bioengineering10070765 - 26 Jun 2023
Cited by 15 | Viewed by 2922
Abstract
Background: This overview aimed to characterize the type, development, and use of haptic technologies for maxillofacial surgical purposes. The work aim is to summarize and evaluate current advantages, drawbacks, and design choices of presented technologies for each field of application in order to [...] Read more.
Background: This overview aimed to characterize the type, development, and use of haptic technologies for maxillofacial surgical purposes. The work aim is to summarize and evaluate current advantages, drawbacks, and design choices of presented technologies for each field of application in order to address and promote future research as well as to provide a global view of the issue. Methods: Relevant manuscripts were searched electronically through Scopus, MEDLINE/PubMed, and Cochrane Library databases until 1 November 2022. Results: After analyzing the available literature, 31 articles regarding tactile sensors and interfaces, sensorized tools, haptic technologies, and integrated platforms in oral and maxillofacial surgery have been included. Moreover, a quality rating is provided for each article following appropriate evaluation metrics. Discussion: Many efforts have been made to overcome the technological limits of computed assistant diagnosis, surgery, and teaching. Nonetheless, a research gap is evident between dental/maxillofacial surgery and other specialties such as endovascular, laparoscopic, and microsurgery; especially for what concerns electrical and optical-based sensors for instrumented tools and sensorized tools for contact forces detection. The application of existing technologies is mainly focused on digital simulation purposes, and the integration into Computer Assisted Surgery (CAS) is far from being widely actuated. Virtual reality, increasingly adopted in various fields of surgery (e.g., sino-nasal, traumatology, implantology) showed interesting results and has the potential to revolutionize teaching and learning. A major concern regarding the actual state of the art is the absence of randomized control trials and the prevalence of case reports, retrospective cohorts, and experimental studies. Nonetheless, as the research is fast growing, we can expect to see many developments be incorporated into maxillofacial surgery practice, after adequate evaluation by the scientific community. Full article
(This article belongs to the Special Issue Recent Advances in Oral and Craniofacial Reconstruction)
Show Figures

Graphical abstract

17 pages, 9552 KiB  
Article
A Modular 3-Degrees-of-Freedom Force Sensor for Robot-Assisted Minimally Invasive Surgery Research
by Zonghe Chua and Allison M. Okamura
Sensors 2023, 23(11), 5230; https://doi.org/10.3390/s23115230 - 31 May 2023
Cited by 10 | Viewed by 4056
Abstract
Effective force modulation during tissue manipulation is important for ensuring safe, robot-assisted, minimally invasive surgery (RMIS). Strict requirements for in vivo applications have led to prior sensor designs that trade off ease of manufacture and integration against force measurement accuracy along the tool [...] Read more.
Effective force modulation during tissue manipulation is important for ensuring safe, robot-assisted, minimally invasive surgery (RMIS). Strict requirements for in vivo applications have led to prior sensor designs that trade off ease of manufacture and integration against force measurement accuracy along the tool axis. Due to this trade-off, there are no commercial, off-the-shelf, 3-degrees-of-freedom (3DoF) force sensors for RMIS available to researchers. This makes it challenging to develop new approaches to indirect sensing and haptic feedback for bimanual telesurgical manipulation. We present a modular 3DoF force sensor that integrates easily with an existing RMIS tool. We achieve this by relaxing biocompatibility and sterilizability requirements and by using commercial load cells and common electromechanical fabrication techniques. The sensor has a range of ±5 N axially and ±3 N laterally with errors of below 0.15 N and maximum errors below 11% of the sensing range in all directions. During telemanipulation, a pair of jaw-mounted sensors achieved average errors below 0.15 N in all directions. It achieved an average grip force error of 0.156 N. The sensor is for bimanual haptic feedback and robotic force control in delicate tissue telemanipulation. As an open-source design, the sensors can be adapted to suit other non-RMIS robotic applications. Full article
(This article belongs to the Special Issue Medical Robotics 2022-2023)
Show Figures

Figure 1

Back to TopTop