Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (23)

Search Parameters:
Keywords = passive haptics

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
14 pages, 7196 KB  
Article
Touch to Speak: Real-Time Tactile Pronunciation Feedback for Individuals with Speech and Hearing Impairments
by Anat Sharon, Roi Yozevitch and Eldad Holdengreber
Technologies 2025, 13(8), 345; https://doi.org/10.3390/technologies13080345 - 7 Aug 2025
Viewed by 1124
Abstract
This study presents a wearable haptic feedback system designed to support speech training for individuals with speech and hearing impairments. The system provides real-time tactile cues based on detected phonemes, helping users correct their pronunciation independently. Unlike prior approaches focused on passive reception [...] Read more.
This study presents a wearable haptic feedback system designed to support speech training for individuals with speech and hearing impairments. The system provides real-time tactile cues based on detected phonemes, helping users correct their pronunciation independently. Unlike prior approaches focused on passive reception or therapist-led instruction, our method enables active, phoneme-level feedback using a multimodal interface combining audio input, visual reference, and spatially mapped vibrotactile output. We validated the system through three user studies measuring pronunciation accuracy, phoneme discrimination, and learning over time. The results show a significant improvement in word articulation accuracy and user engagement. These findings highlight the potential of real-time haptic pronunciation tools as accessible, scalable aids for speech rehabilitation and second-language learning. Full article
Show Figures

Figure 1

30 pages, 14074 KB  
Review
Recent Advances in Wearable Thermal Devices for Virtual and Augmented Reality
by Minsu Park
Micromachines 2025, 16(4), 383; https://doi.org/10.3390/mi16040383 - 27 Mar 2025
Cited by 1 | Viewed by 1798
Abstract
Thermal technologies that effectively deliver thermal stimulation through skin-integrated systems and enable temperature perception via the activation of cutaneous thermoreceptors are key to enhancing immersive experiences in virtual and augmented reality (VR/AR) through multisensory engagement. However, recent advancements and commercial adoption have predominantly [...] Read more.
Thermal technologies that effectively deliver thermal stimulation through skin-integrated systems and enable temperature perception via the activation of cutaneous thermoreceptors are key to enhancing immersive experiences in virtual and augmented reality (VR/AR) through multisensory engagement. However, recent advancements and commercial adoption have predominantly focused on haptic rather than thermal technology. This review provides an overview of recent advancements in wearable thermal devices (WTDs) designed to reconstruct artificial thermal sensations for VR/AR applications. It examines key thermal stimulation parameters, including stimulation area, magnitude, and duration, with a focus on thermal perception mechanisms and thermoreceptor distribution in the skin. Input power requirements for surpassing thermal perception thresholds are discussed based on analytical modeling. Material choices for WTDs, including metal nanowires, carbon nanotubes, liquid metals, thermoelectric devices, and passive cooling elements, are introduced. The functionalities, device designs, operation modes, fabrication processes, and electrical and mechanical properties of various WTDs are analyzed. Representative applications illustrate how flexible, thin WTDs enable immersive VR/AR experiences through spatiotemporal, programmable stimulation. A concluding section summarizes key challenges and future opportunities in advancing skin–integrated VR/AR systems. Full article
(This article belongs to the Section E:Engineering and Technology)
Show Figures

Figure 1

13 pages, 1754 KB  
Article
Cross-Modal Interactions and Movement-Related Tactile Gating: The Role of Vision
by Maria Casado-Palacios, Alessia Tonelli, Claudio Campus and Monica Gori
Brain Sci. 2025, 15(3), 288; https://doi.org/10.3390/brainsci15030288 - 8 Mar 2025
Cited by 1 | Viewed by 1549
Abstract
Background: When engaging with the environment, multisensory cues interact and are integrated to create a coherent representation of the world around us, a process that has been suggested to be affected by the lack of visual feedback in blind individuals. In addition, the [...] Read more.
Background: When engaging with the environment, multisensory cues interact and are integrated to create a coherent representation of the world around us, a process that has been suggested to be affected by the lack of visual feedback in blind individuals. In addition, the presence of voluntary movement can be responsible for suppressing somatosensory information processed by the cortex, which might lead to a worse encoding of tactile information. Objectives: In this work, we aim to explore how cross-modal interaction can be affected by active movements and the role of vision in this process. Methods: To this end, we measured the precision of 18 blind individuals and 18 age-matched sighted controls in a velocity discrimination task. The participants were instructed to detect the faster stimulus between a sequence of two in both passive and active touch conditions. The sensory stimulation could be either just tactile or audio–tactile, where a non-informative sound co-occurred with the tactile stimulation. The measure of precision was obtained by computing the just noticeable difference (JND) of each participant. Results: The results show worse precision with the audio–tactile sensory stimulation in the active condition for the sighted group (p = 0.046) but not for the blind one (p = 0.513). For blind participants, only the movement itself had an effect. Conclusions: For sighted individuals, the presence of noise from active touch made them vulnerable to auditory interference. However, the blind group exhibited less sensory interaction, experiencing only the detrimental effect of movement. Our work should be considered when developing next-generation haptic devices. Full article
(This article belongs to the Special Issue Multisensory Perception of the Body and Its Movement)
Show Figures

Figure 1

22 pages, 1470 KB  
Review
Enhancing Presence, Immersion, and Interaction in Multisensory Experiences Through Touch and Haptic Feedback
by Yang Gao and Charles Spence
Virtual Worlds 2025, 4(1), 3; https://doi.org/10.3390/virtualworlds4010003 - 13 Jan 2025
Cited by 8 | Viewed by 8089
Abstract
In this narrative historical review, we take a closer look at the role of tactile/haptic stimulation in enhancing people’s immersion (and sense of presence) in a variety of entertainment experiences, including virtual reality (VR). An important distinction is highlighted between those situations in [...] Read more.
In this narrative historical review, we take a closer look at the role of tactile/haptic stimulation in enhancing people’s immersion (and sense of presence) in a variety of entertainment experiences, including virtual reality (VR). An important distinction is highlighted between those situations in which digital tactile stimulation and/or haptic feedback are delivered to those (i.e., users/audience members) who passively experience the stimulation and those cases, including VR, where the user actively controls some aspects of the tactile stimulation/haptic feedback that they happen to be experiencing. A further distinction is drawn between visual and/or auditory VR, where some form of tactile/haptic stimulation is added, and what might be classed as genuinely haptic VR, where the active user/player experiences tactile/haptic stimulation that is effortlessly interpreted in terms of the objects and actions in the virtual world. We review the experimental evidence that has assessed the impact of adding a tactile/haptic element to entertainment experiences, including those in VR. Finally, we highlight some of the key challenges to the growth of haptic VR in the context of multisensory entertainment experiences: these include those of a technical, financial, psychological (namely, the fact that tactile/haptic stimulation often needs to be interpreted and can reduce the sense of immersion in many situations), psycho-physiological (such as sensory overload or fatigue), physiological (e.g., relating to the large surface area of the skin that can potentially be stimulated), and creative/artistic nature. Full article
Show Figures

Figure 1

16 pages, 22837 KB  
Article
Learning to Walk with Adaptive Feet
by Antonello Scaldaferri, Franco Angelini and Manolo Garabini
Robotics 2024, 13(8), 113; https://doi.org/10.3390/robotics13080113 - 24 Jul 2024
Cited by 3 | Viewed by 2464
Abstract
In recent years, tasks regarding autonomous mobility favoredthe use of legged robots rather than wheeled ones thanks to their higher mobility on rough and uneven terrains. This comes at the cost of more complex motion planners and controllers to ensure robot stability and [...] Read more.
In recent years, tasks regarding autonomous mobility favoredthe use of legged robots rather than wheeled ones thanks to their higher mobility on rough and uneven terrains. This comes at the cost of more complex motion planners and controllers to ensure robot stability and balance. However, in the case of quadrupedal robots, balancing is simpler than it is for bipeds thanks to their larger support polygons. Until a few years ago, most scientists and engineers addressed the quadrupedal locomotion problem with model-based approaches, which require a great deal of modeling expertise. A new trend is the use of data-driven methods, which seem to be quite promising and have shown great results. These methods do not require any modeling effort, but they suffer from computational limitations dictated by the hardware resources used. However, only the design phase of these algorithms requires large computing resources (controller training); their execution in the operational phase (deployment), takes place in real time on common processors. Moreover, adaptive feet capable of sensing terrain profile information have been designed and have shown great performance. Still, no dynamic locomotion control method has been specifically designed to leverage the advantages and supplementary information provided by this type of adaptive feet. In this work, we investigate the use and evaluate the performance of different end-to-end control policies trained via reinforcement learning algorithms specifically designed and trained to work on quadrupedal robots equipped with passive adaptive feet for their dynamic locomotion control over a diverse set of terrains. We examine how the addition of the haptic perception of the terrain affects the locomotion performance. Full article
(This article belongs to the Special Issue Applications of Neural Networks in Robot Control)
Show Figures

Figure 1

16 pages, 10173 KB  
Article
Pipe Organ Design Including the Passive Haptic Feedback Technology and Measurement Analysis of Key Displacement, Pressure Force and Sound Organ Pipe
by Paweł Kowol, Pawel Nowak, Luca Di Nunzio, Gian Carlo Cardarilli, Giacomo Capizzi and Grazia Lo Sciuto
Appl. Syst. Innov. 2024, 7(3), 37; https://doi.org/10.3390/asi7030037 - 28 Apr 2024
Viewed by 3771
Abstract
In this work, an organ pipe instrument with a mechatronic control system including the Passive Haptic Feedback technology is implemented. The test bed consists of a motorized positioning stage mounted to a brace that is attached to a bridge on a platform. A [...] Read more.
In this work, an organ pipe instrument with a mechatronic control system including the Passive Haptic Feedback technology is implemented. The test bed consists of a motorized positioning stage mounted to a brace that is attached to a bridge on a platform. A simple pneumatic mechanism is designed and realized to achieve the same dynamics pressure for each measurement attempt on the keyboard. This system contain pipes, an air compressor, valves, and a piston connected to applied force pressure on the keyboard of the organ pipe. The pneumatic components, like valves and pressure regulators, mounted on the profile plate are connected to the main air supply line via flexible tubing or hoses to the air compressor and mechanical trucker. The pneumatic system has many types of valves that regulate the air speed, air flow, and power. The combination of valves and air compressor control the air flow and the mechanism of piston and pressure on the keyboard. The mechanical actuator presses the key to be tested, and a load cell detects the applied key force. A laser triangulation measurement system based on a Laser Displacement Sensor measures the displacement of the key during the key depression. The velocity of the key motion is controlled by the pneumatic actuator. A miniature-sized strain gauge load cell, which is mounted on a musical keyboard key, measures the contact force between the probe and the key. In addition, the quality of the audio signal generated by the organ instrument is estimated using the Hilbert transform. Full article
Show Figures

Figure 1

19 pages, 2174 KB  
Article
Teleoperated Surgical Robot with Adaptive Interactive Control Architecture for Tissue Identification
by Yubo Sheng, Haoyuan Cheng, Yiwei Wang, Huan Zhao and Han Ding
Bioengineering 2023, 10(10), 1157; https://doi.org/10.3390/bioengineering10101157 - 2 Oct 2023
Cited by 4 | Viewed by 3342
Abstract
The remote perception of teleoperated surgical robotics has been a critical issue for surgeons in fulfilling their remote manipulation tasks. In this article, an adaptive teleoperation control framework is proposed. It provides a physical human–robot interaction interface to enhance the ability of the [...] Read more.
The remote perception of teleoperated surgical robotics has been a critical issue for surgeons in fulfilling their remote manipulation tasks. In this article, an adaptive teleoperation control framework is proposed. It provides a physical human–robot interaction interface to enhance the ability of the operator to intuitively perceive the material properties of remote objects. The recursive least square (RLS) is adopted to estimate the required human hand stiffness that the operator can achieve to compensate for the contact force. Based on the estimated stiffness, a force feedback controller is designed to avoid the induced motion and to convey the haptic information of the slave side. The passivity of the proposed teleoperation system is ensured by the virtual energy tank. A stable contact test validated that the proposed method achieved stable contact between the slave robot and the hard environment while ensuring the transparency of the force feedback. A series of human subject experiments was conducted to empirically verify that the proposed teleoperation framework can provide a more smooth, dexterous, and intuitive user experience with a more accurate perception of the mechanical property of the interacted material on the slave side, compared to the baseline method. After the experiment, the design idea about the force feedback controller of the bilateral teleoperation is discussed. Full article
(This article belongs to the Special Issue Robotics in Medical Engineering)
Show Figures

Figure 1

19 pages, 4197 KB  
Article
Consumer Subjective Impressions in Virtual Reality Environments: The Role of the Visualization Technique in Product Evaluation
by Almudena Palacios-Ibáñez, Francisco Felip-Miralles, Julia Galán, Carlos García-García and Manuel Contero
Electronics 2023, 12(14), 3051; https://doi.org/10.3390/electronics12143051 - 12 Jul 2023
Cited by 7 | Viewed by 1774
Abstract
The availability and affordability of consumer virtual reality (VR) devices have fueled their adoption during the product design process. High fidelity virtual prototypes can be created more quickly and are more cost-effective than using traditional methods, but certain product features are still difficult [...] Read more.
The availability and affordability of consumer virtual reality (VR) devices have fueled their adoption during the product design process. High fidelity virtual prototypes can be created more quickly and are more cost-effective than using traditional methods, but certain product features are still difficult to evaluate, resulting in perceptual differences when a product is assessed using different visualization techniques. In this paper, we report two case studies in which a group of participants evaluated different designs of a product typology (i.e., a watering can) as presented in VR, VR with passive haptics (VRPH) and in a real setting (R) for the first case study, and VR and R for the second case study. The semantic differential technique was used for product evaluation, and an inferential statistical method using aligned rank transform (ART) proceedings was applied to determine perceptual differences between groups. Our results showed that product characteristics assessed by touch are the most susceptible to being affected by the environment, while the user background can have an effect in some product features. Full article
(This article belongs to the Special Issue Perception and Interaction in Mixed, Augmented, and Virtual Reality)
Show Figures

Figure 1

23 pages, 6366 KB  
Article
Perceptual Relevance of Haptic Feedback during Virtual Plucking, Bowing and Rubbing of Physically-Based Musical Resonators
by Marius George Onofrei, Federico Fontana and Stefania Serafin
Arts 2023, 12(4), 144; https://doi.org/10.3390/arts12040144 - 7 Jul 2023
Cited by 2 | Viewed by 2158
Abstract
The physics-based design and realization of a digital musical interface asks for the modeling and implementation of the contact-point interaction with the performer. Musical instruments always include a resonator that converts the input energy into sound, meanwhile feeding part of it back to [...] Read more.
The physics-based design and realization of a digital musical interface asks for the modeling and implementation of the contact-point interaction with the performer. Musical instruments always include a resonator that converts the input energy into sound, meanwhile feeding part of it back to the performer through the same point. Specifically during plucking or bowing interactions, musicians receive a handful of information from the force feedback and vibrations coming from the contact points. This paper focuses on the design and realization of digital music interfaces realizing two physical interactions along with a musically unconventional one, rubbing, rarely encountered in assimilable forms across the centuries on a few instruments. Therefore, it aims to highlight the significance of haptic rendering in improving quality during a musical experience as opposed to interfaces provided with a passive contact point. Current challenges are posed by the specific requirements of the haptic device, as well as the computational effort needed for realizing such interactions without occurrence during the performance of typical digital artifacts such as latency and model instability. Both are however seemingly transitory due to the constant evolution of computer systems for virtual reality and the progressive popularization of haptic interfaces in the sonic interaction design community. In summary, our results speak in favor of adopting nowadays haptic technologies as an essential component for digital musical interfaces affording point-wise contact interactions in the personal performance space. Full article
(This article belongs to the Special Issue Feeling the Future—Haptic Audio)
Show Figures

Figure 1

18 pages, 12239 KB  
Article
A Novel Master–Slave Interventional Surgery Robot with Force Feedback and Collaborative Operation
by Yu Song, Liutao Li, Yu Tian, Zhiwei Li and Xuanchun Yin
Sensors 2023, 23(7), 3584; https://doi.org/10.3390/s23073584 - 29 Mar 2023
Cited by 8 | Viewed by 4048
Abstract
In recent years, master–slave vascular robots have been developed to address the problem of radiation exposure during vascular interventions for surgeons. However, the single visual feedback reduces surgeon immersion and transparency of the system. In this work, we have developed a haptic interface [...] Read more.
In recent years, master–slave vascular robots have been developed to address the problem of radiation exposure during vascular interventions for surgeons. However, the single visual feedback reduces surgeon immersion and transparency of the system. In this work, we have developed a haptic interface based on the magnetorheological fluid (MRF) on the master side. The haptic interface can provide passive feedback force with high force fidelity and low inertia. Additionally, the manipulation of the master device does not change the operating posture of traditional surgery, which allows the surgeon to better adapt to the robotic system. For the slave robot, the catheter and guidewire can be navigated simultaneously which allows the two degrees of action on the catheter and axial action of a guidewire. The resistance force of the catheter navigation is measured and reflected to the user through the master haptic interface. To verify the proposed master–slave robotic system, the evaluation experiments are carried out in vitro, and the effectiveness of the system was demonstrated experimentally. Full article
(This article belongs to the Special Issue Sensors and Imaging for Medical Robotics)
Show Figures

Figure 1

17 pages, 2567 KB  
Article
Touch Matters: The Impact of Physical Contact on Haptic Product Perception in Virtual Reality
by Francisco Felip, Julia Galán, Manuel Contero and Carlos García-García
Appl. Sci. 2023, 13(4), 2649; https://doi.org/10.3390/app13042649 - 18 Feb 2023
Cited by 10 | Viewed by 3825
Abstract
Nowadays, the presentation of products through virtual reality and other online media coexists with traditional means. However, while some products may be perceived correctly in digital media, others may need physical contact. In this scenario, this work analyses how presenting a product highlighted [...] Read more.
Nowadays, the presentation of products through virtual reality and other online media coexists with traditional means. However, while some products may be perceived correctly in digital media, others may need physical contact. In this scenario, this work analyses how presenting a product highlighted for its haptic properties and the presence or absence of physical contact during the presentation can influence the perception of its attributes and stimulate purchase intention. To this end, an experiment was designed in which each participant viewed and interacted with a chair presented in five different means that elicited a greater or lesser sense of presence. Participants evaluated the product’s attributes on a semantic scale with bipolar pairs. No relation was found between the presentation means and users’ purchase intention. However, results showed significant differences in the evaluation of some physical characteristics depending on the presentation means, and the product was generally more liked when presented in means in which it could be touched. We conclude that choosing means that allow a product to be touched and elicit a greater sense of presence may impact more positively on evaluations of haptic features when presenting a product with high haptic importance. Full article
(This article belongs to the Special Issue User Experience in Extended Reality)
Show Figures

Figure 1

13 pages, 3378 KB  
Article
Exploring the Effect of Virtual Environments on Passive Haptic Perception
by Daehwan Kim, Yongwan Kim and Dongsik Jo
Appl. Sci. 2023, 13(1), 299; https://doi.org/10.3390/app13010299 - 26 Dec 2022
Cited by 9 | Viewed by 3788
Abstract
Recent advances in virtual reality (VR) technologies such as immersive head-mounted display (HMD), sensing devices, and 3D printing-based props have become much more feasible for providing improved experiences for users in virtual environments. In particular, research on haptic feedback is being actively conducted [...] Read more.
Recent advances in virtual reality (VR) technologies such as immersive head-mounted display (HMD), sensing devices, and 3D printing-based props have become much more feasible for providing improved experiences for users in virtual environments. In particular, research on haptic feedback is being actively conducted to enhance the effect of controlling virtual objects. Studies have begun to use real objects that resemble virtual objects, i.e., passive haptic, instead of using haptic equipment with motor control, as an effective method that allows natural interaction. However, technical difficulties must be resolved to match transformations (e.g., position, orientation, and scale) between virtual and real objects to maximize the user’s immersion. In this paper, we compare and explore the effect of passive haptic parameters on the user’s perception by using different transformation conditions in immersive virtual environments. Our experimental study shows that the participants felt the same within a certain range, which seems to support the “minimum cue” theory in giving sufficient sensory stimulation. Thus, considering the benefits of the model using our approach, haptic interaction in VR content can be developed in a more economical way. Full article
(This article belongs to the Special Issue Future Information & Communication Engineering 2022)
Show Figures

Figure 1

14 pages, 6437 KB  
Article
A Cost-Effective, Integrated Haptic Device for an Exoskeletal System
by Maciej Rećko, Kazimierz Dzierżek, Rafał Grądzki and Jozef Živčák
Sensors 2022, 22(23), 9508; https://doi.org/10.3390/s22239508 - 5 Dec 2022
Cited by 1 | Viewed by 2176
Abstract
The paper presents an innovative integrated sensor-effector designed for use in exoskeletal haptic devices. The research efforts aimed to achieve high cost-effectiveness for a design assuring proper monitoring of joint rotations and providing passive force feedback. A review of market products revealed that [...] Read more.
The paper presents an innovative integrated sensor-effector designed for use in exoskeletal haptic devices. The research efforts aimed to achieve high cost-effectiveness for a design assuring proper monitoring of joint rotations and providing passive force feedback. A review of market products revealed that there is space for new designs of haptic devices with such features. To determine the feasibility of the proposed solution, a series of simulations and experiments were conducted to verify the adopted design concept. The focus was set on an investigation of the force of attraction between one and two magnets interacting with a steel plate. Further, a physical model of an integrated joint was fabricated, and its performance was evaluated and compared to a similar commercially available device. The proposed solution is cost-effective due to the use of standard parts and inexpensive components. However, it is light and assures a 19 Nm braking torque adequate for the intended use as a haptic device for upper limbs. Full article
(This article belongs to the Special Issue Challenges and Future Trends of Wearable Robotics)
Show Figures

Figure 1

16 pages, 4939 KB  
Article
Improving the Force Display of Haptic Device Based on Gravity Compensation for Surgical Robotics
by Lixing Jin, Xingguang Duan, Rui He, Fansheng Meng and Changsheng Li
Machines 2022, 10(10), 903; https://doi.org/10.3390/machines10100903 - 7 Oct 2022
Cited by 5 | Viewed by 2788
Abstract
Haptic devices are applied as masters to provide force displays for telemedicinal robots. Gravity compensation has been proven to be crucial for the accuracy and capability of force displays, which are critical for haptic devices to assist operators. Therefore, the existing method suffers [...] Read more.
Haptic devices are applied as masters to provide force displays for telemedicinal robots. Gravity compensation has been proven to be crucial for the accuracy and capability of force displays, which are critical for haptic devices to assist operators. Therefore, the existing method suffers from an unsatisfactory effect, a complex implementation, and low efficiency. In this paper, an approach combining active and passive gravity compensation is proposed to improve the performance of a force display. The passive compensation is conducted by counterweights fixed with the moving platform and pantographs to offset most of the gravity and reduce the loads of the motors, while the peak capability of the force display is enhanced. The required weight is optimized by a multi-objective genetic algorithm in terms of the maximum torque of the motors in the global workspace. As a supplement, the residual gravity is eliminated by active compensation to extend the accuracy of the force display. The balancing forces in the discretized workspace are entirely calibrated, and the required force for the arbitrary configuration is calculated by interpolations. The decisions regarding the algorithm parameters are also discussed to achieve a compromise between the effect and elapsed time. Finally, the prototype with a compensation mechanism is implemented and experiments are carried out to verify the performance of the proposed method. The results show that the peak capability of the force display is enhanced by 45.43% and the maximum deviation is lowered to 0.6 N. Full article
Show Figures

Figure 1

16 pages, 8066 KB  
Article
The Identification of Non-Driving Activities with Associated Implication on the Take-Over Process
by Lichao Yang, Mahdi Babayi Semiromi, Yang Xing, Chen Lv, James Brighton and Yifan Zhao
Sensors 2022, 22(1), 42; https://doi.org/10.3390/s22010042 - 22 Dec 2021
Cited by 7 | Viewed by 3539
Abstract
In conditionally automated driving, the engagement of non-driving activities (NDAs) can be regarded as the main factor that affects the driver’s take-over performance, the investigation of which is of great importance to the design of an intelligent human–machine interface for a safe and [...] Read more.
In conditionally automated driving, the engagement of non-driving activities (NDAs) can be regarded as the main factor that affects the driver’s take-over performance, the investigation of which is of great importance to the design of an intelligent human–machine interface for a safe and smooth control transition. This paper introduces a 3D convolutional neural network-based system to recognize six types of driver behaviour (four types of NDAs and two types of driving activities) through two video feeds based on head and hand movement. Based on the interaction of driver and object, the selected NDAs are divided into active mode and passive mode. The proposed recognition system achieves 85.87% accuracy for the classification of six activities. The impact of NDAs on the perspective of the driver’s situation awareness and take-over quality in terms of both activity type and interaction mode is further investigated. The results show that at a similar level of achieved maximum lateral error, the engagement of NDAs demands more time for drivers to accomplish the control transition, especially for the active mode NDAs engagement, which is more mentally demanding and reduces drivers’ sensitiveness to the driving situation change. Moreover, the haptic feedback torque from the steering wheel could help to reduce the time of the transition process, which can be regarded as a productive assistance system for the take-over process. Full article
(This article belongs to the Special Issue Artificial Intelligence Based Autonomous Vehicles)
Show Figures

Figure 1

Back to TopTop