Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (4)

Search Parameters:
Keywords = self-triggered touch

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
21 pages, 2871 KB  
Article
From Signal to Semantics: The Multimodal Haptic Informatics Index for Triangulating Haptic Intent at the Edge
by Song Xu, Chen Li, Jia-Rong Li and Teng-Wen Chang
Electronics 2026, 15(4), 832; https://doi.org/10.3390/electronics15040832 - 15 Feb 2026
Viewed by 355
Abstract
Modern interaction with smart devices is hindered by the “Midas Touch” problem, where sensors frequently misinterpret incidental physical movements as intentional commands due to a lack of human context. This research addresses this conflict by introducing the Multimodal Haptic Informatics (MHI) index within [...] Read more.
Modern interaction with smart devices is hindered by the “Midas Touch” problem, where sensors frequently misinterpret incidental physical movements as intentional commands due to a lack of human context. This research addresses this conflict by introducing the Multimodal Haptic Informatics (MHI) index within a novel Scene–Action–Trigger (SAT) framework. The goal is to contextualize mechanical movements as human intent by integrating physical, spatial, and cognitive data locally at the edge. The methodology employs an “Action-as-primary indexing” mechanism where the Action channel (IMU) serves as a temporal anchor t, triggering high-resolution Scene (computer vision) and Trigger (audio) processing only during critical haptic events. Validated through a complex origami crane task generating 29,408 data frames, the framework utilizes a three-stage informatics derivation process: single-modal scoring, score weighting, and hand state mapping. Results demonstrate that applying an adaptive “Speedometer” logic successfully reclassifies the “Transitional State”. While this state constitutes over half of the behavioral dataset (54.76% on average), it is effectively disambiguated into meaningful intent using a self-trained local Large Language Model (LLM) for semantic verification. Furthermore, the event-driven sampling of 93 keyframes reduces the processing overhead by 99.68% compared to linear annotation. This study contributes a low-latency, privacy-preserving “Protocol of Assent” that maintains user agency by providing intelligent system suggestions based on confirmed haptic intensity. Full article
(This article belongs to the Special Issue New Trends in Human-Computer Interactions for Smart Devices)
Show Figures

Figure 1

13 pages, 1260 KB  
Article
Perceptual Sensitivity to Tactile Stimuli Is Enhanced When One Is Aware That Tactile Stimulus Intensity Is Self-Controlled
by Hitoshi Oda, Shiho Fukuda, Hiroshi Kunimura, Taku Kawasaki, Han Gao, Moritaka Futamura and Koichi Hiraoka
Brain Sci. 2025, 15(3), 231; https://doi.org/10.3390/brainsci15030231 - 22 Feb 2025
Cited by 1 | Viewed by 2827
Abstract
Background and objectives: This study aimed to investigate whether perceptual sensitivity to tactile stimulus is affected by awareness of who controls the stimulus intensity. Methods: Thirteen healthy participants took part in this study. A participant held a dial and an experimenter held the [...] Read more.
Background and objectives: This study aimed to investigate whether perceptual sensitivity to tactile stimulus is affected by awareness of who controls the stimulus intensity. Methods: Thirteen healthy participants took part in this study. A participant held a dial and an experimenter held the other dial. One dial was to control the intensity of the tactile stimulus while the other (dummy dial) was inactive. The intensity of the tactile stimulus to the participant’s index finger providing each 1 s was increased by the participant or by someone else with or without the participants viewing a dial controlling the stimulus intensity. Results and conclusions: The stimulus intensity at the perceptual threshold, when controlled by the participant, was significantly lower compared to when controlled by someone else, regardless of visual availability. This indicates that awareness of the self-control of the tactile stimulus intensity enhances tactile sensitivity. The electrodermal level immediately preceding the stimulus at the perceptual threshold was significantly lower when the participant controlled the stimulus intensity compared to when it was controlled by someone else, with eyes closed. The electrodermal levels immediately before the perceptual threshold stimulus, when triggered by another person, were significantly higher with the eyes closed. These electrodermal findings suggest that cognitive stress is greater when the timing of the initial tactile perception is difficult to predict. Full article
(This article belongs to the Section Sensory and Motor Neuroscience)
Show Figures

Figure 1

14 pages, 5143 KB  
Article
A Self-Powered, Skin Adhesive, and Flexible Human–Machine Interface Based on Triboelectric Nanogenerator
by Xujie Wu, Ziyi Yang, Yu Dong, Lijing Teng, Dan Li, Hang Han, Simian Zhu, Xiaomin Sun, Zhu Zeng, Xiangyu Zeng and Qiang Zheng
Nanomaterials 2024, 14(16), 1365; https://doi.org/10.3390/nano14161365 - 20 Aug 2024
Cited by 10 | Viewed by 3126
Abstract
Human–machine interactions (HMIs) have penetrated into various academic and industrial fields, such as robotics, virtual reality, and wearable electronics. However, the practical application of most human–machine interfaces faces notable obstacles due to their complex structure and materials, high power consumption, limited effective skin [...] Read more.
Human–machine interactions (HMIs) have penetrated into various academic and industrial fields, such as robotics, virtual reality, and wearable electronics. However, the practical application of most human–machine interfaces faces notable obstacles due to their complex structure and materials, high power consumption, limited effective skin adhesion, and high cost. Herein, we report a self-powered, skin adhesive, and flexible human–machine interface based on a triboelectric nanogenerator (SSFHMI). Characterized by its simple structure and low cost, the SSFHMI can easily convert touch stimuli into a stable electrical signal at the trigger pressure from a finger touch, without requiring an external power supply. A skeleton spacer has been specially designed in order to increase the stability and homogeneity of the output signals of each TENG unit and prevent crosstalk between them. Moreover, we constructed a hydrogel adhesive interface with skin-adhesive properties to adapt to easy wear on complex human body surfaces. By integrating the SSFHMI with a microcontroller, a programmable touch operation platform has been constructed that is capable of multiple interactions. These include medical calling, music media playback, security unlocking, and electronic piano playing. This self-powered, cost-effective SSFHMI holds potential relevance for the next generation of highly integrated and sustainable portable smart electronic products and applications. Full article
(This article belongs to the Special Issue Self-Powered Flexible Sensors Based on Triboelectric Nanogenerators)
Show Figures

Figure 1

19 pages, 4267 KB  
Review
Recent Progress in Self-Powered Skin Sensors
by Jihong Rao, Zetong Chen, Danna Zhao, Yajiang Yin, Xiaofeng Wang and Fang Yi
Sensors 2019, 19(12), 2763; https://doi.org/10.3390/s19122763 - 19 Jun 2019
Cited by 40 | Viewed by 7700
Abstract
Self-powered skin sensors have attracted significant attention in recent years due to their great potential in medical care, robotics, prosthetics, and sports. More importantly, self-powered skin sensors do not need any energy-supply components like batteries, which allows them to work sustainably and saves [...] Read more.
Self-powered skin sensors have attracted significant attention in recent years due to their great potential in medical care, robotics, prosthetics, and sports. More importantly, self-powered skin sensors do not need any energy-supply components like batteries, which allows them to work sustainably and saves them the trouble of replacement of batteries. The self-powered skin sensors are mainly based on energy harvesters, with the device itself generating electrical signals when triggered by the detected stimulus or analyte, such as body motion, touch/pressure, acoustic sound, and chemicals in sweat. Herein, the recent research achievements of self-powered skin sensors are comprehensively and systematically reviewed. According to the different monitoring signals, the self-powered skin sensors are summarized and discussed with a focus on the working mechanism, device structure, and the sensing principle. Based on the recent progress, the key challenges that exist and the opportunities that lie ahead are also discussed. Full article
(This article belongs to the Section State-of-the-Art Sensors Technologies)
Show Figures

Figure 1

Back to TopTop