Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (12)

Search Parameters:
Keywords = thumb-to-finger gestures

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
21 pages, 7327 KB  
Article
FingerType: One-Handed Thumb-to-Finger Text Input Using 3D Hand Tracking
by Nuo Jia, Minghui Sun, Yan Li, Yang Tian and Tao Sun
Sensors 2026, 26(3), 897; https://doi.org/10.3390/s26030897 - 29 Jan 2026
Abstract
We present FingerType, a one-handed text input method based on thumb-to-finger gestures. FingerType detects tap events from 3D hand data using a Temporal Convolutional Network (TCN) and decodes the tap sequence into words with an n-gram language model. To inform the design, we [...] Read more.
We present FingerType, a one-handed text input method based on thumb-to-finger gestures. FingerType detects tap events from 3D hand data using a Temporal Convolutional Network (TCN) and decodes the tap sequence into words with an n-gram language model. To inform the design, we examined thumb-to-finger interactions and collected comfort ratings of finger regions. We used these results to design an improved T9-style key layout. Our system runs at 72 frames per second and reaches 94.97% accuracy for tap detection. We conducted a six-block user study with 24 participants and compared FingerType with controller input and touch input. Entry speed increased from 5.88 WPM in the first practice block to 10.63 WPM in the final block. FingerType also supported more eyes-free typing: attention on the display panel within ±15 of head-gaze was 84.41%, higher than touch input (69.47%). Finally, we report error patterns and WPM learning curves, and a model-based analysis suggests improving gesture recognition accuracy could further increase speed and narrow the gap to traditional VR input methods. Full article
(This article belongs to the Special Issue Sensing Technology to Measure Human-Computer Interactions)
19 pages, 3770 KB  
Article
Evaluating Stroke-Related Motor Impairment and Recovery Using Macroscopic and Microscopic Features of HD-sEMG
by Wenting Qin, Xin Tan, Yi Yu, Yujie Zhang, Zhanhui Lin, Chenyun Dai, Yuxiang Yang, Lingyu Liu and Lingjing Jin
Bioengineering 2025, 12(12), 1357; https://doi.org/10.3390/bioengineering12121357 - 12 Dec 2025
Viewed by 557
Abstract
Stroke-induced motor impairment necessitates objective and quantitative assessment tools for rehabilitation planning. In this study, a gesture-specific framework based on high-density surface electromyography (HD-sEMG) was developed to characterize neuromuscular dysfunction using eight macroscopic features and two microscopic motor unit decomposition features. HD-sEMG recordings [...] Read more.
Stroke-induced motor impairment necessitates objective and quantitative assessment tools for rehabilitation planning. In this study, a gesture-specific framework based on high-density surface electromyography (HD-sEMG) was developed to characterize neuromuscular dysfunction using eight macroscopic features and two microscopic motor unit decomposition features. HD-sEMG recordings were collected from stroke patients (n = 11; affected and unaffected sides) and healthy controls (n = 8; dominant side) during seven standardized hand gestures. Feature-level comparisons revealed hierarchical abnormalities, with the affected side showing significantly reduced activation/coordination relative to healthy controls, while the unaffected side exhibited intermediate deviations. For each gesture, dedicated K-nearest neighbors (KNN) models were constructed for clinical validation. For Brunnstrom stage classification, wrist extension yielded the best performance, achieving 92.08% accuracy and effectively discriminating severe (Stage 4), moderate (Stage 5), and mild (Stage 6) impairment as well as healthy controls. For fine motor recovery prediction, the thumb–index–middle finger pinch provided the optimal regression performance, predicting Upper Extremity Fugl–Meyer Assessment (UE-FMA) scores with R = 0.86 and RMSE = 3.24. These results indicate that gesture selection should be aligned with the clinical endpoint: wrist extension is most informative for gross recovery staging, whereas pinch gestures better capture fine motor control. Overall, the proposed HD-sEMG framework provides an objective approach for monitoring post-stroke recovery and supporting personalized rehabilitation assessment. Full article
(This article belongs to the Section Biosignal Processing)
Show Figures

Figure 1

12 pages, 7963 KB  
Data Descriptor
SurfaceEMG Datasets for Hand Gesture Recognition Under Constant and Three-Level Force Conditions
by Cinthya Alejandra Zúñiga-Castillo, Víctor Alejandro Anaya-Mosqueda, Natalia Margarita Rendón-Caballero, Marcos Aviles, José M. Álvarez-Alvarado, Roberto Augusto Gómez-Loenzo and Juvenal Rodríguez-Reséndiz
Data 2025, 10(12), 194; https://doi.org/10.3390/data10120194 - 22 Nov 2025
Cited by 1 | Viewed by 1086
Abstract
This work introduces two complementary surface electromyography (sEMG) datasets for hand gesture recognition. Signals were collected from 40 healthy subjects aged 18 to 40 years, divided into two independent groups of 20 participants each. In both datasets, subjects performed five hand gestures. Most [...] Read more.
This work introduces two complementary surface electromyography (sEMG) datasets for hand gesture recognition. Signals were collected from 40 healthy subjects aged 18 to 40 years, divided into two independent groups of 20 participants each. In both datasets, subjects performed five hand gestures. Most of the gestures are the same, although the exact set and the order differ slightly between datasets. For example, Dataset 2 (DS2) includes the simultaneous flexion of the thumb and index finger, which is not present in Dataset 1 (DS1). Data were recorded with three bipolar sEMG sensors placed on the dominant forearm (flexor digitorum superficialis, extensor digitorum, and flexor pollicis longus). A battery-powered acquisition system was used, with sampling rates of 1000 Hz for DS1 and 1500 Hz for DS2. DS1 contains recordings performed at a constant moderate force, while DS2 includes three force levels (low, medium, and high). Both datasets provide raw signals and pre-processed versions segmented into overlapping windows, with clear file structures and annotations, enabling feature extraction for machine learning applications. Together, they constitute a large-scale standardized sEMG resource that supports the development and benchmarking of gesture and force recognition algorithms for rehabilitation, assistive technologies, and prosthetic control. Full article
Show Figures

Figure 1

12 pages, 1769 KB  
Article
Optimizing Sensor Placement and Machine Learning Techniques for Accurate Hand Gesture Classification
by Lakshya Chaplot, Sara Houshmand, Karla Beltran Martinez, John Andersen and Hossein Rouhani
Electronics 2024, 13(15), 3072; https://doi.org/10.3390/electronics13153072 - 3 Aug 2024
Cited by 5 | Viewed by 3138
Abstract
Millions of individuals are living with upper extremity amputations, making them potential beneficiaries of hand and arm prostheses. While myoelectric prostheses have evolved to meet amputees’ needs, challenges remain related to their control. This research leverages surface electromyography sensors and machine learning techniques [...] Read more.
Millions of individuals are living with upper extremity amputations, making them potential beneficiaries of hand and arm prostheses. While myoelectric prostheses have evolved to meet amputees’ needs, challenges remain related to their control. This research leverages surface electromyography sensors and machine learning techniques to classify five fundamental hand gestures. By utilizing features extracted from electromyography data, we employed a nonlinear, multiple-kernel learning-based support vector machine classifier for gesture recognition. Our dataset encompassed eight young nondisabled participants. Additionally, our study conducted a comparative analysis of five distinct sensor placement configurations. These configurations capture electromyography data associated with index finger and thumb movements, as well as index finger and ring finger movements. We also compared four different classifiers to determine the most capable one to classify hand gestures. The dual-sensor setup strategically placed to capture thumb and index finger movements was the most effective—this dual-sensor setup achieved 90% accuracy for classifying all five gestures using the support vector machine classifier. Furthermore, the application of multiple-kernel learning within the support vector machine classifier showcases its efficacy, achieving the highest classification accuracy amongst all classifiers. This study showcased the potential of surface electromyography sensors and machine learning in enhancing the control and functionality of myoelectric prostheses for individuals with upper extremity amputations. Full article
Show Figures

Figure 1

25 pages, 52553 KB  
Article
Supervised Myoelectrical Hand Gesture Recognition in Post-Acute Stroke Patients with Upper Limb Paresis on Affected and Non-Affected Sides
by Alexey Anastasiev, Hideki Kadone, Aiki Marushima, Hiroki Watanabe, Alexander Zaboronok, Shinya Watanabe, Akira Matsumura, Kenji Suzuki, Yuji Matsumaru and Eiichi Ishikawa
Sensors 2022, 22(22), 8733; https://doi.org/10.3390/s22228733 - 11 Nov 2022
Cited by 20 | Viewed by 5792
Abstract
In clinical practice, acute post-stroke paresis of the extremities fundamentally complicates timely rehabilitation of motor functions; however, recently, residual and distorted musculoskeletal signals have been used to initiate feedback-driven solutions for establishing motor rehabilitation. Here, we investigate the possibilities of basic hand gesture [...] Read more.
In clinical practice, acute post-stroke paresis of the extremities fundamentally complicates timely rehabilitation of motor functions; however, recently, residual and distorted musculoskeletal signals have been used to initiate feedback-driven solutions for establishing motor rehabilitation. Here, we investigate the possibilities of basic hand gesture recognition in acute stroke patients with hand paresis using a novel, acute stroke, four-component multidomain feature set (ASF-4) with feature vector weight additions (ASF-14NP, ASF-24P) and supervised learning algorithms trained only by surface electromyography (sEMG). A total of 19 (65.9 ± 12.4 years old; 12 men, seven women) acute stroke survivors (12.4 ± 6.3 days since onset) with hand paresis (Brunnstrom stage 4 ± 1/4 ± 1, SIAS 3 ± 1/3 ± 2, FMA-UE 40 ± 20) performed 10 repetitive hand movements reflecting basic activities of daily living (ADLs): rest, fist, pinch, wrist flexion, wrist extension, finger spread, and thumb up. Signals were recorded using an eight-channel, portable sEMG device with electrode placement on the forearms and thenar areas of both limbs (four sensors on each extremity). Using data preprocessing, semi-automatic segmentation, and a set of extracted feature vectors, support vector machine (SVM), linear discriminant analysis (LDA), and k-nearest neighbors (k-NN) classifiers for statistical comparison and validity (paired t-tests, p-value < 0.05), we were able to discriminate myoelectrical patterns for each gesture on both paretic and non-paretic sides. Despite any post-stroke conditions, the evaluated total accuracy rate by the 10-fold cross-validation using SVM among four-, five-, six-, and seven-gesture models were 96.62%, 94.20%, 94.45%, and 95.57% for non-paretic and 90.37%, 88.48%, 88.60%, and 89.75% for paretic limbs, respectively. LDA had competitive results using PCA whereas k-NN was a less efficient classifier in gesture prediction. Thus, we demonstrate partial efficacy of the combination of sEMG and supervised learning for upper-limb rehabilitation procedures for early acute stroke motor recovery and various treatment applications. Full article
(This article belongs to the Special Issue Electromyography (EMG) Signal Acquisition and Processing)
Show Figures

Figure 1

15 pages, 4745 KB  
Article
Mechanical Design Optimization of Prosthetic Hand’s Fingers: Novel Solutions towards Weight Reduction
by Federica Buccino, Alessandro Bunt, Alex Lazell and Laura Maria Vergani
Materials 2022, 15(7), 2456; https://doi.org/10.3390/ma15072456 - 26 Mar 2022
Cited by 9 | Viewed by 4579
Abstract
From the mechanical function of grabbing objects to the emotional aspect of gesturing, the functionality of human hands is fundamental for both physical and social survival. Therefore, the loss of one or both hands represents a devastating issue, exacerbated by long rehabilitation times [...] Read more.
From the mechanical function of grabbing objects to the emotional aspect of gesturing, the functionality of human hands is fundamental for both physical and social survival. Therefore, the loss of one or both hands represents a devastating issue, exacerbated by long rehabilitation times and psychological treatments. Prosthetic arms represent an effective solution to provide concrete functional and esthetical support. However, commercial hand prostheses still lack an optimal combination of light weight, durability, adequate cosmetic appearance, and affordability. Among these aspects, the priority for upper-limb prosthesis users is weight, a key parameter that influences both the portability and the functionality of the system. The purpose of this work is to optimize the design of the MyHand prosthesis, by redesigning both the proximal and distal finger and thumb in light of finding an optimal balance between weight reduction and adequate stiffness. Starting from elastic–plastic numerical models and experimental tests on obsolete components, analyzed under the worst loading condition, five different design solutions are suggested. An iterative topology optimization process locates the regions where material removal is permitted. From these results, 2 mm geometrical patterns on the top surface of the hand prosthesis appear as the most prominent, preventing object intrusion. Full article
(This article belongs to the Section Materials Simulation and Design)
Show Figures

Graphical abstract

17 pages, 12357 KB  
Article
Pneumatic Bionic Hand with Rigid-Flexible Coupling Structure
by Chang Chen, Jiteng Sun, Long Wang, Guojin Chen, Ming Xu, Jing Ni, Rizauddin Ramli, Shaohui Su and Changyong Chu
Materials 2022, 15(4), 1358; https://doi.org/10.3390/ma15041358 - 13 Feb 2022
Cited by 18 | Viewed by 4921
Abstract
This paper presents a rigid-flexible composite of bionic hand structure design scheme solution for solving the problem of low load on the soft gripping hand. The bionic hand was designed based on the Fast Pneumatic Network (FPN) approach, which can produce a soft [...] Read more.
This paper presents a rigid-flexible composite of bionic hand structure design scheme solution for solving the problem of low load on the soft gripping hand. The bionic hand was designed based on the Fast Pneumatic Network (FPN) approach, which can produce a soft finger bending drive mechanism. A soft finger bending driver was developed and assembled into a human-like soft gripping hand which includes a thumb for omnidirectional movement and four modular soft fingers. An experimental comparison of silicone rubber materials with different properties was conducted to determine suitable materials. The combination of 3D printing technology and mold pouring technology was adopted to complete the prototype preparation of the bionic hand. Based on the second-order Yeoh model, a soft bionic finger mathematical model was established, and ABAQUS simulation analysis software was used for correction to verify the feasibility of the soft finger bending. We adopted a pneumatic control scheme based on a motor micro-pump and developed a human–computer interface through LabView. A comparative experiment was carried out on the bending performance of the finger, and the experimental data were analyzed to verify the accuracy of the mathematical model and simulation. In this study, the control system was designed, and the human-like finger gesture and grasping experiments were carried out. Full article
Show Figures

Figure 1

15 pages, 4361 KB  
Article
Design of an Effective Prosthetic Hand System for Adaptive Grasping with the Control of Myoelectric Pattern Recognition Approach
by Yanchao Wang, Ye Tian, Haotian She, Yinlai Jiang, Hiroshi Yokoi and Yunhui Liu
Micromachines 2022, 13(2), 219; https://doi.org/10.3390/mi13020219 - 29 Jan 2022
Cited by 30 | Viewed by 18434
Abstract
In this paper, we develop a prosthetic bionic hand system to realize adaptive gripping with two closed-loop control loops by using a linear discriminant analysis algorithm (LDA). The prosthetic hand contains five fingers and each finger is driven by a linear servo motor. [...] Read more.
In this paper, we develop a prosthetic bionic hand system to realize adaptive gripping with two closed-loop control loops by using a linear discriminant analysis algorithm (LDA). The prosthetic hand contains five fingers and each finger is driven by a linear servo motor. When grasping objects, four fingers except the thumb would adjust automatically and bend with an appropriate gesture, while the thumb is stretched and bent by the linear servo motor. Since the change of the surface electromechanical signal (sEMG) occurs before human movement, the recognition of sEMG signal with LDA algorithm can help to obtain people’s action intention in advance, and then timely send control instructions to assist people to grasp. For activity intention recognition, we extract three features, Variance (VAR), Root Mean Square (RMS) and Minimum (MIN) for recognition. As the results show, it can achieve an average accuracy of 96.59%. This helps our system perform well for disabilities to grasp objects of different sizes and shapes adaptively. Finally, a test of the people with disabilities grasping 15 objects of different sizes and shapes was carried out and achieved good experimental results. Full article
(This article belongs to the Special Issue Wearable Robotics)
Show Figures

Figure 1

13 pages, 4090 KB  
Article
Development of Multifunctional Myoelectric Hand Prosthesis System with Easy and Effective Mode Change Control Method Based on the Thumb Position and State
by Sung-Yoon Jung, Seung-Gi Kim, Joo-Hyung Kim and Se-Hoon Park
Appl. Sci. 2021, 11(16), 7295; https://doi.org/10.3390/app11167295 - 9 Aug 2021
Cited by 5 | Viewed by 7762
Abstract
Commercial multi-degrees-of-freedom (multi-DOF) myoelectric hand prostheses can perform various hand gestures and grip motions using multiple DOFs. However, as most upper limb amputees have less than two electromyogram (EMG) signals generated at the amputation site, it is difficult to control various hand gestures [...] Read more.
Commercial multi-degrees-of-freedom (multi-DOF) myoelectric hand prostheses can perform various hand gestures and grip motions using multiple DOFs. However, as most upper limb amputees have less than two electromyogram (EMG) signals generated at the amputation site, it is difficult to control various hand gestures and grip motions using multi-DOF myoelectric hand prostheses. This paper proposes a multifunctional myoelectric hand prosthesis system that uses only two EMG sensors while improving the convenience of upper limb amputees in everyday life. The proposed system comprises a six-DOF myoelectric hand prosthesis and an easy and effective control algorithm that enables upper limb amputees to perform various hand gestures and grip motions. More specifically, the hand prosthesis has a multi-DOF five-finger mechanism and a small controller that can be mounted inside the hand, allowing it to perform various hand gestures and grip motions. The control algorithm facilitates four grip motions and four gesture motions using the adduction and abduction positions of the thumb, the flexion and extension state of the thumb, and three EMG signals (co-contraction, flexion, and extension) generated using the two EMG sensors. Experimental results indicate that the proposed system is a versatile, flexible, and effective hand prosthesis system for upper limb amputees. Full article
Show Figures

Figure 1

20 pages, 8952 KB  
Article
Development of a Low-Cost Wearable Data Glove for Capturing Finger Joint Angles
by Changcheng Wu, Keer Wang, Qingqing Cao, Fei Fei, Dehua Yang, Xiong Lu, Baoguo Xu, Hong Zeng and Aiguo Song
Micromachines 2021, 12(7), 771; https://doi.org/10.3390/mi12070771 - 30 Jun 2021
Cited by 19 | Viewed by 4904
Abstract
Capturing finger joint angle information has important applications in human–computer interaction and hand function evaluation. In this paper, a novel wearable data glove is proposed for capturing finger joint angles. A sensing unit based on a grating strip and an optical detector is [...] Read more.
Capturing finger joint angle information has important applications in human–computer interaction and hand function evaluation. In this paper, a novel wearable data glove is proposed for capturing finger joint angles. A sensing unit based on a grating strip and an optical detector is specially designed for finger joint angle measurement. To measure the angles of finger joints, 14 sensing units are arranged on the back of the glove. There is a sensing unit on the back of each of the middle phalange, proximal phalange, and metacarpal of each finger, except for the thumb. For the thumb, two sensing units are distributed on the back of the proximal phalange and metacarpal, respectively. Sensing unit response tests and calibration experiments are conducted to evaluate the feasibility of using the designed sensing unit for finger joint measurement. Experimental results of calibration show that the comprehensive precision of measuring the joint angle of a wooden finger model is 1.67%. Grasping tests and static digital gesture recognition experiments are conducted to evaluate the performance of the designed glove. We achieve a recognition accuracy of 99% by using the designed glove and a generalized regression neural network (GRNN). These preliminary experimental results indicate that the designed data glove is effective in capturing finger joint angles. Full article
(This article belongs to the Special Issue Wearable Robotics)
Show Figures

Figure 1

22 pages, 8784 KB  
Article
Grasp Posture Control of Wearable Extra Robotic Fingers with Flex Sensors Based on Neural Network
by Joga Dharma Setiawan, Mochammad Ariyanto, M. Munadi, Muhammad Mutoha, Adam Glowacz and Wahyu Caesarendra
Electronics 2020, 9(6), 905; https://doi.org/10.3390/electronics9060905 - 29 May 2020
Cited by 27 | Viewed by 5949
Abstract
This study proposes a data-driven control method of extra robotic fingers to assist a user in bimanual object manipulation that requires two hands. The robotic system comprises two main parts, i.e., robotic thumb (RT) and robotic fingers (RF). The RT is attached next [...] Read more.
This study proposes a data-driven control method of extra robotic fingers to assist a user in bimanual object manipulation that requires two hands. The robotic system comprises two main parts, i.e., robotic thumb (RT) and robotic fingers (RF). The RT is attached next to the user’s thumb, while the RF is located next to the user’s little finger. The grasp postures of the RT and RF are driven by bending angle inputs of flex sensors, attached to the thumb and other fingers of the user. A modified glove sensor is developed by attaching three flex sensors to the thumb, index, and middle fingers of a wearer. Various hand gestures are then mapped using a neural network. The input data of the robotic system are the bending angles of thumb and index, read by flex sensors, and the outputs are commanded servo angles for the RF and RT. The third flex sensor is attached to the middle finger to hold the extra robotic finger’s posture. Two force-sensitive resistors (FSRs) are attached to the RF and RT for the haptic feedback when the robot is worn to take and grasp a fragile object, such as an egg. The trained neural network is embedded into the wearable extra robotic fingers to control the robotic motion and assist the human fingers in bimanual object manipulation tasks. The developed extra fingers are tested for their capacity to assist the human fingers and perform 10 different bimanual tasks, such as holding a large object, lifting and operate an eight-inch tablet, and lifting a bottle, and opening a bottle cap at the same time. Full article
(This article belongs to the Section Artificial Intelligence)
Show Figures

Figure 1

14 pages, 2488 KB  
Article
A Real-Time Pinch-to-Zoom Motion Detection by Means of a Surface EMG-Based Human-Computer Interface
by Jongin Kim, Dongrae Cho, Kwang Jin Lee and Boreom Lee
Sensors 2015, 15(1), 394-407; https://doi.org/10.3390/s150100394 - 29 Dec 2014
Cited by 30 | Viewed by 9260
Abstract
In this paper, we propose a system for inferring the pinch-to-zoom gesture using surface EMG (Electromyography) signals in real time. Pinch-to-zoom, which is a common gesture in smart devices such as an iPhone or an Android phone, is used to control the size [...] Read more.
In this paper, we propose a system for inferring the pinch-to-zoom gesture using surface EMG (Electromyography) signals in real time. Pinch-to-zoom, which is a common gesture in smart devices such as an iPhone or an Android phone, is used to control the size of images or web pages according to the distance between the thumb and index finger. To infer the finger motion, we recorded EMG signals obtained from the first dorsal interosseous muscle, which is highly related to the pinch-to-zoom gesture, and used a support vector machine for classification between four finger motion distances. The powers which are estimated by Welch’s method were used as feature vectors. In order to solve the multiclass classification problem, we applied a one-versus-one strategy, since a support vector machine is basically a binary classifier. As a result, our system yields 93.38% classification accuracy averaged over six subjects. The classification accuracy was estimated using 10-fold cross validation. Through our system, we expect to not only develop practical prosthetic devices but to also construct a novel user experience (UX) for smart devices. Full article
(This article belongs to the Special Issue HCI In Smart Environments)
Show Figures

Back to TopTop