Next Article in Journal
6DoF Pose Estimation of Transparent Objects: Dataset and Method
Previous Article in Journal
A Monocular Depth Estimation Method for Autonomous Driving Vehicles Based on Gaussian Neural Radiance Fields
Previous Article in Special Issue
When Action Speaks Louder than Words: Exploring Non-Verbal and Paraverbal Features in Dyadic Collaborative VR
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

FingerType: One-Handed Thumb-to-Finger Text Input Using 3D Hand Tracking

1
Key Laboratory of Symbolic Computation and Knowledge Engineering of Ministry of Education, College of Computer Science and Technology, Jilin University, Changchun 130012, China
2
State Key Laboratory of Advanced Vehicle Integration and Control, China FAW Group Co., Ltd., Changchun 130013, China
3
School of Computer, Electronics and Information, Guangxi University, Nanning 530004, China
4
Changchun Institute of Optics, Fine Mechanics and Physics, Chinese Academy of Sciences, Changchun 130033, China
*
Author to whom correspondence should be addressed.
Sensors 2026, 26(3), 897; https://doi.org/10.3390/s26030897
Submission received: 26 December 2025 / Revised: 22 January 2026 / Accepted: 26 January 2026 / Published: 29 January 2026
(This article belongs to the Special Issue Sensing Technology to Measure Human-Computer Interactions)

Abstract

We present FingerType, a one-handed text input method based on thumb-to-finger gestures. FingerType detects tap events from 3D hand data using a Temporal Convolutional Network (TCN) and decodes the tap sequence into words with an n-gram language model. To inform the design, we examined thumb-to-finger interactions and collected comfort ratings of finger regions. We used these results to design an improved T9-style key layout. Our system runs at 72 frames per second and reaches 94.97% accuracy for tap detection. We conducted a six-block user study with 24 participants and compared FingerType with controller input and touch input. Entry speed increased from 5.88 WPM in the first practice block to 10.63 WPM in the final block. FingerType also supported more eyes-free typing: attention on the display panel within ±15 of head-gaze was 84.41%, higher than touch input (69.47%). Finally, we report error patterns and WPM learning curves, and a model-based analysis suggests improving gesture recognition accuracy could further increase speed and narrow the gap to traditional VR input methods.
Keywords: virtual reality; text entry; bare-hand input; one-handed interaction; thumb-to-finger gestures; mid-air input virtual reality; text entry; bare-hand input; one-handed interaction; thumb-to-finger gestures; mid-air input

Share and Cite

MDPI and ACS Style

Jia, N.; Sun, M.; Li, Y.; Tian, Y.; Sun, T. FingerType: One-Handed Thumb-to-Finger Text Input Using 3D Hand Tracking. Sensors 2026, 26, 897. https://doi.org/10.3390/s26030897

AMA Style

Jia N, Sun M, Li Y, Tian Y, Sun T. FingerType: One-Handed Thumb-to-Finger Text Input Using 3D Hand Tracking. Sensors. 2026; 26(3):897. https://doi.org/10.3390/s26030897

Chicago/Turabian Style

Jia, Nuo, Minghui Sun, Yan Li, Yang Tian, and Tao Sun. 2026. "FingerType: One-Handed Thumb-to-Finger Text Input Using 3D Hand Tracking" Sensors 26, no. 3: 897. https://doi.org/10.3390/s26030897

APA Style

Jia, N., Sun, M., Li, Y., Tian, Y., & Sun, T. (2026). FingerType: One-Handed Thumb-to-Finger Text Input Using 3D Hand Tracking. Sensors, 26(3), 897. https://doi.org/10.3390/s26030897

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop