Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (7)

Search Parameters:
Keywords = one-handed interaction

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
21 pages, 7327 KB  
Article
FingerType: One-Handed Thumb-to-Finger Text Input Using 3D Hand Tracking
by Nuo Jia, Minghui Sun, Yan Li, Yang Tian and Tao Sun
Sensors 2026, 26(3), 897; https://doi.org/10.3390/s26030897 - 29 Jan 2026
Abstract
We present FingerType, a one-handed text input method based on thumb-to-finger gestures. FingerType detects tap events from 3D hand data using a Temporal Convolutional Network (TCN) and decodes the tap sequence into words with an n-gram language model. To inform the design, we [...] Read more.
We present FingerType, a one-handed text input method based on thumb-to-finger gestures. FingerType detects tap events from 3D hand data using a Temporal Convolutional Network (TCN) and decodes the tap sequence into words with an n-gram language model. To inform the design, we examined thumb-to-finger interactions and collected comfort ratings of finger regions. We used these results to design an improved T9-style key layout. Our system runs at 72 frames per second and reaches 94.97% accuracy for tap detection. We conducted a six-block user study with 24 participants and compared FingerType with controller input and touch input. Entry speed increased from 5.88 WPM in the first practice block to 10.63 WPM in the final block. FingerType also supported more eyes-free typing: attention on the display panel within ±15 of head-gaze was 84.41%, higher than touch input (69.47%). Finally, we report error patterns and WPM learning curves, and a model-based analysis suggests improving gesture recognition accuracy could further increase speed and narrow the gap to traditional VR input methods. Full article
(This article belongs to the Special Issue Sensing Technology to Measure Human-Computer Interactions)
18 pages, 4185 KB  
Article
An Empirical Study on Pointing Gestures Used in Communication in Household Settings
by Tymon Kukier, Alicja Wróbel, Barbara Sienkiewicz, Julia Klimecka, Antonio Galiza Cerdeira Gonzalez, Paweł Gajewski and Bipin Indurkhya
Electronics 2025, 14(12), 2346; https://doi.org/10.3390/electronics14122346 - 8 Jun 2025
Viewed by 1672
Abstract
Gestures play an integral role in human communication. Our research aims to develop a gesture understanding system that allows for better interpretation of human instructions in household robotics settings. We conducted an experiment with 34 participants who used pointing gestures to teach concepts [...] Read more.
Gestures play an integral role in human communication. Our research aims to develop a gesture understanding system that allows for better interpretation of human instructions in household robotics settings. We conducted an experiment with 34 participants who used pointing gestures to teach concepts to an assistant. Gesture data were analyzed using manual annotations (MAXQDA) and the computational methods of pose estimation and k-means clustering. The study revealed that participants tend to maintain consistent pointing styles, with one-handed pointing and index finger gestures being the most common. Gaze and pointing often co-occur, as do leaning forward and pointing. Using our gesture categorization algorithm, we analyzed gesture information values. As the experiment progressed, the information value of gestures remained stable, although the trends varied between participants and were associated with factors such as age and gender. These findings underscore the need for gesture recognition systems to balance generalization with personalization for more effective human–robot interaction. Full article
(This article belongs to the Special Issue Applications of Computer Vision, 3rd Edition)
Show Figures

Figure 1

39 pages, 2355 KB  
Article
A Comparison of One- and Two-Handed Gesture User Interfaces in Virtual Reality—A Task-Based Approach
by Taneli Nyyssönen, Seppo Helle, Teijo Lehtonen and Jouni Smed
Multimodal Technol. Interact. 2024, 8(2), 10; https://doi.org/10.3390/mti8020010 - 2 Feb 2024
Cited by 6 | Viewed by 5478
Abstract
This paper presents two gesture-based user interfaces which were designed for a 3D design review in virtual reality (VR) with inspiration drawn from the shipbuilding industry’s need to streamline and make their processes more sustainable. The user interfaces, one focusing on single-hand (unimanual) [...] Read more.
This paper presents two gesture-based user interfaces which were designed for a 3D design review in virtual reality (VR) with inspiration drawn from the shipbuilding industry’s need to streamline and make their processes more sustainable. The user interfaces, one focusing on single-hand (unimanual) gestures and the other focusing on dual-handed (bimanual) usage, are tested as a case study using 13 tasks. The unimanual approach attempts to provide a higher degree of flexibility, while the bimanual approach seeks to provide more control over the interaction. The interfaces were developed for the Meta Quest 2 VR headset using the Unity game engine. Hand-tracking (HT) is utilized due to potential usability benefits in comparison to standard controller-based user interfaces, which lack intuitiveness regarding the controls and can cause more strain. The user interfaces were tested with 25 test users, and the results indicate a preference toward the one-handed user interface with little variation in test user categories. Additionally, the testing order, which was counterbalanced, had a statistically significant impact on the preference and performance, indicating that learning novel interaction mechanisms requires an adjustment period for reliable results. VR sickness was also strongly experienced by a few users, and there were no signs that gesture controls would significantly alleviate it. Full article
(This article belongs to the Special Issue 3D User Interfaces and Virtual Reality)
Show Figures

Figure 1

11 pages, 1676 KB  
Article
Resistance Training Using Flywheel Device Improves the Shot Precision in Senior Elite Tennis Players: A Randomized Controlled Study
by Marco Centorbi, Giovanni Fiorilli, Giulia Di Martino, Andrea Buonsenso, Gabriele Medri, Carlo della Valle, Nicolina Vendemiati, Enzo Iuliano, Giuseppe Calcagno and Alessandra di Cagno
Appl. Sci. 2023, 13(24), 13290; https://doi.org/10.3390/app132413290 - 15 Dec 2023
Cited by 4 | Viewed by 2930
Abstract
The aim of the study was to assess the effects of 8 weeks of resistance training using a flywheel device applied to upper limbs, compared to traditional isotonic training, on strength and shot precision in tennis. Twenty-seven elite senior tennis players (age: 55.78 [...] Read more.
The aim of the study was to assess the effects of 8 weeks of resistance training using a flywheel device applied to upper limbs, compared to traditional isotonic training, on strength and shot precision in tennis. Twenty-seven elite senior tennis players (age: 55.78 ± 2.69) were randomly divided into an experimental group (EG) using flywheel devices (n = 13) and a control group (CG) performing isotonic training (n = 14). The EG program included forehand, backhand, and one-handed shoulder press movements, while the CG performed seven resistance exercises on isotonic machines. A similar workout intensity was ensured using the Borg’s CR-10 scale. The assessment included a 30s arm curl test, a medicine ball throw test, and forehand/backhand/overhead shot precision tests. A significant time effect was found in the 30s arm curl test for the EG (F(1,25) = 13.09; p = 0.001), along with a time * group interaction (F(1,25) = 5.21; p = 0.031). A significant group difference was observed in the forehand shot precision test, where the EG achieved better scores than the CG and significant interaction time * group (F(1,25) = 8.35; p = 0.008). In the shot backhand precision test, a significant effect of time (F(1,25) = 5.01; p = 0.034) and significant time * group interaction were found (F(1,25) = 4.50; p = 0.044), but there was no significant difference between groups. Resistance training with flywheel devices has shown potential in improving tennis performance. Applying overload to specific athletic movements during both concentric and eccentric phases in the EG has shown enhanced strength and neuromuscular coordination in relation to shot precision, thereby enabling simultaneous improvements in both conditioning and the technical aspects of fundamental tennis shots. Full article
(This article belongs to the Special Issue Effects of Physical Training on Exercise Performance)
Show Figures

Figure 1

14 pages, 2982 KB  
Article
Complex Hand Interaction Authoring Tool for User Selective Media
by Bok Deuk Song, HongKyw Choi and Sung-Hoon Kim
Electronics 2022, 11(18), 2854; https://doi.org/10.3390/electronics11182854 - 9 Sep 2022
Cited by 1 | Viewed by 1722
Abstract
Nowadays, with the advancement of the Internet and personal mobile devices, many interactive media are prevailing, where viewers make their own decisions on the story of the media based on their interactions. The interaction that the user can make is usually pre-programmed by [...] Read more.
Nowadays, with the advancement of the Internet and personal mobile devices, many interactive media are prevailing, where viewers make their own decisions on the story of the media based on their interactions. The interaction that the user can make is usually pre-programmed by a programmer. Therefore, interactions that users can make are limited to programmable areas. In comparison, in this paper, we propose an Interactive media authoring tool which can compose diverse two-hand interactions from several one-hand interactive components. The aim is to provide content creators with a tool to produce multiple hand motions so that they can design a variety of user interactions to stimulate the interest of content viewers and increase their sense of immersion. Using the proposed system, the content creator can gain greater freedom to create more diverse and complex interactions than programmable ones. The system is composed of a complex motion editor that edits one-hand motions into complex two-hand motions, a touchless sensor that senses the hand motion and a metadata manager that handles the metadata, which specify the settings for the interactive functions. To our knowledge, the proposed system is the first web-based authoring tool that can authorize complex two-hand motions from single hand motions, and which can also control a touchless motion control device. Full article
(This article belongs to the Special Issue Real-Time Visual Information Processing in Human-Computer Interface)
Show Figures

Figure 1

21 pages, 3466 KB  
Article
AnyGesture: Arbitrary One-Handed Gestures for Augmented, Virtual, and Mixed Reality Applications
by Alexander Schäfer, Gerd Reis and Didier Stricker
Appl. Sci. 2022, 12(4), 1888; https://doi.org/10.3390/app12041888 - 11 Feb 2022
Cited by 18 | Viewed by 6860
Abstract
Natural user interfaces based on hand gestures are becoming increasingly popular. The need for expensive hardware left a wide range of interaction possibilities that hand tracking enables largely unexplored. Recently, hand tracking has been built into inexpensive and widely available hardware, allowing more [...] Read more.
Natural user interfaces based on hand gestures are becoming increasingly popular. The need for expensive hardware left a wide range of interaction possibilities that hand tracking enables largely unexplored. Recently, hand tracking has been built into inexpensive and widely available hardware, allowing more and more people access to this technology. This work provides researchers and users with a simple yet effective way to implement various one-handed gestures to enable deeper exploration of gesture-based interactions and interfaces. To this end, this work provides a framework for design, prototyping, testing, and implementation of one-handed gestures. The proposed framework was implemented with two main goals: First, it should be able to recognize any one-handed gesture. Secondly, the design and implementation of gestures should be as simple as performing the gesture and pressing a button to record it. The contribution of this paper is a simple yet unique way to record and recognize static and dynamic one-handed gestures. A static gesture can be captured with a template matching approach, while dynamic gestures use previously captured spatial information. The presented approach was evaluated in a user study with 33 participants and the implementable gestures received high accuracy and user acceptance. Full article
(This article belongs to the Special Issue Applications of Virtual, Augmented, and Mixed Reality)
Show Figures

Figure 1

21 pages, 3186 KB  
Article
Controlling Teleportation-Based Locomotion in Virtual Reality with Hand Gestures: A Comparative Evaluation of Two-Handed and One-Handed Techniques
by Alexander Schäfer, Gerd Reis and Didier Stricker
Electronics 2021, 10(6), 715; https://doi.org/10.3390/electronics10060715 - 18 Mar 2021
Cited by 42 | Viewed by 8178
Abstract
Virtual Reality (VR) technology offers users the possibility to immerse and freely navigate through virtual worlds. An important component for achieving a high degree of immersion in VR is locomotion. Often discussed in the literature, a natural and effective way of controlling locomotion [...] Read more.
Virtual Reality (VR) technology offers users the possibility to immerse and freely navigate through virtual worlds. An important component for achieving a high degree of immersion in VR is locomotion. Often discussed in the literature, a natural and effective way of controlling locomotion is still a general problem which needs to be solved. Recently, VR headset manufacturers have been integrating more sensors, allowing hand or eye tracking without any additional required equipment. This enables a wide range of application scenarios with natural freehand interaction techniques where no additional hardware is required. This paper focuses on techniques to control teleportation-based locomotion with hand gestures, where users are able to move around in VR using their hands only. With the help of a comprehensive study involving 21 participants, four different techniques are evaluated. The effectiveness and efficiency as well as user preferences of the presented techniques are determined. Two two-handed and two one-handed techniques are evaluated, revealing that it is possible to move comfortable and effectively through virtual worlds with a single hand only. Full article
(This article belongs to the Special Issue Recent Advances in Virtual Reality and Augmented Reality)
Show Figures

Figure 1

Back to TopTop