Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (38)

Search Parameters:
Keywords = Leap Motion sensor

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
24 pages, 3935 KB  
Article
PSO Trajectory Optimization of Robot Arm for Ultrasonic Testing of Complex Curved Surface
by Rao Yao, Yahui Lv, Kai Wang, Yan Gao and Dazhong Wang
Coatings 2026, 16(3), 332; https://doi.org/10.3390/coatings16030332 - 8 Mar 2026
Viewed by 199
Abstract
In ultrasonic nondestructive testing, maintaining the ultrasonic sensor in normal contact with curved surfaces is pivotal for acquiring valid defect signals. Replacing manual operation with a robotic arm ensures stable signal collection, while stable and fast trajectory planning for complex curved-surface tracking remains [...] Read more.
In ultrasonic nondestructive testing, maintaining the ultrasonic sensor in normal contact with curved surfaces is pivotal for acquiring valid defect signals. Replacing manual operation with a robotic arm ensures stable signal collection, while stable and fast trajectory planning for complex curved-surface tracking remains a key challenge. This research investigates gesture-driven robotic trajectory planning and impact optimization via the particle swarm optimization (PSO) algorithm in the robot joint space for rapid and smooth movement. Gesture trajectories are acquired via a Leap Motion device, with unified mapping established through spatial transformations among gesture, simulation, and experimental robot spaces. PSO is utilized to optimize trajectories, enhancing accuracy and controllability. Median filtering is applied to trajectory coordinate data to suppress errors from hand tremor and sensor limitations, followed by introducing a surface normal offset to generate pose matrices at each trajectory point. Systematic comparison of interpolation methods (polynomial, cubic spline, circular, cubic B-spline) reveals that cubic B-spline interpolation achieves the shortest execution time under angular acceleration constraints. The results show that PSO optimizes point-to-point trajectories based on 5-5-5 polynomial interpolation, with impact force and execution time as objectives, yielding the optimal trajectory with minimal time under acceleration constraints. This research provides valuable methodological references for robotic manipulator trajectory planning and optimization in complex curved-surface ultrasonic testing. Full article
(This article belongs to the Section Surface Characterization, Deposition and Modification)
Show Figures

Figure 1

23 pages, 65396 KB  
Article
Comparative Analysis of the Accuracy and Robustness of the Leap Motion Controller 2
by Daniel Matuszczyk, Mikel Jedrusiak, Denis Fisseler and Frank Weichert
Sensors 2025, 25(24), 7473; https://doi.org/10.3390/s25247473 - 8 Dec 2025
Viewed by 937
Abstract
Along with the ongoing success of virtual/augmented reality (VR/AR) and human–machine interaction (HMI) in the professional and consumer markets, new compatible and inexpensive hand tracking devices are required. One of the contenders in this market is the Leap Motion Controller 2 (LMC2), successor [...] Read more.
Along with the ongoing success of virtual/augmented reality (VR/AR) and human–machine interaction (HMI) in the professional and consumer markets, new compatible and inexpensive hand tracking devices are required. One of the contenders in this market is the Leap Motion Controller 2 (LMC2), successor to the popular Leap Motion Controller (LMC1), which has been widely used for scientific hand-tracking applications since its introduction in 2013. To quantify ten years of advances, this study compares both controllers using quantitative tracking metrics and characterizes the interaction space above the sensor. A robot-actuated 3D-printed hand and a motion-capture system provide controlled movements and external reference data. In the central tracking volume, the LMC2 achieves improved performance, reducing palm-position error from 7.9–9.8 mm (LMC1) to 5.2–5.3 mm (LMC2) and lowering positional variability from 1.3–2.2 mm to 0.4–0.8 mm. Dynamic tests confirm stable tracking for both devices. For boundary experiments, the LMC2 maintains continuous detection at distances up to 666 mm, compared to 250–275 mm (LMC1), and detects hands entering the field of view from distances up to 646 mm. Both devices show reduced accuracy toward the edges of the tracking volume. Overall, the results provide a grounded characterization of LMC2 performance in its newly emphasized VR/AR-relevant interaction spaces, while the metrics support cross-comparison with earlier LMC1-based studies and transfer to related application scenarios. Full article
(This article belongs to the Section Sensors and Robotics)
Show Figures

Graphical abstract

12 pages, 1423 KB  
Article
Measurement of Hand Function by an Automated Device—System Validation and Usability Analysis
by Margarida Vieira, Tobias Barth, Matthias Münch, Natascha Koch, Alexander Kögel, Marie Stroetmann and Arndt Peter Schulz
Sensors 2025, 25(22), 7068; https://doi.org/10.3390/s25227068 - 19 Nov 2025
Cited by 1 | Viewed by 640
Abstract
(1) Aim: This study aims to assess the repeatability and accuracy of a 9-axis IMU-based glove and an IR camera system, in order to determine their potential to replace traditional goniometry. (2) Background: Traditional methods for assessing hand function, such as goniometry, are [...] Read more.
(1) Aim: This study aims to assess the repeatability and accuracy of a 9-axis IMU-based glove and an IR camera system, in order to determine their potential to replace traditional goniometry. (2) Background: Traditional methods for assessing hand function, such as goniometry, are time-consuming and limited by subjectivity, inter-rater variability and external factors that compromise accuracy and reliability. Recent advancements in motion capture technology and sensor-based devices offer potential improvements in efficiency and accuracy for hand rehabilitation assessment; (3) Methods: To evaluate the repeatability of an IMU-based glove and an IR camera, measurements were taken using a silicone hand model under controlled conditions, while accuracy assessments involved a volunteer without movement constraints. Bland–Altman plots were employed for visual comparison and accuracy evaluation; (4) Results: The Nuada glove exhibited high repeatability, with standard deviations below two degrees across all joints, surpassing the goniometer’s accuracy threshold of five degrees. The UltraLeap system demonstrated comparable repeatability, with deviations consistently under 3.5 degrees. Accuracy assessments revealed limitations: over 50% of the Nuada glove’s measurements and over 80% of UltraLeap’s measurements deviated by more than five degrees compared to the goniometer. However, the Nuada glove and UltraLeap system were more consistent with each other than with the goniometer, suggesting limitations in the goniometer’s reliability for modern mobility assessment; (5) Conclusions: Both devices exhibited excellent repeatability, highlighting their strong potential for clinical application. However, their accuracy compared to the goniometer requires further refinement. These findings suggest that these technologies could enhance traditional assessment methods, offering more efficient and accurate solutions for evaluating hand mobility in clinical settings. Full article
(This article belongs to the Special Issue (Bio)sensors for Physiological Monitoring)
Show Figures

Figure 1

20 pages, 5461 KB  
Article
Design and Implementation of a 3D Korean Sign Language Learning System Using Pseudo-Hologram
by Naeun Kim, HaeYeong Choe, Sukwon Lee and Changgu Kang
Appl. Sci. 2025, 15(16), 8962; https://doi.org/10.3390/app15168962 - 14 Aug 2025
Viewed by 1515
Abstract
Sign language is a three-dimensional (3D) visual language that conveys meaning through hand positions, shapes, and movements. Traditional sign language education methods, such as textbooks and videos, often fail to capture the spatial characteristics of sign language, leading to limitations in learning accuracy [...] Read more.
Sign language is a three-dimensional (3D) visual language that conveys meaning through hand positions, shapes, and movements. Traditional sign language education methods, such as textbooks and videos, often fail to capture the spatial characteristics of sign language, leading to limitations in learning accuracy and comprehension. To address this, we propose a 3D Korean Sign Language Learning System that leverages pseudo-hologram technology and hand gesture recognition using Leap Motion sensors. The proposed system provides learners with an immersive 3D learning experience by visualizing sign language gestures through pseudo-holographic displays. A Recurrent Neural Network (RNN) model, combined with Diffusion Convolutional Recurrent Neural Networks (DCRNNs) and ProbSparse Attention mechanisms, is used to recognize hand gestures from both hands in real-time. The system is implemented using a server–client architecture to ensure scalability and flexibility, allowing efficient updates to the gesture recognition model without modifying the client application. Experimental results show that the system enhances learners’ ability to accurately perform and comprehend sign language gestures. Additionally, a usability study demonstrated that 3D visualization significantly improves learning motivation and user engagement compared to traditional 2D learning methods. Full article
Show Figures

Figure 1

17 pages, 8323 KB  
Article
A Symmetrical Leech-Inspired Soft Crawling Robot Based on Gesture Control
by Jiabiao Li, Ruiheng Liu, Tianyu Zhang and Jianbin Liu
Biomimetics 2025, 10(1), 35; https://doi.org/10.3390/biomimetics10010035 - 8 Jan 2025
Cited by 2 | Viewed by 1725
Abstract
This paper presents a novel soft crawling robot controlled by gesture recognition, aimed at enhancing the operability and adaptability of soft robots through natural human–computer interactions. The Leap Motion sensor is employed to capture hand gesture data, and Unreal Engine is used for [...] Read more.
This paper presents a novel soft crawling robot controlled by gesture recognition, aimed at enhancing the operability and adaptability of soft robots through natural human–computer interactions. The Leap Motion sensor is employed to capture hand gesture data, and Unreal Engine is used for gesture recognition. Using the UE4Duino, gesture semantics are transmitted to an Arduino control system, enabling direct control over the robot’s movements. For accurate and real-time gesture recognition, we propose a threshold-based method for static gestures and a backpropagation (BP) neural network model for dynamic gestures. In terms of design, the robot utilizes cost-effective thermoplastic polyurethane (TPU) film as the primary pneumatic actuator material. Through a positive and negative pressure switching circuit, the robot’s actuators achieve controllable extension and contraction, allowing for basic movements such as linear motion and directional changes. Experimental results demonstrate that the robot can successfully perform diverse motions under gesture control, highlighting the potential of gesture-based interaction in soft robotics. Full article
(This article belongs to the Special Issue Design, Actuation, and Fabrication of Bio-Inspired Soft Robotics)
Show Figures

Figure 1

22 pages, 13474 KB  
Article
Multimodal Human–Robot Interaction Using Gestures and Speech: A Case Study for Printed Circuit Board Manufacturing
by Ángel-Gabriel Salinas-Martínez, Joaquín Cunillé-Rodríguez, Elías Aquino-López and Angel-Iván García-Moreno
J. Manuf. Mater. Process. 2024, 8(6), 274; https://doi.org/10.3390/jmmp8060274 - 30 Nov 2024
Cited by 8 | Viewed by 6948
Abstract
In recent years, technologies for human–robot interaction (HRI) have undergone substantial advancements, facilitating more intuitive, secure, and efficient collaborations between humans and machines. This paper presents a decentralized HRI platform, specifically designed for printed circuit board manufacturing. The proposal incorporates many input devices, [...] Read more.
In recent years, technologies for human–robot interaction (HRI) have undergone substantial advancements, facilitating more intuitive, secure, and efficient collaborations between humans and machines. This paper presents a decentralized HRI platform, specifically designed for printed circuit board manufacturing. The proposal incorporates many input devices, including gesture recognition via Leap Motion and Tap Strap, and speech recognition. The gesture recognition system achieved an average accuracy of 95.42% and 97.58% for each device, respectively. The speech control system, called Cellya, exhibited a markedly reduced Word Error Rate of 22.22% and a Character Error Rate of 11.90%. Furthermore, a scalable user management framework, the decentralized multimodal control server, employs biometric security to facilitate the efficient handling of multiple users, regulating permissions and control privileges. The platform’s flexibility and real-time responsiveness are achieved through advanced sensor integration and signal processing techniques, which facilitate intelligent decision-making and enable accurate manipulation of manufacturing cells. The results demonstrate the system’s potential to improve operational efficiency and adaptability in smart manufacturing environments. Full article
(This article belongs to the Special Issue Smart Manufacturing in the Era of Industry 4.0)
Show Figures

Figure 1

24 pages, 74134 KB  
Article
Upper and Lower Limb Training Evaluation System Based on Virtual Reality Technology
by Jian Zhao, Hanlin Gao, Chen Yang, Zhejun Kuang, Mingliang Liu, Zhuozheng Dang and Lijuan Shi
Sensors 2024, 24(21), 6909; https://doi.org/10.3390/s24216909 - 28 Oct 2024
Cited by 5 | Viewed by 3237
Abstract
Upper and lower limb rehabilitation training is essential for restoring patients’ physical movement ability and enhancing muscle strength and coordination. However, traditional rehabilitation training methods have limitations, such as high costs, low patient participation, and lack of real-time feedback. The purpose of this [...] Read more.
Upper and lower limb rehabilitation training is essential for restoring patients’ physical movement ability and enhancing muscle strength and coordination. However, traditional rehabilitation training methods have limitations, such as high costs, low patient participation, and lack of real-time feedback. The purpose of this study is to design and implement a rehabilitation training evaluation system based on virtual reality to improve the quality of patients’ rehabilitation training. This paper proposes an upper and lower limb rehabilitation training evaluation system based on virtual reality technology, aiming to solve the problems existing in traditional rehabilitation training. The system provides patients with an immersive and interactive rehabilitation training environment through virtual reality technology, aiming to improve patients’ participation and rehabilitation effects. This study used Kinect 2.0 and Leap Motion sensors to capture patients’ motion data and transmit them to virtual training scenes. The system designed multiple virtual scenes specifically for different upper and lower limbs, with a focus on hand function training. Through these scenes, patients can perform various movement training, and the system will provide real-time feedback based on the accuracy of the patient’s movements. The experimental results show that patients using the system show higher participation and better rehabilitation training effects. Compared with patients receiving traditional rehabilitation training, patients using the virtual reality system have significantly improved movement accuracy and training participation. The virtual reality rehabilitation training evaluation system developed in this study improves the quality of patients’ rehabilitation and provides personalized treatment information to medical personnel through data collection and analysis, promoting the systematization and personalization of rehabilitation training. This system is innovative and has broad application potential in the field of rehabilitation medicine. Full article
(This article belongs to the Section Biomedical Sensors)
Show Figures

Figure 1

9 pages, 296 KB  
Article
Virtual Reality-Based Assessment for Rehabilitation of the Upper Limb in Patients with Parkinson’s Disease: A Pilot Cross-Sectional Study
by Luciano Bissolotti, Justo Artiles-Sánchez, José Luís Alonso-Pérez, Josué Fernández-Carnero, Vanesa Abuín-Porras, Pierluigi Sinatti and Jorge Hugo Villafañe
Medicina 2024, 60(4), 555; https://doi.org/10.3390/medicina60040555 - 29 Mar 2024
Cited by 4 | Viewed by 3252
Abstract
Background and Objectives: This study aimed to examine the responsiveness and concurrent validity of a serious game and its correlation between the use of serious games and upper limbs (UL) performance in Parkinson’s Disease (PD) patients. Materials and Methods: Twenty-four consecutive [...] Read more.
Background and Objectives: This study aimed to examine the responsiveness and concurrent validity of a serious game and its correlation between the use of serious games and upper limbs (UL) performance in Parkinson’s Disease (PD) patients. Materials and Methods: Twenty-four consecutive upper limbs (14 males, 8 females, age: 55–83 years) of PD patients were assessed. The clinical assessment included: the Box and Block test (BBT), Nine-Hole Peg test (9HPT), and sub-scores of the Unified Parkinson’s Disease Rating-Scale Motor section (UPDRS-M) to assess UL disability. Performance scores obtained in two different tests (Ex. A and Ex. B, respectively, the Trolley test and Mushrooms test) based on leap motion (LM) sensors were used to study the correlations with clinical scores. Results: The subjective fatigue experienced during LM tests was measured by the Borg Rating of Perceived Exertion (RPE, 0–10); the BBT and 9HPT showed the highest correlation coefficients with UPDRS-M scores (ICCs: −0.652 and 0.712, p < 0.05). Exercise A (Trolley test) correlated with UPDRS-M (ICC: 0.31, p < 0.05), but not with the 9HPT and BBT tests (ICCs: −0.447 and 0.390, p < 0.05), while Exercise B (Mushroom test) correlated with UPDRS-M (ICC: −0.40, p < 0.05), as did these last two tests (ICCs: −0.225 and 0.272, p < 0.05). The mean RPE during LM tests was 3.4 ± 3.2. The evaluation of upper limb performance is feasible and does not induce relevant fatigue. Conclusions: The analysis of the ICC supports the use of Test B to evaluate UL disability and performance in PD patients, while Test A is mostly correlated with disability. Specifically designed serious games on LM can serve as a method of impairment in the PD population. Full article
(This article belongs to the Section Neurology)
17 pages, 6082 KB  
Article
A Model of Multi-Finger Coordination in Keystroke Movement
by Jialuo Lin, Baihui Ding, Zilong Song, Zheng Li and Shengchao Li
Sensors 2024, 24(4), 1221; https://doi.org/10.3390/s24041221 - 14 Feb 2024
Viewed by 2608
Abstract
In multi-finger coordinated keystroke actions by professional pianists, movements are precisely regulated by multiple motor neural centers, exhibiting a certain degree of coordination in finger motions. This coordination enhances the flexibility and efficiency of professional pianists’ keystrokes. Research on the coordination of keystrokes [...] Read more.
In multi-finger coordinated keystroke actions by professional pianists, movements are precisely regulated by multiple motor neural centers, exhibiting a certain degree of coordination in finger motions. This coordination enhances the flexibility and efficiency of professional pianists’ keystrokes. Research on the coordination of keystrokes in professional pianists is of great significance for guiding the movements of piano beginners and the motion planning of exoskeleton robots, among other fields. Currently, research on the coordination of multi-finger piano keystroke actions is still in its infancy. Scholars primarily focus on phenomenological analysis and theoretical description, which lack accurate and practical modeling methods. Considering that the tendon of the ring finger is closely connected to adjacent fingers, resulting in limited flexibility in its movement, this study concentrates on coordinated keystrokes involving the middle and ring fingers. A motion measurement platform is constructed, and Leap Motion is used to collect data from 12 professional pianists. A universal model applicable to multiple individuals for multi-finger coordination in keystroke actions based on the backpropagation (BP) neural network is proposed, which is optimized using a genetic algorithm (GA) and a sparrow search algorithm (SSA). The angular rotation of the ring finger’s MCP joint is selected as the model output, while the individual difference information and the angular data of the middle finger’s MCP joint serve as inputs. The individual difference information used in this study includes ring finger length, middle finger length, and years of piano training. The results indicate that the proposed SSA-BP neural network-based model demonstrates superior predictive accuracy, with a root mean square error of 4.8328°. Based on this model, the keystroke motion of the ring finger’s MCP joint can be accurately predicted from the middle finger’s keystroke motion information, offering an evaluative method and scientific guidance for the training of multi-finger coordinated keystrokes in piano learners. Full article
(This article belongs to the Special Issue Optical Instruments and Sensors and Their Applications)
Show Figures

Figure 1

19 pages, 5262 KB  
Article
Bimanual Intravenous Needle Insertion Simulation Using Nonhomogeneous Haptic Device Integrated into Mixed Reality
by Jin Woo Kim, Jeremy Jarzembak and Kwangtaek Kim
Sensors 2023, 23(15), 6697; https://doi.org/10.3390/s23156697 - 26 Jul 2023
Cited by 17 | Viewed by 3928
Abstract
In this study, we developed a new haptic–mixed reality intravenous (HMR-IV) needle insertion simulation system, providing a bimanual haptic interface integrated into a mixed reality system with programmable variabilities considering real clinical environments. The system was designed for nursing students or healthcare professionals [...] Read more.
In this study, we developed a new haptic–mixed reality intravenous (HMR-IV) needle insertion simulation system, providing a bimanual haptic interface integrated into a mixed reality system with programmable variabilities considering real clinical environments. The system was designed for nursing students or healthcare professionals to practice IV needle insertion into a virtual arm with unlimited attempts under various changing insertion conditions (e.g., skin: color, texture, stiffness, friction; vein: size, shape, location depth, stiffness, friction). To achieve accurate hand–eye coordination under dynamic mixed reality scenarios, two different haptic devices (Dexmo and Geomagic Touch) and a standalone mixed reality system (HoloLens 2) were integrated and synchronized through multistep calibration for different coordinate systems (real world, virtual world, mixed reality world, haptic interface world, HoloLens camera). In addition, force-profile-based haptic rendering proposed in this study was able to successfully mimic the real tactile feeling of IV needle insertion. Further, a global hand-tracking method, combining two depth sensors (HoloLens and Leap Motion), was developed to accurately track a haptic glove and simulate grasping a virtual hand with force feedback. We conducted an evaluation study with 20 participants (9 experts and 11 novices) to measure the usability of the HMR-IV simulation system with user performance under various insertion conditions. The quantitative results from our own metric and qualitative results from the NASA Task Load Index demonstrate the usability of our system. Full article
Show Figures

Figure 1

15 pages, 4812 KB  
Article
A Novel Sensor Fusion Approach for Precise Hand Tracking in Virtual Reality-Based Human—Computer Interaction
by Yu Lei, Yi Deng, Lin Dong, Xiaohui Li, Xiangnan Li and Zhi Su
Biomimetics 2023, 8(3), 326; https://doi.org/10.3390/biomimetics8030326 - 22 Jul 2023
Cited by 17 | Viewed by 7904
Abstract
The rapidly evolving field of Virtual Reality (VR)-based Human–Computer Interaction (HCI) presents a significant demand for robust and accurate hand tracking solutions. Current technologies, predominantly based on single-sensing modalities, fall short in providing comprehensive information capture due to susceptibility to occlusions and environmental [...] Read more.
The rapidly evolving field of Virtual Reality (VR)-based Human–Computer Interaction (HCI) presents a significant demand for robust and accurate hand tracking solutions. Current technologies, predominantly based on single-sensing modalities, fall short in providing comprehensive information capture due to susceptibility to occlusions and environmental factors. In this paper, we introduce a novel sensor fusion approach combined with a Long Short-Term Memory (LSTM)-based algorithm for enhanced hand tracking in VR-based HCI. Our system employs six Leap Motion controllers, two RealSense depth cameras, and two Myo armbands to yield a multi-modal data capture. This rich data set is then processed using LSTM, ensuring the accurate real-time tracking of complex hand movements. The proposed system provides a powerful tool for intuitive and immersive interactions in VR environments. Full article
(This article belongs to the Special Issue Computer-Aided Biomimetics)
Show Figures

Figure 1

19 pages, 8010 KB  
Article
Enhancing sEMG-Based Finger Motion Prediction with CNN-LSTM Regressors for Controlling a Hand Exoskeleton
by Mirco Vangi, Chiara Brogi, Alberto Topini, Nicola Secciani and Alessandro Ridolfi
Machines 2023, 11(7), 747; https://doi.org/10.3390/machines11070747 - 17 Jul 2023
Cited by 14 | Viewed by 3767
Abstract
In recent years, the number of people with disabilities has increased hugely, especially in low- and middle-income countries. At the same time, robotics has made significant advances in the medical field, and many research groups have begun to develop low-cost wearable solutions. The [...] Read more.
In recent years, the number of people with disabilities has increased hugely, especially in low- and middle-income countries. At the same time, robotics has made significant advances in the medical field, and many research groups have begun to develop low-cost wearable solutions. The Mechatronics and Dynamic Modelling Lab of the Department of Industrial Engineering at the University of Florence has recently developed a new version of a wearable hand exoskeleton for assistive purposes. In this paper, we will present a new regression method to predict the finger angle position of the first joint from the value of the sEMG of the forearm and the previous position of the finger itself. To acquire the dataset necessary to train the regressor a specific graphical user interface was developed which was able to acquire sEMG data from a Myo armband and the finger position from a Leap Motion Controller. Two long short-term memory (LSTM) models were compared, one in its standard configuration and the other with a convolutional layer, yielding significantly better performance for the second one, with an increase in R2 coefficient from an average value of 0.746 to 0.825, leading to the conclusion that a convolutional layer could increase performance when few sensors are available. Full article
(This article belongs to the Special Issue Design and Control of Wearable Mechatronics Devices)
Show Figures

Figure 1

13 pages, 3775 KB  
Brief Report
Analysis of the Leap Motion Controller Workspace for HRI Gesture Applications
by Michal Tölgyessy, Martin Dekan, Jozef Rodina and František Duchoň
Appl. Sci. 2023, 13(2), 742; https://doi.org/10.3390/app13020742 - 5 Jan 2023
Cited by 7 | Viewed by 4385
Abstract
The Leap Motion Controller is a sensor for precise hand tracking; it is a device used for human interaction with computer systems via gestures. The study presented in this paper evaluates its workspace in real-world conditions. An exact replica of a human operator’s [...] Read more.
The Leap Motion Controller is a sensor for precise hand tracking; it is a device used for human interaction with computer systems via gestures. The study presented in this paper evaluates its workspace in real-world conditions. An exact replica of a human operator’s hand was used to measure the sensor’s precision, and therefore determine its hand tracking abilities in varying positions above the sensor. The replica was moved randomly across the workspace defined by the manufacturer, and precision was measured in each position. The hand model was placed in the furthest distances from the sensor to find every position where the sensor was still capable of tracking. We found the dimensions of the workspace in some cases exceeded the datasheet values; in other cases, the real workspace was smaller than the proclaimed one. We also computed precision in all positions, which shows tracking reliability. This study serves researchers developing HMI and HRI algorithms as a reference for the real dimensions of the Leap Motion Controller workspace as it provides extra and more precise information compared to the datasheet. Full article
Show Figures

Figure 1

24 pages, 39938 KB  
Article
Robust Identification System for Spanish Sign Language Based on Three-Dimensional Frame Information
by Jesús Galván-Ruiz, Carlos M. Travieso-González, Alejandro Pinan-Roescher and Jesús B. Alonso-Hernández
Sensors 2023, 23(1), 481; https://doi.org/10.3390/s23010481 - 2 Jan 2023
Cited by 9 | Viewed by 3283
Abstract
Nowadays, according to the World Health Organization (WHO), of the world’s population suffers from a hearing disorder that makes oral communication with other people challenging. At the same time, in an era of technological evolution and digitization, designing tools that could help these [...] Read more.
Nowadays, according to the World Health Organization (WHO), of the world’s population suffers from a hearing disorder that makes oral communication with other people challenging. At the same time, in an era of technological evolution and digitization, designing tools that could help these people to communicate daily is the base of much scientific research such as that discussed herein. This article describes one of the techniques designed to transcribe Spanish Sign Language (SSL). A Leap Motion volumetric sensor has been used in this research due to its capacity to recognize hand movements in 3 dimensions. In order to carry out this research project, an impaired hearing subject has collaborated in the recording of 176 dynamic words. Finally, for the development of the research, Dynamic Time Warping (DTW) has been used to compare the samples and predict the input with an accuracy of 95.17%. Full article
Show Figures

Figure 1

18 pages, 4701 KB  
Article
Analysis of the Leap Motion Controller’s Performance in Measuring Wrist Rehabilitation Tasks Using an Industrial Robot Arm Reference
by Rogério S. Gonçalves, Marcus R. S. B. de Souza and Giuseppe Carbone
Sensors 2022, 22(13), 4880; https://doi.org/10.3390/s22134880 - 28 Jun 2022
Cited by 13 | Viewed by 5460
Abstract
The Leap Motion Controller (LMC) is a low-cost markerless optical sensor that performs measurements of various parameters of the hands that has been investigated for a wide range of different applications. Research attention still needs to focus on the evaluation of its precision [...] Read more.
The Leap Motion Controller (LMC) is a low-cost markerless optical sensor that performs measurements of various parameters of the hands that has been investigated for a wide range of different applications. Research attention still needs to focus on the evaluation of its precision and accuracy to fully understand its limitations and widen its range of applications. This paper presents the experimental validation of the LMC device to verify the feasibility of its use in assessing and tailoring wrist rehabilitation therapy for the treatment of physical disabilities through continuous exercises and integration with serious gaming environments. An experimental set up and analysis is proposed using an industrial robot as motion reference. The high repeatability of the selected robot is used for comparisons with the measurements obtained via a leap motion controller while performing the basic movements needed for rehabilitation exercises of the human wrist. Experimental tests are analyzed and discussed to demonstrate the feasibility of using the leap motion controller for wrist rehabilitation. Full article
(This article belongs to the Collection Survey on Research of Sensors and Robot Control)
Show Figures

Figure 1

Back to TopTop