American Sign Language Recognition Using Leap Motion Controller with Machine Learning Approach
Abstract
:1. Introduction
2. System Overview
3. Method
3.1. Data Collection
3.2. Feature Extraction
3.3. Features Preprocessing
- C1: D + A + R + L (S excluded)
- C2: S + A + R + L (D excluded)
- C3: S + D + R + L (A excluded)
- C4: S + D + A + L (R excluded)
- C5: S + D + A + R (L excluded)
- C6: S + D + A + R + L (all included)
3.4. Classification and Validation
4. Result and Discussion
5. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
Ethical Statements
References
- Cheok, M.J.; Omar, Z.; Jaward, M.H. A review of hand gestures and sign language recognition techniques. Int. J. Mach. Learn. Cybern. 2017, 8, 1–23. [Google Scholar] [CrossRef]
- Lee, B.G.; Lee, S.M. Smart Wearable Hand Device for Sign Language Interpretation system with Sensors Fusion. IEEE Sens. J. 2018, 18, 1224–1232. [Google Scholar] [CrossRef]
- Preetham, C.; Ramakrishnan, G.; Kumur, S.; Tamse, A.; Krishnapura, H. Hand Talk-Implemented of a Gesture Recognition Glove. In Proceedings of the 2013 Texas Instruments India Educators’ Conference, Bangalore, India, 4–6 April 2013; pp. 328–331. [Google Scholar]
- Wang, J.; Zhang, T. An ARM-Based Embedded Gesture Recognition System Using a Data Glove. In Proceedings of the 26th Chinese Control and Decision conference (CCDC), Changsha, China, 31 May–2 June 2014; pp. 1580–1584. [Google Scholar]
- Wu, J.; Sun, L.; Jafari, R. A Wearable System for Recognizing American Sign Language in Real-Time Using IMU and Surface EMG Sensors. IEEE J. Biomed. Health Inf. 2016, 20, 1281–1290. [Google Scholar] [CrossRef] [PubMed]
- Cheng, J.; Chen, X.; Liu, A.; Peng, H. A Novel Phonology- and Radical-Coded Chinese Sign Language Recognition Framework Using Accelerometer and Surface Electromyography Sensors. Sensors 2015, 15, 23303–23324. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Shukor, A.Z.; Miskon, M.F.; Jamaluddin, M.H.; Ali@Ibrahim, F.; Asyraf, M.F.; Bahar, M.B. A New Glove Approach for Malaysian Sign Language Detection. In Proceedings of the 2015 IEEE International Symposium on Robotics and Intelligent Sensors (IRIS 2015), Langkawi, Malaysia, 18–20 October 2015; pp. 60–67. [Google Scholar]
- Mummadi, C.K.; Leo, F.P.P.; Verma, K.D.; Kasireddy, S.; Scholl, P.M.; Kempfle, J.; Laerhoven, K.V. Real-Time and Embedded Detection of Hand Gesture with an IMU-Based Glove. Informatics 2018, 5, 28. [Google Scholar] [CrossRef]
- Roh, M.C.; Lee, S.W. Human gesture recogntion using a simplified dynamic Bayesian network. Multimed. Syst. 2015, 21, 557–568. [Google Scholar] [CrossRef]
- Bheda, V.; Radpour, D. Using Deep Convolutional Networks for Gesture Recognition in American Sign Language. arXiv. 2017. Available online: https://arxiv.org/abs/1710.06836v3 (accessed on 15 August 2018).
- Pan, T.; Lo, L.; Yeh, C.; Li, J.; Liu, H.; Hu, M. Real-Time Sign Language Recognition in Complex Background Scene Based on a Hierarchical Clustering Classification Method. In Proceedings of the 2016 IEEE Second International Conference on Multimedia Big Data (BigMM), Taipei, Taiwan, 20–22 April 2016; pp. 64–67. [Google Scholar] [CrossRef]
- Kang, B.; Tripathi, S.; Nguyen, T.Q. Real-time Sign Language Fingerspelling Recognition using Convolutional Neural Network from Depth map. arXiv. 2015. Available online: https://arxiv.org/abs/1509.03001v3 (accessed on 9 August 2018).
- Leap Motion. Available online: https://www.leapmotion.com/ (accessed on 5 July 2018).
- Kinect for Windows. Available online: https://developer.microsoft.com/en-us/windows/kinect (accessed on 5 July 2018).
- Sykora, P.; Kamencay, P.; Hudec, R. Comparison of SIFT and SURF Methods for Use on Hand Gesture Recognition Based on Depth Map. AASRI 2014, 9, 19–24. [Google Scholar] [CrossRef]
- Huang, F.; Huang, S. Interpreting American sign language with Kinect. J. Deaf Stud. Educ. 2011, 20, 1281–1290. [Google Scholar]
- Chai, X.; Li, G.; Lin, Y.; Xu, Z.; Tang, Y.; Chen, X.; Zhou, M. Sign Language Recognition and Translation with Kinect. In Proceedings of the IEEE International Conference on Automatic Face and Gesture Recognition, Shanghai, China, 22–26 April 2013. [Google Scholar]
- Yang, H.D. Sign Language Recognition with the Kinect Sensor Based on Conditional Random Fields. Sensors 2015, 15, 135–147. [Google Scholar] [CrossRef] [PubMed]
- Khelil, B.; Amiri, H. Hand Gesture Recognition Using Leap Motion Controller for Recognition of Arabic Sign Language. In Proceedings of the 3rd International Conference on Automation, Control, Engineering and Computer Science (ACECS’ 16), Hammamet, Tunisia, 20–22 March 2016. [Google Scholar]
- Du, Y.; Liu, S.; Feng, L.; Chen, M.; Wu, J. Hand Gesture Recognition with Leap Motion. arXiv. 2017. Available online: https://arxiv.org/abs/1711.04293 (accessed on 5 July 2018).
- Kumar, P.; Gauba, H.; Roy, P.P.; Dogr, D.P. A multimodal framework for sensor based sign language recognition. Neurocomputing 2017, 259, 21–38. [Google Scholar] [CrossRef]
- Funasaka, M.; Ishikawa, Y.; Takata, M.; Joe, K. Sign Language Recognition using Leap Motion. In Proceedings of the International Conference on Parallel and Distributed Processing Techniques and Applications (PDPTA), Las Vegas, NV, USA, 25–28 July 2016; pp. 263–269. [Google Scholar]
- Mapari, R.B.; Kharat, G. American Static Sign Recognition Using Leap Motion Sensor. In Proceedings of the Second International Conference on Information and Communication Technology for Competitive Strategies 2016 (ICTCS ’16), Udaipur, India, 4–5 March 2016. [Google Scholar]
- Marin, G.; Dominio, F.; Zanuttigh, P. Hand gesture recognition with Leap Motion and Kinect devices. In Proceedings of the 2014 IEEE International Conference on Image Processing (ICIP), Paris, France, 27–30 October 2014; pp. 1565–1569. [Google Scholar]
- Lane, H.L.; Grosjean, F. Recent Perspective on American Sign Language; Psychology Press Ltd: New York, NY, USA, 2017. [Google Scholar]
- Scikit Learn Machine Learning in Python. Available online: https://scikit-learn.org/stable/index.html (accessed on 31 August 2018).
- TenSorflow. An Open Source Machine Learning Framework for Everyone. Available online: https://www.tensorflow.org/ (accessed on 31 August 2018).
- Arlot, S.; Celisse, A. A survey of cross-validation procedures for model selection. Stat. Surv. 2010, 4, 40–79. [Google Scholar] [CrossRef] [Green Version]
- Marin, G.; Dominio, F.; Zanuttigh, P. Hand gesture recognition with jointlycalibrated Leap Motion and depth sensor. Multimed. Tools Appl. 2016, 22, 14991–15015. [Google Scholar] [CrossRef]
- Simos, M.; Nikolaidis, N. Greek sign language alphabet recognition using leap motion device. In Proceedings of the 9th Hellenic Conference on Artificial Intelligence, Thessaloniki, Greece, 18–20 May 2016; p. 34. [Google Scholar]
- Chuan, C.; Regina, E.; Guardino, C. American Sign Language Recognition Using Leap Motion Sensor. In Proceedings of the 13th International Conference on Machine Learning and Applications, Miami, FL, USA, 4–7 December 2013; pp. 541–544. [Google Scholar]
Group | Feature | # of Feature |
---|---|---|
S | Standard deviation of palm position | 3 |
R | Hand palm curvature radius | 1 |
D | Distance between palm center and each fingertips | 5 |
A | Angle between two adjacent fingertips | 4 |
L | Distance between one fingertip and the consecutive fingertip | 10 |
Accuracy (%) | |||||
---|---|---|---|---|---|
Combination | 26 Classes | 36 Classes | - | ||
SVM | DNN | SVM | DNN | Average | |
C1 | 74.26 | 87.84 | 67.85 | 83.29 | 74.31 |
C2 | 74.57 | 92.77 | 67.08 | 87.38 | 80.45 |
C3 | 75.35 | 88.61 | 67.94 | 83.89 | 78.95 |
C4 | 80.30 | 93.29 | 72.04 | 87.35 | 83.25 |
C5 | 68.81 | 87.15 | 57.53 | 83.18 | 74.17 |
C6 | 79.73 | 93.81 | 72.79 | 88.79 | 83.78 |
Average | 75.50 | 90.58 | 67.54 | 85.65 | - |
36 Classes | 26 Classes | |||||||
---|---|---|---|---|---|---|---|---|
Letters | SVM | DNN | SVM | DNN | ||||
Se (%) | Sp (%) | Se (%) | Sp (%) | Se (%) | Sp (%) | Se (%) | Sp (%) | |
A | 99.50 | 98.81 | 86.31 | 99.59 | 94.17 | 99.14 | 90.82 | 99.41 |
B | 83.33 | 99.76 | 100.00 | 100.00 | 100.00 | 100.00 | 100.00 | 100.00 |
C | 100.00 | 99.55 | 100.00 | 100.00 | 100.00 | 100.00 | 100.00 | 100.00 |
D | 16.17 | 99.48 | 62.33 | 99.00 | 74.33 | 99.32 | 89.67 | 99.81 |
E | 63.50 | 99.03 | 98.67 | 99.93 | 83.83 | 99.58 | 99.00 | 99.99 |
F | 75.00 | 99.49 | 79.33 | 99.80 | 100.00 | 99.99 | 100.00 | 100.00 |
G | 91.67 | 98.71 | 89.17 | 99.85 | 91.67 | 99.58 | 90.00 | 99.89 |
H | 52.67 | 99.00 | 69.67 | 99.22 | 41.67 | 99.59 | 72.17 | 99.77 |
I | 100.00 | 99.75 | 100.00 | 100.00 | 100.00 | 99.91 | 100.00 | 99.99 |
J | 94.83 | 99.99 | 99.67 | 99.90 | 94.33 | 99.99 | 99.67 | 10.00 |
K | 75.00 | 99.05 | 93.17 | 99.84 | 91.67 | 99.55 | 99.50 | 99.97 |
L | 100.00 | 99.75 | 100.00 | 100.00 | 100.00 | 99.90 | 99.83 | 99.99 |
M | 91.67 | 98.33 | 96.67 | 99.79 | 100.00 | 99.50 | 95.17 | 99.90 |
N | 58.33 | 99.64 | 96.33 | 99.71 | 66.67 | 99.80 | 93.67 | 99.91 |
O | 50.00 | 98.12 | 64.50 | 99.20 | 83.33 | 99.90 | 99.33 | 99.99 |
P | 70.50 | 98.66 | 81.17 | 99.62 | 58.33 | 99.58 | 86.67 | 99.90 |
Q | 79.17 | 99.50 | 96.33 | 99.65 | 70.83 | 99.79 | 92.17 | 99.91 |
R | 87.33 | 98.95 | 96.67 | 99.82 | 75.00 | 99.37 | 100.00 | 99.85 |
S | 33.33 | 99.28 | 64.33 | 99.55 | 33.33 | 99.67 | 65.83 | 99.83 |
T | 91.67 | 99.52 | 98.67 | 99.72 | 91.67 | 99.67 | 94.83 | 99.90 |
U | 8.33 | 99.62 | 71.00 | 99.04 | 8.50 | 99.72 | 70.00 | 99.72 |
V | 58.33 | 99.28 | 74.17 | 99.24 | 80.67 | 99.34 | 96.33 | 99.95 |
W | 58.33 | 98.57 | 89.50 | 99.48 | 100.00 | 99.99 | 100.00 | 100.00 |
X | 41.67 | 99.61 | 93.00 | 99.95 | 50.00 | 99.99 | 98.17 | 99.96 |
Y | 100.00 | 99.99 | 100.00 | 100.00 | 100.00 | 99.99 | 100.00 | 100.00 |
Z | 97.83 | 99.97 | 96.17 | 99.99 | 97.83 | 99.99 | 99.67 | 99.99 |
0 | 17.50 | 99.02 | 71.33 | 99.00 | - | - | - | - |
1 | 61.83 | 98.08 | 76.50 | 99.05 | - | - | - | - |
2 | 83.33 | 98.22 | 77.33 | 99.39 | - | - | - | - |
3 | 74.67 | 99.21 | 96.83 | 99.80 | - | - | - | - |
4 | 78.00 | 99.66 | 97.67 | 99.99 | - | - | - | - |
5 | 100.00 | 99.61 | 100.00 | 100.00 | - | - | - | - |
6 | 57.17 | 99.05 | 81.83 | 99.70 | - | - | - | - |
7 | 100.00 | 99.92 | 100.00 | 100.00 | - | - | - | - |
8 | 95.50 | 99.73 | 100.00 | 100.00 | - | - | - | - |
9 | 74.33 | 99.10 | 92.83 | 99.41 | - | - | - | - |
Author | Gesture | Dataset (M people × N repetitions per letter per person) | Cross-Validation | Features | Classifier | Accuracy (%) |
---|---|---|---|---|---|---|
Khelil et al. [19] | 10 ArSL digits gesture | 10 people × 10 sets | - | Angle between two fingertips, angle between fingertips and hand’s normal, distance between the hand center to each fingertips | SVM | 91.3 |
Du et al. [20] | 10 digits gesture | 13 people × 20 sets | 80% training set, and 20% testing set, experiment performed 50 times | Fingertips angle (A), fingertips distance (D), fingertips elevation(E), fingertips tip distance (T) | SVM | 83.36 |
A+D+E+T+ HOG | SVM | 99.42 | ||||
Funasaka et al. [22] | 24 ASL gestures (static) | - | - | Palm normal vector, fingertips position, arm direction and fingertips direction | Decision Tree | 82.7 |
Marin et al. [24] | 10 ASL gestures | 14 people × 10 sets | Leave-one-subject-out cross-validation | Fingertips angle fingertips distance, and fingertips elevation | SVM | 80.86 |
Chuan et al. [31] | 26 ASL gestures | 2 people × 2 sets | 4 fold cross-validation | Pinch strength, grab strength, average distance, average spread, average tri-spread, extended distance, dip-tip projection, OrderX, and angle | k-Nearest Neighbors SVM | 72.78 79.83 |
Mapari et al. [23] | 32 ASL gestures (J, Z, 2 and 6 are excluded) | 146 people × 1 set | 90% training set, and 10% testing set cross-validation | Finger positions, palm position, distance between positions, angles between positions | MLP | 90 |
Proposed work | 36 ASL | 12 people × 1 set | Leave-one-subject-out cross-validation | Standard deviation of palm positions, hand palm curvature radius, distance between palm center and each fingertips, angle between two adjacent fingertips, distance between fingertips and each consecutive fingertips | DNN | 93.81 (26 classes) 88.79 (36 classes) |
© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Chong, T.-W.; Lee, B.-G. American Sign Language Recognition Using Leap Motion Controller with Machine Learning Approach. Sensors 2018, 18, 3554. https://doi.org/10.3390/s18103554
Chong T-W, Lee B-G. American Sign Language Recognition Using Leap Motion Controller with Machine Learning Approach. Sensors. 2018; 18(10):3554. https://doi.org/10.3390/s18103554
Chicago/Turabian StyleChong, Teak-Wei, and Boon-Giin Lee. 2018. "American Sign Language Recognition Using Leap Motion Controller with Machine Learning Approach" Sensors 18, no. 10: 3554. https://doi.org/10.3390/s18103554
APA StyleChong, T.-W., & Lee, B.-G. (2018). American Sign Language Recognition Using Leap Motion Controller with Machine Learning Approach. Sensors, 18(10), 3554. https://doi.org/10.3390/s18103554