Next Article in Journal
Simultaneous Detection of Ammonium and Nitrate in Environmental Samples Using on Ion-Selective Electrode and Comparison with Portable Colorimetric Assays
Previous Article in Journal
Two-Dimensional Angle Estimation of Two-Parallel Nested Arrays Based on Sparse Bayesian Estimation
Open AccessArticle

American Sign Language Recognition Using Leap Motion Controller with Machine Learning Approach

Department of Electronics Engineering, Keimyung University, Daegu 42601, Korea
*
Author to whom correspondence should be addressed.
Sensors 2018, 18(10), 3554; https://doi.org/10.3390/s18103554
Received: 11 September 2018 / Revised: 16 October 2018 / Accepted: 17 October 2018 / Published: 19 October 2018
(This article belongs to the Section Physical Sensors)
Sign language is intentionally designed to allow deaf and dumb communities to convey messages and to connect with society. Unfortunately, learning and practicing sign language is not common among society; hence, this study developed a sign language recognition prototype using the Leap Motion Controller (LMC). Many existing studies have proposed methods for incomplete sign language recognition, whereas this study aimed for full American Sign Language (ASL) recognition, which consists of 26 letters and 10 digits. Most of the ASL letters are static (no movement), but certain ASL letters are dynamic (they require certain movements). Thus, this study also aimed to extract features from finger and hand motions to differentiate between the static and dynamic gestures. The experimental results revealed that the sign language recognition rates for the 26 letters using a support vector machine (SVM) and a deep neural network (DNN) are 80.30% and 93.81%, respectively. Meanwhile, the recognition rates for a combination of 26 letters and 10 digits are slightly lower, approximately 72.79% for the SVM and 88.79% for the DNN. As a result, the sign language recognition system has great potential for reducing the gap between deaf and dumb communities and others. The proposed prototype could also serve as an interpreter for the deaf and dumb in everyday life in service sectors, such as at the bank or post office. View Full-Text
Keywords: human-computer interaction; machine learning; sign language recognition; Leap Motion Controller; support vector machine; deep neural network; multi-class classification; American Sign Language human-computer interaction; machine learning; sign language recognition; Leap Motion Controller; support vector machine; deep neural network; multi-class classification; American Sign Language
Show Figures

Figure 1

MDPI and ACS Style

Chong, T.-W.; Lee, B.-G. American Sign Language Recognition Using Leap Motion Controller with Machine Learning Approach. Sensors 2018, 18, 3554.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop