Next Article in Journal
Simple and Cost-Effective Electrochemical Method for Norepinephrine Determination Based on Carbon Dots and Tyrosinase
Next Article in Special Issue
An Acoustic Sensing Gesture Recognition System Design Based on a Hidden Markov Model
Previous Article in Journal
A Low-Cost On-Street Parking Management System Based on Bluetooth Beacons
Previous Article in Special Issue
A Portable Fuzzy Driver Drowsiness Estimation System
Article

Development of Real-Time Hand Gesture Recognition for Tabletop Holographic Display Interaction Using Azure Kinect

1
Department of Electronic Engineering, Kwangwoon University, Seoul 01897, Korea
2
Electronics and Telecommunications Research Institute (ETRI), Daejeon 34129, Korea
3
Graduate School of Smart Convergence, Kwangwoon University, Seoul 01897, Korea
*
Author to whom correspondence should be addressed.
Sensors 2020, 20(16), 4566; https://doi.org/10.3390/s20164566
Received: 8 July 2020 / Revised: 4 August 2020 / Accepted: 12 August 2020 / Published: 14 August 2020
(This article belongs to the Special Issue Sensor Systems for Gesture Recognition)
The use of human gesturing to interact with devices such as computers or smartphones has presented several problems. This form of interaction relies on gesture interaction technology such as Leap Motion from Leap Motion, Inc, which enables humans to use hand gestures to interact with a computer. The technology has excellent hand detection performance, and even allows simple games to be played using gestures. Another example is the contactless use of a smartphone to take a photograph by simply folding and opening the palm. Research on interaction with other devices via hand gestures is in progress. Similarly, studies on the creation of a hologram display from objects that actually exist are also underway. We propose a hand gesture recognition system that can control the Tabletop holographic display based on an actual object. The depth image obtained using the latest Time-of-Flight based depth camera Azure Kinect is used to obtain information about the hand and hand joints by using the deep-learning model CrossInfoNet. Using this information, we developed a real time system that defines and recognizes gestures indicating left, right, up, and down basic rotation, and zoom in, zoom out, and continuous rotation to the left and right. View Full-Text
Keywords: azure kinect; deep-learning; gesture interaction; hand detection; hologram display azure kinect; deep-learning; gesture interaction; hand detection; hologram display
Show Figures

Figure 1

MDPI and ACS Style

Lee, C.; Kim, J.; Cho, S.; Kim, J.; Yoo, J.; Kwon, S. Development of Real-Time Hand Gesture Recognition for Tabletop Holographic Display Interaction Using Azure Kinect. Sensors 2020, 20, 4566. https://doi.org/10.3390/s20164566

AMA Style

Lee C, Kim J, Cho S, Kim J, Yoo J, Kwon S. Development of Real-Time Hand Gesture Recognition for Tabletop Holographic Display Interaction Using Azure Kinect. Sensors. 2020; 20(16):4566. https://doi.org/10.3390/s20164566

Chicago/Turabian Style

Lee, Chanhwi, Jaehan Kim, Seoungbae Cho, Jinwoong Kim, Jisang Yoo, and Soonchul Kwon. 2020. "Development of Real-Time Hand Gesture Recognition for Tabletop Holographic Display Interaction Using Azure Kinect" Sensors 20, no. 16: 4566. https://doi.org/10.3390/s20164566

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop