Next Article in Journal
Comparison of Feature Selection Techniques for Power Amplifier Behavioral Modeling and Digital Predistortion Linearization
Previous Article in Journal
Experimental Validation and Deployment of Observability Applications for Monitoring of Low-Voltage Distribution Grids
Previous Article in Special Issue
Toward a Unified Theory of Customer Continuance Model for Financial Technology Chatbots
Article

No Interface, No Problem: Gesture Recognition on Physical Objects Using Radar Sensing

1
Faculty of Mathematics, Natural Sciences and Information Technologies (FAMNIT), University of Primorska, Glagoljaška 8, 6000 Koper, Slovenia
2
Department of Computer Science, University of Luxembourg, Maison du Nombre 6, Avenue de la Fonte, L-4364 Esch-sur-Alzette, Luxembourg
3
School of Creative Media, City University of Hong Kong, Hong Kong, China
4
Department of Information Science, University of Otago, P.O. Box 56, Dunedin 9054, New Zealand
5
Graduate School of Science and Technology, Nara Institute of Science and Technology, Takayama 8916-5, Ikoma, Nara, Japan
*
Author to whom correspondence should be addressed.
Nuwan T. Attygalle, Luis A. Leiva and Klen Čopič Pucihar contributed equally to this work.
Academic Editors: Pavel Zemcik, Alan Chalmers and Vítězslav Beran
Sensors 2021, 21(17), 5771; https://doi.org/10.3390/s21175771
Received: 30 June 2021 / Revised: 12 August 2021 / Accepted: 20 August 2021 / Published: 27 August 2021
Physical objects are usually not designed with interaction capabilities to control digital content. Nevertheless, they provide an untapped source for interactions since every object could be used to control our digital lives. We call this the missing interface problem: Instead of embedding computational capacity into objects, we can simply detect users’ gestures on them. However, gesture detection on such unmodified objects has to date been limited in the spatial resolution and detection fidelity. To address this gap, we conducted research on micro-gesture detection on physical objects based on Google Soli’s radar sensor. We introduced two novel deep learning architectures to process range Doppler images, namely a three-dimensional convolutional neural network (Conv3D) and a spectrogram-based ConvNet. The results show that our architectures enable robust on-object gesture detection, achieving an accuracy of approximately 94% for a five-gesture set, surpassing previous state-of-the-art performance results by up to 39%. We also showed that the decibel (dB) Doppler range setting has a significant effect on system performance, as accuracy can vary up to 20% across the dB range. As a result, we provide guidelines on how to best calibrate the radar sensor. View Full-Text
Keywords: radar sensing; gesture recognition; deep learning; human factors radar sensing; gesture recognition; deep learning; human factors
Show Figures

Figure 1

MDPI and ACS Style

Attygalle, N.T.; Leiva, L.A.; Kljun, M.; Sandor, C.; Plopski, A.; Kato, H.; Čopič Pucihar, K. No Interface, No Problem: Gesture Recognition on Physical Objects Using Radar Sensing. Sensors 2021, 21, 5771. https://doi.org/10.3390/s21175771

AMA Style

Attygalle NT, Leiva LA, Kljun M, Sandor C, Plopski A, Kato H, Čopič Pucihar K. No Interface, No Problem: Gesture Recognition on Physical Objects Using Radar Sensing. Sensors. 2021; 21(17):5771. https://doi.org/10.3390/s21175771

Chicago/Turabian Style

Attygalle, Nuwan T., Luis A. Leiva, Matjaž Kljun, Christian Sandor, Alexander Plopski, Hirokazu Kato, and Klen Čopič Pucihar. 2021. "No Interface, No Problem: Gesture Recognition on Physical Objects Using Radar Sensing" Sensors 21, no. 17: 5771. https://doi.org/10.3390/s21175771

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop