Next Article in Journal
Autobiographical Recall Triggers Visual Exploration
Previous Article in Journal
Discriminating Cognitive Processes with Eye Movements in a Decision-Making Driving Task
 
 
Journal of Eye Movement Research is published by MDPI from Volume 18 Issue 1 (2025). Previous articles were published by another publisher in Open Access under a CC-BY (or CC-BY-NC-ND) licence, and they are hosted by MDPI on mdpi.com as a courtesy and upon agreement with Bern Open Publishing (BOP).
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Human-Robot Interaction Based on Gaze Gestures for the Drone Teleoperation

1
School of Automation, Beijing Institute of Technology, Beijing 100811, China
2
Intelligent Human-Machine Systems Lab, Northeastern University, Boston, MA 02115, USA
J. Eye Mov. Res. 2014, 7(4), 1-14; https://doi.org/10.16910/jemr.7.4.4 (registering DOI)
Published: 29 September 2014

Abstract

Teleoperation has been widely used to perform tasks in dangerous and unreachable environments by replacing humans with controlled agents. The idea of human-robot interaction (HRI) is very important in teleoperation. Conventional HRI input devices include keyboard, mouse and joystick, etc. However, they are not suitable for handicapped users or people with disabilities. These devices also increase the mental workload of normal users due to simultaneous operation of multiple HRI input devices by hand. Hence, HRI based on gaze tracking with an eye tracker is presented in this study. The selection of objects is of great importance and occurs at a high frequency during HRI control. This paper introduces gaze gestures as an object selection strategy into HRI for drone teleoperation. In order to test and validate the performance of gaze gestures selection strategy, we evaluate objective and subjective measurements, respectively. Drone control performance, including mean task completion time and mean error rate, are the objective measurements. The subjective measurement is the analysis of participant perception. The results showed gaze gestures selection strategy has a great potential as an additional HRI for use in agent teleoperation.
Keywords: human-robot interaction; teleoperation; gaze gestures; object selection; gaze-controlled interfaces human-robot interaction; teleoperation; gaze gestures; object selection; gaze-controlled interfaces

Share and Cite

MDPI and ACS Style

Yu, M.; Lin, Y.; Wang, X.; Schmidt, D.; Wang, Y. Human-Robot Interaction Based on Gaze Gestures for the Drone Teleoperation. J. Eye Mov. Res. 2014, 7, 1-14. https://doi.org/10.16910/jemr.7.4.4

AMA Style

Yu M, Lin Y, Wang X, Schmidt D, Wang Y. Human-Robot Interaction Based on Gaze Gestures for the Drone Teleoperation. Journal of Eye Movement Research. 2014; 7(4):1-14. https://doi.org/10.16910/jemr.7.4.4

Chicago/Turabian Style

Yu, Mingxin, Yingzi Lin, Xiangzhou Wang, David Schmidt, and Yu Wang. 2014. "Human-Robot Interaction Based on Gaze Gestures for the Drone Teleoperation" Journal of Eye Movement Research 7, no. 4: 1-14. https://doi.org/10.16910/jemr.7.4.4

APA Style

Yu, M., Lin, Y., Wang, X., Schmidt, D., & Wang, Y. (2014). Human-Robot Interaction Based on Gaze Gestures for the Drone Teleoperation. Journal of Eye Movement Research, 7(4), 1-14. https://doi.org/10.16910/jemr.7.4.4

Article Metrics

Back to TopTop