A Novel Gaze Tracking Method Based on the Generation of Virtual Calibration Points
AbstractMost conventional gaze-tracking systems require that users look at many points during the initial calibration stage, which is inconvenient for them. To avoid this requirement, we propose a new gaze-tracking method with four important characteristics. First, our gaze-tracking system uses a large screen located at a distance from the user, who wears a lightweight device. Second, our system requires that users look at only four calibration points during the initial calibration stage, during which four pupil centers are noted. Third, five additional points (virtual pupil centers) are generated with a multilayer perceptron using the four actual points (detected pupil centers) as inputs. Fourth, when a user gazes at a large screen, the shape defined by the positions of the four pupil centers is a distorted quadrangle because of the nonlinear movement of the human eyeball. The gaze-detection accuracy is reduced if we map the pupil movement area onto the screen area using a single transform function. We overcame this problem by calculating the gaze position based on multi-geometric transforms using the five virtual points and the four actual points. Experiment results show that the accuracy of the proposed method is better than that of other methods. View Full-Text
Share & Cite This Article
Lee, J.W.; Heo, H.; Park, K.R. A Novel Gaze Tracking Method Based on the Generation of Virtual Calibration Points. Sensors 2013, 13, 10802-10822.
Lee JW, Heo H, Park KR. A Novel Gaze Tracking Method Based on the Generation of Virtual Calibration Points. Sensors. 2013; 13(8):10802-10822.Chicago/Turabian Style
Lee, Ji W.; Heo, Hwan; Park, Kang R. 2013. "A Novel Gaze Tracking Method Based on the Generation of Virtual Calibration Points." Sensors 13, no. 8: 10802-10822.