Next Article in Journal
Aggregated Throughput Prediction for Collated Massive Machine-Type Communications in 5G Wireless Networks
Previous Article in Journal
A Novel Differential High-Frequency Current Transformer Sensor for Series Arc Fault Detection
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

24-Gaze-Point Calibration Method for Improving the Precision of AC-EOG Gaze Estimation

by
Muhammad Syaiful Amri bin Suhaimi
,
Kojiro Matsushita
,
Minoru Sasaki
* and
Waweru Njeri
Department of Mechanical Engineering, Gifu University, 1-1 Yanagido, Gifu 501-1193, Japan
*
Author to whom correspondence should be addressed.
Sensors 2019, 19(17), 3650; https://doi.org/10.3390/s19173650
Submission received: 12 July 2019 / Revised: 5 August 2019 / Accepted: 17 August 2019 / Published: 22 August 2019
(This article belongs to the Section Intelligent Sensors)

Abstract

:
This paper sought to improve the precision of the Alternating Current Electro-Occulo-Graphy (AC-EOG) gaze estimation method. The method consisted of two core techniques: To estimate eyeball movement from EOG signals and to convert signals from the eyeball movement to the gaze position. In conventional research, the estimations are computed with two EOG signals corresponding to vertical and horizontal movements. The conversion is based on the affine transformation and those parameters are computed with 24-point gazing data at the calibration. However, the transformation is not applied to all the 24-point gazing data, but to four spatially separated data (Quadrant method), and each result has different characteristics. Thus, we proposed the conversion method for 24-point gazing data at the same time: To assume an imaginary center (i.e., 25th point) on gaze coordinates with 24-point gazing data and apply an affine transformation to 24-point gazing data. Then, we conducted a comparative investigation between the conventional method and the proposed method. From the results, the average eye angle error for the cross-shaped electrode attachment is x = 2.27 ° ± 0.46 ° and y = 1.83 ° ± 0.34 ° . In contrast, for the plus-shaped electrode attachment, the average eye angle error is is x = 0.94 ° ± 0.19 ° and y = 1.48 ° ± 0.27 ° . We concluded that the proposed method offers a simpler and more precise EOG gaze estimation than the conventional method.

1. Introduction

The direct human gaze is an indicator of interest. Thus, eye tracking has become of the most prospective technologies in recent years. There are two types of major eye-tracking systems: One is to record pupil positions with infrared cameras [1,2], and the other is to record electro-occulo-graphic (EOG) signals with a biological signal measurement device [3]. The camera-based system requires the user to wear camera built-in glasses [4,5] or to fix their head position in order to achieve stable capturing of their pupils. This imposes physical burdens, such as heaviness and motion and visual restriction on the user [6,7]. In addition, this method is slightly difficult for users who wear glasses and contact lenses due to reflection from the lenses, which makes the system unable to detect the pupils in a stable manner. Whereas the EOG-based system is characterized by gaze point precision, which is less than that of the camera-based system, the EOG is more robust in determining gaze direction [8,9,10,11,12,13,14]. In the EOG, data acquisition is achieved by attaching a few disposable electrodes around their eye of the subject and has the advantage of being less burdensome to the eyes in the measurement of the eye movement. Considering the aforementioned low accuracy associated with the EOG, it is important to develop techniques in order to improve the accuracy of the EOG-based eye tracker system.
EOG signals are electrical potentials generated when the eyeball moves. This results from the fact that the eyeball acts like a battery, where the eye cornea provides the positive potential and the eye retina provides the negative potential [15]. For that reason, it is possible to estimate eye movements by analyzing the resulting EOG signals. There are two types of EOG analysis methods: Direct-current-EOG (DC-EOG) and alternating-current-EOG (AC-EOG). DC-EOG signals are almost raw EOG signals, and the amplitudes are directly related to the eye movement [16]. DC-EOG has a disadvantage in that the resulting signals are easily influenced by human movement, and motion restriction is required to ensure precise acquisition [17,18]. On the other hand, AC-EOG is a DC-EOG which has been augmented with a band-pass filtered DC-EOG and is strongly against human movements. The signals are characterized to automatically adjust to zero so that the amount of eye movement is calculated with the signal integral [19].
However, the AC-EOG-based gaze estimation requires calibration to the signal before implementation. The user is required to gaze at a number of target points, and each signal is used for determining the conversion parameters from eye moment and gaze estimation at the calibration phase. The calibration is the most important factor, since it adjusts individual differences caused by electrode attachment, such as plus-shaped five-electrode arrangement and cross-shaped four-electrode arrangement [20,21]. K. Sakurai et al. [16] proposed gaze estimation calibration improvement by combining EOG and Kinect. M. Yan et al. [20] proposed a fuzzy mathematical model to improve the gaze estimation.
In another notable example, Ilhamdi et al. [22] studied the application of AC-EOG gaze estimation for robot control. Four electrodes were attached around the user’s eyes in the cross-shaped arrangement and the calibration was made using affine transformation based on the 24-point-group gaze target. The method was well-converted from two EOG signals to a gaze point on the gaze target. However, the transformation was performed for each quadrant, rather than simultaneously. Then, the quadrant method assumed all the data in one set in space are of the same polarity. However, the significant rotation could disrupt the data polarity. Thus, each quadrant had different characteristics, and data polarity disruption by rotation led to low accuracy.
In this paper, we focused on improving the accuracy of AC-EOG gaze estimation by proposing an alternative 24-gaze-point affine transformation calibration method. A virtual origin was computed which, together with 24 gaze points, formed 25 gaze points which are affine transformed. In this work, we analyzed and compared the accuracy of the calibration technique employed earlier, proposed by Ilhamdi, and the technique proposed in this paper. We applied the proposed method to two traditional AC-EOG signal measurements (i.e., cross-shaped electrode arrangement and plus-shaped electrode arrangement) and demonstrated its superiority in terms of accuracy. The two-electrode arrangements were investigated, as the gaze data were significantly different from each other for the rotation operation.
The accuracy improvement is beneficial for the EOG-based control system. For example, robotic control can pick and place an object using an EOG interfaced robotic arm, while in interface control, EOG can be used in graphical user interface (GUI) navigation.

2. Methodology

An experiment to obtain the EOG signal for eye gazing was conducted. The objective was to investigate the accuracy of EOG gazing data with gaze target points. Figure 1 shows the experimental setup for the study. A 24-point target based on a GUI program was built as the target for eye gaze. Target points were represented in pixel value, and reference points served as target points center coordinates (0,0). The relationship between EOG and eye gazing was determined by analyzing the captured EOG data and comparing it with the already-known target pixel value.
The experiments were performed with 10 test subjects (having good visual capabilities and aged between 22 to 40 years old). One after the other were seated with their chin resting on a fixed base, which was set such that the eyes of the subject were horizontal to the origin of the grid on the screen. The distance between the eyes of the test subject and the computer screen was maintained at 35 cm.

2.1. Hardware

2.1.1. Electrode Arrangement

The order in which the electrodes are attached to the face of the subject largely affects the shape for the resulting EOG signal. There are two major attachment shapes adopted in research, as shown in Figure 2: One is plus-shaped five-electrode arrangement, and the other is cross-shaped four-electrode arrangement. The plus-shaped type is suitable for recording EOG signals generated by vertical eye movements and horizontal eye movements, since the electrodes are attached in the same direction. Meanwhile, the cross-shaped type has the advantage of a smaller number of electrodes. However, the technique requires appropriate post-processing conversion of EOG signals to gaze position.

2.1.2. EOG Measurement System

We developed a low-cost AC-EOG measurement system as shown in Figure 3. The device consisted of four main components: Disposable electrodes, an EOG measurement circuit (comprising of a differential amplifier, bandpass filter, and an inverting amplifier, all connected in cascade), a data acquisition device (National Instruments (NI) Corporation USB-6008), and a PC running Windows 10 and VC++ 2017. Disposable electrodes were attached around the user’s eye, which captured DC-EOG generated by eye movements. The EOG system is configured as two channels (Ch1 and Ch2). The bandpass filter, having a lower cut-off frequency of 1.06 Hz and upper cut-off frequency of 4.97 Hz, converts the DC-EOG to AC-EOG, which is then amplified before being fed to the data acquisition device. The value of the bandpass filter was set as such to effectively measure the EOG signal. The sampling frequency in the data acquisition device is 1 kHz. Then, the device converts the electrical potential to digital data and transfer to PC program “Microsoft Visual Studio C/C++ 2017”. Finally, the PC program analyzes with gaze estimation.

2.2. Software

2.2.1. AC-EOG Discrimination

In this paper, we determined the gaze estimation by analyzing the two AC-EOG channel (Ch1 and Ch2) signals. The gaze estimation refers to the estimation of the point in space where the eyes of the subject are focused. In other words, it is the determination of the x and y coordinates on a two-dimensional plane.
From the signals, we proposed an integral method to analyze the EOG signal. In the integral computation, we included the signal thresholds method for Ch1 and Ch2 signals. The signal threshold served two purposes. The first purpose was to determine the polarity of the integral value. Positive integral was determined if the signal was above the positive threshold (th+) and negative integral was determined if the signal was below the negative threshold (th-). Second, the thresholds were introduced to remove any unwanted residual noises from the integral value, as noises could affect the accuracy of gaze estimation. The equation to obtain the signal integral is derived as
EOG integral c h i = | t h _ p E O G C h i ( t ) d t | + | t h _ n E O G C h i ( t ) d t |
th _ p = { t : E O G C h i ( t ) > t h + }
th _ n = { t : E O G C h i ( t ) > t h }
i = 1 , 2
Figure 4 shows an example of AC-EOG analysis for integral value. We also highlighted that eye blinks, either voluntary or involuntary, are counted for integral value.
Then, to determined the gaze estimation for x and y coordinates, the x value is represented by the channel Ch1 integral value and y value from channel Ch2 integral value. The values are used to compare with gaze target coordinates.

2.2.2. EOG-Gaze Target Algorithm

Ideally, the EOG gazing integral value should match the gaze target coordinate, as shown in Figure 5a. Previous studies for the cross-shaped electrode attachment show that there is a significant problem with accuracy between gazing data and targets [21]. Figure 5b shows the estimated gazing data obtained using the cross-shaped electrode arrangement. It can be observed that the estimated coordinates are rotated relative to the axes. Gazing data obtained using the plus-shaped electrodes are more accurate than the cross-shaped electrodes, as seen in Figure 5c.

2.2.3. Coordinate Transformation Method

The difference between gaze data and gaze target coordinates could be corrected using coordinate transformation techniques. A 24-point gazing data calibration method was previously developed by Ilhamdi et al. [22] based on the cross-shaped electrode attachment to improve the gazing data. In this conventional method, the 24-point gazing data are spatially separated into quadrants. The data conversion is then implemented based on the quadrant.
In this paper, we proposed a simpler computation using the affine transformation. Instead of separating the data, we propose a calibration method based on virtual origin coordinates. The conversion based on the proposed technique enables all gazing data to be calibrated simultaneously. Equations (5)–(8) shows the proposed homogeneous matrix for the affine transformation. The gazing data conversion (x’, y’) is determined using Equation (9).
Homogeneous   Matrix =   [ D i l a t a t i o n ] [ R o t a t i o n ] [ S h e a r ] [ T r a n s l a t i o n ]
=   [ s x ( cos θ m 2 sin θ ) s x ( m 1 cos θ sin θ ) f 1 s y ( sin θ + m 2 cos θ ) s y ( m 1 sin θ + cos θ ) f 2 0 0 1 ]
f 1 = s x ( T x cos θ + T x m 2 sin θ T y m 1 cos θ + T y sin θ )
f 2 = s y ( T x sin θ T x m 2 cos θ T y m 1 sin θ T y cos θ )
[ x y 1 ] = Homogeneous   Matrix   [ x y 1 ]
There are four geometrical steps involved in the homogeneous matrix computation. Figure 6 illustrates each geometry processes. The first step is the translation, where the imaginary center coordinate (Tx, Ty) is adjusted to the origin coordinates (0,0). The second step involves shearing of the axes. The shear is to ensure that the axes of the captured gazing data are perpendicular to each other. The third step is to rotate the now perpendicular axes by an angle θ to match normal x–y plane. The final step is dilatation. Dilatation adjusted the gazing data to have similar value with the gaze target pixel value. However, in this part, we separated the dilatation into 4fourvariables based on the line axis. The dilations are determined as:
  • Dilatation s1 is for data located on the positive y-axis line;
  • Dilatation s2 is for data located on the negative y-axis line;
  • Dilatation s3 is for data located on the positive x-axis line;
  • Dilatation s4 is for data located on the negative x-axis line.
Compared to the conventional method, the number and the order of the proposed geometrical steps are significantly different. The proposed method uses one less geometrical step than the conventional method. Besides, the total number of mathematical operation parameters required in the proposed method are nine: Two translation, two shear operations, one rotation, and four dilatation operations. Compared with the conventional method, which required seven operation parameters for each quadrant, giving a total of twenty-eight operation parameters, this shows that the proposed method is simpler than the conventional method in terms of computational resources and time.

3. Experiments and Discussion

3.1. Experiment 1: Cross-Shaped Electrode Attachment Transformation Method Simulation

A comparative investigation experiment was conducted to compare the accuracy of conventional cross–shaped electrode attachment method conversion method with the proposed method. Four simulation patterns of 24-point gazing data were proposed for the experiment. The patterns were used to test each computation in affine transformation homogeneous matrix: Dilatation, rotation, shear, and translation. To investigate the accuracy, the pixel distance error was used as an indicator, where the lower the distance error value, the higher the accuracy. Figure 7 shows the simulation patterns used in the experiment. The specification for each pattern computation objectives is as such:
(1)
Pattern (a) is to test dilatation and rotation;.
(2)
Pattern (b) is to test dilatation, rotation, and translation;
(3)
Pattern (c) is to test dilatation, rotatin, and shear;
(4)
Pattern (d) is to test dilatation, rotation, shear and translation.
Figure 8 shows the computation results. Based on the results, the proposed method performed better in calibrating the simulated gazing data relative to the conventional technique. From the four patterns, the average pixel distance error for the x and y pixel was negligibly small for the proposed method. This meant that no errors were observed for the proposed method. In contrast, the average pixel distance error for x pixel was 11 ± 3 pixel and for y pixel , the error was 5 ± 2 pixel for the conventional method. From the results, the proposed virtual origin coordinate conversion method is significantly better than the conventional method.
The errors observed for the conventional method are attributed to quadrant separation-based computation. The quadrant method restricts the view on the calibration into a small set of data. The data near the x and y axis are adversely affecting the computation. The quadrant method assumed all the data in one set in space are of the same polarity. If the rotation is significant, the data polarity is disrupted. The borderline data polarity, when used for computation, became inconsistent and then affected the accuracy of the calculations. In the proposed method, however, the 24-point gazing data are calibrated simultaneously, making the method not only simpler, but also accurate when it comes to computation.

3.2. Experiment 2: Accuracy Performance Between Electrode Attachment Methods

In this section, we performed a comparative investigation of the accuracy of the proposed method using EOG gazing data. Ten samples of 24-point gazing data were taken for both electrode attachment methods. We determined the accuracy based on gaze angle error. The lower the angle degree, the more accurate the gazing data. Figure 9 illustrates the original EOG gazing data.
Figure 10 shows the result of cross-shaped electrode attachment. Comparied to the original EOG gazing data, the accuracy is greatly improved. The calibrated 24-point gazing data achieved compelling similarity with gaze targets. Based on the bar chart, most of the x and y error angle for each point is less than 2 ° . On the other hand, Figure 11 shows the results of plus-shaped electrode attachment. Likewise, compared to the original EOG gazing data, the accuracy is also greatly improved. The gaze error angle observed from bar chart shows that the x and y angle error is less than 2 ° .
The next step in the experiment is to obtain EOG gazing data from 10 test subjects. The test subjects are to strengthen the investigation on the accuracy of the proposed method. However, instead of observing the gaze angle error on a point-to-point basis, the average for 24-point is calculated. The results for gazing data are shown in Figure 12.
The test subjects’ average gaze angle error results are shown in Figure 13, Table 1 and Table 2. The ten test subjects’ average for the cross-shaped electrode attachment is x = 2.27 ° ± 0.46 ° and y = 1.83 ° ± 0.34 ° . In comparison, the average for the plus-shaped electrode attachment is x = 0.94 ° ± 0.19 ° and y = 1.48 ° ± 0.27 ° . From the gaze angle error result, both electrode methods show almost identical gaze angle errors. However, the plus-shaped electrode attachment method shows a relatively small gaze angle error compared to the cross-shaped electrode attachment.
From the experiment, the proposed method proved to increase the accuracy of EOG gazing data for two different electrode attachment methods. However, in terms of the simplicity of the conversion method, the plus-shaped electrode attachment offered simpler computation than the cross-shaped electrode attachment method. The rotation for plus-shaped dive-electrode attachment is negligible. Thus, the cross-shaped electrode attachment has nine operation parameters for affine transformation, but we had eight operation parameters for the plus-shaped electrode attachment.

4. Conclusions

The objective of the research was to enhance the accuracy of the EOG gaze estimation method. The AC-EOG signal method was used to estimate eyeball movement and the gaze position was determined by converting the captured signals. In conventional research, the affine transformation was introduced as a calibration method for 24-point gazing data from the two AC-EOG signals, corresponding to vertical and horizontal eyeball movements using the cross-shaped electrode attachment method. However, the transformation was not applied for all the 24-point gazing data, but was instead applied for four spatially separated data (quadrant method). The quadrant method was also easily influenced by data rotation. The quadrant method assumed all the data in one set in space are of the same polarity, however, the significant rotation could disrupt the data polarity. The influence of data rotation was not fully investigated, as one electrode attachment method (cross-shaped four-electrode attachment) was used. Then, in term of the computational complexity of the quadrant method, seven variables were computed in each quadrant, where each quadrant produced a different value. In this paper, we proposed the conversion method for 24-point gazing data simultaneously and assumed a virtual origin (i.e., 25th point) on gaze coordinates with 24-point gazing data and applied an affine transformation to 24-point gazing data. Two experiments were conducted as comparative investigation for the conventional and proposed methods. The first experiment was an accuracy investigation between the proposed method and conventional computation. Four simulation patterns, based on 24-point gazing data, were used, and the accuracy was determined by the pixel distance error. The result shows that the proposed method achieved negligible error in gazing data conversion. The second experiment was to determine the accuracy of the proposed method using EOG gazing data. Ten test subjects were used to performed 24-point gaze targets with two different electrode attachment methods. The average angle error for the cross-shaped electrode attachment was x = 2.27 ° ± 0.46 ° and y = 1.83 ° ± 0.34 ° . On the other hand, the plus-shaped electrode attachment had an error of x = 0.94 ° ± 0.19 ° and y = 1.48 ° ± 0.27 ° . The results show that there was minimal error using the proposed method, and the two electrode attachment methods resulted in almost identical performances. From the experiments, we conclude that the proposed method was simpler and more accurate in EOG gaze estimation than the conventional method.

Author Contributions

M.S. and M.S.A.b.S. made conception and design of the study. M.S.A.b.S. and K.M. conducted experiments and analyzed data. M.S.A.b.S., W.N., and K.M. wrote this paper.

Funding

This research received no external funding.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Gibaldi, A.; Vanegas, M.; Bex, P.J.; Maiello, G. Evaluation of the Tobii EyeX Eye tracking controller and Matlab toolkit for research. Behav. Res. Methods 2017, 49, 923–946. [Google Scholar] [CrossRef]
  2. Lim, Y.; Gardi, A.; Pongsakornsathien, N.; Sabatini, R.; Ezer, A.; Kistan, T. Experimental characterisation of eye-tracking sensors for adaptive human-machine systems. Measurement 2019, 140, 151–160. [Google Scholar] [CrossRef]
  3. Arden, G.B.; Constable, P.A. The electro-oculogram. Prog. Retin. Eye Res. 2006, 25, 207–248. [Google Scholar] [CrossRef] [PubMed]
  4. Yasui, Y.; Tanaka, J.; Kakudo, M.; Tanaka, M. Relationship between preference and gaze in modified food using eye tracker. J. Prosthodont. Res. 2019, 63, 210–215. [Google Scholar] [CrossRef] [PubMed]
  5. Takahashi, R.; Suzuki, H.; Chew, J.Y.; Ohtake, Y.; Nagai, Y.; Ohtomi, K. A system for three-dimensional gaze fixation analysis using eye tracking glasses. J. Comput. Des. Eng. 2018, 5, 449–457. [Google Scholar] [CrossRef]
  6. Al-Rahayfeh, A.; Faezipour, M. Eye Tracking and Head Movement Detection: A State-of-Art Survey. IEEE J. Transl. Eng. Health Med. 2013, 1, 2100212. [Google Scholar] [CrossRef] [PubMed]
  7. Blignaut, P.; van Rensburg, E.J.; Oberholzer, M. Visualization and quantification of eye tracking data for the evaluation of oculomotor function. Heliyon 2019, 5, e01127. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  8. Deng, L.Y.; Hsu, C.-L.; Lin, T.-C.; Tuan, J.-S.; Chang, S.-M. EOG-based human–computer interface system development. Expert Syst. Appl. 2010, 37, 3337–3343. [Google Scholar] [CrossRef]
  9. Postelnicu, C.C.; Girbacia, F.; Talaba, D. EOG-based visual navigation interface development. Expert Syst. Appl. 2012, 39, 10857–10866. [Google Scholar] [CrossRef]
  10. Lledó, L.D.; Úbeda, A.; Iáñez, E.; Azorín, J.M. Internet browsing application based on electrooculography for disabled people. Expert Syst. Appl. 2013, 40, 2640–2648. [Google Scholar] [CrossRef]
  11. Chen, Y.; Newman, W.S. A human–robot interface based on electrooculography. In Proceedings of the IEEE International Conference on Robotics and Automation, New Orleans, LA, USA, 26 April–1 May 2004; pp. 243–248. [Google Scholar]
  12. Barea, R.; Boquete, L.; Mazo, M.; López, E.L. Wheelchair guidance strategies using EOG. J. Intell. Robot. Syst. 2002, 34, 279–299. [Google Scholar] [CrossRef]
  13. Lv, Z.; Wu, X.-P.; Li, M.; Zhang, D. A novel eye movement detection algorithm for EOG driven human computer interface. Pattern Recognit. Lett. 2010, 31, 1041–1047. [Google Scholar] [CrossRef]
  14. Sasaki, M.; Suhaimi, M.S.A.B.; Matsushita, K.; Ito, S.; Rusydi, M.I. Robot Control System Based on Electrooculography and Electromyogram. J. Comput. Commun. 2015, 3. [Google Scholar] [CrossRef]
  15. Venkataramanan, S.; Nemade, H.B.; Sahambi, J.S. Design and development of a novel EOG bio-potential amplifier. Int. J. Bioelectromagn. 2005, 7, 271–274. [Google Scholar]
  16. Manabe, H.; Fukumoto, M.; Yagi, T. Direct gaze estimation based on nonlinearity of EOG. IEEE Trans. Biomed. Eng. 2015, 62, 1553–1562. [Google Scholar] [CrossRef] [PubMed]
  17. Yagi, T.; Kuno, Y.; Koga, K.; Mukai, T. Drifting and blinking compensation in electro-oculography (EOG) eye-gaze interface. In Proceedings of the 2006 IEEE International Conference on Systems, Man and Cybernetics, Taipei, Taiwan, 8–11 October 2006; pp. 3222–3226. [Google Scholar]
  18. Sakurai, K.; Yan, M.; Tanno, K.; Tamura, H. Gaze Estimation Method Using Analysis of Electrooculogram Signals and Kinect Sensor. Comput. Intell. Neurosci. 2017, 2017. [Google Scholar] [CrossRef] [PubMed]
  19. Rusydi, M.I.; Sasaki, M.; Ito, S. Calculate Target Position of Object in 3-Dimensional Area Based on the Perceived Locations Using EOG Signals. J. Comput. Commun. 2014, 2. [Google Scholar] [CrossRef]
  20. Yan, M.; Tamura, H.; Tanno, K. A study on gaze estimation system using cross-channels electrooculogram signals. In Proceedings of the International MultiConference of Engineers and Computer Scientists, Hong Kong, China, 12–14 March 2014; pp. 112–116. [Google Scholar]
  21. Rusydi, M.I.; Okamoto, T.; Ito, S.; Sasaki, M. Rotation Matrix to Operate a Robot Manipulator for 2D Analog Tracking Objects Using Electrooculography. Robotics 2014, 3, 289–309. [Google Scholar] [CrossRef] [Green Version]
  22. Rusydi, M.I.; Sasaki, M.; Ito, S. Affine Transform to Reform Pixel Coordinates of EOG Signals for Controlling Robot Manipulators Using Gaze Motions. Sensors 2014, 14, 10107–10123. [Google Scholar] [CrossRef] [PubMed] [Green Version]
Figure 1. Experiment setup (left) and 24-point gaze targets (right).
Figure 1. Experiment setup (left) and 24-point gaze targets (right).
Sensors 19 03650 g001
Figure 2. (a) Plus-shaped electrode attachment and (b) cross-shaped electrode attachment.
Figure 2. (a) Plus-shaped electrode attachment and (b) cross-shaped electrode attachment.
Sensors 19 03650 g002
Figure 3. Electro-occulo-graphic (EOG) measurement device and the processes.
Figure 3. Electro-occulo-graphic (EOG) measurement device and the processes.
Sensors 19 03650 g003
Figure 4. Signal integral method for left and right eye movement.
Figure 4. Signal integral method for left and right eye movement.
Sensors 19 03650 g004
Figure 5. Gaze estimation: (a) Ideal; (b) Cross-shaped-electrode attachment; (c) Plus-shaped-electrode attachment.
Figure 5. Gaze estimation: (a) Ideal; (b) Cross-shaped-electrode attachment; (c) Plus-shaped-electrode attachment.
Sensors 19 03650 g005
Figure 6. Proposed affine transformation geometry process. (a) Translation; (b) Shear; (c) Rotation; and (d) Dilatation.
Figure 6. Proposed affine transformation geometry process. (a) Translation; (b) Shear; (c) Rotation; and (d) Dilatation.
Sensors 19 03650 g006
Figure 7. Simulation patterns for 24-point computation: (a) Dilatation and rotation; (b) Dilatation, rotation and translation; (c) Dilatation, rotation and shear; (d) Dilatation, rotation, shear and translation.
Figure 7. Simulation patterns for 24-point computation: (a) Dilatation and rotation; (b) Dilatation, rotation and translation; (c) Dilatation, rotation and shear; (d) Dilatation, rotation, shear and translation.
Sensors 19 03650 g007
Figure 8. Simulation results for 24-point computation: (a) Dilatation and rotation; (b) Dilatation, rotation and translation; (c) Dilatation, rotation and shear; (d) Dilatation, rotation, shear and translation.
Figure 8. Simulation results for 24-point computation: (a) Dilatation and rotation; (b) Dilatation, rotation and translation; (c) Dilatation, rotation and shear; (d) Dilatation, rotation, shear and translation.
Sensors 19 03650 g008
Figure 9. Original EOG gazing data: (a) Cross-shaped electrode attachment; (b) Plus-shaped electrode attachment.
Figure 9. Original EOG gazing data: (a) Cross-shaped electrode attachment; (b) Plus-shaped electrode attachment.
Sensors 19 03650 g009
Figure 10. Cross-shaped electrode attachment conversion results: (a) Calibrated 24-point gazing data; (b) Gaze angle error for each point.
Figure 10. Cross-shaped electrode attachment conversion results: (a) Calibrated 24-point gazing data; (b) Gaze angle error for each point.
Sensors 19 03650 g010
Figure 11. Plus-shaped electrode attachment conversion results: (a) Calibrated 24-point gazing data; (b) Gaze angle error for each point.
Figure 11. Plus-shaped electrode attachment conversion results: (a) Calibrated 24-point gazing data; (b) Gaze angle error for each point.
Sensors 19 03650 g011
Figure 12. Twenty-four-point gazing data for 10 test subjects (a) Cross-shaped electrode attachment; (b) Plus-shaped electrode attachment.
Figure 12. Twenty-four-point gazing data for 10 test subjects (a) Cross-shaped electrode attachment; (b) Plus-shaped electrode attachment.
Sensors 19 03650 g012
Figure 13. Average 24-point gaze angle error for 10 test subjects: (a) Cross-shaped electrode attachment; (b) Plus-shaped electrode attachment.
Figure 13. Average 24-point gaze angle error for 10 test subjects: (a) Cross-shaped electrode attachment; (b) Plus-shaped electrode attachment.
Sensors 19 03650 g013
Table 1. Cross-shaped electrode attachment average 24-point gaze angle error for 10 test subjects.
Table 1. Cross-shaped electrode attachment average 24-point gaze angle error for 10 test subjects.
Subject Ave .   Angle   x   ( θ ° ) Ave .   Angle   y   ( θ ° ) x   SD   ( σ x ° ) y   SD   ( σ y ° )
12.02051.20980.35550.2517
22.23900.89780.40060.1682
33.72672.45370.73840.5082
42.69031.92620.56950.3820
52.96052.92020.65910.4417
61.63902.02280.41460.4356
71.89611.46590.29780.2190
81.71872.08540.40730.3257
90.95060.80940.19570.1324
102.86132.46530.53500.5427
Average2.27031.82560.45740.3407
Table 2. Plus-shaped electrode attachment average 24-point gaze angle error for 10 test subjects.
Table 2. Plus-shaped electrode attachment average 24-point gaze angle error for 10 test subjects.
Subject Ave .   Angle   x   ( θ ° ) Ave .   Angle   y   ( θ ° ) x   SD   ( σ x ° ) y   SD   ( σ y ° )
10.66271.86330.12800.4032
20.97681.21660.25540.2388
31.52831.58730.27830.3169
40.92212.38250.21060.3763
50.91280.92520.15980.2105
60.62421.39430.12840.2691
70.66910.69440.12110.1133
80.74531.29210.13030.2095
91.37742.22470.29760.3514
101.03071.19710.20070.2041
Average0.94491.47770.19100.2693

Share and Cite

MDPI and ACS Style

bin Suhaimi, M.S.A.; Matsushita, K.; Sasaki, M.; Njeri, W. 24-Gaze-Point Calibration Method for Improving the Precision of AC-EOG Gaze Estimation. Sensors 2019, 19, 3650. https://doi.org/10.3390/s19173650

AMA Style

bin Suhaimi MSA, Matsushita K, Sasaki M, Njeri W. 24-Gaze-Point Calibration Method for Improving the Precision of AC-EOG Gaze Estimation. Sensors. 2019; 19(17):3650. https://doi.org/10.3390/s19173650

Chicago/Turabian Style

bin Suhaimi, Muhammad Syaiful Amri, Kojiro Matsushita, Minoru Sasaki, and Waweru Njeri. 2019. "24-Gaze-Point Calibration Method for Improving the Precision of AC-EOG Gaze Estimation" Sensors 19, no. 17: 3650. https://doi.org/10.3390/s19173650

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop