Next Article in Journal
Head and Eye Movements During Pedestrian Crossing in Patients with Visual Impairment: A Virtual Reality Eye Tracking Study
Previous Article in Journal
Recognition and Misclassification Patterns of Basic Emotional Facial Expressions: An Eye-Tracking Study in Young Healthy Adults
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Test–Retest Reliability of a Computerized Hand–Eye Coordination Task

by
Antonio Ríder-Vázquez
1,
Estanislao Gutiérrez-Sánchez
2,
Clara Martinez-Perez
3,* and
María Carmen Sánchez-González
1
1
Department of Physics of Condensed Matter, Optics Area, University of Seville, Reina Mercedes S/N, 41012 Seville, Spain
2
Department of Surgery, Ophthalmology Area, University of Seville, Doctor Fedriani S/N, 41009 Seville, Spain
3
Instituto Superior de Educação e Ciências de Lisboa (ISEC Lisboa), Alameda das Linhas de Torres, 179, 1750-142 Lisboa, Portugal
*
Author to whom correspondence should be addressed.
J. Eye Mov. Res. 2025, 18(5), 54; https://doi.org/10.3390/jemr18050054
Submission received: 21 August 2025 / Revised: 9 October 2025 / Accepted: 11 October 2025 / Published: 14 October 2025

Abstract

Background: Hand–eye coordination is essential for daily functioning and sports performance, but standardized digital protocols for its reliable assessment are limited. This study aimed to evaluate the intra-examiner repeatability and inter-examiner reproducibility of a computerized protocol (COI-SV®) for assessing hand–eye coordination in healthy adults, as well as the influence of age and sex. Methods: Seventy-eight adults completed four sessions of a computerized visual–motor task requiring rapid and accurate responses to randomly presented targets. Accuracy and response times were analyzed using repeated-measures and reliability analyses. Results: Accuracy showed a small session effect and minor examiner differences on the first day, whereas response times were consistent across sessions. Men generally responded faster than women, and response times increased slightly with age. Overall, reliability indices indicated moderate-to-good repeatability and reproducibility for both accuracy and response time measures. Conclusions: The COI-SV® protocol provides a robust, objective, and reproducible measurement of hand–eye coordination, supporting its use in clinical, sports, and research settings.

1. Introduction

Hand–eye coordination is a fundamental capacity of the central nervous system that enables the use of visual information to guide and control manual movements such as reaching, writing, or catching objects [1,2,3]. This skill requires efficient integration of the visual system, motor pathways, and upper limbs, functioning as a sequential process from the visual detection of the target to muscle activation [1,3,4].
Its development begins in childhood and improves throughout life with practice and learning [5,6,7]. Activities like building games or object manipulation stimulate perception–action coupling, promoting maturation of cortical and subcortical circuits involved in movement planning and execution [5,8,9]. Neurologically, it relies on regions such as the cerebellum, posterior parietal cortex, and basal ganglia, as well as connections between visual and motor areas [8,9].
Functionally, gaze–hand coordination follows an ordered sequence: visual localization, attentional focus, perceptual identification, motor planning, and finally muscle activation [1,2,4]. Although eye movements are completed faster than manual ones, both actions are tightly coupled [4,10,11], adapting to spatial and temporal demands, as shown in pointing tasks using the index of difficulty paradigm [11,12,13]. When precision requirements increase, each reaching movement is preceded by an eye saccade, whereas lower demands allow intermittent control [12,13,14,15].
Interest in this capacity extends beyond clinical and neurological contexts. In sports such as volleyball, tennis, or badminton, it is a determinant of performance, with training improving visual reaction speed and inducing structural brain adaptations [16,17,18,19,20]. Beyond sports, hand–eye coordination is essential in daily activities—from writing or driving to digital interaction—underpinning functionality, independence, and learning [2,3,21]. Performance is also shaped by age and sex: coordination patterns mature around age 10, and some studies report task-dependent sex differences [6,7,8,9,22].
In clinical and research practice, a key challenge is the availability of standardized tools that reliably measure hand–eye coordination. While tests exist for isolated components such as precision or reaction time, protocols combining both are less common and require validation. Assessing inter- and intra-examiner reliability and session-to-session stability is essential for their diagnostic and training use [23]. The integration of eye tracking and digital recordings offers new opportunities, but robust validation remains needed [24,25,26,27].
The objective of this study was therefore to analyze the intra-examiner repeatability and inter-examiner reproducibility of a computerized method for hand–eye coordination, and to examine the influence of age and sex.

2. Materials and Methods

2.1. Study Design

A prospective study was conducted at the “Centro de Optometría Internacional” (COI) facilities in Madrid, Spain, between May and June 2023. A total of 78 individuals (42 males and 36 females) participated in the research. All participants gave their informed consent in accordance with ethical guidelines. The study adhered to the principles of the Declaration of Helsinki (1964) and received approval from the Research Ethics Committee of the University of Seville.

2.2. Participant Recruitment

Sample size estimation was carried out using the Granmo calculator (version 7.12; Institut Municipal d’Investigació Médica, Barcelona, Spain) [28]. According to the study parameters, a minimum of 46 participants was needed to achieve 80% statistical power, with a significance level (alpha) of 0.05 and a beta risk of 0.2 for a two-sided test.
Eligible participants were adults aged 18 to 65 years, with binocular and monocular distance visual acuity (VA) of 20/25 or better while wearing their distance correction. Exclusion criteria included any pathological condition causing visual field limitations, impairments in reaction time or eye movement, chronic medication use that could affect reaction time, intake of acute medication within 24 h prior to testing, and consumption of alcohol, drugs, or other substances potentially impacting test performance within 24 h before the assessment.
Participants were recruited through public advertisements and word of mouth at the COI facilities. Interested individuals contacted the research team and were screened against inclusion and exclusion criteria using a short questionnaire and an initial visual examination. A total of 78 respondents were enrolled (42 men, 36 women; mean age 32.5 ± 11.7 years).

2.3. Hand–Eye Coordination

This measurement was based on the Acuvision 1000® [29], a tool created to improve hand–eye coordination in athletes. During the test, 120 red dots appeared randomly and evenly across the four sections of the visual field. Participants were required to touch these dots as quickly and accurately as possible. The recorded data included the number of correct and missed touches, along with the average reaction time overall and for each visual quadrant.
Hand–eye coordination was measured using the COI Sport Vision® (COI-SV®) digital software. Developed by “Centro de Optometría Internacional” (Madrid, Spain), this software included diagnostic and treatment tests specifically designed for sports vision [30]. The software had an excellent reliability (α = 0.93) and had been used in other similar studies [31,32]. To reduce examiner-related biases, the procedure was standardized as follows:
  • The participant was seated approximately 60–70 cm from the screen.
  • The examiner read aloud the standardized instructions: “Red dots will appear randomly anywhere on the screen. You must try to touch the red dot as quickly and accurately as possible. You may use whichever hand you prefer.”
  • The participant confirmed understanding of the instructions.
  • The test was initiated with the command: “Let’s begin.”

2.4. Test and Re-Test Methodology

To evaluate the repeatability of the measurement method, a protocol was established in which each participant performed the tests twice across two separate days, resulting in a total of four trials under the same conditions [30,33,34,35]. In this study, Session 1 corresponds to Examiner 1 on Day 1, Session 2 to Examiner 2 on Day 1, Session 3 to Examiner 1 on Day 2, and Session 4 to Examiner 2 on Day 2. This designation is used consistently throughout the manuscript and in the figures. On the initial day, participants underwent preliminary assessments and completed a short questionnaire to verify compliance with inclusion and exclusion criteria. Each participant was evaluated by two different examiners; the order in which examiners conducted the tests was randomized on the first day and reversed for the second day. Examiners did not have access to each other’s results or to their own previous assessments. Both examiners were optometrists with comparable clinical and research experience in visuomotor testing. A third examiner was responsible for securely managing and storing the data. Moreover, participants were kept unaware of their earlier test results. Tests were spaced with rest periods ranging from 5 to 15 min, and approximately two weeks separated the first and second testing sessions. The overall structure of the experimental protocol is summarized in Figure 1.

2.5. Statistical Analysis

Statistical analyses were performed using R (version 4.4.2). An initial description of the sample was conducted, including measures of central tendency and dispersion for age, as well as distribution by sex.
To assess intra- and inter-examiner reproducibility of the hand–eye coordination task, data on correct responses and mean reaction times from four sessions (two per examiner on different days) were analyzed. The variables were converted to long format to facilitate the analyses.
Differences between sessions and examiners were evaluated using repeated-measures analysis of variance (ANOVA), including fixed effects for examiner, day, and their interaction, and a random effect for participant. Accuracy scores were treated as continuous variables given their approximately normal distribution. A sensitivity analysis using a Poisson generalized linear mixed-effects model yielded consistent results, confirming the robustness of the findings.
Test–retest reliability was estimated using intraclass correlation coefficients (ICCs) with a two-way random-effects, absolute-agreement, single-measures model (ICC [2,1]). To explore systematic bias and variability between measurements, Bland–Altman plots were computed with 95% limits of agreement (LoA), defined as the mean of the differences ± 1.96 times the standard deviation of the differences.
Paired t-tests were conducted to compare performance between examiners in the same session (inter-examiner) and between sessions of the same examiner (intra-examiner). Correlations between age and performance variables (accuracy and time) were calculated using Spearman’s coefficient. In addition, sex differences were evaluated using independent t-tests. The simultaneous influence of age and sex on reaction time was examined using an ANCOVA model. Finally, the relationship between accuracy and speed (the classic accuracy–latency trade-off) was evaluated through bivariate correlations by session. A p-value < 0.05 was considered statistically significant.

3. Results

3.1. Participants

After excluding two participants (one ineligible and one who did not complete the test), a total of 78 participants were included in the final analysis, comprising 42 men and 36 women. The mean age was 32.50 years (SD = 11.70), the median age was 27 years, and the interquartile range (IQR) was 13.80 years. Participant ages ranged from 19 to 64 years.
Figure 2A shows the mean scores obtained in each session. The repeated measures ANOVA revealed a statistically significant difference in mean scores across sessions (F(3,183) = 11.03, p = 0.002), suggesting a small but significant session effect on accuracy. A sensitivity analysis using a Poisson generalized linear mixed-effects model confirmed this session effect (p < 0.05), indicating that the result was not dependent on the statistical approach. Additionally, a paired t-test comparing Examiner 1 Session 1 and Examiner 2 Session 1 showed a significant difference (t = −2.51, p = 0.014), with lower scores observed in Examiner 1 Session 1.
Regarding response time, Figure 2B presents the mean times per session. The repeated measures ANOVA did not show significant differences in mean response time across sessions (F(3,181) = 2.05, p = 0.158), indicating similar temporal performance regardless of session or examiner. The visualization confirms a consistent temporal pattern across all sessions, with low variability among participants.

3.2. Influence of Age and Sex

No significant differences in accuracy scores were found between men and women across sessions, nor were there significant associations between age and accuracy (all p > 0.05). In contrast, response time showed a significant sex difference in Session 1 (p = 0.011), with men being faster (0.81 ± 0.08 s) than women (0.87 ± 0.10 s), as illustrated in Figure 3A. In other sessions, no significant sex differences were detected (all p > 0.05), although a trend toward faster times in men was observed. Age was positively correlated with response time in all sessions (Session 1: r = 0.35, p = 0.002; Session 2: r = 0.27, p = 0.019; Session 3: r = 0.31, p = 0.024; Session 4: r = 0.32, p = 0.018), indicating that older participants tended to respond more slowly (Figure 3B). An analysis of covariance (ANCOVA) performed for Session 1 confirmed that both sex (p = 0.006) and age (p < 0.001) had significant effects on response time, suggesting that sex differences persisted even after adjusting for age.
Furthermore, correlation analyses revealed significant positive associations between accuracy scores and response times in all sessions (Session 1: r = 0.47, p < 0.001; Session 2: r = 0.56, p < 0.001; Session 3: r = 0.44, p < 0.001; Session 4: r = 0.54, p < 0.001), confirming the classic speed–accuracy trade-off, where higher precision was associated with longer response times. This pattern indicates that participants prioritized accuracy at the cost of speed, consistent with previous visuomotor research findings.

3.3. Reproducibility and Session Effects

A linear mixed-effects model was applied to assess the effects of examiner, day (Day 1 vs. Day 2), and their interaction on both accuracy and response time, using participant as a random effect. For accuracy scores, the model revealed significant main effects of examiner (F(1,183) = 7.55, p = 0.007) and day (F(1,183) = 7.26, p = 0.008). The interaction between examiner and session was not significant (F(1,183) = 1.17, p = 0.281), indicating independent effects.
For response time, the effect of examiner was not significant (F(1,181) = 0.26, p = 0.613), nor was the interaction term (F(1,181) = 0.32, p = 0.572). However, a significant day effect was observed (F(1,181) = 4.71, p = 0.031), indicating slightly faster or slower response times depending on the testing day.

3.4. Reproducibility and Reliability Analysis

Intra-session repeatability was assessed by comparing accuracy scores and response times between Day 1 and Day 2 (mean across examiners). Inter-examiner reproducibility was evaluated separately for each day, and intra-examiner reproducibility was examined for each examiner. A summary of mean biases, 95% limits of agreement, and ICC values is presented in Table 1. Overall, both accuracy and response times showed small biases, narrow limits of agreement, and moderate-to-good ICCs, indicating robust reproducibility across sessions and examiners. Corresponding Bland–Altman plots (Figure 4A–D) illustrate these findings.

4. Discussion

This study provides evidence for the repeatability and reproducibility of a computerized protocol (COI-SV®) designed to assess hand–eye coordination in healthy adults. Both accuracy and response time demonstrated acceptable stability across sessions and examiners, supporting the utility of this protocol in clinical, sports, and research contexts where objective monitoring of visuomotor performance is required.
A small session effect on accuracy was detected, with slightly higher scores in subsequent sessions. Similar findings have been reported in studies of reaching or precision tasks, where repeated exposure produces limited improvements in motor accuracy [16,36,37]. In our cohort of untrained adults, however, these differences were modest and are more likely attributable to natural variability of the test rather than true learning effects. In fact, recent reliability studies of computerized visuomotor tests have reported ICC values in the range of 0.80–0.92 for both upper- and lower-extremity visuomotor reaction time tasks [38] and 0.86–0.94 for VR-based coordination and reaction time tests [39]. Our intra-session ICC of 0.70 and inter-examiner ICCs of 0.51–0.71 therefore fall within the same order of magnitude, supporting that the stability observed in this study is consistent with the expected performance of similar instruments. Importantly, the significant examiner difference observed in the first session aligns with previous reports attributing discrepancies to subtle variations in test instructions or initial participant familiarization [23], underlining the importance of strict standardization.
Response time measures showed greater robustness than accuracy. No systematic differences were observed across sessions or between examiners, in agreement with previous findings that temporal variables in computerized visuomotor tasks are less affected by contextual variability or fatigue [23,40]. This temporal stability is particularly valuable in longitudinal monitoring, where reliable detection of change depends on minimizing measurement error. At the same time, individual factors shaped temporal performance: men were faster than women in the first session, a difference that diminished with task repetition, and older participants consistently required longer times. These findings reflect well-documented patterns in the literature [36,40,41], where sex differences are often more pronounced during initial exposure and age-related slowing is a robust population trend.
A consistent positive correlation was observed between accuracy and response time across all sessions, confirming the classic speed–accuracy trade-off described by Fitts [42]. This relationship indicates that participants who aimed for greater precision did so at the cost of longer latencies, a phenomenon widely reported in both experimental and applied contexts [23,40]. For practical applications, this reinforces the need to interpret both accuracy and speed jointly, rather than in isolation, when evaluating visuomotor ability.
In terms of reproducibility, the Bland–Altman analysis confirmed small biases and acceptable limits of agreement for both intra- and inter-examiner comparisons, indicating that observed variations are unlikely to reflect true performance changes. These findings align with previous computerized reliability studies in sports and clinical settings [16,23,43,44], and support the use of the COI-SV® protocol as a stable monitoring tool. Compared to other available tools, such as Acuvision, Leap Motion, or EyeTribe, the COI-SV® protocol offers a number of practical advantages. It can be implemented on standard computer hardware without the need for specialized VR or haptic devices, making it more accessible and cost-efficient. At the same time, it provides standardized, reproducible measurements that are less dependent on examiner influence or contextual variability. Unlike fully immersive VR approaches, COI-SV® maintains ecological validity by replicating digital interaction contexts common in everyday life, while still allowing for precise quantification of visuomotor performance. Thus, rather than replacing immersive or haptic systems, COI-SV® complements them by offering a fast, scalable, and objective tool suitable for baseline screening, telehealth applications, and longitudinal monitoring.
Several limitations should be acknowledged. The study population consisted exclusively of healthy adults, limiting the generalizability of the findings to children, older adults, or individuals with neurological or visual disorders. The protocol did not include more complex task demands such as bimanual coordination, multitasking, or fatigue conditions, which may influence performance in applied settings. Another limitation concerns ecological validity: while the COI-SV® task provides standardized and reproducible measurements, tapping digital red dots may not fully reflect the visuomotor complexity of real-world contexts such as sports performance or clinical rehabilitation, where additional perceptual, motor, and decision-making demands are present. These considerations highlight the need for future studies to extend validation to different populations and more complex testing scenarios.
Future research should also examine the sensitivity of the COI-SV® protocol in detecting changes after visuomotor training, clinical rehabilitation, or neurological interventions. Validation in clinical cohorts where hand–eye coordination is critical, such as stroke survivors, individuals with Parkinson’s disease, or children with ADHD, would provide further evidence of its clinical applicability. Integrating the platform with emerging technologies such as real-time eye tracking, haptic feedback, or immersive VR could enhance ecological validity and provide richer insights into visuomotor control. Longitudinal applications are also promising, since the COI-SV® could be used to track recovery trajectories in rehabilitation or to monitor progressive improvements following sports or therapeutic training. Such studies will be important to distinguish true performance changes from intra-individual variability and measurement error.
From a practical perspective, the COI-SV® protocol represents a reliable and accessible tool for the objective assessment of hand–eye coordination. Its potential applications extend from baseline evaluations and follow-up in rehabilitation programs to functional assessments in sports performance, optometry, and neurology. The ability to obtain precise and comparable measurements across sessions and examiners facilitates the detection of clinically meaningful changes, supports individualized interventions, and allows integration into telemedicine and digital health platforms.

5. Conclusions

In conclusion, this study confirms that the computerized assessment of hand–eye coordination using the COI-SV® protocol is a reliable, objective, and reproducible tool for use in healthy adults. Both accuracy and response time demonstrated remarkable stability across sessions and examiners, supporting the methodological robustness of the test and its potential utility for monitoring purposes in clinical, sports, and research settings.
The absence of significant sex- or age-related differences in accuracy, along with the identification of the classic speed–accuracy trade-off, reinforces the value of this protocol for comprehensive visuomotor function assessment. While the learning or familiarization effects were minimal, the study highlights the importance of standardized administration to maximize measurement reliability.
Although the results should be interpreted in light of the sample characteristics and evaluation context, the computerized test offers a solid foundation for future clinical applications and research involving diverse populations, functional settings, and emerging assessment technologies.

Author Contributions

Conceptualization, A.R.-V., M.C.S.-G. and E.G.-S.; methodology, A.R.-V., M.C.S.-G. and E.G.-S.; software, A.R.-V.; validation, A.R.-V., E.G.-S. and M.C.S.-G.; formal analysis, A.R.-V. and C.M.-P.; investigation, A.R.-V. and C.M.-P.; resources, A.R.-V., M.C.S.-G. and E.G.-S.; data curation, A.R.-V.; writing—original draft preparation, A.R.-V., C.M.-P. and M.C.S.-G.; writing—review and editing, A.R.-V., M.C.S.-G., C.M.-P. and E.G.-S.; visualization, A.R.-V., M.C.S.-G., C.M.-P. and E.G.-S.; supervision, M.C.S.-G. and E.G.-S.; project administration, M.C.S.-G. and E.G.-S. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

All procedures involving human participants in this study were conducted in accordance with the ethical standards of the institutional research committee and in line with the principles of the 1964 Helsinki Declaration. This study was approved by Research Ethics Committee of the University of Seville (approval number: 0733-N-22).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors on request.

Acknowledgments

We would like to thank all the participants involved in the project. We also extend our gratitude to the staff of “Centro de Optometría Internacional” (Madrid), where this study was conducted, for their valuable support.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Crawford, J.D.; Medendorp, W.P.; Marotta, J.J. Spatial transformations for eye-hand coordination. J. Neurophysiol. 2004, 92, 10–19. [Google Scholar] [CrossRef] [PubMed]
  2. Kauffman, J.M.; Hallahan, D.P.; Pullen, P.C. Handbook of Special Education, 2nd ed.; Taylor & Francis: Abingdon, UK, 2017. [Google Scholar]
  3. Fliers, E.; Rommelse, N.; Vermeulen, S.H.; Altink, M.; Buschgens, C.J.; Faraone, S.V.; Sergeant, J.A.; Franke, B.; Buitelaar, J.K. Motor coordination problems in children and adolescents with ADHD rated by parents and teachers: Effects of age and gender. J. Neural Transm. 2008, 115, 211–220. [Google Scholar] [CrossRef] [PubMed]
  4. Gao, K.L.; Ng, S.S.; Kwok, J.W.; Chow, R.T.; Tsang, W.W. Eye-hand coordination and its relationship with sensori-motor impairments in stroke survivors. J. Rehabil. Med. 2010, 42, 368–373. [Google Scholar] [CrossRef]
  5. Gowen, E.; Miall, R.C. Eye-hand interactions in tracing and drawing tasks. Hum. Mov. Sci. 2006, 25, 568–585. [Google Scholar] [CrossRef] [PubMed]
  6. Niechwiej-Szwedo, E.; Wu, S.; Nouredanesh, M.; Tung, J.; Christian, L.W. Development of eye-hand coordination in typically developing children and adolescents assessed using a reach-to-grasp sequencing task. Hum. Mov. Sci. 2021, 80, 102868. [Google Scholar] [CrossRef]
  7. Kim, H.J.; Lee, C.H.; Kim, E.Y. Temporal differences in eye–hand coordination between children and adults during manual action on objects. Hong Kong J. Occup. Ther. 2018, 31, 106–114. [Google Scholar] [CrossRef]
  8. Battaglia-Mayer, A.; Caminiti, R. Parieto-frontal networks for eye–hand coordination and movements. Handb. Clin. Neurol. 2018, 151, 499–524. [Google Scholar] [CrossRef]
  9. Leigh, R.J.; Zee, D.S. The Neurology of Eye Movements, 5th ed.; Oxford University Press: Oxford, UK, 2015. [Google Scholar]
  10. Rand, M.; Stelmach, G. Effects of hand termination and accuracy constraint on eye–hand coordination during sequential two-segment movements. Exp. Brain Res. 2010, 207, 197–211. [Google Scholar] [CrossRef]
  11. Terrier, R.; Forestier, N.; Berrigan, F.; Germain-Robitaille, M.; Lavallière, M.; Teasdale, N. Effect of terminal accuracy requirements on temporal gaze-hand coordination during fast discrete and reciprocal pointings. J. Neuroeng. Rehabil. 2011, 8, 10. [Google Scholar] [CrossRef]
  12. Lazzari, S.; Mottet, D.; Vercher, J.L. Eye-hand coordination in rhythmical pointing. J. Mot. Behav. 2009, 41, 294–304. [Google Scholar] [CrossRef]
  13. de Vries, S.; Huys, R.; Zanone, P.G. Keeping your eye on the target: Eye—hand coordination in a repetitive Fitts’ task. Exp. Brain Res. 2018, 236, 3181–3190. [Google Scholar] [CrossRef]
  14. Guiard, Y. On Fitts’s and Hooke’s laws: Simple harmonic movement in upper-limb cyclical aiming. Acta Psychol. 1993, 82, 139–159. [Google Scholar] [CrossRef] [PubMed]
  15. Huys, R.; Fernandez, L.; Bootsma, R.J.; Jirsa, V.K. Fitts’ law is not continuous in reciprocal aiming. Proc. Biol. Sci. 2010, 277, 1179–1184. [Google Scholar] [CrossRef]
  16. Ngadiyana, H. The effect of eye-hand coordination training on accuracy of service in volleyball players. In Proceedings of the 1st South Borneo International Conference on Sport Science and Education (SBICSSE 2019), Banjarmasin, Indonesia, 28–29 November 2019; Atlantis Press: Dordrecht, The Netherlands, 2020. [Google Scholar]
  17. Carey, D.P. Eye–hand coordination: Eye to hand or hand to eye? Curr. Biol. 2000, 10, R416–R419. [Google Scholar] [CrossRef] [PubMed]
  18. Dane, S.; Hazar, F.; Tan, Ü. Correlations between eye-hand reaction time and power of various muscles in badminton players. Int. J. Neurosci. 2008, 118, 349–354. [Google Scholar] [CrossRef]
  19. Dube, S.P.; Mungal, S.U.; Kulkarni, M.B. Simple visual reaction time in badminton players: A comparative study. Natl. J. Physiol. Pharm. Pharmacol. 2015, 5, 18–20. [Google Scholar] [CrossRef]
  20. Di, X.; Zhu, S.; Jin, H.; Wang, P.; Ye, Z.; Zhou, K.; Zhuo, Y.; Rao, H. Altered resting brain function and structure in professional badminton players. Brain Connect. 2012, 2, 225–233. [Google Scholar] [CrossRef]
  21. Leo, A.; Handjaras, G.; Bianchi, M.; Marino, H.; Gabiccini, M.; Guidi, A.; Scilingo, E.P.; Pietrini, P.; Bicchi, A.; Santello, M.; et al. A synergy-based hand control is encoded in human motor cortical areas. Elife 2016, 5, e13420. [Google Scholar] [CrossRef]
  22. Komogortsev, O.V.; Gobert, D.V.; Jayarathna, S.; Koh, D.H.; Gowda, S.M. Standardization of automated analyses of oculomotor fixation and saccadic behaviors. IEEE Trans. Biomed. Eng. 2010, 57, 2635–2645. [Google Scholar] [CrossRef]
  23. Ujbányi, T.; Kővári, A.; Sziládi, G.; Katona, J. Examination of the eye-hand coordination related to computer mouse movement. Infocommunications J. 2020, 12, 26–31. [Google Scholar] [CrossRef]
  24. Popelka, S.; Stachoň, Z.; Šašinka, Č.; Doležalová, J. EyeTribe tracker data accuracy evaluation and its interconnection with Hypothesis software for cartographic purposes. Comput. Intell. Neurosci. 2016, 2016, 9172506. [Google Scholar] [CrossRef]
  25. Ooms, K.; Dupont, L.; Lapon, L.; Popelka, S. Accuracy and precision of fixation locations recorded with the low-cost Eye Tribe tracker in different experimental set-ups. J. Eye Mov. Res. 2015, 8, 1–24. [Google Scholar] [CrossRef]
  26. Dalmaijer, E. Is the low-cost EyeTribe eye tracker any good for research? PeerJ Prepr. 2014, 2, e585v1. [Google Scholar] [CrossRef]
  27. Weichert, F.; Bachmann, D.; Rudak, B.; Fisseler, D. Analysis of the accuracy and robustness of the Leap Motion Controller. Sensors 2013, 13, 6380–6393. [Google Scholar] [CrossRef]
  28. GRANMO. Sample Size and Power Calculator V 7.12. Institut Municipal d’Investigació Médica. Available online: http://www.imim.es/ofertadeserveis/software-public/granmo/ (accessed on 15 July 2025).
  29. Fogt, N.; Uhlig, R.; Thach, D.P.; Liu, A. The influence of head movement on the accuracy of a rapid pointing task. Optometry 2002, 73, 665–673. [Google Scholar]
  30. Ríder-Vázquez, A.; Vega-Holm, M.; Sánchez-González, M.C.; Gutiérrez-Sánchez, E. Minimum perceptual time (MPT). Repeatability and reproducibility of variables applied to “sports vision”. Graefe’s Arch. Clin. Exp. Ophthalmol. 2025, 263, 1175–1182. [Google Scholar] [CrossRef]
  31. Jorge, J.; Fernandes, P. Static and dynamic visual acuity and refractive errors in elite football players. Clin. Exp. Optom. 2019, 102, 51–56. [Google Scholar] [CrossRef]
  32. Nascimento, H.; Alvarez-Peregrina, C.; Martinez-Perez, C.; Sánchez-Tena, M.Á. Vision in futsal players: Coordination and reaction time. Int. J. Environ. Res. Public Health 2021, 18, 9069. [Google Scholar] [CrossRef]
  33. Antona, B.; Barrio, A.; Barra, F.; Gonzalez, E.; Sanchez, I. Repeatability and agreement in the measurement of horizontal fusional vergences. Ophthalmic Physiol. Opt. 2008, 28, 475–491. [Google Scholar] [CrossRef] [PubMed]
  34. Anstice, N.S.; Davidson, B.; Field, B.; Mathan, J.; Collins, A.V.; Black, J.M. The repeatability and reproducibility of four techniques for measuring horizontal heterophoria: Implications for clinical practice. J. Optom. 2021, 14, 275–281. [Google Scholar] [CrossRef] [PubMed]
  35. McCullough, S.J.; Doyle, L.; Saunders, K.J. Intra- and inter-examiner repeatability of cycloplegic retinoscopy among young children. Ophthalmic Physiol. Opt. 2017, 37, 16–23. [Google Scholar] [CrossRef]
  36. Coudiere, A.; Danion, F.R. Eye-hand coordination during sequential reaching to uncertain targets: The effect of task difficulty, target width, movement amplitude, and task scaling. Exp. Brain Res. 2025, 243, 143. [Google Scholar] [CrossRef]
  37. Igoresky, A.; Tangkudung, J. Relationship between eye-hand coordination and precision service. In Proceedings of the 1st International Conference on Sport Sciences, Health and Tourism (ICSSHT 2019), Padang, Indonesia, 13–14 November 2019; Atlantis Press: Dordrecht, The Netherlands, 2021. [Google Scholar]
  38. Brinkman, C.; Baez, S.E.; Quintana, C.; Andrews, M.L.; Heebner, N.R.; Hoch, M.C.; Hoch, J.M. The reliability of an upper- and lower-extremity visuomotor reaction time task. J. Sport Rehabil. 2020, 30, 828–831. [Google Scholar] [CrossRef]
  39. Pastel, S.; Klenk, F.; Bürger, D.; Heilmann, F.; Witte, K. Reliability and validity of a self-developed virtual reality-based test battery for assessing motor skills in sports performance. Sci. Rep. 2025, 15, 6256. [Google Scholar] [CrossRef]
  40. Abid, M.; Poitras, I.; Gagnon, M.; Mercier, C. Eye-hand coordination during upper limb motor tasks in individuals with or without a neurodevelopmental disorder: A systematic review. Front. Neurol. 2025, 16, 1569438. [Google Scholar] [CrossRef]
  41. Szabo, D.A.; Neagu, N.; Teodorescu, S.; Sopa, I.S. Eye-hand relationship of proprioceptive motor control and coordination in children 10–11 years old. Health Sports Rehabil. Med. 2020, 21, 185–191. [Google Scholar] [CrossRef]
  42. Fitts, P.M. The information capacity of the human motor system in controlling the amplitude of movement. J. Exp. Psychol. 1954, 47, 381–391. [Google Scholar] [CrossRef]
  43. Lavoie, E.; Hebert, J.S.; Chapman, C.S. Comparing eye–hand coordination between controller-mediated virtual reality and a real-world object interaction task. J. Vis. 2024, 24, 9. [Google Scholar] [CrossRef]
  44. Lavoie, E.; Hebert, J.S.; Chapman, C.S. How a lack of haptic feedback affects eye-hand coordination and embodiment in virtual reality. Sci. Rep. 2025, 15, 25219. [Google Scholar] [CrossRef]
Figure 1. Experimental protocol. Schematic representation of participant flow across the four test sessions: Day 1 (Examiner 1 and Examiner 2) and Day 2 (Examiner 1 and Examiner 2), with a two-week interval between days and 5–15 min rest periods between sessions.
Figure 1. Experimental protocol. Schematic representation of participant flow across the four test sessions: Day 1 (Examiner 1 and Examiner 2) and Day 2 (Examiner 1 and Examiner 2), with a two-week interval between days and 5–15 min rest periods between sessions.
Jemr 18 00054 g001
Figure 2. Mean accuracy scores (A) and response times (B) per session.
Figure 2. Mean accuracy scores (A) and response times (B) per session.
Jemr 18 00054 g002
Figure 3. (A) Response time by sex in Session 1, showing lower mean times in men. (B) Scatter plot illustrating the positive association between age and response time in Session 1.
Figure 3. (A) Response time by sex in Session 1, showing lower mean times in men. (B) Scatter plot illustrating the positive association between age and response time in Session 1.
Jemr 18 00054 g003
Figure 4. Bland-Altman plots illustrating agreement for accuracy and response time measures. (A) Inter-examiner agreement for accuracy scores (Day 1). (B) Inter-session agreement for response time (Day 1 vs. Day 2). (C) Inter-session agreement for accuracy (Day 1 vs. Day 2); (D) Inter-examiner agreement for response time (Day 1). The solid horizontal line represents the mean bias, and the dotted lines indicate the 95% limits of agreement.
Figure 4. Bland-Altman plots illustrating agreement for accuracy and response time measures. (A) Inter-examiner agreement for accuracy scores (Day 1). (B) Inter-session agreement for response time (Day 1 vs. Day 2). (C) Inter-session agreement for accuracy (Day 1 vs. Day 2); (D) Inter-examiner agreement for response time (Day 1). The solid horizontal line represents the mean bias, and the dotted lines indicate the 95% limits of agreement.
Jemr 18 00054 g004
Table 1. Summary of reproducibility analyses for accuracy and response time.
Table 1. Summary of reproducibility analyses for accuracy and response time.
ComparisonMeasureBias (Mean Difference)95% LoAICC (95% CI)
Intra-session (Day 1 vs. Day 2)Accuracy+1.31 points−5.39 to 8.000.70 (0.52–0.82)
Response time−0.01 s−0.13 to 0.11
Inter-examiner (Day 1)Accuracy+1.33 points−8.54 to 11.200.51 (0.28–0.68)
Response time−0.01 s−0.12 to 0.11
Inter-examiner (Day 2)Accuracy+0.61 points−6.98 to 8.200.71 (0.55–0.82)
Response time−0.001 s−0.09 to 0.08
Intra-examiner (Examiner 1)Accuracy+1.67 points−8.49 to 11.82
Intra-examiner (Examiner 2)Accuracy+0.94 points−8.20 to 10.09
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Ríder-Vázquez, A.; Gutiérrez-Sánchez, E.; Martinez-Perez, C.; Sánchez-González, M.C. Test–Retest Reliability of a Computerized Hand–Eye Coordination Task. J. Eye Mov. Res. 2025, 18, 54. https://doi.org/10.3390/jemr18050054

AMA Style

Ríder-Vázquez A, Gutiérrez-Sánchez E, Martinez-Perez C, Sánchez-González MC. Test–Retest Reliability of a Computerized Hand–Eye Coordination Task. Journal of Eye Movement Research. 2025; 18(5):54. https://doi.org/10.3390/jemr18050054

Chicago/Turabian Style

Ríder-Vázquez, Antonio, Estanislao Gutiérrez-Sánchez, Clara Martinez-Perez, and María Carmen Sánchez-González. 2025. "Test–Retest Reliability of a Computerized Hand–Eye Coordination Task" Journal of Eye Movement Research 18, no. 5: 54. https://doi.org/10.3390/jemr18050054

APA Style

Ríder-Vázquez, A., Gutiérrez-Sánchez, E., Martinez-Perez, C., & Sánchez-González, M. C. (2025). Test–Retest Reliability of a Computerized Hand–Eye Coordination Task. Journal of Eye Movement Research, 18(5), 54. https://doi.org/10.3390/jemr18050054

Article Metrics

Back to TopTop