Next Article in Journal
OKN and Pupillary Response Modulation by Gaze and Attention Shifts
Previous Article in Journal
Numerosity Perception and Perceptual Load: Exploring Sex Differences Through Eye-Tracking
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Eye Movement Indicator Difference Based on Binocular Color Fusion and Rivalry

by
Xinni Zhang
1,2,3,
Mengshi Dai
2,3,
Feiyan Cheng
1,3,
Lijun Yun
1,2 and
Zaiqing Chen
1,2,3,*
1
School of Information Science and Technology, Yunnan Normal University, Kunming 650500, China
2
Engineering Research Center of Computer Vision and Intelligent Control Technology, Department of Education of Yunnan Province, Kunming 650500, China
3
Yunnan Key Laboratory of Optoelectronic Information Technology, Kunming 650500, China
*
Author to whom correspondence should be addressed.
J. Eye Mov. Res. 2025, 18(2), 10; https://doi.org/10.3390/jemr18020010
Submission received: 20 February 2025 / Revised: 23 March 2025 / Accepted: 1 April 2025 / Published: 5 April 2025

Abstract

:
Color fusion and rivalry are two key information integration mechanisms in binocular vision, representing the visual system’s processing patterns for consistent and conflicting inputs, respectively. This study hypothesizes that there are quantifiable differences in eye movement indicators under states of binocular color fusion and rivalry, which can be verified through multi-paradigm eye movement experiments. The experiment recruited eighteen subjects with normal vision (nine males and nine females), employing the Gaze Stability paradigm, Straight Curve Eye Hopping paradigm, and Smoothed Eye Movement Tracking paradigm for eye movement tracking. Each paradigm included a binocular color rivalry experimental group (R-G) and two binocular color fusion control groups (R-R, G-G). Data analysis indicates significant differences in indicators such as Average Saccade Amplitude, Median Saccade Amplitude, and SD of Saccade Amplitude between binocular color fusion and rivalry states. For instance, through Z-Score normalization and cross-paradigm merged analysis, specific ranges of these indicators were identified to distinguish between the two states. When the Average Saccade Amplitude falls within the range of −0.905–−0.693, it indicates a state of binocular color rivalry; when the range is 0.608–1.294, it reflects a state of binocular color fusion. Subsequently, ROC curve analysis confirmed the effectiveness of the experimental paradigms in analyzing the mechanisms of binocular color fusion and rivalry, with AUC values of 0.990, 0.741, and 0.967, respectively. These results reveal the potential of eye movement behaviors as biomarkers for the dynamic processing of visual conflicts. This finding provides empirical support for understanding the neural computational models of binocular vision and lays a methodological foundation for developing visual impairment assessment tools based on eye movement features.

1. Introduction

When the left and right eyes receive different images, the brain processes these inputs in two distinct ways: slight differences lead to binocular fusion, while significant differences result in binocular rivalry [1,2,3]. Binocular fusion and rivalry represent complementary phenomena in binocular vision that are frequently discussed in studies concerning consciousness, attention, and the brain’s processing of conflicting visual information. Breese revealed the dynamics of inhibition between opposing colors using green-red stimuli [4], a study later expanded by Gellhorn and Schoeppe who explored color dominance with varying intensities [5]. From the 20th century onwards, research shifted from monocular to binocular rivalry, with Levelt proposing models of alternating dominance [6] and Blake and Fox examining the hierarchy of visual processing through aftereffects [7]. Wandell et al. contributed a model explaining how the brain integrates color information from both eyes [8].
Entering the 21st century, Li et al. demonstrated that binocular stimulation training could significantly improve visual integration and perceptual stability in adult amblyopic patients, highlighting the potential of such training for visual rehabilitation [9]. Brascamp et al. reviewed Levelt’s laws of binocular rivalry, revealing that periodicity and stability are closely related to neural oscillatory activity, providing new insights into understanding its dynamic mechanisms [10]. Riesen et al. found that the visual system can flexibly switch between rivalry and fusion, indicating that external stimuli and neural processing influence these perceptions [11]. Chen et al. quantified the effects of inconsistency and color difference on binocular color rivalry, discovering that increased inconsistency enhances rivalry likelihood, whereas minor inconsistencies favor fusion [12]. Lv et al. analyzed EEG signals using the EEGNet model, showing that dorsal areas play a significant role in distinguishing color fusion and rivalry [13]. Asano and Wang introduced a preliminary rivalry indicator ΔE*bino, proving the critical role of color differences in binocular rivalry and offering a new tool for quantifying these processes [14].
Color fusion and rivalry are two key information integration mechanisms in binocular vision, representing the visual system’s processing modes for consistent and conflicting inputs, respectively. Existing studies have shown that binocular color rivalry and fusion phenomena are closely related to the neural mechanisms of visual information processing, involving processes such as attention allocation [15], conflict monitoring [16], and multisensory integration [17]. However, most existing studies focus on macroscopic representations of behavioral responses or EEG signals, with few systematically exploring the potential value of eye movement indicators in distinguishing fusion from rivalry.
This research gap may stem from two aspects: first, traditional experimental designs often view eye movements as nuisance variables, suppressing their natural behavior by fixing gaze points [18], thereby ignoring the function of eye movements as “indicators of cognitive load”; second, the rapid dynamics of rivalry and fusion (such as millisecond-level perceptual switching) impose higher spatiotemporal resolution requirements on eye movement data [19]. However, with advancements in high-precision eye movement technology and machine learning, the potential of eye movement indicators in parsing microsecond-level cognitive processes (such as conflict detection and attention switching) is gradually becoming evident. For instance, Raveendran et al. discovered that saccadic movements can modulate the efficiency of binocular field integration [20], while Kalisvaart and Goossens further proposed that interactions between eye movement signals and additional retinal signals might encode rhythmic characteristics of competitive alternation [21]. These findings suggest that eye movement behaviors are not only passive reflections of perceptual outputs but also actively participate in the dynamic process of conflict resolution.
Notably, although some studies have focused on the role of tiny eye movements in maintaining visual stability [22,23], they have yet to clearly answer a crucial question: Can eye movement indicators effectively distinguish cognitive states of color fusion and rivalry? Hirota et al. found that decreased binocular fusion ability could serve as an indicator of visual fatigue [24], but their study did not address mechanisms regulating color conflicts. Grossberg et al. revealed associations between saccade velocity and depth perception but did not explore specific responses in color rivalry [25]. Furthermore, existing eye movement studies primarily focus on monochrome or grayscale stimuli [26], lacking systematic analysis of the regulatory mechanisms of asymmetric adaptation across color channels [27].
Therefore, this study aims to systematically analyze eye movement indicators under states of binocular color fusion and rivalry through multi-paradigm eye movement experiments, including the Gaze Stability paradigm, Straight Curve Eye Hopping paradigm, and Smoothed Eye Movement Tracking paradigm. We hypothesize that eye movement indicators will show significant differences when distinguishing between states of binocular color fusion and rivalry.

2. Methods

2.1. Instrumentation and Viewing Groups

Figure 1 illustrates the apparatus and equipment used in our experiments. Visual stimuli were presented on a 3D display (Samsung Electronics, Suwon, Gyeonggi-do, Republic of Korea) paired with matching 3D shutter glasses(Samsung Electronics, Suwon, Gyeonggi-do, Republic of Korea). The display features a 2D/3D switching capability, with dimensions of 511.8 mm (H) × 288.3 mm (V), a resolution of 1920 × 1080 pixels, and a refresh rate of 120 Hz. The 3D shutter glasses enabled viewing of the 3D stimuli by pressing a switch button. Stimuli were calibrated to specific CIE-1931 coordinates and luminance using the look-up table (LUT) method. Brightness and chromaticity were measured at the display’s center through the glasses using a spectroradiometer (Photo Research PR-715, Photo Research, Granite Bay, CA, USA), achieving a characterization accuracy of 1.56 CIELAB units ( E a b * ). A chin support device secured the participant’s head, minimizing deviations due to spontaneous movement. Binocular eye movement data were collected with an EyeLink1000 Plus (SR Research: Ottawa, ON, Canada) video eye tracker at 2 kHz. All experiments were conducted in a dark room, with participants seated 50 cm from the display.

2.2. Subjects

Eighteen subjects (nine males and nine females) participated in visual acuity and visual function testing, meeting all experimental criteria. The subjects were aged between 22 and 30 years old, with 11 wearing glasses and 7 not. All subjects had normal visual acuity, no refractive errors, no color blindness or color vision deficiency, and normal stereopsis. Each subject signed an informed consent form and participated in the experiment anonymously [28]. A detailed report on the vision of the 18 subjects is provided in Appendix A Table A1.
In the context of binocular color rivalry, various factors of the visual system, such as visual acuity, can influence attention allocation and rivalry dynamics. Lower visual acuity may lead the visual system to prioritize more salient stimuli, thereby affecting target visibility and information detection efficiency [29]. Therefore, to ensure the reliability of experimental data, we implemented strict screenings of visual acuity and visual function for all participants.
Ophthalmologists, along with trained research assistants (author Xinni Zhang), conducted comprehensive screenings. Visual acuity was evaluated using logarithmic scales at a standard distance of 4 m for both uncorrected and corrected vision. For myopic subjects, specialized optometric instruments were used to quantify corrected visual acuity accurately. Following this, in-depth visual function tests were conducted, including assessing color vision with the Ishihara Color Blindness Test and measuring depth perception using the Titmus Stereoscopic Chart (stereoscopic sensitivity ≤ 100 arcsec). Additionally, the presence and location of the blind spot were determined using a traditional method, where a research assistant guided the participant to track a small pointer while it was moved gradually.
This rigorous screening procedure ensured uniformity among all participants in terms of visual acuity, color vision, and stereoscopic vision, thereby minimizing potential individual differences that could impact the experimental results and enhancing the reproducibility and robustness of this study’s conclusions.

2.3. Stimuli Paradigm

In the CIELAB color space, all stimulus colors were selected at a lightness (L*) of 30, with a luminance of 15 ± 0.5 cd/m2. Building on previous research [30], we quantified the binocular chromatic fusion limit, establishing that binocular color rivalry occurs when the color difference exceeds 27.55. To elicit a pronounced rivalry effect, we presented two opposing colors—R: L*a* b*(30, 30, 0) and G: L*a* b*(30, −30, 0), referred to as R-G—to each eye separately.
Heterogeneous fusion involves the brain perceiving the integration of identical colors from both eyes into a more vivid hue, while homogeneous fusion leads to the perception of mixed or contrasting colors. These experimental groups enable the investigation of binocular fusion mechanisms and the integration of different colors by the visual system. In the heterogeneous fusion condition, color differences can produce complex phenomena, like color mixing and depth perception. Conversely, homogeneous fusion minimizes color interference, allowing for a clearer exploration of the basic fusion mechanisms. Thus, we selected homogeneous stimuli (R-R and G-G) for the control group to facilitate normal binocular color fusion (see Figure 2). To prevent background interference with color fusion effects, the stimuli were presented against a black background with a luminance of 0.24 cd/m2.
Three experimental paradigms were utilized: the Gaze Stability paradigm, Straight Curve Eye Hopping paradigm, and Smoothed Eye Movement Tracking paradigm [31,32,33,34]. Each paradigm included an experimental group (R-G) and two control groups (R-R and G-G), with equal presentation times across all groups.
In the Gaze Stability paradigm (see Figure 3), a target disk with a diameter of 2° is fixed at the center of the screen and presented continuously for 10 s. Subsequently, a distractor disk, identical in size and color to the target disk, appears randomly at one of six preset positions (A–F) for 500 ms before disappearing. The distractor disk is presented in random order and cycles through all positions. Participants are required to fixate on the target disk throughout the entire 130 s duration until all positions have been tested.
The Straight Curve Eye Hopping paradigm (see Figure 4) consists of two movement paths: straight and curved. In Figure 4a, the disk moves linearly between points A and G at a speed of 4.5° per 1.2 s for 14 repetitions. In Figure 4b, the disk starts at points A or B and follows two arcs at the same speed. This segment lasts a total of 100 s, with 38 s for the straight path and 62 s for the curved path. Participants were required to track the disk’s movements with their eyes.
In the Smoothed Eye Movement Tracking paradigm (see Figure 5), the disk moves back and forth along an “S”-shaped path, beginning from points A or B. The entire experiment lasts 80 s, with each “S” and its reverse taking 40 s. Participants were instructed to maintain smooth eye movements to follow the disk’s trajectory.

2.4. Procedure

The experimental procedure is illustrated in Figure 6. To enhance the alertness and concentration of participants, the experiment was scheduled in the morning, starting at 8:00 and ending at approximately 9:20, with each session lasting about 80 min and only one participant involved per day. Prior to the start of the experiment, participants were required to wear active stereo glasses, adjust their seating, and complete calibration on the eye movement instrument to ensure data accuracy (for detailed calibration steps, see Appendix B). After calibration, a 10 s pre-test was conducted where participants had to report whether they perceived any rivalry phenomena. If perception was reported, the experiment proceeded; otherwise, the experiment was paused.
To minimize measurement errors, a 10 s transitional gray field was displayed before and after each stimulus presentation, ensuring equal exposure time for both eyes to identical stimuli. Stimulus presentations were randomized, with rivalry and fusion stimuli alternating. Throughout the experiment, participants performed eye movement tasks according to the requirements of different paradigms.
The completion of all random groups marked the end of one experimental cycle. Repeating this cycle three times indicated the completion of data collection.

2.5. Data Processing and Statistical Analysis

Data processing commenced with the elimination of erroneous records and the exclusion of data from participants who failed to complete tasks as required. Outliers within the eye movement data were identified and removed using box plots alongside the 3σ principle. To ensure comparability across participants, we calculated the Mean and standard deviation (SD) for each eye’s movement data, as well as for both eyes combined, followed by normalization of relevant indicators to mitigate the impact of individual differences on analysis outcomes. Each participant completed three repetitions under both experimental and control states to minimize biases stemming from distractions, instrument misalignment, or calibration errors. The raw data and stimulus materials are accessible via the following link: https://doi.org/10.6084/m9.figshare.25688625.v1 (dataset accessed on 25 April 2024).
To assess significant interocular differences, a paired t-test was conducted on the left and right eye data within the same experimental group, with Bonferroni correction applied (adjusted α′ = 0.0013). Given that binocular fusion and rivalry involve the brain’s integration of inputs from both eyes to form a single percept, rather than isolating contributions from each eye, data from both eyes were merged. For binocular eye movement data conforming to a normal distribution, an independent samples t-test was employed. Otherwise, a Mann–Whitney U test was used for non-normally distributed data. After detecting indicators with significant differences across paradigms, we further performed a Z-score normalization analysis to summarize the ranges of these indicators, removing overlapping ranges between binocular color fusion and rivalry states to more precisely distinguish based on eye movement features. To ensure that the obtained indicator ranges could be effectively applied in non-controlled experiments, we conducted a cross-paradigm analysis of eye movement indicators, identifying the intersection ranges of indicators that showed significant differences across the Gaze Stability paradigm, Straight Curve Eye Hopping paradigm, and Smoothed Eye Movement Tracking paradigm, thereby highlighting common features among the paradigms. Additionally, ROC analysis was utilized to evaluate the capability of the three paradigms in distinguishing between binocular color fusion and rivalry. All statistical analyses were executed using SPSS 27.0 software, with a significance level set at p < 0.05 (corrected where applicable).

3. Results

3.1. Results of Interocular Significant Difference

Appendix A Table A2, Table A3 and Table A4 summarize the results, showing significant differences in Gaze X, Gaze Y, and Pupil Size, while Acceleration X, Acceleration Y, and Velocity X remained stable across all three paradigms.
Gaze X, Gaze Y, and Pupil Size reflect eye position and pupil response, which are sensitive to shifts in visual input and binocular vision integration. These differences are expected due to changes in visual tasks. In contrast, Acceleration X, Acceleration Y, and Velocity X, which measure the eyes’ speed and acceleration, showed no significant differences, consistent with Robinson’s theory that eye movements are centrally controlled to ensure coordination between both eyes during movement [35].
Since binocular fusion and rivalry reflect the brain’s ability to integrate and process inputs from both eyes into a single percept, we focus on analyzing the averaged data from both eyes to capture the visual system’s collaborative response rather than isolating monocular contributions.

3.2. Results of Normal Distribution Analysis

The results of the normal distribution test are listed in Appendix A Table A5, and t-tests and Mann–Whitney U tests for all three experimental paradigms are listed in Appendix A Table A6, Table A7, Table A8 and Table A9.
As shown in Figure 7, the eye movement data for the experimental group (R-G) and the control groups (R-R, G-G) were normalized across the three paradigms and displayed as line graphs with different colors representing each group.
Based on the data analysis results, six eye movement indicators showed significant differences between binocular color fusion and rivalry states. Specifically, Average Saccade Amplitude, Median Saccade Amplitude, and SD Saccade Amplitude exhibited significant differences across all tested paradigms. For instance, in the comparison between the R-G and R-R groups, the Average Saccade Amplitude values were 50.6% vs. 13.3%, 28.7% vs. 42.0%, and 54.1% vs. 14.0% across the three paradigms, respectively. These results indicate that saccade amplitude and its variability are stable indicators for distinguishing between binocular color fusion and rivalry, effectively reflecting the dynamic regulation strategies of the visual system under conflicting states.
Additionally, in the Gaze Stability paradigm, Average Blink Duration was significantly shorter in the R-G group compared to the R-R group (56.5% vs. 9.3%) and the G-G group (56.5% vs. 34.2%), suggesting higher visual attention and more pronounced blink suppression under this condition. In the Straight Curve Eye Hopping paradigm, Sac Avg Velocity (24.3% vs. 43.0%) and Sac Peak Velocity (42.9% vs. 28.4%) also showed significant differences between the R-G and G-G groups, indicating that during dynamic tasks, rivalry states may optimize conflict processing by modulating saccade velocity.
These findings collectively suggest that under binocular color rivalry states (R-G), the visual system enhances fixation stability and suppresses redundant eye movements to manage color conflicts, thereby achieving higher efficiency in visual information processing.

3.3. Results of Z-Score Normalization Analysis

Indicator values were subjected to a Z-Score normalization analysis, as shown in Table 1. In the present experiment, all eye movement data between the R-R group and the G-G group did not show significant differences. Therefore, in the Z-score normalization analysis, we combined these two groups into one. This allowed us to obtain the normalized range values for binocular color fusion and rivalry states across the three paradigms. The indicator ranges are expressed as normalized Mean ± SD.
In Table 1, a clear distinction can be made between binocular color fusion and rivalry based on the normalized ranges of eye movement indicators. For example, in the Gaze Stability paradigm, if the normalized value of Average Blink Duration falls within the range of −0.854–−0.454, it indicates that the subject is in the binocular color rivalry state. Conversely, when the normalized value lies within the range of −0.454–1.088, it suggests the subject is in the binocular color fusion state.
Similarly, in the Straight Curve Eye Hopping paradigm, eye movement indicators also exhibit distinct patterns for distinguishing between the two states. For instance, the normalized range of Average Saccade Amplitude in the binocular color rivalry state is −0.905–1.700, while in the fusion state, it is −1.919–−0.905. For Median Saccade Amplitude, the range in the rivalry state is −0.950–1.610, whereas in the fusion state, it is −1.980–−0.950. The SD Saccade Amplitude indicator shows a range of −0.827–1.545 in the rivalry state and −1.909–−0.827 in the fusion state. Additionally, Sac Avg Velocity exhibits a range of −1.404–0.376 in the rivalry state and 0.376–1.118 in the fusion state, while Sac Peak Velocity ranges from −0.676–2.308 in the rivalry state and −0.836–−0.684 in the fusion state.
In the Smoothed Eye Movement Tracking paradigm, similar distinguishing characteristics are observed. Specifically, the normalized range of Average Saccade Amplitude in the binocular color rivalry state is −1.392–0.608, compared to 0.608–2.234 in the fusion state. For Median Saccade Amplitude, the rivalry state range is −1.358–−0.552, while the fusion state range is −0.348–1.362. The SD Saccade Amplitude indicator shows a range of −1.196–−0.444 in the rivalry state and −0.444–1.498 in the fusion state.
In summary, the significant differences in the normalized ranges of eye movement indicators across different paradigms not only validate their effectiveness in distinguishing between binocular color fusion and rivalry but also provide quantifiable reference criteria for determining visual states based on eye movement features.

3.4. Results of Cross-Paradigm Merged Analysis

According to the cross-paradigm merged results of eye movement indicators shown in Table 2, Average Saccade Amplitude, Median Saccade Amplitude, and SD Saccade Amplitude exhibited significant differences across all three paradigms, demonstrating cross-paradigm stability. Specifically, when the range of Average Saccade Amplitude lies between 0.608 and 1.294, it can be inferred that the subject is in a state of binocular color fusion; conversely, when the range falls between −0.905 and −0.693, it indicates a state of binocular color rivalry. Similarly, for Median Saccade Amplitude, a range of −0.232–1.286 corresponds to binocular color fusion, while a range of −0.905–−0.693 suggests binocular color rivalry. Additionally, SD Saccade Amplitude indicates binocular color fusion when its range is −0.403–1.165, whereas a range of −0.827–−0.444 reflects binocular color rivalry. These findings demonstrate that even in less strictly controlled experimental states, eye movement indicators remain effective in distinguishing between states of binocular color fusion and rivalry.

3.5. Results of ROC Curve Analysis

The previous normalized analysis established specific thresholds for significant differences between binocular color fusion and rivalry. To further evaluate the sensitivity and specificity of these indicators in distinguishing the two visual phenomena, we conducted a ROC curve analysis. ROC curves, widely used in binary classification problems, plot the TPR on the vertical axis against the FPR on the horizontal axis [36].
This analysis enables us to visually compare and determine the discriminative efficacy of binocular color fusion and rivalry across different thresholds, allowing us to identify optimal thresholds that maximize accuracy and reliability in classifying color perception patterns. Consequently, we first calculated the sensitivity and specificity from the ROC curve and then derived Youden’s J statistic using the following Equations (1)–(3) [37]:
J = S e n s i t i v i t y + S p e c i f i c i t y 1
S e n s i t i v i t y = T P T P + F N × 100 %
S p e c i f i c i t y = T N T N + F P × 100 %
Youden’s J statistic provides a comprehensive assessment of an eye movement indicator’s ability to recognize and discriminate between specific eye movement behaviors or mental processes in binocular rivalry. Sensitivity measures the proportion of trial periods during which the binocular rivalry phenomenon occurred and was correctly identified by the eye movement indicator. TP refers to instances where the model accurately detects binocular color rivalry (e.g., R-G) and indicates correct identifications of the absence of rivalry (e.g., binocular fusion, such as R-R and G-G). FP occurs when the model incorrectly identifies binocular color rivalry, and FN denotes instances where the presence of binocular rivalry is missed.
Specificity represents the proportion of trial periods without the phenomenon that were correctly classified by the eye movement indicator as free of significant rivalry dynamics. The true positive count reflects the number of samples the model correctly predicted as positive, whereas false negatives indicate samples inaccurately predicted as negative.
After calculating Youden’s J statistic, we plotted the ROC curve to visualize the classifier’s ability to distinguish between binocular color rivalry and binocular color fusion effectively.
In order to verify the independence of the data, we performed Spearman correlation analysis on the three eye movement indicators that showed significant differences, and the results showed that Average Saccade Amplitude, Median Saccade Amplitude, and SD Saccade Amplitude were significantly correlated with each other. For example, the correlation coefficient between Average Saccade Amplitude and Median Saccade Amplitude is greater than 0.9, as shown in Table 3.
In the Gaze Stability paradigm (Figure 8a), we selected four indicators with significant differences and excluded Average Blink Duration to avoid overfitting. The analysis revealed that Average Saccade Amplitude, Median Saccade Amplitude, and SD Saccade Amplitude effectively distinguished between binocular color fusion and rivalry, achieving an area under the ROC curve (AUC) of 0.990, with a sensitivity of 86.7% and specificity of 96.8%.
In the Straight Curve Eye Hopping paradigm (Figure 8b), we excluded SD Saccade Amplitude to simplify the model, resulting in an AUC of 0.741, a sensitivity of 86.2%, and a specificity of 61.3%. In the Smoothed Eye Movement Tracking paradigm (Figure 8c), which included all significant indicators, the AUC was 0.967, with a sensitivity of 70.6% and a specificity of 96.8%.
These results demonstrate the robust classification ability of the binary logistic regression model across the paradigms. The Gaze Stability paradigm models exhibited high accuracy in distinguishing between fusion and rivalry, while the Straight Curve Eye Hopping paradigm showed stable classification performance. The high AUC in the Smoothed Eye Movement Tracking paradigm further validates the effectiveness of these indicators in analyzing binocular color fusion and rivalry. Overall, the significant eye movement indicators in each paradigm are crucial for identifying binocular fusion and rivalry in various visual phenomena, underscoring the models’ reliable predictive and classification capabilities, which enhance our understanding of visual processing mechanisms.

4. Discussion

In this study, we conducted a systematic comparison of eye movement indicators under states of binocular color fusion (R-R group, G-G group) and rivalry (R-G group) through three paradigms: the Gaze Stability paradigm, Straight Curve Eye Hopping paradigm, and Smoothed Eye Movement Tracking paradigm. The findings reveal several critical points.
1.
Differences in Eye Movement Indicators
Under normalization analysis, Saccade Amplitude-related indicators (Average Saccade Amplitude, Median Saccade Amplitude, SD Saccade Amplitude) showed significant differences across all three paradigms, indicating their robust response to visual conflict across tasks. This phenomenon is directly related to the core loop of conflict monitoring. For instance, during color rivalry states, the prefrontal–parietal network (PFC-PPC) dynamically adjusts saccade amplitude by modulating the color-selective gain of neurons in area V4 [38]. In the Gaze Stability paradigm, the Average Saccade Amplitude of the rivalry group was significantly lower than that of the fusion groups (rivalry group approximately 50.6%, R-R group 13.3%, G-G group 34.2%), likely reflecting how the lateral intraparietal area (LIP) maintains Gaze Stability by suppressing redundant saccades [39]. Conversely, in dynamic tasks, such as the Straight Curve Eye Hopping paradigm, the rivalry group’s Average Saccade Amplitude surpassed that of the fusion group (rivalry group 28.7%, fusion group 42.0%), suggesting that task demands drive the visual system to increase Saccade Amplitude for rapid attention shifts [40].
The reduction in SD Saccade Amplitude further supports this mechanism. For example, in the Gaze Stability paradigm, the SD for the rivalry group dropped to 54.1% (vs. R-R group at 14.0%), indicating that variability in the saccade strategy was actively suppressed. This suppression may be associated with phase synchronization of V4 γ oscillations (30–80 Hz), as Piantoni et al. found that phase locking of γ oscillations coordinates saccade initiation and perceptual switching, thereby reducing randomness [41]. Similarly, Denison et al. proposed that feedforward inhibition mechanisms optimize saccade planning by predicting conflicting inputs, thus reducing ineffective exploration [42].
Additionally, Average Blink Duration only showed significant differences in the Gaze Stability paradigm. The rivalry group’s blink duration (56.5%) was significantly shorter than that of the fusion groups (R-R group 9.3%, G-G group 34.2%), possibly due to the overactivation of the dorsal attention network (DAN). fMRI studies have shown that increased BOLD signals in the DAN correlate with blink suppression [43]. Color rivalry requires continuous engagement of the DAN to suppress input from the non-dominant eye, leading to compressed blink duration. In dynamic tasks, the priority of saccade speed diminishes the influence of the DAN on blink control, resulting in the disappearance of these differences.
For saccade velocity indicators (Sac Avg Velocity, Sac Peak Velocity), significant differences were observed in dynamic tasks, demonstrating adaptive strategies for conflict processing. In the Straight Curve Eye Hopping paradigm, the rivalry group’s Sac Avg Velocity and Sac Peak Velocity were significantly higher than those of the fusion group, illustrating a “speed-accuracy trade-off” strategy [44]. Specifically, burst firing of neurons in the Superior Colliculus (SC) increases speed by shortening saccade latency but may sacrifice spatial accuracy (e.g., an increase in endpoint error of about 5.2% in the rivalry group) [45]. Grossberg et al. further proposed that saccade velocity adjustment correlates with the prediction error of conflict signals; when the alternation frequency of color inputs increases, the SC accelerates saccades to shorten the decision window, matching the demands of dynamic inputs [46].
2.
Difference Between Fusion and Rivalry States Based on Ranges
Our data further indicate that there are clear differences in the normalized ranges of various eye movement indicators between states of binocular color fusion and rivalry, confirming the effectiveness of these indicators in distinguishing the two visual states. For example, in the Gaze Stability paradigm, if the normalized value of Average Blink Duration falls within −0.854–−0.776, it can be preliminarily judged as rivalry, whereas if it falls within −0.455–1.406, it indicates fusion. The cross-paradigm merged analysis results show that Average Saccade Amplitude, Median Saccade Amplitude, and SD Saccade Amplitude effectively distinguish binocular color rivalry from fusion across all three paradigms. Specifically, when the range of Average Saccade Amplitude lies between 0.608 and 1.294, it suggests binocular color fusion; conversely, when it is between −0.905 and −0.693, it indicates rivalry. These results align with Blake and Logothetis’ dynamic threshold model, which successfully distinguishes visual rivalry from fusion using eye movement parameters based on statistical distribution differences [47]. Einhäuser et al. further validated the feasibility of multi-indicator classification methods, providing methodological support for constructing high-precision discrimination models in this study [48].
These indicators not only provide objective evidence for understanding binocular vision processing mechanisms but also hold broad practical application potential. By real-time monitoring of normalized eye movement indicator values, one can quickly differentiate between fusion and rivalry states, meeting the demand for immediate feedback in intelligent human–machine interaction systems. Additionally, specific eye movement indicator anomalies can aid in the early detection of visual conflict processing abnormalities, offering auxiliary information for early screening and clinical diagnosis of visual disorders. Furthermore, quantitative data provide solid experimental evidence for building and validating binocular vision neural computation models, advancing our understanding of visual information integration mechanisms.
3.
ROC Analysis and Classification Efficacy
Further validation through ROC curve analysis demonstrated the effectiveness of the three eye movement paradigms in distinguishing between states of binocular color fusion and rivalry. The Gaze Stability paradigm exhibited nearly perfect classification efficacy (AUC = 0.990), with high specificity (96.8%) and substantial sensitivity (86.7%), significantly outperforming certain neuro-psychiatric disorder diagnostic studies based on MRI biomarkers, such as those reported by Kambeitz et al. with a sensitivity of 77% and a specificity of 78% [49]. This not only highlights the potential of eye movement indicators in visual state classification but also showcases their unique advantages in high-precision visual state assessment.
Although the Straight Curve Eye Hopping paradigm had a lower AUC (0.741), its screening potential at high specificity ranges (>80%, with a sensitivity of 62.1%) indicates that this paradigm can serve as an initial screening tool. In contrast, the Smoothed Eye Movement Tracking paradigm displayed high classification efficacy (AUC = 0.967) and specificity (96.8%), albeit with some variation in sensitivity (70.6%), possibly reflecting biological variability among subjects in feedforward prediction mechanisms [50]. Notably, the ROC curves for all three paradigms exhibited bimodal distributions, supporting the theoretical hypothesis of discrete stable states for color fusion and rivalry. The performance gradient across paradigms (Gaze Stability paradigm > Smoothed Eye Movement Tracking paradigm > Straight Curve Eye Hopping paradigm) further maps onto the neural hierarchy of visual processing; static maintenance (high-level cognitive regulation) surpasses dynamic tracking (mid-level motor integration) and path planning (low-level reflex pathways).
These findings not only confirm the classification validity of eye movement indicators but also establish a multi-paradigm joint analysis framework. Future research should incorporate time-resolved ROC analyses and include spatial features, such as saccade trajectory curvature, to enhance model interpretability.
Overall, this study, through quantitative evaluation and multi-paradigm experimental validation, further demonstrates the effectiveness of eye movement indicators in distinguishing between binocular color fusion and rivalry states. It provides robust theoretical and methodological support for the fine assessment of visual states and related applications, such as real-time monitoring, visual disorder screening, and the construction of neural computation models.

5. Conclusions

Color fusion and rivalry are two key mechanisms of information integration in binocular vision, characterizing how the visual system processes consistent and conflicting inputs, respectively. This study systematically revealed the quantitative differences in eye movement indicators under states of binocular color fusion and rivalry through multi-paradigm eye movement experiments, validating the sensitivity of indicators such as Saccade Amplitude and velocity to visual conflict.
These eye movement indicators not only effectively distinguish between states of binocular color fusion and rivalry but also show significant characteristic changes across different visual processing modes. Through Z-Score analysis, we found that eye movement indicators could effectively discriminate between binocular color fusion and rivalry states. Furthermore, ROC curve analysis further confirmed the effectiveness of the experimental paradigms employed—including the Gaze Stability paradigm, Straight Curve Eye Hopping paradigm, and Smoothed Eye Movement Tracking paradigm—in elucidating the mechanisms of binocular color fusion and rivalry. These results not only validate the sensitivity of eye movement indicators to visual conflict but also highlight their potential as biomarkers for characterizing dynamic visual processing. Additionally, our findings suggest that eye movement behavior can serve as a biomarker for characterizing the dynamic processing of visual conflict, providing empirical support for understanding the neural computational models of binocular vision.

Author Contributions

Conceptualization, X.Z.; data curation, X.Z., M.D. and F.C.; formal analysis, X.Z. and M.D.; investigation, X.Z. and L.Y.; methodology, X.Z. and Z.C.; project administration, Z.C.; resources, F.C.; software, X.Z. and L.Y.; writing—review and editing, X.Z. and Z.C. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the National Science Foundation of China (62165019, 62265017) and the Yunnan Youth and Middle-aged Academic and Technical Leaders Reserve Talent Program (202305AC160084). We thank the Color and Image Vision Laboratory of Yunnan Normal University for providing the experimental equipment and space.

Institutional Review Board Statement

Ethical review and approval were waived for this study due to it meeting the exemption criteria outlined in the "Ethical Review Measures for Biomedical Research Involving Human Subjects" (research using anonymous data does not require formal ethical approval).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study. Written informed consent has been obtained from the patients to publish this paper.

Data Availability Statement

The original contributions presented in the study are included in the article, and stimuli and raw data are available at https://doi.org/10.6084/m9.figshare.25688625.v1 (dataset accessed on 25 April 2024).

Conflicts of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Appendix A

Table A1. Data of acuity optometric screening.
Table A1. Data of acuity optometric screening.
Total Participants
(Non-Myopes/Myopes)
Non-MyopesMyopesStereoacuity@41cm
(Arcsec)
Naked Visual Acuity @4mNaked Eye Acuity@4mCorrective Acuity@4mCorrected Degree(D)
18(11/7)OS:20/20
OD:20/20
OS:20/40~20/25OS:20/20OS: −1.75 ~−6.0041~55
OD:20/40~20/25OD:20/20OD: −2.75~−6.00
Table A2. Analysis of interocular differences (stability gaze, bold part indicates significant difference).
Table A2. Analysis of interocular differences (stability gaze, bold part indicates significant difference).
IndicatorsNameMeanSDtpCohen’s d
Acceleration XLeft−10.2351150.6130.0810.9350.004
Right−44.3461233.721
Acceleration YLeft9.048520.3880.3370.7360.015
Right−32.4683865.833
Gaze XLeft251.94878.30337.770.0001.689
Right60.968139.41
Gaze YLeft377.5235.743107.4020.0004.803
Right787.018120.432
Pupil SizeLeft3799.668839.0319.4250.0000.422
Right3447.723830.9
Velocity XLeft0.357.9970.1360.8920.006
Right0.17839.272
Velocity YLeft0.3633.821.2250.2210.055
Right−0.13112.183
Table A3. Analysis of interocular differences (Straight Curve Eye Hopping, bolded text indicates significant differences).
Table A3. Analysis of interocular differences (Straight Curve Eye Hopping, bolded text indicates significant differences).
IndicatorsNameMeanSDtpCohen’s d
Acceleration XLeft−21.481268.941−0.5450.5860.023
Right37.9833387.998
Acceleration YLeft−4.9731272.7230.8950.3710.038
Right−112.163761.714
Gaze XLeft499.159142.151−0.5150.6060.022
Right502.14128.92
Gaze YLeft396.257127.343−4.7940.0000.204
Right421.58120.324
Pupil SizeLeft1597.771801.027.5650.0000.323
Right1378.965527.798
Velocity XLeft−0.1918.995−0.4480.6540.019
Right0.22624.279
Velocity YLeft1.3311.5530.5090.6110.022
Right0.98919.003
Table A4. Analysis of interocular differences (Smoothed Eye Movement Tracking, bold parts indicate significant differences).
Table A4. Analysis of interocular differences (Smoothed Eye Movement Tracking, bold parts indicate significant differences).
IndicatorsEyeMeanSDtpCohen’s d
Acceleration XLeft14.163497.7230.5330.5940.028
Right−6.6921056.822
Acceleration YLeft−29.448864.872−0.1250.90.006
Right−19.0852365.992
Gaze XLeft408.60373.545−12.5660.0000.65
Right453.21559.399
Gaze YLeft487.71255.7−1.940.0530.1
Right493.01247.51
Pupil SizeLeft2725.338672.04722.0010.0001.138
Right1980.695622.701
Velocity XLeft0.3883.645−0.3390.7350.018
Right0.4554.094
Velocity YLeft0.3247.426−0.040.9680.002
Right0.34210.539
Table A5. Normal distribution test table for eye movement indicators in different paradigms (parentheses represent p-values; bolded portions represent significant differences).
Table A5. Normal distribution test table for eye movement indicators in different paradigms (parentheses represent p-values; bolded portions represent significant differences).
IndicatorsGaze Stability (n = 54)Smoothed Eye Movement Tracking (n = 108)Straight Curve Eye Hopping (n = 54)
SkewnessKurtosisS-W TestSkewnessKurtosisS-W TestSkewnessKurtosisS-W Test
Average Blink Duration0.92−0.610.78
(0.000)
1.882.250.59
(0.000)
2.676.210.45
(0.000)
Average Fixation Duration0.54−0.310.95
(0.036)
2.168.210.82
(0.000)
0.390.190.98
(0.439)
Average Saccade Amplitude−0.02−1.190.96
(0.047)
2.9611.220.69
(0.000)
0.191.040.96
(0.083)
Blink Count0.920.660.77
(0.000)
2.376.760.59
(0.000)
2.807.460.45
(0.000)
Fixation Count0.850.360.85
(0.000)
1.140.970.82
(0.000)
0.530.270.88
(0.000)
Median Fixation Duration0.38−0.520.97
(0.181)
2.037.140.83
(0.000)
0.280.450.99
(0.765)
Median Saccade Amplitude0.001−1.140.96
(0.079)
3.1011.980.67
(0.000)
0.141.120.96
(0.054)
SD Saccade Amplitude0.39−1.090.93
(0.004)
2.175.590.77
(0.000)
0.331.310.92
(0.001)
SD Fixation Duration1.191.600.90
(0.000)
1.934.380.80
(0.000)
0.180.260.95
(0.035)
Saccade Count0.590.380.87
(0.000)
0.57−0.030.88
(0.000)
0.040.670.85
(0.000)
Pupil Size Duration−0.600.650.97
(0.152)
0.521.430.95
(0.001)
1.002.790.94
(0.009)
Sac Avg Velocity1.111.620.91
(0.001)
0.01−1.210.95
(0.001)
1.152.970.93
(0.003)
Sac Peak Velocity1.471.660.82
(0.000)
2.284.070.52
(0.000)
2.959.800.64
(0.000)
Table A6. Results of the independent sample t-test for the experimental group and the control group (stability gaze, df = 34).
Table A6. Results of the independent sample t-test for the experimental group and the control group (stability gaze, df = 34).
IndicatorsExp./Ctr.MeanSDpCohen’s d
Median Fixation Duration (ms)R-G/R-R244.9/254.1105.7/135.10.8230.1
R-G/G-G244.9/220.7105.7/101.20.5470.2
R-R/G-G254.1/220.7135.1/101.20.4080.3
Median Saccade Amplitude (°)R-G/R-R9.4/23.15.1/7.10.0000.5
R-G/G-G9.4/36.45.1/6.90.0000.5
R-R/G-G23.1/36.47.1/6.90.0000.1
Pupil Size Duration (ms)R-G/R-R4773.0/4479.91418.3/767.20.4760.3
R-G/G-G4773.0/4439.91418.3/1180.10.4120.3
R-R/G-G4479.9/4439.9767.2/1180.10.9210.0
Table A7. Results of the independent sample Mann–Whitney U test for binocular color fusion and rivalry (stability gaze, bold indicates significant difference).
Table A7. Results of the independent sample Mann–Whitney U test for binocular color fusion and rivalry (stability gaze, bold indicates significant difference).
IndicatorsExp./Ctr.MeanSDpCohen’s d
Average Blink Duration (ms)R-G/R-R1.0/9.51.6/6.30.0021.451
R-G/G-G1.0/19.51.6/11.50.0091.427
R-R/G-G9.5/19.56.3/11.50.1450.546
Average Fixation Duration (ms)R-G/R-R199.4/260.5120.5/118.80.7520.155
R-G/G-G199.4/243.0120.5/114.70.6690.071
R-R/G-G260.5/243.0118.8/114.70.8740.087
Average Saccade Amplitude (°)R-G/R-R9.5/26.55.3/6.80.0002.681
R-G/G-G9.5/38.55.3/10.40.0003.230
R-R/G-G26.5/38.56.8/10.40.0001.167
Blink Count (times)R-G/R-R0.0/1.00.6/0.90.0350.739
R-G/G-G0.0/1.00.6/0.60.1280.462
R-R/G-G1.0/1.00.9/0.60.3750.372
Fixation Count (times)R-G/R-R2.0/2.01.2/0.90.4950.101
R-G/G-G2.0/3.01.2/1.10.2610.281
R-R/G-G2.0/3.00.9/1.10.4770.216
SD Saccade Amplitude (°)R-G/R-R6.5/19.54.1/7.80.0001.662
R-G/G-G6.5/30.54.1/13.10.0011.760
R-R/G-G19.5/30.57.8/13.10.0380.629
SD Fixation Duration (ms)R-G/R-R79.2/89.9100.7/88.70.1830.331
R-G/G-G79.2/90.7100.7/71.30.5160.044
R-R/G-G89.9/90.788.7/71.30.4290.343
Saccade Count (times)R-G/R-R2.0/2.01.2/0.80.4020.159
R-G/G-G2.0/2.01.2/0.90.2270.306
R-R/G-G2.0/2.00.8/0.90.5640.190
Sac Avg Velocity (°/s)R-G/R-R59.3/79.428.8/46.70.4020.435
R-G/G-G59.3/78.228.8/24.00.6690.088
R-R/G-G79.4/78.246.7/24.00.5910.391
Sac Peak Velocity (°/s)R-G/R-R139.4/126.8177.3/183.10.8620.180
R-G/G-G139.4/187.9177.3/213.10.1500.418
R-R/G-G126.8/187.9183.1/213.10.3930.249
Table A8. Results of the independent sample t-test for the experimental group and the control group (n = 36, Straight Curve Eye Hopping, bold text indicates a significant difference).
Table A8. Results of the independent sample t-test for the experimental group and the control group (n = 36, Straight Curve Eye Hopping, bold text indicates a significant difference).
Independent Samples t-Test Result (df = 34)
IndicatorsExp./Ctr.AverageStandardptCohen’s d
Average Fixation Duration (ms)R-G/R-R260.6/283.183.8/94.20.461−0.70.3
R-G/G-G260.6/265.783.8/86.20.870−0.20.1
R-R/G-G283.1/265.794.2/86.20.581−9.63.4
Average Saccade Amplitude (°)R-G/R-R9.5/21.64.9/6.00.000−6.42.2
R-G/G-G9.5/36.74.9/7.80.000−9.43.3
R-R/G-G21.6/36.76.0/7.80.000−0.10.0
Median Fixation Duration (ms)R-G/R-R251.9/280.998.6/100.40.390−0.80.3
R-G/G-G251.9/263.598.6/91.90.738−0.30.1
R-R/G-G280.9/263.5100.4/91.90.6170.50.2
Median Saccade Amplitude (°)R-G/R-R9.8/21.55.3/8.00.000−6.22.1
R-G/G-G9.8/37.75.3/7.80.000−9.43.3
R-R/G-G21.5/37.78.0/7.80.000−0.20.1
Table A9. Results of the independent sample Mann–Whitney U test for the experimental and control groups (n = 36, Straight Curve Eye Hopping,).
Table A9. Results of the independent sample Mann–Whitney U test for the experimental and control groups (n = 36, Straight Curve Eye Hopping,).
Independent Sample Mann–Whitney U Test Result
IndicatorsExp./Ctr.MedianStandard DeviationpCohen’s d
Average Blink Duration (ms)R-G/R-R1.0/1.00.5/1.60.4280.528
R-G/G-G1.0/1.00.5/3.30.2840.644
R-R/G-G1.0/1.01.6/3.30.4920.391
Blink Count (times)R-G/R-R0.0/0.00.8/0.40.1720.752
R-G/G-G0.0/0.00.8/0.90.4350.133
R-R/G-G0.0/0.00.4/0.90.5820.337
Fixation Count (times)R-G/R-R2.5/2.01.2/0.70.3490.403
R-G/G-G2.5/2.01.2/0.80.3310.391
R-R/G-G2.0/2.00.7/0.80.8750.000
SD Saccade Amplitude (°)R-G/R-R8.5/20.54.5/8.30.0011.493
R-G/G-G8.5/33.54.5/12.40.0002.203
R-R/G-G20.5/33.58.3/12.40.0020.996
SD Fixation Duration (ms)R-G/R-R102.5/129.961.4/75.60.1680.489
R-G/G-G102.5/151.361.4/76.00.0710.588
R-R/G-G129.9/151.375.6/76.00.8740.092
Saccade Count (times)R-G/R-R2.0/2.01.0/0.70.6830.127
R-G/G-G2.0/2.01.0/0.50.3450.273
R-R/G-G2.0/2.00.7/0.50.5480.177
Pupil Size Duration (ms)R-G/R-R3889.5/3843.51050.6/1334.50.5910.22
R-G/G-G3889.5/3929.01050.6/1574.20.5690.335
R-R/G-G3843.5/3929.01334.5/1574.20.8490.126
Sac Avg Velocity (°/s)R-G/R-R53.4/53.612.7/15.60.5690.254
R-G/G-G53.4/53.612.7/24.10.9750.144
R-R/G-G53.6/53.615.6/24.10.7040.041
Sac Peak Velocity (°/s)R-G/R-R71.8/82.031.0/75.30.4110.36
R-G/G-G71.8/68.631.0/69.00.8740.263

Appendix B. Eye Movement Instrument Calibration Process

The eye tracker was accurately calibrated prior to the experiment and is only described in Section 2.4 of the text for fear of being too large. Before the experiment officially began, subjects were required to put on their active stereo glasses, adjust their sitting position, and perform a nine-point calibration of the eye tracker to enable it to capture the subject’s eye movements well. The calibration is reported in detail below:
(1)
Subjects completed an informed consent form describing the general procedure.
(2)
Turn off or mute all distracting electronic devices.
(3)
Turn on the power supply, power on the host PC and display PC, and wait for the devices to display the interface normally.
(4)
Adjust the subject’s seat height so that they can comfortably rest their chin on the chin rest and lean their forehead against the forehead. Locate the captured eye roughly in the center of the display screen or 1/4 of the way from the upper screen.
(5)
On the host PC, click the red circle on the subject’s eye to lock the tracking range, press A to automatically adjust the threshold, and adjust the lens focus so that the eye is clearly visible.
(6)
Have the subject view the four corners of the monitor, making sure that the pupillary and corneal reflections are tracked over the entire surface of the monitor. If the pupillary or corneal reflexes are lost at the edges of the monitor, make sure the screen is far enough away from the subject. For subjects wearing eyeglasses, the frame sometimes obscured part of the video image of the eye when viewing extreme horizontal or vertical angles, which was adjusted by moving the front of the subject’s chin.
(7)
Calibration is the process used to set up the eye tracking software to accurately track eye movements; in this experiment, a nine-point calibration was used in order to optimize the calibration accuracy.
(8)
Confirm that the subject’s eye signal is intact and click the Calibrate button on the host PC interface to begin calibration. During calibration, instruct the subject to look at each point until it disappears and then look at the next point.
(9)
Validate the calibration. Click the Validate button to confirm the calibration and decide whether it is acceptable or not based on the feedback.
(10)
Start the experiment after an acceptable calibration. Click the Output/Record button to begin presenting the stimulus and recording data. Instruct the subject not to talk while the stimulus is displayed. Talking will cause the head to move up and down at the chin, which will decrease the accuracy of eye tracking.
(11)
If the subject needs to rest at any time or the quality of the eye movement data decreases, the calibration will be checked and recalibrated as needed.
(12)
Upon completion of the experiment, check for the presence of eye movement data in the data folder.
(13)
Ask the subject if any discomfort exists, and if not, have the subject leave the laboratory.
Figure A1. Calibration process of eye locking.
Figure A1. Calibration process of eye locking.
Jemr 18 00010 g0a1
Figure A2. Nine-point calibration diagram.
Figure A2. Nine-point calibration diagram.
Jemr 18 00010 g0a2

References

  1. Bhardwaj, R.; O’Shea, R.P. Temporal analysis of binocular rivalry. Vis. Res. 2012, 52, 43–47. [Google Scholar] [CrossRef]
  2. Ludwig, I.; Davies, J.R.; Castet, E. Contextual effects in binocular rivalry revealed by monocular occlusions. Vis. Res. 2007, 47, 2855–2865. [Google Scholar] [CrossRef]
  3. Xiong, Y.; Zhang, J.; Tong, F. Temporal dynamics of binocular rivalry: New perspectives on neural mechanisms. Neurosci. Bull. 2021, 37, 707–720. [Google Scholar] [CrossRef]
  4. Breese, B.B. On inhibition. Psychol. Rev. 1899, 6, 229–249. [Google Scholar] [CrossRef]
  5. Gellhorn, E.; Schöppe, C. Quantitative Untersuchungen über den Wettstreit der Sehfelder: IV. Mitteilung Über den Einfluß von farbigen Nebenreizen auf den Wettstreit. Pflüger’s Arch. Für Die Gesamte Physiol. Des Menschen Und Der Tiere 1925, 208, 393–407. [Google Scholar] [CrossRef]
  6. Levelt, W.J.M. On Binocular Rivalry; Mouton & Co.: Uvernet-Fours, France, 1965. [Google Scholar]
  7. Blake, R.; Fox, R. Adaptation to invisible gratings and the site of binocular rivalry suppression. Nature 1974, 249, 488–490. [Google Scholar] [CrossRef]
  8. Wandell, B.A. Foundations of Vision; Sinauer Associates: Sunderland, MA, USA, 1995. [Google Scholar]
  9. Li, J.; Thompson, B.; Lam, C.S.Y.; Deng, D.; Chan, L.Y.L.; Maehara, G.; Woo, G.C.; Yu, M.; Hess, R.F. The role of suppression in amblyopia. Investig. Ophthalmol. Vis. Sci. 2013, 54, 2659–2671. [Google Scholar] [CrossRef]
  10. Brascamp, J.W.; Klink, P.C.; Levelt, W.J.M. The ‘laws’ of binocular rivalry: 50 years of Levelt’s propositions. Vis. Res. 2015, 109, 20–37. [Google Scholar] [CrossRef]
  11. Riesen, C.; Norcia, A.M.; Banks, M.S. Perceptual switching in binocular rivalry: Neural and behavioral perspectives. J. Vis. 2019, 19, 14. [Google Scholar] [CrossRef]
  12. Chen, Z.; Shi, J.; Tai, Y.; Huang, X.; Yun, L.; Zhang, C. A quantitative measurement of binocular color fusion limit for different disparities. In Proceedings of the 2017 International Conference on Optical Instruments and Technology: Optical Systems and Modern Optoelectronic Instruments, Beijing, China, 28–30 October 2017. [Google Scholar] [CrossRef]
  13. Lv, Z.; Liu, X.; Dai, M.; Jin, X.; Huang, X.; Chen, Z. Investigating critical brain area for EEG-based binocular color fusion and rivalry with EEGNet. Front. Neurosci. 2024, 18, 1361486. [Google Scholar] [CrossRef]
  14. Asano, Y.; Wang, M. An investigation of color difference for binocular rivalry and a preliminary rivalry metric, Δ E* bino. Color Res. Appl. 2024, 49, 51–64. [Google Scholar] [CrossRef]
  15. Zhang, P.; Jamison, K.; Engel, S.; He, B.; He, S. Binocular rivalry requires visual attention. Neuron 2011, 71, 362–369. [Google Scholar] [CrossRef]
  16. Botvinick, M.M.; Braver, T.S.; Barch, D.M.; Carter, C.S.; Cohen, J.D. Conflict monitoring and cognitive control. Psychol. Rev. 2001, 108, 624–652. [Google Scholar] [CrossRef]
  17. Carter, C.S.; van Veen, V. Anterior cingulate cortex and conflict detection: An update of theory and data. Cogn. Affect. Behav. Neurosci. 2007, 7, 367–379. [Google Scholar] [CrossRef]
  18. Martinez-Conde, S.; Macknik, S.L.; Hubel, D.H. The role of fixational eye movements in visual perception. Nat. Rev. Neurosci. 2004, 5, 229–240. [Google Scholar] [CrossRef]
  19. Van Dam, L.C.J.; Van Ee, R. The role of (micro)saccades and blinks in perceptual bi-stability from slant rivalry. Vis. Res. 2006, 46, 241–256. [Google Scholar] [CrossRef]
  20. Raveendran, R.N.; Bobier, W.R.; Thompson, B. Binocular vision and fixational eye movements. J.Vis. 2019, 19, 9. [Google Scholar] [CrossRef]
  21. Kalisvaart, J.P.; Goossens, J. Influence of retinal and extra-retinal motion signals on binocular rivalry alternations. J. Vis. 2013, 13, 12. [Google Scholar] [CrossRef]
  22. Skerswetat, J.; Formankiewicz, M.A.; Waugh, S.J. Relationship between microsaccades, fixational eye movements, and visual acuity. Vis. Res. 2017, 141, 109–118. [Google Scholar] [CrossRef]
  23. Otero-Millan, J.; Macknik, S.L.; Martinez-Conde, S. Microsaccades and blinks trigger illusory rotation in the "rotating snakes" illusion. J. Neurosci. 2014, 34, 6046–6051. [Google Scholar] [CrossRef]
  24. Hirota, M.; Matsuoka, Y.; Koshiro, A. Binocular fusion capacity as an indicator of visual fatigue. Optom. Vis. Sci. 2018, 95, 807–814. [Google Scholar] [CrossRef]
  25. Grossberg, S.; Srinivasan, K.; Yazdanbakhsh, A. On the neural dynamics of saccade generation: The superior colliculus signal as an emergent spatial decision. Neural Netw. 2015, 71, 20–33. [Google Scholar] [CrossRef]
  26. Rucci, M.; Ahissar, E.; Burr, D. Temporal coding of visual space. Trends Cogn. Sci. 2018, 22, 883–895. [Google Scholar] [CrossRef]
  27. Schluppeck, D.; Meeson, A.; Wade, A.R. Asymmetric L/M cone contributions to human visual cortex revealed by fMRI adaptation. J. Vis. 2019, 19, 9. [Google Scholar] [CrossRef]
  28. World Medical Association. World Medical Association Declaration of Helsinki: 5th Amendment of the Declaration of Helsinki. Ethical principles for medical research involving human subjects. Eur. J. Emerg. Med. 2001, 8, 221–223. [Google Scholar] [CrossRef]
  29. Han, S.; Kim, S.; Jung, J. The effect of visual rivalry in peripheral head-mounted displays on mobility. Sci. Rep. 2023, 13, 20199. [Google Scholar] [CrossRef]
  30. Xiong, Q.; Liu, H.; Chen, Z.; Tai, Y.; Shi, J.; Liu, W. Detection of binocular chromatic fusion limit for opposite colors. Opt. Express 2021, 29, 35022. [Google Scholar] [CrossRef]
  31. Benson, P.J.; Beedie, S.A.; Shephard, E.; Giegling, I.; Rujescu, D.; St Clair, D. Simple viewing tests can detect eye movement abnormalities that distinguish schizophrenia cases from controls with exceptional accuracy. Biol. Psychiatry 2012, 72, 716–724. [Google Scholar] [CrossRef]
  32. Morita, K.; Miura, K.; Fujimoto, M.; Yamamori, H.; Yasuda, Y.; Iwase, M.; Kasai, K.; Hashimoto, R. Eye movement as a biomarker of schizophrenia: Using an integrated eye movement score. Psychiatry Clin. Neurosci. 2016, 71, 104–114. [Google Scholar] [CrossRef]
  33. Brakemeier, S.; Sprenger, A.; Meyhöfer, I.; McDowell, J.E.; Rubin, L.H.; Hill, S.K.; Keshavan, M.S.; Pearlson, G.D.; Tamminga, C.A.; Gershon, E.S.; et al. Smooth pursuit eye movement deficits as a biomarker for psychotic features in bipolar disorder—Findings from the PARDIP study. Bipolar Disord. 2019, 22, 602–611. [Google Scholar] [CrossRef]
  34. Takahashi, J.; Hirano, Y.; Miura, K.; Morita, K.; Fujimoto, M.; Yamamori, H.; Yasuda, Y.; Kudo, N.; Shishido, E.; Okazaki, K.; et al. Eye movement abnormalities in major depressive disorder. Front. Psychiatry 2021, 12, 673443. [Google Scholar] [CrossRef]
  35. Robinson, D.A. The use of control systems analysis in the neurophysiology of eye movements. Annu. Rev. Neurosci. 1981, 4, 463–503. [Google Scholar] [CrossRef]
  36. Kumar, R.; Indrayan, A. Receiver operating characteristic (ROC) curve for medical researchers. Indian Pediatr. 2011, 48, 277–287. [Google Scholar] [CrossRef]
  37. Youden, W.J. Index for rating diagnostic tests. Cancer 1950, 3, 32–35. [Google Scholar] [CrossRef]
  38. Baseler, H.A.; Morland, A.B.; Wandell, B.A. Topographic organization of human visual areas in the absence of input from primary cortex. J. Neurosci. 1999, 19, 2619–2627. [Google Scholar]
  39. Krauzlis, R.J.; Lovejoy, L.P.; Zénon, A. Superior colliculus and visual spatial attention. Annu. Rev. Neurosci. 2017, 40, 165–182. [Google Scholar] [CrossRef]
  40. Hafed, Z.M.; Krauzlis, R.J. Similarity of superior colliculus involvement in microsaccade and saccade generation. J. Neurophysiol. 2012, 107, 1904–1916. [Google Scholar] [CrossRef]
  41. Piantoni, G.; Romeijn, N.; Gomez-Herrero, G.; Van Der Meij, J. Gamma oscillations in the human visual cortex are synchronized with saccadic eye movements. Curr. Biol. 2017, 27, 1–7. [Google Scholar] [CrossRef]
  42. Denison, R.N.; Piazza, E.A.; Silver, M.A. Predictive context influences perceptual decisions through biasing of pre-stimulus neural activity. J. Vis. 2019, 19, 1–18. [Google Scholar] [CrossRef]
  43. Nakano, T.; Kato, M.; Morito, Y.; Itoi, S.; Kitazawa, S. Blink-related momentary activation of the default mode network while viewing videos. Proc. Natl. Acad. Sci. USA 2013, 110, 702–706. [Google Scholar] [CrossRef]
  44. Bieg, H.-J.; Chuang, L.L.; Bülthoff, H.H. Looking at the rope while walking on it: Gaze behavior during tightrope walking. J. Vis. 2015, 15, 1–13. [Google Scholar] [CrossRef]
  45. Hafed, Z.M.; Goffart, L.; Krauzlis, R.J. A neural mechanism for microsaccade generation in the primate superior colliculus. Science 2009, 323, 940–943. [Google Scholar] [CrossRef] [PubMed]
  46. Grossberg, S.; Srinivasan, K.; Yazdanbakhsh, A. Predictive remapping of attention during eye movements. Vis. Res. 2015, 110, 144–156. [Google Scholar] [CrossRef]
  47. Blake, R.; Logothetis, N.K. Visual competition. Nat. Rev. Neurosci. 2002, 3, 13–21. [Google Scholar] [PubMed]
  48. Einhäuser, W.; Stout, J.; Koch, C.; Carter, O. Pupil dilation reflects perceptual selection and predicts subsequent stability in perceptual rivalry. Proc. Natl. Acad. Sci. USA 2008, 105, 1704–1709. [Google Scholar]
  49. Kambeitz, J.; Cabral, C.; Sacchet, M.D.; Gotlib, I.H.; Zahn, R.; Serpa, M.H.; Walter, M.; Falkai, P.; Koutsouleris, N. Detecting neuroimaging biomarkers for depression: A meta-analysis of multivariate pattern recognition studies. Biol. Psychiatry 2016, 82, 330–338. [Google Scholar] [CrossRef]
  50. Niemeier, M.; Crawford, J.D.; Tweed, D.B. Optimal transsaccadic integration explains distorted spatial perception. Nature 2003, 422, 76–80. [Google Scholar]
Figure 1. Experimental environment and equipment diagram.
Figure 1. Experimental environment and equipment diagram.
Jemr 18 00010 g001
Figure 2. Stimuli paradigm of the experimental and control groups.
Figure 2. Stimuli paradigm of the experimental and control groups.
Jemr 18 00010 g002
Figure 3. Paradigm of Gaze Stability.
Figure 3. Paradigm of Gaze Stability.
Jemr 18 00010 g003
Figure 4. Paradigm of Straight Curve Eye Hopping: (a) straight-line path (b); curve path.
Figure 4. Paradigm of Straight Curve Eye Hopping: (a) straight-line path (b); curve path.
Jemr 18 00010 g004
Figure 5. Paradigm of Smoothed Eye Movement Tracking: (a) “S”-shaped path; (b) reverse “S”-shaped path.
Figure 5. Paradigm of Smoothed Eye Movement Tracking: (a) “S”-shaped path; (b) reverse “S”-shaped path.
Jemr 18 00010 g005
Figure 6. Schematic diagram of one experimental cycle.
Figure 6. Schematic diagram of one experimental cycle.
Jemr 18 00010 g006
Figure 7. Normalized eye movement data for the R-G, R-R, and G-G groups. (a) Graph 1 displays the percentage values of eye movement indicators; (b) Graph 2 displays the percentage values of eye movement indicators; (c) Graph 3 displays the percentage values of eye movement indicators. The * in the figures indicates significant differences between R-G, R-R, and G-G for the corresponding indicators, with the color of the * matching the respective paradigm.
Figure 7. Normalized eye movement data for the R-G, R-R, and G-G groups. (a) Graph 1 displays the percentage values of eye movement indicators; (b) Graph 2 displays the percentage values of eye movement indicators; (c) Graph 3 displays the percentage values of eye movement indicators. The * in the figures indicates significant differences between R-G, R-R, and G-G for the corresponding indicators, with the color of the * matching the respective paradigm.
Jemr 18 00010 g007aJemr 18 00010 g007b
Figure 8. ROC curves for Different Eye Movement Paradigms. (a) ROC curve of the Gaze Stability paradigm. (b) ROC curve of the Straight Curve Eye Hopping paradigm. (c) ROC curve of the Smoothed Eye Movement Tracking paradigm. The actual ROC curve illustrates the relationship between sensitivity and specificity at various thresholds. The closer this curve approaches the top-left corner, the better the model’s performance. The baseline is the diagonal line representing the average value, with an AUC of 0.5.
Figure 8. ROC curves for Different Eye Movement Paradigms. (a) ROC curve of the Gaze Stability paradigm. (b) ROC curve of the Straight Curve Eye Hopping paradigm. (c) ROC curve of the Smoothed Eye Movement Tracking paradigm. The actual ROC curve illustrates the relationship between sensitivity and specificity at various thresholds. The closer this curve approaches the top-left corner, the better the model’s performance. The baseline is the diagonal line representing the average value, with an AUC of 0.5.
Jemr 18 00010 g008aJemr 18 00010 g008b
Table 1. Z-score normalized analysis of eye movement indicators with significant differences. Mean ± SD represents the corresponding values of Mean ± SD under Z-Score normalization analysis, while Range indicates the range of the indicator corresponding to these values.
Table 1. Z-score normalized analysis of eye movement indicators with significant differences. Mean ± SD represents the corresponding values of Mean ± SD under Z-Score normalization analysis, while Range indicates the range of the indicator corresponding to these values.
ParadigmIndicatorsBinocular Color Fusion
(n = 36)
Binocular Color Rivalry
(n = 18)
Mean ± SDRangeMean ± SDRange
Gaze StabilityAverage Blink Duration0.317 ± 0.771−0.454~1.088−0.654 ± 0.2−0.854~−0.454
Average Saccade Amplitude0.525 ± 0.769−0.244~1.294−1.082 ± 0.389−1.471~−0.693
Median Saccade Amplitude0.527 ± 0.759−0.232~1.286−1.088 ± 0.405−1.493~−0.683
SD Saccade Amplitude0.381 ± 0.784−0.403~1.165−0.785 ± 0.382−1.167~−0.403
Straight Curve Eye HoppingAverage Saccade Amplitude−1.412 ± 0.507−1.919~−0.9050.397 ± 1.302−0.905~1.700
Median Saccade Amplitude−1.455 ± 0.525−1.980~−0.9500.33 ± 1.28−0.950~1.610
SD Saccade Amplitude−1.368 ± 0.541−1.909~−0.8270.359 ± 1.186−0.827~1.545
Sac Avg Velocity0.747 ± 0.3710.376~1.118−0.509 ± 0.895−1.404~0.376
Sac Peak Velocity−0.760 ± 0.076−0.836~−0.6840.816 ± 1.492−0.676~2.308
Smoothed Eye Movement TrackingAverage Saccade Amplitude1.421 ± 0.8130.608~2.234−1 ± 0.392−1.392~0.608
Median Saccade Amplitude0.507 ± 0.855−0.348~1.362−0.955 ± 0.403−1.358~−0.552
SD Saccade Amplitude0.527 ± 0.971−0.444~1.498−0.82 ± 0.376−1.196~−0.444
Table 2. Cross-paradigm merged analysis of eye movement indicators with significant differences. Mean ± SD represents the corresponding values of Mean ± SD under cross-paradigm merged analysis, while Range indicates the range of the indicator corresponding to these values.
Table 2. Cross-paradigm merged analysis of eye movement indicators with significant differences. Mean ± SD represents the corresponding values of Mean ± SD under cross-paradigm merged analysis, while Range indicates the range of the indicator corresponding to these values.
IndicatorsBinocular Color Fusion
(n = 36)
Binocular Color Rivalry
(n = 18)
Mean ± SDRangeMean ± SDRange
Average Saccade Amplitude0.951 ± 0.3430.608~1.294−0.799 ± 0.106−0.905~−0.693
Median Saccade Amplitude0.527 ± 0.759−0.232~1.286−0.817 ± 0.134−0.950~−0.683
SD Saccade Amplitude0.381 ± 0.784−0.403~1.165−0.636 ± 0.192−0.827~−0.444
Table 3. Correlation analysis of eye movement indicators in the Gaze Stability paradigm group.
Table 3. Correlation analysis of eye movement indicators in the Gaze Stability paradigm group.
IndicatorsAverage Saccade AmplitudeMedian Saccade AmplitudeSD Saccade Amplitude
Average Saccade Amplitude1 (0.000)0.95 (0.000)0.54 (0.000)
Median Saccade Amplitude0.95 (0.000)1 (0.000)0.52 (0.000)
SD Saccade Amplitude0.54 (0.000)0.52 (0.000)1 (0.000)
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhang, X.; Dai, M.; Cheng, F.; Yun, L.; Chen, Z. Eye Movement Indicator Difference Based on Binocular Color Fusion and Rivalry. J. Eye Mov. Res. 2025, 18, 10. https://doi.org/10.3390/jemr18020010

AMA Style

Zhang X, Dai M, Cheng F, Yun L, Chen Z. Eye Movement Indicator Difference Based on Binocular Color Fusion and Rivalry. Journal of Eye Movement Research. 2025; 18(2):10. https://doi.org/10.3390/jemr18020010

Chicago/Turabian Style

Zhang, Xinni, Mengshi Dai, Feiyan Cheng, Lijun Yun, and Zaiqing Chen. 2025. "Eye Movement Indicator Difference Based on Binocular Color Fusion and Rivalry" Journal of Eye Movement Research 18, no. 2: 10. https://doi.org/10.3390/jemr18020010

APA Style

Zhang, X., Dai, M., Cheng, F., Yun, L., & Chen, Z. (2025). Eye Movement Indicator Difference Based on Binocular Color Fusion and Rivalry. Journal of Eye Movement Research, 18(2), 10. https://doi.org/10.3390/jemr18020010

Article Metrics

Back to TopTop