Next Issue
Volume 11, May
Previous Issue
Volume 11, February
 
 
Journal of Eye Movement Research is published by MDPI from Volume 18 Issue 1 (2025). Previous articles were published by another publisher in Open Access under a CC-BY (or CC-BY-NC-ND) licence, and they are hosted by MDPI on mdpi.com as a courtesy and upon agreement with Bern Open Publishing (BOP).

J. Eye Mov. Res., Volume 11, Issue 2 (May 2018) – 13 articles

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Select all
Export citation of selected articles as:
24 pages, 3644 KiB  
Article
Eye-Hand Synchronization in Xylophone Performance: Two Case-Studies with African and Western Percussionists
by Fabrice Marandola
J. Eye Mov. Res. 2018, 11(2), 1-24; https://doi.org/10.16910/jemr.11.2.7 - 31 Mar 2019
Cited by 5 | Viewed by 90
Abstract
This article is the result of a first foray into xylophone performance with percussionists from Canada and Cameroon. It proposes to use the combination of Eye-Stroke Span (ESS), Fixation- Duration and Note-Pattern indexes to analyze free-score and performance oriented musical tasks, instead of [...] Read more.
This article is the result of a first foray into xylophone performance with percussionists from Canada and Cameroon. It proposes to use the combination of Eye-Stroke Span (ESS), Fixation- Duration and Note-Pattern indexes to analyze free-score and performance oriented musical tasks, instead of eye-hand span or awareness span for sight-reading and score-based eye-tracking research in music. Based on measurements realized with a head-mounted eyetracker system, the research examines gaze-movements related to eye-hand synchronization in xylophone performance with musicians coming from three different ethnic groups from Cameroon (Bedzan Pygmies, Tikar and Eton) and classically trained Western percussionists (Canada). Increases in tempo are found to involve a diminution of the number of fixations, but not proportionally, as well as changes in lateral gaze shifts. Fixation-Duration and Note- Pattern are closely related but not identical, while ESS is relatively more independent. These gaze patterns are consistent within individuals, but not across individuals. Cameroonian musicians tend to look away from their instrument, interacting with their peers or with the audience. When they look at their keyboard, preliminary measures of ESS were found similar to the ESS of Western performers. Full article
Show Figures

Figure 1

4 pages, 117 KiB  
Article
The Application of Eye-Tracking in Music Research
by Lauren K. Fink, Elke B. Lange and Rudolf Groner
J. Eye Mov. Res. 2018, 11(2), 1-4; https://doi.org/10.16910/jemr.11.2.1 - 15 Feb 2019
Cited by 19 | Viewed by 223
Abstract
Though eye-tracking is typically a methodology applied in the visual research domain, recent studies suggest its relevance in the context of music research. There exists a communityof researchers interested in this kind of research from varied disciplinary backgrounds scattered across the globe. Therefore, [...] Read more.
Though eye-tracking is typically a methodology applied in the visual research domain, recent studies suggest its relevance in the context of music research. There exists a communityof researchers interested in this kind of research from varied disciplinary backgrounds scattered across the globe. Therefore, in August 2017, an international conference was held at the Max Planck Institute for Empirical Aesthetics in Frankfurt, Germany,to bring this research community together. The conference was dedicated to the topic of music and eye-tracking, asking the question: what do eye movements, pupil dilation, and blinking activity tell us about musical processing? This special issue is constituted of top-scoring research from the conference and spans a range of music-related topics. From tracking the gaze of performers in musical trios to basic research on how eye movements are affected by background music, the contents of this special issue highlight a variety of experimental approaches and possible applications of eye-tracking in music research. Full article
12 pages, 2777 KiB  
Article
Eye Movements, Attention, and Expert Knowledge in the Observation of Bharatanatyam Dance
by Raganya Ponmanadiyil and Matthew H. Woolhouse
J. Eye Mov. Res. 2018, 11(2), 1-12; https://doi.org/10.16910/jemr.11.2.11 - 28 Dec 2018
Cited by 9 | Viewed by 123
Abstract
Previous research indicates that dance expertise affects eye-movement behaviour—dance experts tend to have faster saccades and more tightly clustered fixations than novices when observing dance, suggesting that experts are able to predict movements and process choreographic information more quickly. Relating to this, the [...] Read more.
Previous research indicates that dance expertise affects eye-movement behaviour—dance experts tend to have faster saccades and more tightly clustered fixations than novices when observing dance, suggesting that experts are able to predict movements and process choreographic information more quickly. Relating to this, the present study aimed to explore (1) the effects of expertise on eye movements (as a proxy for attentional focus and the existence of movement-dance schemas) in Indian Bharatanatyam dance, and (2) narrative dance, which is an important component of Bharatanatyam. Fixation durations, dwell times, and fixation-position dispersions were recorded for novices and experts in Bharatanatyam (N = 28) while they observed videos of narrative and non-narrative Bharatanatyam dance. Consistent with previous research, experts had shorter fixation durations and more tightly clustered fixations than novices. Tighter clustering of fixations was also found for narrative dance versus non-narrative. Our results are discussed in relation to previous dance and eye-tracking research. Full article
Show Figures

Figure 1

13 pages, 1048 KiB  
Article
Pupillary Dilation Response Reflects Surprising Moments in Music
by Hsin-I Liao, Makoto Yoneya, Makio Kashino and Shigeto Furukawa
J. Eye Mov. Res. 2018, 11(2), 1-13; https://doi.org/10.16910/jemr.11.2.13 - 14 Dec 2018
Cited by 14 | Viewed by 123
Abstract
There are indications that the pupillary dilation response (PDR) reflects surprising moments in an auditory sequence such as the appearance of a deviant noise against repetitively presented pure tones (Liao, Yoneya, Kidani, Kashino, & Furukawa, 2016), and salient and loud sounds that are [...] Read more.
There are indications that the pupillary dilation response (PDR) reflects surprising moments in an auditory sequence such as the appearance of a deviant noise against repetitively presented pure tones (Liao, Yoneya, Kidani, Kashino, & Furukawa, 2016), and salient and loud sounds that are evaluated by human paricipants subjectively (Liao, Kidani, Yoneya, Kashino, & Furukawa, 2016). In the current study, we further examined whether the reflection of PDR in auditory surprise can be accumulated and revealed in complex and yet structured auditory stimuli, i.e., music, and when the surprise is defined subjectively. Participants listened to 15 excerpts of music while their pupillary responses were recorded. In the surprise-rating session, participants rated how surprising an instance in the excerpt was, i.e., rich in variation versus monotonous, while they listened to it. In the passive-listening session, they listened to the same 15 excerpts again but were not involved in any task. The pupil diameter data obtained from both sessions were time-aligned to the rating data obtained from the surprise-rating session. Results showed that in both sessions, mean pupil diameter was larger at moments rated more surprising than unsurprising. The result suggests that the PDR reflects surprise in music automatically. Full article
Show Figures

Figure 1

21 pages, 2112 KiB  
Article
Cross-Modal Music Integration in Expert Memory: Evidence from Eye Movements
by Véronique Drai-Zerbib and Thierry Baccino
J. Eye Mov. Res. 2018, 11(2), 1-21; https://doi.org/10.16910/jemr.11.2.4 - 12 Dec 2018
Cited by 14 | Viewed by 128
Abstract
The study investigated the cross-modal integration hypothesis for expert musicians using eye tracking. Twenty randomized excerpts of classical music were presented in two modes (auditory and visual), at the same time (simultaneously) or successively (sequentially). Musicians (N = 53, 26 experts and 27 [...] Read more.
The study investigated the cross-modal integration hypothesis for expert musicians using eye tracking. Twenty randomized excerpts of classical music were presented in two modes (auditory and visual), at the same time (simultaneously) or successively (sequentially). Musicians (N = 53, 26 experts and 27 non-experts) were asked to detect a note modified between the auditory and visual versions, either in the same major/minor key or violating the key. Experts carried out the task faster and with greater accuracy than non-experts. Sequential presentation was more difficult than simultaneous (longer fixations and higher error rates) and the modified notes were more easily detected when violating the key (fewer errors), but with longer fixations (speed/accuracy trade-off strategy). Experts detected the modified note faster, especially in the simultaneous condition in which cross-modal integration may be applied. These results support the hypothesis that the main difference between experts and non-experts derives from the difference in knowledge structures in memory built over time with practice. They also suggest that these high-level knowledge structures in memory contain harmony and tonal rules, arguing in favour of cross-modal integration capacities for experts, which are related to and can be explained by the long-term working memory (LTWM) model of expert memory (e.g., (Drai-Zerbib & Baccino, 2014; Ericsson & Kintsch, 1995)). Full article
Show Figures

Figure 1

24 pages, 5404 KiB  
Article
A Linear Oscillator Model Predicts Dynamic Temporal Attention and Pupillary Entrainment to Rhythmic Patterns
by Lauren K. Fink, Brian K. Hurley, Joy J. Geng and Petr Janata
J. Eye Mov. Res. 2018, 11(2), 1-24; https://doi.org/10.16910/jemr.11.2.12 - 20 Nov 2018
Cited by 23 | Viewed by 186
Abstract
Rhythm is a ubiquitous feature of music that induces specific neural modes of processing. In this paper, we assess the potential of a stimulus-driven linear oscillator model (57) to predict dynamic attention to complex musical rhythms on an instant-by-instant basis. We [...] Read more.
Rhythm is a ubiquitous feature of music that induces specific neural modes of processing. In this paper, we assess the potential of a stimulus-driven linear oscillator model (57) to predict dynamic attention to complex musical rhythms on an instant-by-instant basis. We use perceptual thresholds and pupillometry as attentional indices against which to test our model predictions. During a deviance detection task, participants listened to continuously looping, multiinstrument, rhythmic patterns, while being eye-tracked. Their task was to respond anytime they heard an increase in intensity (dB SPL). An adaptive thresholding algorithm adjusted deviant intensity at multiple probed temporal locations throughout each rhythmic stimulus. The oscillator model predicted participants’ perceptual thresholds for detecting deviants at probed locations, with a low temporal salience prediction corresponding to a high perceptual threshold and vice versa. A pupil dilation response was observed for all deviants. Notably, the pupil dilated even when participants did not report hearing a deviant. Maximum pupil size and resonator model output were significant predictors of whether a deviant was detected or missed on any given trial. Besides the evoked pupillary response to deviants, we also assessed the continuous pupillary signal to the rhythmic patterns. The pupil exhibited entrainment at prominent periodicities present in the stimuli and followed each of the different rhythmic patterns in a unique way. Overall, these results replicate previous studies using the linear oscillator model to predict dynamic attention to complex auditory scenes and extend the utility of the model to the prediction of neurophysiological signals, in this case the pupillary time course; however, we note that the amplitude envelope of the acoustic patterns may serve as a similarly useful predictor. To our knowledge, this is the first paper to show entrainment of pupil dynamics by demonstrating a phase relationship between musical stimuli and the pupillary signal. Full article
Show Figures

Figure 1

17 pages, 435 KiB  
Article
The Rhythm of Cognition—Effects of an Auditory Beat on Oculomotor Control in Reading and Sequential Scanning
by Elke B. Lange, Aleks Pieczykolan, Hans A. Trukenbrod and Lynn Huestegge
J. Eye Mov. Res. 2018, 11(2), 1-17; https://doi.org/10.16910/jemr.11.2.9 - 20 Aug 2018
Cited by 8 | Viewed by 112
Abstract
Eye-movement behavior is inherently rhythmic. Even without cognitive input, the eyes never rest, as saccades are generated 3 to 4 times per second. Based on an embodied view of cognition, we asked whether mental processing in visual cognitive tasks is also rhythmic in [...] Read more.
Eye-movement behavior is inherently rhythmic. Even without cognitive input, the eyes never rest, as saccades are generated 3 to 4 times per second. Based on an embodied view of cognition, we asked whether mental processing in visual cognitive tasks is also rhythmic in nature by studying the effects of an external auditory beat (rhythmic background music) on saccade generation in exemplary cognitive tasks (reading and sequential scanning). While in applied settings background music has been demonstrated to impair reading comprehension, the effect of musical tempo on eye-movement control during reading or scanning has not been investigated so far. We implemented a tempo manipulation in four steps as well as a silent baseline condition, while participants completed a text reading or a sequential scanning task that differed from each other in terms of underlying cognitive processing requirements. The results revealed that increased tempo of the musical beat sped up fixations in text reading, while the presence (vs. absence) of the auditory stimulus generally reduced overall reading time. In contrast, sequential scanning was unaffected by the auditory pacemaker. These results were supported by additionally applying Bayesian inference statistics. Our study provides evidence against a cognitive load account (i.e., that spare resources during low-demand sequential scanning allow for enhanced processing of the external beat). Instead, the data suggest an interpretation in favor of a modulation of the oculomotor saccade timer by irrelevant background music in cases involving highly automatized oculomotor control routines (here: in text reading). Full article
Show Figures

Figure 1

13 pages, 722 KiB  
Article
Eye Movements in Scene Perception While Listening to Slow and Fast Music
by Marek Franěk, Denis Šefara, Jan Petružálek, Roman Mlejnek and Leon van Noorden
J. Eye Mov. Res. 2018, 11(2), 1-13; https://doi.org/10.16910/jemr.11.2.8 - 11 Aug 2018
Cited by 10 | Viewed by 57
Abstract
To date, there is insufficient knowledge of how visual exploration of outdoor scenes may be influenced by the simultaneous processing of music. Eye movements during viewing various outdoor scenes while listening to music at either a slow or fast tempo or in silence [...] Read more.
To date, there is insufficient knowledge of how visual exploration of outdoor scenes may be influenced by the simultaneous processing of music. Eye movements during viewing various outdoor scenes while listening to music at either a slow or fast tempo or in silence were measured. Significantly shorter fixations were found for viewing urban scenes com-pared with natural scenes, but there was no interaction between the type of scene and the acoustic conditions. The results revealed shorter fixation durations in the silent control condition in the range 30 ms, compared to both music conditions but, in contrast to previ-ous studies, these differences were non-significant. Moreover, we did not find differences in eye movements between music conditions with a slow or fast tempo. It is supposed that the type of musical stimuli, the specific tempo, the specific experimental procedure, and the engagement of participants in listening to background music while processing visual information may be important factors that influence attentional processes, which are mani-fested in eye-movement behavior. Full article
13 pages, 400 KiB  
Article
Gazing at the Partner in Musical Trios: A Mobile Eye-Tracking Study
by Sarah Vandemoortele, Kurt Feyaerts, Mark Reybrouck, Geert De Bièvre, Geert Brône and Thomas De Baets
J. Eye Mov. Res. 2018, 11(2), 1-13; https://doi.org/10.16910/jemr.11.2.6 - 16 Jul 2018
Cited by 16 | Viewed by 102
Abstract
Few investigations into the nonverbal communication in ensemble playing have focused on gaze behaviour up to now. In this study, the gaze behaviour of musicians playing in trios was recorded using the recently developed technique of mobile eye-tracking. Four trios (clarinet, violin, piano) [...] Read more.
Few investigations into the nonverbal communication in ensemble playing have focused on gaze behaviour up to now. In this study, the gaze behaviour of musicians playing in trios was recorded using the recently developed technique of mobile eye-tracking. Four trios (clarinet, violin, piano) were recorded while rehearsing and while playing several runs through the same musical fragment. The current article reports on an initial exploration of the data in which we describe how often gazing at the partner occurred. On the one hand, we aim to identify possible contrasting cases. On the other, we look for tendencies across the run-throughs. We discuss the quantified gaze behaviour in relation to the existing literature and the current research design. Full article
Show Figures

Figure 1

17 pages, 736 KiB  
Article
The Impact of Music and Stretched Time on Pupillary Responses and Eye Movements in Slow-Motion Film Scenes
by David Hammerschmidt and Clemens Wöllner
J. Eye Mov. Res. 2018, 11(2), 1-17; https://doi.org/10.16910/jemr.11.2.10 - 20 May 2018
Cited by 11 | Viewed by 124
Abstract
This study investigated the effects of music and playback speed on arousal and visual perception in slow-motion scenes taken from commercial films. Slow-motion scenes are a ubiquitous film technique and highly popular. Yet the psychological effects of mediated time-stretching compared to real-time motion [...] Read more.
This study investigated the effects of music and playback speed on arousal and visual perception in slow-motion scenes taken from commercial films. Slow-motion scenes are a ubiquitous film technique and highly popular. Yet the psychological effects of mediated time-stretching compared to real-time motion have not been empirically investigated. We hypothesised that music affects arousal and attentional processes. Furthermore, we as-sumed that playback speed influences viewers’ visual perception, resulting in a higher number of eye movements and larger gaze dispersion. Thirty-nine participants watched three film excerpts in a repeated-measures design in conditions with or without music and in slow motion vs. adapted real-time motion (both visual-only). Results show that music in slow-motion film scenes leads to higher arousal compared to no music as indicated by larger pupil diameters in the former. There was no systematic effect of music on visual perception in terms of eye movements. Playback speed influenced visual perception in eye movement parameters such that slow motion resulted in more and shorter fixations as well as more saccades compared to adapted real-time motion. Furthermore, in slow motion there was a higher gaze dispersion and a smaller centre bias, indicating that individuals attended to more detail in slow motion scenes. Full article
Show Figures

Figure 1

16 pages, 7896 KiB  
Article
Synchronizing Eye Tracking and Optical Motion Capture: How to Bring Them Together
by Birgitta Burger, Anna Puupponen and Tommi Jantunen
J. Eye Mov. Res. 2018, 11(2), 1-16; https://doi.org/10.16910/jemr.11.2.5 - 7 May 2018
Cited by 12 | Viewed by 185
Abstract
Both eye tracking and motion capture technologies are nowadays frequently used in human sciences, although both technologies are usually used separately. However, measuring both eye and body movements simultaneously would offer great potential for investigating crossmodal interaction in human (e.g., music and language-related) [...] Read more.
Both eye tracking and motion capture technologies are nowadays frequently used in human sciences, although both technologies are usually used separately. However, measuring both eye and body movements simultaneously would offer great potential for investigating crossmodal interaction in human (e.g., music and language-related) behavior. Here we combined an Ergoneers Dikablis head mounted eye tracker with a Qualisys Oqus optical motion capture system. In order to synchronize the recordings of both devices, we developed a generalizable solution that does not rely on any (cost-intensive) ready-made/company-provided synchronization solution. At the beginning of each recording, the participant nods quickly while fixing on a target while keeping the eyes open—a motion yielding a sharp vertical displacement in both mocap and eye data. This displacement can be reliably detected with a peak-picking algorithm and used for accurately aligning the mocap and eye data. This method produces accurate synchronization results in the case of clean data and therefore provides an attractive alternative to costly plug-ins, as well as a solution in case ready-made synchronization solutions are unavailable. Full article
Show Figures

Figure 1

16 pages, 414 KiB  
Article
Eye on Music Reading: A Methodological Review of Studies from 1994 to 2017
by Marjaana Puurtinen
J. Eye Mov. Res. 2018, 11(2), 1-16; https://doi.org/10.16910/jemr.11.2.2 - 1 May 2018
Cited by 32 | Viewed by 156
Abstract
In this review, we focus on the methodological aspects of eye-tracking research in the domain of music, published and/or available between 1994 and 2017, and we identify potentially fruitful next steps to increase coherence and systematicity within this emerging field. We review and [...] Read more.
In this review, we focus on the methodological aspects of eye-tracking research in the domain of music, published and/or available between 1994 and 2017, and we identify potentially fruitful next steps to increase coherence and systematicity within this emerging field. We review and discuss choices of musical stimuli, the conditions under which these were performed (i.e., control of performance tempo and music-reading protocols), performer’s level of musical expertise, and handling of performance errors and eye-movement data. We propose that despite a lack of methodological coherence in research to date, careful reflection on earlier methodological choices can help in formulating future research questions and in positioning new work. These steps would represent progress towards a cumulative research tradition, where joint understanding is built by systematic and consistent use of stimuli, research settings and methods of analysis. Full article
Show Figures

Figure 1

30 pages, 2934 KiB  
Article
Early Attraction in Temporally Controlled Sight Reading of Music
by Erkki Huovinen, Anna-Kaisa Ylitalo and Marjaana Puurtinen
J. Eye Mov. Res. 2018, 11(2), 1-30; https://doi.org/10.16910/jemr.11.2.3 - 10 Apr 2018
Cited by 24 | Viewed by 157
Abstract
A music reader has to “look ahead” from the notes currently being played—this has usually been called the Eye-Hand Span. Given the restrictions on processing time due to tempo and meter, the Early Attraction Hypothesis suggests that sight readers are likely to locally [...] Read more.
A music reader has to “look ahead” from the notes currently being played—this has usually been called the Eye-Hand Span. Given the restrictions on processing time due to tempo and meter, the Early Attraction Hypothesis suggests that sight readers are likely to locally increase the span of looking ahead in the face of complex upcoming symbols (or symbol relationships). We argue that such stimulus-driven effects on looking ahead are best studied using a measure of Eye-Time Span (ETS) which redefines looking ahead as the metrical distance between the position of a fixation in the score and another position that corresponds to the point of metrical time at fixation onset. In two experiments of temporally controlled sight reading, musicians read simple stepwise melodies that were interspersed with larger intervallic skips, supposed to create points of higher melodic complexity (and visual salience) at the notes following the skips. The results support both Early Attraction (lengthening of looking ahead) and Distant Attraction (lengthening of incoming saccades) in the face of relative melodic complexity. Notably, such effects also occurred on the notes preceding the nominally complex ones. The results suggest that saccadic control in music reading depends on temporal restrictions as well as on local variations in stimulus complexity. Full article
Show Figures

Figure 1

Previous Issue
Next Issue
Back to TopTop