Next Article in Journal
Acknowledgement to Reviewers of Drones in 2019
Previous Article in Journal
Prediction of Optical and Non-Optical Water Quality Parameters in Oligotrophic and Eutrophic Aquatic Systems Using a Small Unmanned Aerial System
Open AccessArticle

EyeTrackUAV2: A Large-Scale Binocular Eye-Tracking Dataset for UAV Videos

1
IRISA, CNRS, Univ Rennes, 263 Avenue Général Leclerc, 35000 Rennes, France
2
Laboratoire des Sciences du Numérique de Nantes (LS2N), Polytech Nantes, Université de Nantes, 44306 Nantes CEDEX 3, France
3
Department of Surveying and Geoinformatics Engineering, University of West Attica, 28 Agiou Spyridonos Str., 12243 Aigaleo, Greece
4
IETR, CNRS, INSA Rennes, IETR-UMR 6164, 35000 Rennes, France
*
Author to whom correspondence should be addressed.
Received: 3 December 2019 / Revised: 18 December 2019 / Accepted: 20 December 2019 / Published: 8 January 2020
The fast and tremendous evolution of the unmanned aerial vehicle (UAV) imagery gives place to the multiplication of applications in various fields such as military and civilian surveillance, delivery services, and wildlife monitoring. Combining UAV imagery with study of dynamic salience further extends the number of future applications. Indeed, considerations of visual attention open the door to new avenues in a number of scientific fields such as compression, retargeting, and decision-making tools. To conduct saliency studies, we identified the need for new large-scale eye-tracking datasets for visual salience in UAV content. Therefore, we address this need by introducing the dataset EyeTrackUAV2. It consists of the collection of precise binocular gaze information (1000 Hz) over 43 videos (RGB, 30 fps, 1280 × 720 or 720 × 480). Thirty participants observed stimuli under both free viewing and task conditions. Fixations and saccades were then computed with the dispersion-threshold identification (I-DT) algorithm, while gaze density maps were calculated by filtering eye positions with a Gaussian kernel. An analysis of collected gaze positions provides recommendations for visual salience ground-truth generation. It also sheds light upon variations of saliency biases in UAV videos when opposed to conventional content, especially regarding the center bias. View Full-Text
Keywords: dataset; salience; unmanned aerial vehicles (UAV); videos; visual attention; eye tracking; surveillance dataset; salience; unmanned aerial vehicles (UAV); videos; visual attention; eye tracking; surveillance
Show Figures

Graphical abstract

MDPI and ACS Style

Perrin, A.-F.; Krassanakis, V.; Zhang, L.; Ricordel, V.; Perreira Da Silva, M.; Le Meur, O. EyeTrackUAV2: A Large-Scale Binocular Eye-Tracking Dataset for UAV Videos. Drones 2020, 4, 2.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop