Electroencephalography Dataset of Young Drivers and Non-Drivers Under Visual and Auditory Distraction Using a Go/No-Go Paradigm
Abstract
1. Summary
2. Data Description
2.1. Dataset Content
- The participants file contains demographic and behavioral data from 40 university students who voluntarily participated in the study. The dataset includes demographic and behavioral information such as sex (male or female), driving status (ability to drive and possession of a driver’s license), and driving experience measured in years. It also records the weekly driving frequency (ranging from one to three or more times per week) and daily cell phone use, categorized into four ranges: ≤1 h, 1–≤2 h, 2–≤3 h, and more than 3 h per day. In addition, the file documents the frequency of cell phone use while driving (“sometimes” or “frequently”) and the emotional state prior to the experiment, classified according to Mehrabian’s affective model [21], with categories such as calm, tense, sleepy, delighted, excited, or happy.
- The EEG raw data folder contains the recordings of the 40 participants, each identified with a unique code (e.g., ID 001, ID 002, etc.). Within every participant’s directory, there are eight files with the .db extension, corresponding to the eight experimental tasks. File naming follows the standardized BIDS-like convention:subj-002_ses-001_task-02_run-002_20240615_094500_eeg.dbwhere fields correspond to subject ID, session, task number, run number, and timestamp (YYYYMMDD_HHMMSS). This naming scheme is consistent across all .db, .set, and .events files in the Zenodo repository.
- The Events folder contains eight text files that document the experimental events, for example, Event_Test_1.txt. Each file stores three key variables:
- Latency indicates the precise timing of each event relative to the EEG recording, expressed as the time interval from the start of the EEG signal.
- Type specifies the stimulus presented during the experiment. Stimuli consist of both visual and auditory signals: visual signals include yield signs, stop signs, and pedestrian crossings, while auditory signals include hazard light sounds, brake sounds, phone call sounds, and conversation sounds. In addition, the “Cross” entries represent transition markers between events rather than stimuli. All stimuli were systematically presented across the eight experimental tasks for each participant.
- Position provides the sequential order or spatial reference of the event within the experimental timeline, allowing precise alignment with the EEG data.
2.2. File Structure Example
3. Methods
3.1. Participants
3.2. Experimental Design
- Experiment 1: https://gitlab.pavlovia.org/ydgarcia1/gonogo_youngdrivers_test1 (accessed on 16 September 2025);
- Experiment 2: https://gitlab.pavlovia.org/ydgarcia1/gonogo_youngdrivers_test2 (accessed on 16 September 2025);
- Experiment 3: https://gitlab.pavlovia.org/ydgarcia1/gonogo_youngdrivers_test3 (accessed on 16 September 2025);
- Experiment 4: https://gitlab.pavlovia.org/ydgarcia1/gonogo_youngdrivers_test4 (accessed on 16 September 2025);
- Experiment 5: https://gitlab.pavlovia.org/ydgarcia1/gonogo_youngdrivers_test5 (accessed on 16 September 2025);
- Experiment 6: https://gitlab.pavlovia.org/ydgarcia1/gonogo_youngdrivers_test6 (accessed on 16 September 2025);
- Experiment 7: https://gitlab.pavlovia.org/ydgarcia1/gonogo_youngdrivers_test7 (accessed on 16 September 2025);
- Experiment 8: https://gitlab.pavlovia.org/ydgarcia1/gonogo_youngdrivers_test8 (accessed on 16 September 2025).
3.3. EEG Acquisition
3.4. Data Preprocessing and Curation
- Filtered between 0.5 and 40 Hz.
- Re-referenced to the average.
- Cleaned using ICA for ocular and muscular artifact removal.
- Epoched from −200 ms to 800 ms around each stimulus.
- Annotated with Go/No-Go and distraction events.
4. Data Availability
- File format: .db (BrainAccess MIDI).
- Sampling rate: 250 Hz.
- Bit resolution: 24-bit.
- Number of channels: 16 (channel configuration as shown in Figure 2).
- Event markers: Included as separate .txt files with precise latency information, stimulus type, and position data.
- File Conversion: Raw .db files should be converted to EEGLAB-compatible .set format using the BrainAccess Board software File Converter (version 2.0 or later).
- Recommended Preprocessing Pipeline (using EEGLAB version 2021.0 or later):
- Filtering: Butterworth band-pass filter (0.5–40 Hz, filter order: 4th).
- Referencing: Average reference (re-reference to the mean of all channels).
- ICA Algorithm: Infomax ICA (runica) or AMICA, with default EEGLAB settings; minimum 30 min of data per participant recommended for optimal ICA decomposition.
- Artifact Rejection: Manual inspection of independent components and removal of components with EOG/EMG characteristics.
- Epoching: Time-locked to stimulus onset (−200 to 800 ms relative to Go/No-Go cues).
- Baseline Correction: Mean baseline subtraction (−200 to 0 ms pre-stimulus window).
5. Technical Validation
6. User Notes
- Event-Related Potentials (ERP) analysis: Researchers can replicate N2 and P3 responses associated with inhibition and distraction [24].
- Driver vs. non-driver comparisons: Enables testing whether neural differences are attributable to driving experience or general youth traits [1].
- Machine learning: The structured dataset is suitable for training classifiers (e.g., SVM, LSTM) for distraction detection and BCI applications [25].
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Goldsworthy, J.; Watling, C.; Rose, C.; Larue, G. The Effects of Distraction on Younger Drivers: A Neurophysiological Perspective. Appl. Ergon. 2024, 114, 104147. [Google Scholar] [CrossRef] [PubMed]
- Regan, M.; Lee, J.; Young, K. Driver Distraction: Theory, Effects, and Mitigation, 1st ed.; CRC Press: Boca Raton, FL, USA, 2008. [Google Scholar]
- Yeo, M.V.M.; Li, X.; Shen, K.; Wilder-Smith, E.P.V. Can SVM Be Used for Automatic EEG Detection of Drowsiness during Car Driving? Saf. Sci. 2009, 47, 115–124. [Google Scholar] [CrossRef]
- Shen, K.Q.; Li, X.P.; Ong, C.J.; Shao, S.Y.; Wilder-Smith, E.P.V. EEG-Based Mental Fatigue Measurement Using Multi-Class Support Vector Machines with Confidence Estimate. Clin. Neurophysiol. 2008, 119, 1524–1533. [Google Scholar] [CrossRef] [PubMed]
- Saroj, K.L.; Craig, A. A Critical Review of the Psychophysiology of Driver Fatigue. Biol. Psychol. 2001, 55, 173–194. [Google Scholar] [CrossRef] [PubMed]
- Jap, B.T.; Lal, S.; Fischer, P.; Bekiaris, E. Using EEG Spectral Components to Assess Algorithms for Detecting Fatigue. Expert Syst. Appl. 2009, 36, 2352–2359. [Google Scholar] [CrossRef]
- Sonnleitner, A.; Treder, M.S.; Simon, M.; Willmann, S.; Ewald, A.; Buchner, A.; Schrauf, M. EEG Alpha Spindles and Prolonged Brake Reaction Times during Auditory Distraction in an On-Road Driving Study. Accid. Anal. Prev. 2014, 62, 110–118. [Google Scholar] [CrossRef] [PubMed]
- Wali, M.K.; Murugappan, A. Subtractive Fuzzy Classifier Based Driver Distraction Levels Classification Using EEG. J. Phys. Ther. Sci. 2013, 25, 1055–1058. [Google Scholar] [CrossRef] [PubMed]
- Rothkrantz, L.J.M.; Horlings, R.; Dharmawan, Z. Recognition of Emotional States of Car Drivers by EEG Analysis. Int. J. Neural Mass 2009, 19, 119–128. [Google Scholar]
- Frasson, C.; Brosseau, P.O.; Tran, T.H.D. Virtual Environment for Monitoring Emotional Behaviour in Driving. In Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2014; Volume 8474, pp. 75–83. ISBN 9783319072203. [Google Scholar]
- Fan, X.A.; Bi, L.Z.; Chen, Z.L. Using EEG to Detect Drivers’ Emotion with Bayesian Networks. In Proceedings of the International Conference on Machine Learning and Cybernetics, ICMLC 2010, Qingdao, China, 11–14 July 2010; Volume 3, pp. 1177–1181. [Google Scholar]
- Lin, C.T.; Lin, H.Z.; Chiu, T.W.; Chao, C.F.; Chen, Y.C.; Liang, S.F.; Ko, L.W. Distraction-Related EEG Dynamics in Virtual Reality Driving Simulation. In Proceedings of the IEEE International Symposium on Circuits and Systems, Seattle, WA, USA, 18–21 May 2008; pp. 1088–1091. [Google Scholar]
- Kumar, S.P.; Selvaraj, J.; Krishnakumar, R.; Sahayadhas, A. Detecting Distraction in Drivers Using Electroencephalogram (EEG) Signals. In Proceedings of the 4th International Conference on Computing Methodologies and Communication, ICCMC 2020, Erode, India, 11–13 March 2020; pp. 635–639. [Google Scholar]
- Shi, C.; Yan, F.; Zhang, J.; Yu, H.; Peng, F.; Yan, L. Right Superior Frontal Involved in Distracted Driving. Transp. Res. Part. F Traffic Psychol. Behav. 2023, 93, 191–203. [Google Scholar] [CrossRef]
- Osborn, A.F.; Owens, D.A. Change Blindness: A Comparison of Selective Attention of Novice and Experienced Drivers. J. Vis. 2010, 10, 201. [Google Scholar] [CrossRef]
- Tao, S.; Deng, Y.; Jiang, Y. Differential Impact of Working Memory and Inhibitory Control on Distracted Driving Performance among Experienced and Inexperienced Drivers. In Proceedings of the 12th International Conference on Traffic and Logistic Engineering, ICTLE, Macau, China, 23–25 August 2024; pp. 17–22. [Google Scholar]
- Ba, Y.; Zhang, W.; Salvendy, G.; Cheng, A.S.K.; Ventsislavova, P. Assessments of Risky Driving: A Go/No-Go Simulator Driving Task to Evaluate Risky Decision-Making and Associated Behavioral Patterns. Appl. Ergon. 2016, 52, 265–274. [Google Scholar] [CrossRef] [PubMed]
- Sodnik, J.; Dicke, C.; Tomažič, S.; Billinghurst, M. A User Study of Auditory versus Visual Interfaces for Use While Driving. Int. J. Hum. Comput. Stud. 2008, 66, 318–332. [Google Scholar] [CrossRef]
- Karthaus, M.; Wascher, E.; Getzmann, S. Effects of Visual and Acoustic Distraction on Driving Behavior and EEG in Young and Older Car Drivers: A Driving Simulation Study. Front. Aging Neurosci. 2018, 10, 420. [Google Scholar] [CrossRef] [PubMed]
- Delorme, A.; Makeig, S. EEGLAB: An Open Source Toolbox for Analysis of Single-Trial EEG Dynamics Including Independent Component Analysis. J. Neurosci. Methods 2004, 134, 9–21. [Google Scholar] [CrossRef] [PubMed]
- Mehrabian, A. Pleasure-Arousal-Dominance: A General Framework for Describing and Measuring Individual Differences in Temperament. Curr. Psychol. 1996, 14, 261–292. [Google Scholar] [CrossRef]
- NEUROTecnology. BrainAccess MIDI Electroencephalograph. Available online: https://www.brainaccess.ai/wp-content/uploads/downloads/UserManualMIDI.pdf (accessed on 12 December 2023).
- Li, G.; Wu, X.; Eichberger, A.; Green, P.; Olaverri-Monreal, C.; Yan, W.; Qin, Y.; Li, Y. Drivers’ EEG Responses to Different Distraction Tasks. Automot. Innov. 2023, 6, 20–31. [Google Scholar] [CrossRef]
- Hsieh, S.; Wu, M.; Tang, C.H. Adaptive Strategies for the Elderly in Inhibiting Irrelevant and Conflict No-Go Trials While Performing the Go/No-Go Task. Front. Aging Neurosci. 2016, 7, 243. [Google Scholar] [CrossRef] [PubMed]
- Wali, M.K.; Murugappan, M.; Ahmmad, B. Wavelet Packet Transform Based Driver Distraction Level Classification Using EEG. Math. Probl. Eng. 2013, 2013, 297587. [Google Scholar] [CrossRef]


| Group | Licensed Driver | Non-Drivers | ||
|---|---|---|---|---|
| Sex | Male | Female | Male | Female |
| Age (years) | 18–24 Ave: 20.9 SD: 2.03 | 18–21 Ave: 20.1 SD: 0.97 | 19–24 Ave: 21.3 SD: 1.78 | 18–23 Ave: 20.5 SD: 1.44 |
| Driving experience (years) | 0–6 Ave: 2.1 SD: 1.76 | 1–3 Ave: 1.7 SD: 0.67 | — | — |
| Mobile use per day (percentage) | ≤1 h: 0% 1–2 h: 0% 2–3 h: 30% >3 h: 70% | ≤1 h: 0% 1–2 h: 9% 2–3 h: 36% >3 h: 55% | ≤1 h: 8% 1–2 h: 0% 2–3 h: 42% >3 h: 50% | ≤1 h: 0% 1–2 h: 9% 2–3 h: - >3 h: 91% |
| Frequency of mobile use while driving (percentage) | Sometimes: 90% Frequently: 10% | Sometimes: 91% Frequently: 9% | ||
| Emotions and mood states before the experiments (percentage) | Calm: 50% Tense: 0% Excited: 0% Happy: 20% Depressed: 0% Sleepy: 10% Delighted: 10% Distressed: 0% Angry: 10% Pleased: 0% | Calm: 55% Tense: 9% Excited: 9% Happy: 0% Depressed: 9% Sleepy: 9% Delighted: 0% Distressed: 0% Angry: 0% Pleased: 9% | Calm: 42% Tense: 33% Excited: 0% Happy: 17% Depressed: 0% Sleepy: 0% Delighted: 0% Distressed: 8% Angry: 0% Pleased: 0% | Calm: 36% Tense: 36% Excited: 18% Happy: 0% Depressed: 0% Sleepy: 10% Delighted: 0% Distressed: 0% Angry: 0% Pleased: 0% |
| Exp. | Stimuli Type | Stimuli Presented | Task Requirement | Distraction Condition | Distraction Condition |
|---|---|---|---|---|---|
| 1 | Visual | - Stop sign (1 s) - Yield sign (1 s) - Cross (1 s) | No response required | None | Participants view alternating “Stop” and “Yield” signals with interspersed “cross” symbols to focus attention at the screen center. No interaction is needed. |
| 2 | Visual + Interactive | - Stop sign (1 s) - Yield sign (1 s) - Cross (1 s) | Press space bar when “Stop” appears | None | Participants press the space bar when they see the “Stop” signal. The “cross” symbol centers attention between signals. |
| 3 | Visual + Distraction | - Stop sign (1 s) - Yield sign (1 s) - Cross (1 s) | No response required | Text message notification (sound + visual) | Participants view “Stop” and “Yield” signals while receiving 3 visual and auditory message notifications. No interaction required. |
| 4 | Visual + Interactive + Distraction | - Stop sign (1 s) - Yield sign (1 s) - Cross (1 s) | Press space bar when “Stop” appears | Incoming call sound + 6 s conversation | Participants press the space bar when they see “Stop” (Go stimulus) and experience an incoming call notification followed by a short conversation sound. |
| 5 | Auditory | - Braking sound (1.5 s) - Hazard lights sound (1.5 s) - Pause (1 s) | No response required | None | Blindfolded participants listen to alternating “braking” and “hazard lights“ sounds, separated by 1 s pauses. No interaction required. |
| 6 | Auditory + Interactive | - Braking sound (1.5 s) - Hazard lights sound (1.5 s) - Pause (1 s) | Press space bar when hearing “braking” sound | None | Blindfolded participants press the space bar when they hear the “braking” sound. |
| 7 | Auditory + Distraction | - Braking sound (1.5 s) - Hazard lights sound (1.5 s) - Pause (1 s) | No response required | Message notification sound | Blindfolded participants listen to the “braking” and “hazard lights“ sounds with intermittent WhatsApp message notifications. No interaction required. |
| 8 | Auditory + Interactive + Distraction | - Braking sound (1.5 s) - Hazard lights sound (1.5 s) - Pause (1 s) | Press space bar when hearing “braking” sound | Incoming call notification + 6 s conversation | Blindfolded participants press the space bar when they hear the “braking” sound, while experiencing an incoming call notification and brief conversation sound. |
| Stimulus | Go/No-Go Status | Expected Response | Trigger Label |
|---|---|---|---|
| Stop sign | Go | Button press | Go_yield |
| Braking sound | Go | Button press | Go_hazard |
| Yield sign | No-Go | No response | Nogo_stop |
| Hazard lights | No-Go | No response | Nogo_brake |
| Transition cross | Transition | — | Cross |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
García-Ramírez, Y.; Gordillo, L.; Pereira, B. Electroencephalography Dataset of Young Drivers and Non-Drivers Under Visual and Auditory Distraction Using a Go/No-Go Paradigm. Data 2025, 10, 175. https://doi.org/10.3390/data10110175
García-Ramírez Y, Gordillo L, Pereira B. Electroencephalography Dataset of Young Drivers and Non-Drivers Under Visual and Auditory Distraction Using a Go/No-Go Paradigm. Data. 2025; 10(11):175. https://doi.org/10.3390/data10110175
Chicago/Turabian StyleGarcía-Ramírez, Yasmany, Luis Gordillo, and Brian Pereira. 2025. "Electroencephalography Dataset of Young Drivers and Non-Drivers Under Visual and Auditory Distraction Using a Go/No-Go Paradigm" Data 10, no. 11: 175. https://doi.org/10.3390/data10110175
APA StyleGarcía-Ramírez, Y., Gordillo, L., & Pereira, B. (2025). Electroencephalography Dataset of Young Drivers and Non-Drivers Under Visual and Auditory Distraction Using a Go/No-Go Paradigm. Data, 10(11), 175. https://doi.org/10.3390/data10110175

