Next Article in Journal
Groundwater Table Depth Monitoring Dataset (2023–2025) from an Extracted Kaigu Peatland Section in Central Latvia
Previous Article in Journal
Web Scraping Chilean News Media: A Dataset for Analyzing Social Unrest Coverage (2019–2023)
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Data Descriptor

Electroencephalography Dataset of Young Drivers and Non-Drivers Under Visual and Auditory Distraction Using a Go/No-Go Paradigm

Department of Civil Engineering, Universidad Técnica Particular de Loja, Loja 110101, Ecuador
*
Author to whom correspondence should be addressed.
Data 2025, 10(11), 175; https://doi.org/10.3390/data10110175 (registering DOI)
Submission received: 16 September 2025 / Revised: 13 October 2025 / Accepted: 31 October 2025 / Published: 1 November 2025

Abstract

Electroencephalography (EEG) provides insights into the neural mechanisms underlying attention, response inhibition, and distraction in cognitive tasks. This dataset was collected to examine neural activity in young drivers and non-drivers performing Go/No-Go tasks under visual and auditory distraction conditions. A total of 40 university students (20 drivers, 20 non-drivers; balanced by sex) completed eight experimental blocks combining visual or auditory stimuli with realistic distractions, such as text message notifications and phone call simulations. EEG was recorded using a 16-channel BrainAccess MIDI system at 250 Hz. Experiments 1, 3, 5, and 7 served as transitional blocks without participant responses and were excluded from behavioral and event-related potential analyses; however, their EEG recordings and event markers are included for baseline or exploratory analyses. The dataset comprises raw EEG files, event markers for Go/No-Go stimuli and distractions, and metadata on participant demographics and mobile phone usage. This resource enables studies of attentional control, inhibitory processes, and distraction-related neural dynamics, supporting research in cognitive neuroscience, brain–computer interfaces, and transportation safety.
Dataset License: CC-BY 4.0

1. Summary

This dataset provides electroencephalography (EEG) recordings from young drivers and non-drivers exposed to visual and auditory distractions in a Go/No-Go paradigm. Distraction is one of the leading factors contributing to road crashes, particularly among young adults, who are statistically more prone to risk-taking behaviors and mobile phone use while driving [1,2]. Given the critical role of neurophysiological measures in understanding these risks, previous studies have extensively explored EEG applications in driving contexts, including fatigue detection [3,4,5,6], distraction [7,8], and emotional states [9,10,11]. For instance, changes in frontal theta and delta bands are strongly correlated with the onset of fatigue [5], whereas increases in frontal theta (5–7.8 Hz) and beta (12–17 Hz) power are observed during distraction episodes [12,13]. Specific brain regions—such as the right superior frontal gyrus and dorsolateral frontal cortex—have been linked to visual and auditory distraction, respectively [14]. Moreover, event-related potentials (ERPs) such as the N2 and P3 components are consistently associated with response inhibition and attentional control in Go/No-Go paradigms [15,16,17]. Despite these advances, there remains a lack of open-access EEG datasets comparing young drivers and non-drivers under realistic distraction conditions. By addressing this gap, the present dataset provides a valuable resource for studying how driving experience modulates neural activity associated with attentional control and inhibition.
To address this gap, we recorded EEG signals from 40 university students (aged 18–24) balanced by sex and driving experience, while they performed Go/No-Go tasks under different distraction conditions. The experimental design included eight blocks, combining visual and auditory stimuli with realistic distractors such as simulated text messages and phone calls, providing ecologically valid conditions [18,19]. EEG was collected using the BrainAccess MIDI system (16 channels, 250 Hz sampling), followed by standardized preprocessing and artifact removal using EEGLAB [20].
While numerous EEG datasets have explored driver fatigue, drowsiness, or emotion recognition, very few openly available resources have focused on young adults and directly compared drivers and non-drivers under both visual and auditory distraction conditions. Most existing datasets rely on driving simulators or lack standardized task protocols designed for response inhibition. This dataset uniquely combines realistic stimuli—such as phone calls and text notifications—with the Go/No-Go paradigm, enabling cross-modal analysis of attention, distraction, and inhibitory control in both experienced and inexperienced drivers. It thus fills a clear gap in the open-access EEG domain by offering ecologically valid, cross-group, and multi-modal data for transportation safety and cognitive neuroscience research.

2. Data Description

2.1. Dataset Content

The dataset is organized into three main components:
  • The participants file contains demographic and behavioral data from 40 university students who voluntarily participated in the study. The dataset includes demographic and behavioral information such as sex (male or female), driving status (ability to drive and possession of a driver’s license), and driving experience measured in years. It also records the weekly driving frequency (ranging from one to three or more times per week) and daily cell phone use, categorized into four ranges: ≤1 h, 1–≤2 h, 2–≤3 h, and more than 3 h per day. In addition, the file documents the frequency of cell phone use while driving (“sometimes” or “frequently”) and the emotional state prior to the experiment, classified according to Mehrabian’s affective model [21], with categories such as calm, tense, sleepy, delighted, excited, or happy.
  • The EEG raw data folder contains the recordings of the 40 participants, each identified with a unique code (e.g., ID 001, ID 002, etc.). Within every participant’s directory, there are eight files with the .db extension, corresponding to the eight experimental tasks. File naming follows the standardized BIDS-like convention:
    subj-002_ses-001_task-02_run-002_20240615_094500_eeg.db
    where fields correspond to subject ID, session, task number, run number, and timestamp (YYYYMMDD_HHMMSS). This naming scheme is consistent across all .db, .set, and .events files in the Zenodo repository.
  • The Events folder contains eight text files that document the experimental events, for example, Event_Test_1.txt. Each file stores three key variables:
    • Latency indicates the precise timing of each event relative to the EEG recording, expressed as the time interval from the start of the EEG signal.
    • Type specifies the stimulus presented during the experiment. Stimuli consist of both visual and auditory signals: visual signals include yield signs, stop signs, and pedestrian crossings, while auditory signals include hazard light sounds, brake sounds, phone call sounds, and conversation sounds. In addition, the “Cross” entries represent transition markers between events rather than stimuli. All stimuli were systematically presented across the eight experimental tasks for each participant.
    • Position provides the sequential order or spatial reference of the event within the experimental timeline, allowing precise alignment with the EEG data.
It is important to note that Experiments 1, 3, 5, and 7 were designed as transitional blocks to prepare participants for the main Go/No-Go tasks. These blocks did not require active responses from participants and were therefore excluded from the behavioral and ERP analyses. However, the EEG recordings and event markers from these blocks are still included in the dataset to allow researchers to explore baseline neural activity, transition effects, or other analytical purposes.

2.2. File Structure Example

/EEG_Dataset_Drivers_NonDrivers/
  ├── 1. Participants/
  │   ├── Participants.xlsx
  ├── 2. EEG raw data/
  │   ├── ID 001/
  │       ├── subj-ID#_ses-#_task-#_run-001_ yyyymmdd _ hhmmss _eeg.db
  │       ├── subj-ID#_ses-#_task-#_run-002_ yyyymmdd _ hhmmss _eeg.db
  │       ├── …
  │   ├── ID 002/
  │   ├── ID 003/
  │   └── ...
  ├── 3. Events/
     ├── Event_Test_1.txt
     ├── Event_Test_2.txt
     └── ...

3. Methods

3.1. Participants

Forty young adults (20 drivers, 20 non-drivers) from the Universidad Técnica Particular de Loja (UTPL), Ecuador, participated in the study (See Table 1). Participants were classified as drivers if they held a valid driver’s license and had at least one year of active driving experience (average 2.1 ± 1.7 years). Non-drivers were individuals without a driver’s license or formal driving experience. Exclusion criteria included neurological disorders, psychiatric conditions, or medication affecting the central nervous system. Ethical approval was granted by the UTPL Ethics Committee (protocol 2024-06-INT-EO-RM-002), and informed consent was obtained from all participants. No identifiable audio or visual data were recorded. The dataset contains only anonymized EEG signals and event markers, ensuring compliance with institutional privacy standards.
Table 1 presents a distribution of participants across study groups. The participants, aged 18 to 24 years, were equally divided by sex. The licensed drivers, with 0–6 years of driving experience, reported high mobile usage, with 55–70% using their phones for over 3 h per day, which could indicate higher exposure to distractions. Non-drivers also showed significant mobile use, with 50–91% exceeding 3 h daily. In terms of mobile usage while driving, more than 90% of drivers reported using their phones “sometimes,” highlighting a common yet risky behavior. Emotional states prior to the experiment were mostly positive, with a majority feeling “calm” (36–55%), though a few participants reported feelings of “tension”, “excited”, or “happy.”
Both groups were comparable in age and sex distribution, ensuring that differences in EEG or behavioral outcomes are not attributable to demographic factors. Each group included 10 males and 10 females, all within the 18–24 age range. The only intended difference between groups was driving experience, which allows the dataset to isolate neural and behavioral effects related specifically to driving exposure.

3.2. Experimental Design

The protocol consisted of eight Go/No-Go blocks, designed using PsychoPy and delivered via Pavlovia. Visual tasks used traffic symbols (“Stop” = Go; “Yield” = No-Go), while auditory tasks involved car-related sounds (“braking” = Go; “hazard lights sound” = No-Go). The ‘Cross’ serves only as a transition marker between events and is not a Go/No-Go stimulus; therefore, no response should be made to it. Distractors included text messages and phone call simulations, selected to reflect realistic communication among university students, ensuring ecological validity [18,19]. Detailed parameters for each experiment are provided in Table 2. All experiments were designed using PsychoPy software version 2023.2.3 (https://www.psychopy.org/ accessed on 16 September 2025), each lasting 55 s. The experiments can be accessed at:
The full details of the experiments are publicly available on Pavlovia.org. They can be accessed by searching for ‘Gonogo_youngdrivers’.
Experiments 1, 3, 5, and 7 functioned as transitional tasks in which participants passively observed visual stimuli or listened to auditory cues without responding. These experiments were included to familiarize participants with the experimental paradigm and maintain consistent timing between tasks. Although no behavioral responses were collected from these experiments, EEG data were recorded and are available in the dataset for completeness and potential supplementary analyses.
In all interactive experiments (Experiments 2, 4, 6, and 8), participants were instructed to respond to the Go cues (“Stop” and “Braking”) and to withhold responses to the No-Go cues (“Yield” and “Hazard lights”). This mapping was consistent throughout the experiment to ensure reproducibility and comparability across conditions. Experiments 1, 3, 5, and 7 were transitional and designed to maintain participant engagement between interactive tasks. These experiments contained simple fixation crosses and were excluded from behavioral and ERP analyses. Their corresponding event codes (cross) are included but flagged as non-task events to prevent unintended processing by future users. See Figure 1 for additional details.
Each EEG event marker corresponds to a unique stimulus–response pair. The following convention applies to all interactive experiments as seen in Table 3.
These events are included in the .events metadata within the Zenodo release and documented in a separate events_codebook.csv file for machine readability.

3.3. EEG Acquisition

EEG signals were recorded using the BrainAccess MIDI device (16 channels, 250 Hz, 24-bit resolution) [22]. Participants wore a standard EEG cap, and channels were placed according to manufacturer recommendations. Audio was presented via headphones, and visual stimuli on a 16-inch monitor at 60 cm distance. Figure 2 shows the BrainAccess MIDI setup and channel distribution.
The EEG data process began with acquisition using the BrainAccess MIDI device. Channels were positioned according to the standard 10–20 configuration (Fp1, Fp2, F3, F4, C3, C4, P3, P4, O1, O2, F7, F8, T3, T4, T5, T6). Continuous monitoring via BrainAccess Board software version 1.0.3 ensured that impedance levels were maintained below 10 kΩ, and the session was repeated if noise or disconnection occurred. Once acquired, the data were converted to the .set format, and band-pass filtering (0.5–40 Hz) was applied. For artifact correction, Independent Component Analysis (ICA) was utilized. Finally, the data were segmented into epochs and event labeling was performed in EEGLAB.

3.4. Data Preprocessing and Curation

During the data collection, any instances of channel connectivity issues (e.g., due to hair interference) resulted in the data collection being repeated. Following this, incomplete or corrupted data were excluded from the final analysis.
The raw files, which have a .db extension, should be converted to the .set format and imported into EEGLAB. It is recommended that the following preprocessing steps be performed on the data:
  • Filtered between 0.5 and 40 Hz.
  • Re-referenced to the average.
  • Cleaned using ICA for ocular and muscular artifact removal.
  • Epoched from −200 ms to 800 ms around each stimulus.
  • Annotated with Go/No-Go and distraction events.

4. Data Availability

The complete dataset is publicly available in Version 1.0, corresponding to this publication, on Zenodo (https://doi.org/10.5281/zenodo.17135621 accessed on 16 September 2025). It is distributed under the Creative Commons Attribution 4.0 International (CC-BY 4.0) license. All future updates will be versioned and documented in a changelog.md file. The Zenodo repository contains exclusively raw EEG data in the proprietary .db format (BrainAccess MIDI native format). No preprocessed data (.set files) are included in the current release. This approach maximizes flexibility for researchers who may prefer different preprocessing pipelines tailored to their specific research questions. Data has the following data specifications:
  • File format: .db (BrainAccess MIDI).
  • Sampling rate: 250 Hz.
  • Bit resolution: 24-bit.
  • Number of channels: 16 (channel configuration as shown in Figure 2).
  • Event markers: Included as separate .txt files with precise latency information, stimulus type, and position data.
To facilitate reproducibility and reduce barriers to entry, we provide the following guidance:
  • File Conversion: Raw .db files should be converted to EEGLAB-compatible .set format using the BrainAccess Board software File Converter (version 2.0 or later).
  • Recommended Preprocessing Pipeline (using EEGLAB version 2021.0 or later):
    • Filtering: Butterworth band-pass filter (0.5–40 Hz, filter order: 4th).
    • Referencing: Average reference (re-reference to the mean of all channels).
    • ICA Algorithm: Infomax ICA (runica) or AMICA, with default EEGLAB settings; minimum 30 min of data per participant recommended for optimal ICA decomposition.
    • Artifact Rejection: Manual inspection of independent components and removal of components with EOG/EMG characteristics.
    • Epoching: Time-locked to stimulus onset (−200 to 800 ms relative to Go/No-Go cues).
    • Baseline Correction: Mean baseline subtraction (−200 to 0 ms pre-stimulus window).
Raw data may contain physiological artifacts (eye blinks, muscle activity) and environmental noise. ICA-based artifact rejection is strongly recommended. Researchers should inspect data quality on a per-subject and per-channel basis, as channel contact issues (particularly with participants with longer hair) occasionally may necessitate retesting. Such quality-control notes are documented in the participants’ metadata file.
While our recommended pipeline uses EEGLAB, researchers are welcome to employ alternative preprocessing software (e.g., FieldTrip, MNE-Python, Brainstorm) adapted to the .set format following initial .db-to-.set conversion. The raw data and event markers contain sufficient information to reconstruct any standard preprocessing workflow.

5. Technical Validation

All EEG recordings were subjected to multiple validation procedures to ensure data integrity, signal quality, and reproducibility. During acquisition, each participant’s channel impedance was monitored, and trials were repeated when signal instability, excessive motion artifacts, or poor channel contact (particularly with long hair) were detected. The BrainAccess MIDI device, operating at 250 Hz with 24-bit resolution, demonstrated high reliability and no hardware-related data losses. Each dataset was visually inspected and verified for consistency across all eight experimental blocks before inclusion.
Data preprocessing followed standard EEG quality assurance protocols. Raw files were converted into .set format using BrainAccess Board software and imported into EEGLAB (v2022.1) for filtering (0.5–40 Hz), re-referencing to the average, and artifact rejection using independent component analysis (ICA). Epochs of −200 to +800 ms around each event were extracted to capture pre- and post-stimulus neural responses. All steps were documented to facilitate full reproducibility of the pipeline.
To confirm the physiological plausibility of the EEG signals, Event-Related Potentials (ERPs) were computed for Go and No-Go conditions across the key interactive experiments (2, 4, 6, and 8). Consistent with established Go/No-Go paradigms, clear N2 and P3 components were observed in both groups. Consistent with established Go/No-Go paradigms, clear N2 and P3 components were observed in both drivers and non-drivers. Drivers exhibited stronger frontal-central N2 amplitudes and earlier P3 peaks during distraction conditions, suggesting greater neural efficiency in inhibitory control. In contrast, non-drivers showed delayed and more diffuse activation, particularly during auditory distraction, indicating higher cognitive load and less focused processing. These findings align with previous studies linking frontal theta and beta increases to attentional control and inhibitory mechanisms in distracted driving contexts [4,5,6,7,12,13,14,23].
Behavioral indicators further support the integrity of the dataset. Reaction time analyses revealed consistent trends across participants, with no anomalous or missing data. For drivers, distraction effects were transient—reaction times increased only during distraction—whereas for non-drivers, the impairment persisted beyond the distraction interval, demonstrating behavioral patterns consistent with the recorded ERP differences.
The combination of controlled task design, preprocessing, and consistent ERP outcomes validates the technical soundness of the dataset and ensures its suitability for future analyses in attention research, driver behavior modeling, and brain–computer interface development.

6. User Notes

This dataset can be applied in multiple research domains:
  • Event-Related Potentials (ERP) analysis: Researchers can replicate N2 and P3 responses associated with inhibition and distraction [24].
  • Spectral dynamics: Studies can examine theta and beta increases in the frontal cortex during distraction, and alpha suppression in temporal regions [12,13,14,23].
  • Driver vs. non-driver comparisons: Enables testing whether neural differences are attributable to driving experience or general youth traits [1].
  • Machine learning: The structured dataset is suitable for training classifiers (e.g., SVM, LSTM) for distraction detection and BCI applications [25].
Because the stimuli replicate common real-world distractions (WhatsApp messages, phone calls), this dataset has ecological validity, supporting applied research in road safety and intelligent driver-assistance systems.

Supplementary Materials

The following supporting information can be downloaded at https://doi.org/10.5281/zenodo.17135621 accessed on 16 September 2025 .

Author Contributions

Conceptualization, Y.G.-R.; methodology, Y.G.-R.; software, L.G. and B.P.; validation, L.G. and B.P.; formal analysis, Y.G.-R.; investigation, L.G. and B.P.; resources, Y.G.-R.; data curation, L.G. and B.P.; writing—original draft preparation, Y.G.-R.; writing—review and editing, Y.G.-R.; visualization, Y.G.-R.; supervision, Y.G.-R.; project administration, Y.G.-R. All authors have read and agreed to the published version of the manuscript.

Funding

The APC was funded by Universidad Técnica Particular de Loja (1600).

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and approved by the Ethics Committee of Universidad Técnica Particular de Loja (UTPL), Ecuador (protocol code 2024-06-INT-EO-RM-002; approval date: June 2024).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data available in a publicly accessible repository.

Acknowledgments

During the preparation of this manuscript, the authors used ChatGPT (GPT-4 mini, OpenAI) to assist in improving the clarity, coherence, and overall readability of the text. The authors have carefully reviewed and edited all AI-assisted content and take full responsibility for the accuracy and integrity of this publication.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Goldsworthy, J.; Watling, C.; Rose, C.; Larue, G. The Effects of Distraction on Younger Drivers: A Neurophysiological Perspective. Appl. Ergon. 2024, 114, 104147. [Google Scholar] [CrossRef] [PubMed]
  2. Regan, M.; Lee, J.; Young, K. Driver Distraction: Theory, Effects, and Mitigation, 1st ed.; CRC Press: Boca Raton, FL, USA, 2008. [Google Scholar]
  3. Yeo, M.V.M.; Li, X.; Shen, K.; Wilder-Smith, E.P.V. Can SVM Be Used for Automatic EEG Detection of Drowsiness during Car Driving? Saf. Sci. 2009, 47, 115–124. [Google Scholar] [CrossRef]
  4. Shen, K.Q.; Li, X.P.; Ong, C.J.; Shao, S.Y.; Wilder-Smith, E.P.V. EEG-Based Mental Fatigue Measurement Using Multi-Class Support Vector Machines with Confidence Estimate. Clin. Neurophysiol. 2008, 119, 1524–1533. [Google Scholar] [CrossRef] [PubMed]
  5. Saroj, K.L.; Craig, A. A Critical Review of the Psychophysiology of Driver Fatigue. Biol. Psychol. 2001, 55, 173–194. [Google Scholar] [CrossRef] [PubMed]
  6. Jap, B.T.; Lal, S.; Fischer, P.; Bekiaris, E. Using EEG Spectral Components to Assess Algorithms for Detecting Fatigue. Expert Syst. Appl. 2009, 36, 2352–2359. [Google Scholar] [CrossRef]
  7. Sonnleitner, A.; Treder, M.S.; Simon, M.; Willmann, S.; Ewald, A.; Buchner, A.; Schrauf, M. EEG Alpha Spindles and Prolonged Brake Reaction Times during Auditory Distraction in an On-Road Driving Study. Accid. Anal. Prev. 2014, 62, 110–118. [Google Scholar] [CrossRef] [PubMed]
  8. Wali, M.K.; Murugappan, A. Subtractive Fuzzy Classifier Based Driver Distraction Levels Classification Using EEG. J. Phys. Ther. Sci. 2013, 25, 1055–1058. [Google Scholar] [CrossRef] [PubMed]
  9. Rothkrantz, L.J.M.; Horlings, R.; Dharmawan, Z. Recognition of Emotional States of Car Drivers by EEG Analysis. Int. J. Neural Mass 2009, 19, 119–128. [Google Scholar]
  10. Frasson, C.; Brosseau, P.O.; Tran, T.H.D. Virtual Environment for Monitoring Emotional Behaviour in Driving. In Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2014; Volume 8474, pp. 75–83. ISBN 9783319072203. [Google Scholar]
  11. Fan, X.A.; Bi, L.Z.; Chen, Z.L. Using EEG to Detect Drivers’ Emotion with Bayesian Networks. In Proceedings of the International Conference on Machine Learning and Cybernetics, ICMLC 2010, Qingdao, China, 11–14 July 2010; Volume 3, pp. 1177–1181. [Google Scholar]
  12. Lin, C.T.; Lin, H.Z.; Chiu, T.W.; Chao, C.F.; Chen, Y.C.; Liang, S.F.; Ko, L.W. Distraction-Related EEG Dynamics in Virtual Reality Driving Simulation. In Proceedings of the IEEE International Symposium on Circuits and Systems, Seattle, WA, USA, 18–21 May 2008; pp. 1088–1091. [Google Scholar]
  13. Kumar, S.P.; Selvaraj, J.; Krishnakumar, R.; Sahayadhas, A. Detecting Distraction in Drivers Using Electroencephalogram (EEG) Signals. In Proceedings of the 4th International Conference on Computing Methodologies and Communication, ICCMC 2020, Erode, India, 11–13 March 2020; pp. 635–639. [Google Scholar]
  14. Shi, C.; Yan, F.; Zhang, J.; Yu, H.; Peng, F.; Yan, L. Right Superior Frontal Involved in Distracted Driving. Transp. Res. Part. F Traffic Psychol. Behav. 2023, 93, 191–203. [Google Scholar] [CrossRef]
  15. Osborn, A.F.; Owens, D.A. Change Blindness: A Comparison of Selective Attention of Novice and Experienced Drivers. J. Vis. 2010, 10, 201. [Google Scholar] [CrossRef]
  16. Tao, S.; Deng, Y.; Jiang, Y. Differential Impact of Working Memory and Inhibitory Control on Distracted Driving Performance among Experienced and Inexperienced Drivers. In Proceedings of the 12th International Conference on Traffic and Logistic Engineering, ICTLE, Macau, China, 23–25 August 2024; pp. 17–22. [Google Scholar]
  17. Ba, Y.; Zhang, W.; Salvendy, G.; Cheng, A.S.K.; Ventsislavova, P. Assessments of Risky Driving: A Go/No-Go Simulator Driving Task to Evaluate Risky Decision-Making and Associated Behavioral Patterns. Appl. Ergon. 2016, 52, 265–274. [Google Scholar] [CrossRef] [PubMed]
  18. Sodnik, J.; Dicke, C.; Tomažič, S.; Billinghurst, M. A User Study of Auditory versus Visual Interfaces for Use While Driving. Int. J. Hum. Comput. Stud. 2008, 66, 318–332. [Google Scholar] [CrossRef]
  19. Karthaus, M.; Wascher, E.; Getzmann, S. Effects of Visual and Acoustic Distraction on Driving Behavior and EEG in Young and Older Car Drivers: A Driving Simulation Study. Front. Aging Neurosci. 2018, 10, 420. [Google Scholar] [CrossRef] [PubMed]
  20. Delorme, A.; Makeig, S. EEGLAB: An Open Source Toolbox for Analysis of Single-Trial EEG Dynamics Including Independent Component Analysis. J. Neurosci. Methods 2004, 134, 9–21. [Google Scholar] [CrossRef] [PubMed]
  21. Mehrabian, A. Pleasure-Arousal-Dominance: A General Framework for Describing and Measuring Individual Differences in Temperament. Curr. Psychol. 1996, 14, 261–292. [Google Scholar] [CrossRef]
  22. NEUROTecnology. BrainAccess MIDI Electroencephalograph. Available online: https://www.brainaccess.ai/wp-content/uploads/downloads/UserManualMIDI.pdf (accessed on 12 December 2023).
  23. Li, G.; Wu, X.; Eichberger, A.; Green, P.; Olaverri-Monreal, C.; Yan, W.; Qin, Y.; Li, Y. Drivers’ EEG Responses to Different Distraction Tasks. Automot. Innov. 2023, 6, 20–31. [Google Scholar] [CrossRef]
  24. Hsieh, S.; Wu, M.; Tang, C.H. Adaptive Strategies for the Elderly in Inhibiting Irrelevant and Conflict No-Go Trials While Performing the Go/No-Go Task. Front. Aging Neurosci. 2016, 7, 243. [Google Scholar] [CrossRef] [PubMed]
  25. Wali, M.K.; Murugappan, M.; Ahmmad, B. Wavelet Packet Transform Based Driver Distraction Level Classification Using EEG. Math. Probl. Eng. 2013, 2013, 297587. [Google Scholar] [CrossRef]
Figure 1. Experimental design and sequence of visual and auditory Go/No-Go tasks used for EEG data collection. Each experiment integrates specific sensory modalities and distraction conditions (message or phone call) to simulate real-world driving attention demands.
Figure 1. Experimental design and sequence of visual and auditory Go/No-Go tasks used for EEG data collection. Each experiment integrates specific sensory modalities and distraction conditions (message or phone call) to simulate real-world driving attention demands.
Data 10 00175 g001
Figure 2. BrainAccess MIDI electroencephalograph used for EEG acquisition. The 16-channel layout follows the manufacturer’s 10–20 configuration. The image shows the channel placement, amplifier unit, and signal acquisition interface.
Figure 2. BrainAccess MIDI electroencephalograph used for EEG acquisition. The 16-channel layout follows the manufacturer’s 10–20 configuration. The image shows the channel placement, amplifier unit, and signal acquisition interface.
Data 10 00175 g002
Table 1. Demographic and behavioral characteristics of participants by sex and driving experience. Both groups (drivers and non-drivers) were matched for age and sex, ensuring comparability across conditions.
Table 1. Demographic and behavioral characteristics of participants by sex and driving experience. Both groups (drivers and non-drivers) were matched for age and sex, ensuring comparability across conditions.
GroupLicensed DriverNon-Drivers
SexMaleFemaleMaleFemale
Age (years)18–24 Ave: 20.9 SD: 2.0318–21 Ave: 20.1 SD: 0.9719–24 Ave: 21.3 SD: 1.7818–23 Ave: 20.5 SD: 1.44
Driving experience (years)0–6 Ave: 2.1 SD: 1.761–3 Ave: 1.7 SD: 0.67
Mobile use per day (percentage)≤1 h: 0%
1–2 h: 0%
2–3 h: 30%
>3 h: 70%
≤1 h: 0%
1–2 h: 9%
2–3 h: 36%
>3 h: 55%
≤1 h: 8%
1–2 h: 0%
2–3 h: 42%
>3 h: 50%
≤1 h: 0%
1–2 h: 9%
2–3 h: -
>3 h: 91%
Frequency of mobile use while driving (percentage)Sometimes: 90%
Frequently: 10%
Sometimes: 91%
Frequently: 9%
Emotions and mood states before the experiments (percentage)Calm: 50%
Tense: 0%
Excited: 0%
Happy: 20%
Depressed: 0%
Sleepy: 10%
Delighted: 10%
Distressed: 0%
Angry: 10%
Pleased: 0%
Calm: 55%
Tense: 9%
Excited: 9%
Happy: 0%
Depressed: 9%
Sleepy: 9%
Delighted: 0%
Distressed: 0%
Angry: 0%
Pleased: 9%
Calm: 42%
Tense: 33%
Excited: 0%
Happy: 17%
Depressed: 0%
Sleepy: 0%
Delighted: 0%
Distressed: 8%
Angry: 0%
Pleased: 0%
Calm: 36%
Tense: 36%
Excited: 18%
Happy: 0%
Depressed: 0%
Sleepy: 10%
Delighted: 0%
Distressed: 0%
Angry: 0%
Pleased: 0%
Table 2. Summary of the eight experimental tasks in the Go/No-Go paradigm, detailing stimulus type, task requirements, and distraction conditions. Abbreviations: “No-Go” = inhibition cue; “Go” = response cue. Latency values are reported in milliseconds (ms).
Table 2. Summary of the eight experimental tasks in the Go/No-Go paradigm, detailing stimulus type, task requirements, and distraction conditions. Abbreviations: “No-Go” = inhibition cue; “Go” = response cue. Latency values are reported in milliseconds (ms).
Exp.Stimuli TypeStimuli PresentedTask RequirementDistraction Condition Distraction Condition
1Visual- Stop sign (1 s)
- Yield sign (1 s)
- Cross (1 s)
No response requiredNoneParticipants view alternating “Stop” and “Yield” signals with interspersed “cross” symbols to focus attention at the screen center. No interaction is needed.
2Visual + Interactive- Stop sign (1 s)
- Yield sign (1 s)
- Cross (1 s)
Press space bar when “Stop” appearsNoneParticipants press the space bar when they see the “Stop” signal. The “cross” symbol centers attention between signals.
3Visual + Distraction- Stop sign (1 s)
- Yield sign (1 s)
- Cross (1 s)
No response requiredText message notification (sound + visual)Participants view “Stop” and “Yield” signals while receiving 3 visual and auditory message notifications. No interaction required.
4Visual + Interactive + Distraction- Stop sign (1 s)
- Yield sign (1 s)
- Cross (1 s)
Press space bar when “Stop” appearsIncoming call sound + 6 s conversationParticipants press the space bar when they see “Stop” (Go stimulus) and experience an incoming call notification followed by a short conversation sound.
5Auditory- Braking sound (1.5 s)
- Hazard lights sound (1.5 s)
- Pause (1 s)
No response requiredNoneBlindfolded participants listen to alternating “braking” and “hazard lights“ sounds, separated by 1 s pauses. No interaction required.
6Auditory + Interactive- Braking sound (1.5 s)
- Hazard lights sound (1.5 s)
- Pause (1 s)
Press space bar when hearing “braking” soundNoneBlindfolded participants press the space bar when they hear the “braking” sound.
7Auditory + Distraction- Braking sound (1.5 s)
- Hazard lights sound (1.5 s)
- Pause (1 s)
No response requiredMessage notification soundBlindfolded participants listen to the “braking” and “hazard lights“ sounds with intermittent WhatsApp message notifications. No interaction required.
8Auditory + Interactive + Distraction- Braking sound (1.5 s)
- Hazard lights sound (1.5 s)
- Pause (1 s)
Press space bar when hearing “braking” soundIncoming call notification + 6 s conversationBlindfolded participants press the space bar when they hear the “braking” sound, while experiencing an incoming call notification and brief conversation sound.
Table 3. Event code and stimulus-response convention for interactive experiments.
Table 3. Event code and stimulus-response convention for interactive experiments.
StimulusGo/No-Go StatusExpected ResponseTrigger Label
Stop signGoButton pressGo_yield
Braking soundGoButton pressGo_hazard
Yield signNo-GoNo responseNogo_stop
Hazard lightsNo-GoNo responseNogo_brake
Transition crossTransitionCross
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

García-Ramírez, Y.; Gordillo, L.; Pereira, B. Electroencephalography Dataset of Young Drivers and Non-Drivers Under Visual and Auditory Distraction Using a Go/No-Go Paradigm. Data 2025, 10, 175. https://doi.org/10.3390/data10110175

AMA Style

García-Ramírez Y, Gordillo L, Pereira B. Electroencephalography Dataset of Young Drivers and Non-Drivers Under Visual and Auditory Distraction Using a Go/No-Go Paradigm. Data. 2025; 10(11):175. https://doi.org/10.3390/data10110175

Chicago/Turabian Style

García-Ramírez, Yasmany, Luis Gordillo, and Brian Pereira. 2025. "Electroencephalography Dataset of Young Drivers and Non-Drivers Under Visual and Auditory Distraction Using a Go/No-Go Paradigm" Data 10, no. 11: 175. https://doi.org/10.3390/data10110175

APA Style

García-Ramírez, Y., Gordillo, L., & Pereira, B. (2025). Electroencephalography Dataset of Young Drivers and Non-Drivers Under Visual and Auditory Distraction Using a Go/No-Go Paradigm. Data, 10(11), 175. https://doi.org/10.3390/data10110175

Article Metrics

Back to TopTop