ERP Signatures of Stimulus Choice in Gaze-Independent BCI Communication
Abstract
1. Introduction
2. Materials and Methods
2.1. Participants
2.2. Stimuli
2.3. Procedure
2.4. EEG Recordings and Analysis
2.5. Source Reconstruction
3. Results
3.1. Behavioural Data
3.2. Electrophysiological Results
3.3. Results—Source Localization (swLORETA)
4. Discussion
5. Study Limitations
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
| ALS | Amyotrophic Lateral Sclerosis |
| AUC | Area Under the Curve |
| ANOVA | Analysis of Variance |
| ASA | Advanced Source Analysis |
| BCI | Brain–Computer Interface |
| BEM | Boundary element model |
| CAR | Common Average Reference |
| CNV | Contingent Negative Variation |
| EBL | Emotional Body Language |
| EEG | Electroencephalogram |
| EOG | Electro-oculogram |
| ERP | Event-Related Potential |
| ISI | Inter-stimulus Interval |
| ITI | Inter-trial Interval |
| ITR | Information Transfer Rate |
| LIS | Locked-in syndrome |
| LORETA | Low-Resolution Electromagnetic Tomography |
| MRI | Magnetic Resonance Imaging |
| RSVP | Rapid Serial Visual Presentation |
| SE | Standard Error |
| SSVEP | Steady-state visual evoked potential |
References
- Herbert, C. Analyzing and computing humans by means of the brain using Brain-Computer Interfaces. Front. Hum. Neurosci. 2024, 17, 1286895. [Google Scholar] [CrossRef]
- Arns, M.; Sokhadze, E.; Birbaumer, N. Neurofeedback and Brain-Machine Interfaces: Where are We Now? In Applied Psychophysiology and Biofeedback; Springer: Berlin/Heidelberg, Germany, 2025. [Google Scholar] [CrossRef]
- Chaudhary, U.; Mrachacz-Kersting, N.; Birbaumer, N. Neuropsychological and neurophysiological aspects of brain-computer-interface (BCI) control in paralysis. J. Physiol. 2021, 599, 2351–2359. [Google Scholar] [CrossRef]
- Wolpaw, J.R.; Birbaumer, N.; McFarland, D.J.; Pfurtscheller, G.; Vaughan, T.M. Brain-computer interfaces for communication and control. Clin. Neurophysiol. 2002, 113, 767–791. [Google Scholar] [CrossRef] [PubMed]
- Farwell, L.A.; Donchin, E. Talking off the top of your head: Toward a mental prosthesis utilizing event-related brain potentials. Electroencephalogr. Clin. Neurophysiol. 1988, 70, 510–523. [Google Scholar] [CrossRef] [PubMed]
- Townsend, G.; LaPallo, B.K.; Boulay, C.B.; Krusienski, D.J.; Frye, G.E.; Hauser, C.K.; Schwartz, N.E.; Vaughan, T.M.; Wolpaw, J.R.; Sellers, E.W. A novel P300-based brain-computer interface stimulus presentation paradigm: Moving beyond rows and columns. Clin. Neurophysiol. 2010, 121, 1109–1120. [Google Scholar] [CrossRef] [PubMed]
- Treder, M.S.; Blankertz, B. (C)overt attention and visual speller design in an ERP-based brain-computer interface. Behav. Brain Funct. 2010, 6, 28. [Google Scholar] [CrossRef]
- Acqualagna, L.; Blankertz, B. Gaze-independent BCI-spelling using rapid serial visual presentation (RSVP). Clin. Neurophysiol. 2013, 124, 901–908. [Google Scholar] [CrossRef]
- Bakardjian, H.; Tanaka, T.; Cichocki, A. Emotional faces boost up steady-state visual responses for brain-computer interface. Neuroreport. 2011, 22, 121–125. [Google Scholar] [CrossRef]
- Kuś, R.; Duszyk, A.; Milanowski, P.; Łabęcki, M.; Bierzyńska, M.; Radzikowska, Z.; Michalska, M.; Zygierewicz, J.; Suffczyński, P.; Durka, P.J. On the quantification of SSVEP frequency responses in human EEG in realistic BCI conditions. PLoS ONE. 2013, 8, e77536. [Google Scholar] [CrossRef]
- Pang, Z.; Zhang, R.; Li, M.; Li, Z.; Cui, H.; Chen, X. SSVEP-based BCI using ultra-low-frequency and high-frequency peripheral flickers. J. Neural. Eng. 2025, 22, 036032. [Google Scholar] [CrossRef]
- Siribunyaphat, N.; Tohkhwan, N.; Punsawad, Y. Investigation of Personalized Visual Stimuli via Checkerboard Patterns Using Flickering Circles for SSVEP-Based BCI System. Sensors 2025, 25, 4623. [Google Scholar] [CrossRef]
- Pronina, A.; Grigoryan, R.; Makarova, A.; Kaplan, A. Spatial Attention Effects on P300 BCI Performance: ERP and Eye-Tracking Study. Mosc. Univ. Biol.Sci. Bull. 2023, 78, 255–262. [Google Scholar] [CrossRef]
- Brunner, P.; Joshi, S.; Briskin, S.; Wolpaw, J.R.; Bischof, H.; Schalk, G. Does the ‘P300′ speller depend on eye gaze? J. Neural. Eng. 2010, 7, 056013. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
- Leoni, J.; Tanelli, M.; Strada, S.; Brusa, A.; Proverbio, A.M. Single-Trial Stimuli Classification from Detected P300 for Augmented Brain-Computer Interface: A Deep Learning Approach. Mach. Learn. Appl. 2022, 275, 10039. [Google Scholar] [CrossRef]
- Colafiglio, T.; Lombardi, A.; Di Noia, T.; De Bonis, M.L.N.; Narducci, F.; Proverbio, A.M. Machine learning classification of motivational states: Insights from EEG analysis of perception and imagery. Expert. Syst. Appl. 2025, 275, 127076. [Google Scholar] [CrossRef]
- Della Vedova, G.; Proverbio, A.M. Neural signatures of imaginary motivational states: Desire for music, movement and social play. Brain Topogr. 2024, 37, 806–825. [Google Scholar] [CrossRef]
- Proverbio, A.M.; Pischedda, F. Measuring brain potentials of imagination linked to physiological needs and motivational states. Front. Hum. Neurosci. 2023, 17, 1146789. [Google Scholar] [CrossRef]
- Leoni, J.; Strada, S.C.; Tanelli, M.; Proverbio, A.M. MIRACLE: MInd ReAding CLassification Engine. IEEE Trans Neural. Syst. Rehabil. Eng. 2023, 31, 3212–3222. [Google Scholar] [CrossRef]
- Costa, F.R.L.; Iáñez, E.; Azorín, J.M.; Patow, G. Classify four imagined objects with EEG signals. Evol. Intel. 2022, 15, 1657–1666. [Google Scholar] [CrossRef]
- Nemrodov, D.; Niemeier, M.; Patel, A.; Nestor, A. The Neural Dynamics of Facial Identity Processing: Insights from EEG-Based Pattern Analysis and Image Reconstruction. eNeuro 2018, 5, ENEURO.0358-17.2018. [Google Scholar] [CrossRef]
- Cudlenco, N.; Popescu, N.; Leordeanu, M. Reading into the mind’s eye: Boosting automatic visual recognition with EEG signals. Neurocomputing 2020, 386, 281–292. [Google Scholar] [CrossRef]
- Krusienski, D.J.; Sellers, E.W.; McFarland, D.J.; Vaughan, T.M.; Wolpaw, J.R. Toward enhanced P300 speller performance. J. Neurosci. Methods. 2008, 167, 15–21. [Google Scholar] [CrossRef]
- Riccio, A.; Leotta, F.; Bianchi, L.; Aloise, F.; Zickler, C.; Hoogerwerf, E.J.; Kübler, A.; Mattia, D.; Cincotti, F. Workload. measurement in a communication application operated through a P300-based brain–computer interface. J. Neural Eng. 2011, 8, 025028. [Google Scholar] [CrossRef] [PubMed]
- Spüler, M. A Brain-Computer Interface (BCI) system to use arbitrary Windows applications by directly controlling mouse and keyboard. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; pp. 1087–1090. [Google Scholar] [CrossRef]
- Neuper, C.; Pfurtscheller, G. Evidence for distinct beta resonance frequencies in human EEG related to specific sensorimotor cortical areas. Clin. Neurophysiol. 2001, 112, 2084–2097. [Google Scholar] [CrossRef]
- Pacheco, T.B.F.; Oliveira Rego, I.A.; Campos, T.F.; Cavalcanti, F.A.D.C. Brain activity during a lower limb functional task in a real and virtual environment: A comparative study. NeuroRehabilitation. 2017, 40, 391–400. [Google Scholar] [CrossRef] [PubMed]
- Liu, Y.; Zhou, Z.; Hu, D. Gaze independent brain-computer speller with covert visual search tasks. Clin. Neurophysiol. 2011, 122, 1127–1136. [Google Scholar] [CrossRef] [PubMed]
- Proverbio, A.M.; Pischedda, F. Validation of a Pictionary-based communication tool for assessing individual needs and motivational states in locked-in patients: P.A.I.N. set. Front. Cogn. Section Percept. 2023. [Google Scholar] [CrossRef]
- Oostenveld, R.; Praamstra, P. The five percent electrode system for high-resolution EEG and ERP measurements. Clin. Neurophysiol. 2001, 112, 713–719. [Google Scholar] [CrossRef]
- Pascual-Marqui, R.D. Standardized low-resolution brain electromagnetic tomography (sLORETA): Technical details. Methods Find Exp Clin Pharmacol. 2002, 24 (Suppl. D), 5–12. [Google Scholar] [PubMed]
- Palmero-Soler, E.; Dolan, K.; Hadamschek, V.; Tass, P.A. swLORETA: A novel approach to robust source localization and synchronization tomography. Phys. Med. Biol. 2007, 52, 1783–1800. [Google Scholar] [CrossRef]
- Zanow, F.; Knösche, T.R. ASA--Advanced Source Analysis of continuous and event-related EEG/MEG signals. Brain Topogr. 2004, 16, 287–290. [Google Scholar] [CrossRef]
- Lamm, C.; Windischberger, C.; Leodolter, U.; Moser, E.; Bauer, H. Evidence for premotor cortex activity during dynamic visuospatial imagery from single-trial functional magnetic resonance imaging and event-related slow cortical potentials. Neuroimage 2001, 14, 268–283. [Google Scholar] [CrossRef]
- Hinterberger, T.; Schmidt, S.; Neumann, N.; Mellinger, J.; Blankertz, B.; Curio, G.; Birbaumer, N. Brain-computer communication and slow cortical potentials. IEEE Trans Biomed Eng. 2004, 51, 1011–1018. [Google Scholar] [CrossRef]
- Birbaumer, N.; Hinterberger, T.; Kübler, A.; Neumann, N. The thought-translation device (TTD): Neurobehavioral mechanisms and clinical outcome. IEEE Trans Neural Syst Rehabil Eng. 2003, 11, 120–123. [Google Scholar] [CrossRef] [PubMed]
- Becerra-Casillas, O.A.; Diaz-Lozano, K.A.; Galvan-Guerrero, H.M.; Huidobro, N.; Romo-Vazquez, R.; Treviño, M.; Osuna-Carrasco, P.; Toro-Castillo, M.D.C.; de la Torre-Valdovinos, B. Temporal downscaling of movement reveals duration-dependent modulation of motor preparatory potentials in humans. Neuroscience 2025, 583, 157–170. [Google Scholar] [CrossRef] [PubMed]
- Hirose, S.; Nambu, I.; Naito, E. Cortical activation associated with motor preparation can be used to predict the freely chosen effector of an upcoming movement and reflects response time: An fMRI decoding study. Neuroimage 2018, 183, 584–596. [Google Scholar] [CrossRef] [PubMed]
- Bares, M.; Nestrasil, I.; Rektor, I. The effect of response type (motor output versus mental counting) on the intracerebral distribution of the slow cortical potentials in an externally cued (CNV) paradigm. Brain Res. Bull. 2007, 71, 428–435. [Google Scholar] [CrossRef]
- Gómez, C.M.; Delinte, A.; Vaquero, E.; Cardoso, M.J.; Vázquez, M.; Crommelinck, M.; Roucoux, A. Current source density analysis of CNV during temporal gap paradigm. In Brain Topography; Springer: Berlin/Heidelberg, Germany, 2001; Volume 13, pp. 149–159. [Google Scholar] [CrossRef]
- Gómez, C.M.; Marco, J.; Grau, C. Preparatory visuo-motor cortical network of the contingent negative variation estimated by current density. Neuroimage 2003, 20, 216–224. [Google Scholar] [CrossRef]
- Beck, S.; Houdayer, E.; Richardson, S.P.; Hallett, M. The role of inhibition from the left dorsal premotor cortex in right-sided focal hand dystonia. Brain Stimul. 2009, 2, 208–214. [Google Scholar] [CrossRef]
- Rushworth, M.F.; Johansen-Berg, H.; Gobel, S.M.; Devlin, J.T. The left parietal and premotor cortices: Motor attention and selection. Neuroimage 2003, 20 (Suppl. 1), S89–S100. [Google Scholar] [CrossRef]
- Forstmann, B.U.; Wolfensteller, U.; Derrfuss, J.; Neumann, J.; Brass, M.; Ridderinkhof, K.R.; von Cramon, D.Y. When the choice is ours: Context and agency modulate the neural bases of decision-making. PLoS ONE 2008, 3, e1899. [Google Scholar] [CrossRef]
- Deppe, M.; Schwindt, W.; Kugel, H.; Plassmann, H.; Kenning, P. Nonlinear responses within the medial prefrontal cortex reveal when specific implicit information influences economic decision making. J. Neuroimaging 2005, 15, 171–182. [Google Scholar] [CrossRef] [PubMed]
- van den Berg, F.E.; Swinnen, S.P.; Wenderoth, N. Involvement of the primary motor cortex in controlling movements executed with the ipsilateral hand differs between left- and right-handers. J. Cogn. Neurosci. 2011, 23, 3456–3469. [Google Scholar] [CrossRef]
- Viaro, R.; Bonazzi, L.; Maggiolini, E.; Franchi, G. Cerebellar Modulation of Cortically Evoked Complex Movements in Rats. Cereb. Cortex. 2017, 27, 3525–3541. [Google Scholar] [CrossRef]
- Molenberghs, P.; Cunnington, R.; Mattingley, J.B. Brain regions with mirror properties: A meta-analysis of 125 human fMRI studies. Neurosci. Biobehav. Rev. 2012, 36, 341–349. [Google Scholar] [CrossRef] [PubMed]
- Friman, O.; Volosyak, I.; Gräser, A. Multiple channel detection of steady-state visual evoked potentials for brain-computer interfaces. IEEE Trans. Biomed. Eng. 2007, 54, 742–750. [Google Scholar] [CrossRef]
- Volosyak, H.; Cecotti, H.; Valbuena, D.; Gräser, A. Evaluation of the Bremen SSVEP based BCI in real world conditions. In Proceedings of the 2009 IEEE International Conference on Rehabilitation Robotics, Kyoto, Japan, 23–26 June 2009; IEEE: Piscataway, NJ, USA; pp. 322–331. [Google Scholar] [CrossRef]
- Allison, B.Z.; Kübler, A.; Jin, J. 30+ years of P300 brain-computer interfaces. Psychophysiology 2020, 57, e13569. [Google Scholar] [CrossRef]






| Study Type/Paradigm | Key References | Stimulus Type & Semantics | Response Mode | Gaze Constraint | EEG/ERP Data | Population | Decoding AI Algorithm | Key Features |
|---|---|---|---|---|---|---|---|---|
| Classic P300 Speller | Farwell & Donchin [5]; Krusienski et al. [23]; Townsend et al. [6]; Treder & Blankertz [7]; Riccio et al. [24] | Alphanumeric grid; arbitrary symbols | Overt or covert selection via attention | Free gaze | P3b (300–600 ms) | Healthy, ALS/LIS | Yes—classification accuracy, ITR | Communication via character-based target detection |
| RSVP-based BCI | Acqualagna & Blankertz [8]; Spüler [25] | Rapid serial visual presentation of words/pictures | Covert attention; button press | Partially constrained | P3a/P3b | Healthy, ALS | Yes—AUC, ITR | Sequential single-target detection |
| Motor Imagery BCIs | Wolpaw et al. [4]; Neuper & Pfurtscheller [26]; Chaudhary et al. [3]; Pacheco et al. [27] | Abstract cues or limb icons | Motor imagery (hand/foot) | Free gaze | CNV, μ/β ERD/ERS | Healthy, ALS | Yes—classifier accuracy | Continuous motor imagery control paradigms |
| Gaze-independent/Covert attention paradigms | Brunner et al. [14]; Riccio et al. [24]; Liu et al. [28] | Semantic or affective symbols; sequential character groups | Covert attention, sometimes button press | Fixed gaze/gaze-independent | P300 variants | Healthy, LIS | Partial | Attention-based BCIs with minimal eye movement |
| Present study (current work) | — | Motivational pictograms (“PAIN Pictionary”) representing need states | Motor imagery only (no overt response) | Strict central fixation (no gaze shifts) | Joint P300–CNV–P600 modulation | Healthy (feasibility step) | No decoding in current study (planned pipeline) | First demonstration of this approach |
| ERPs to Pictograms—Mean Area Amplitude Values | ||||||
|---|---|---|---|---|---|---|
| Category | Hem. | Mean Area | SE | −95% | +95% | N |
| P300 (450–650) | ||||||
| Target | −0.913 | 1.476 | −3.960 | 2.134 | 25 | |
| Non Target | −3.543 | 1.057 | −5.724 | −1.362 | 25 | |
| Early CNV (450–750 ms) | ||||||
| Target | −5.266 | 0.700 | −6.711 | −3.821 | 25 | |
| Non Target | −4.279 | 0.787 | −5.904 | −2.653 | 25 | |
| Late CNV (2250–2750 ms) | ||||||
| Target | Left | −1.497 | 1.275 | −4.128 | 1.134 | 25 |
| Target | Right | 0.336 | 1.225 | −2.192 | 2.8642 | 25 |
| Non-Target | Left | −0.566 | 1.176 | −2.99 | 1.8616 | 25 |
| Non Target | Right | −0.613 | 1.362 | −3.423 | 2.197 | 25 |
| ERPs to response prompts—Mean area amplitude values | ||||||
| P600 (600–800 ms) | ||||||
| Target Non-Target | 2.585 | 1.017 | 0.486 | 4.684 | 25 | |
| 1.193 | 1.195 | −1.273 | 3.658 | 25 | ||
| EARLY CNV TO TARGET PICTOGRAMS (450–750 ms) | |||||
|---|---|---|---|---|---|
| Magn. | Hem. | Lobe | Gyrus | BA | Functional Correlates |
| 2.435 | L | F | Superior Frontal | 10 | Decision making |
| 2.431 | L | T | Superior Temporal | 22/38/20/37/18 | Visual attention (Face & Body) |
| 2.158 | R | F | Superior Frontal | 10 | Decision making |
| 1.628 | R | F | Middle Frontal | 46 | Selective attention |
| 1.437 | L | F | Superior Frontal | 6 | Premotor (Right hand) |
| 1.411 | R | F | Precentral | 6 | Premotor (Left hand) |
| 1.383 | R | T | Inferior/Middle Temporal | 20/21/37 | Visual attention |
| 1.02 | L | P | Postcentral | 2 | Somatosensory |
| 0.95 | R | Cereb | Post. Lobe, Declive | / | Motor preparation |
| LATE CNV TO TARGET PICTOGRAMS (2250–2750 ms) | |||||
|---|---|---|---|---|---|
| Magn. | Hem. | Lobe | Gyrus | BA | Functional Correlates |
| 4.787 | R | F | Superior/Middle Frontal | 10 | Decision making |
| 2.937 | L | T | Middle Temporal | 21 | Visual attention |
| 2.705 | R | T | Middle/Superior Temporal | 20/22/42/37 | Attention, EBL, Motivation |
| 2.218 | R | P | Supramarginal | 40 | Mirror neuron/embodiment |
| 1.992 | L | F | Middle Frontal | 9 | Decision making |
| 1.837 | L | O | Cuneus | 18/38 | Visual processing (Body) |
| 1.264 | L | F | Superior Frontal | 6 | Premotor (right hand) |
| 1.21 | R | Cereb | Post. Lobe, Declive | / | Motor preparation |
| 1.191 | L | F | Superior Frontal | 6 | Premotor (right hand) |
| 1.173 | R | F | Precentral | 4 | M1, Motor command (BCI) |
| 1.057 | R | F | Superior Frontal | 8 | Attention (FEF) |
| 0.878 | R | Limbic | Cingulate | 24 | Empathy, Motivation |
| P600 TO RESPONSE PROMPTS (600–800 ms) | |||||
|---|---|---|---|---|---|
| Magn. | Hem. | Lobe | Gyrus | BA | Functional Correlates |
| 1.484 | R | F | Medial Frontal | 6 | MNS—Motor imagery and preparation, embodiment |
| 1.381 | R | F | Medial Frontal | 10 | Decision making |
| 1.133 | R | O | Fusiform | 20/37 | Occipital body area (Hands) |
| 1.105 | L | F | Middle Frontal | 46 | Attention |
| 1.081 | L | F | Superior Frontal | 10 | Decision making |
| 0.922 | R | F | Middle Frontal | 47 | Motivation/Crave |
| 0.875 | R, L | Basal Ganglia | Globus Pallidus | / | Craving; Motivation; Reward |
| 0.809 | L, R | P | Inferior Parietal | 40 | MNS—Motor imagery, embodiment |
| 0.793 | L | Limbic | Uncus | 36 | Craving |
| 0.752 | L | T | Superior Temporal | 41 | Inner speech |
| 0.727 | L | F | Inferior Frontal | 47 | Motivation & Reward |
| 0.657 | L | O | Middle Temporal | 22 | EBL/Motivation |
| 0.543 | R | Sublobar | Insula | 13 | Craving |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Proverbio, A.M.; Dishi, Y. ERP Signatures of Stimulus Choice in Gaze-Independent BCI Communication. Appl. Sci. 2025, 15, 11888. https://doi.org/10.3390/app152211888
Proverbio AM, Dishi Y. ERP Signatures of Stimulus Choice in Gaze-Independent BCI Communication. Applied Sciences. 2025; 15(22):11888. https://doi.org/10.3390/app152211888
Chicago/Turabian StyleProverbio, Alice Mado, and Yldjana Dishi. 2025. "ERP Signatures of Stimulus Choice in Gaze-Independent BCI Communication" Applied Sciences 15, no. 22: 11888. https://doi.org/10.3390/app152211888
APA StyleProverbio, A. M., & Dishi, Y. (2025). ERP Signatures of Stimulus Choice in Gaze-Independent BCI Communication. Applied Sciences, 15(22), 11888. https://doi.org/10.3390/app152211888

