Next Article in Journal
ML-CDAE: Multi-Lead Convolutional Denoising Autoencoder for Denoising 12-Lead ECG Signals
Previous Article in Journal
An Interpretable Residual Spatio-Temporal Graph Attention Network for Multiclass Emotion Recognition from EEG
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

From Neurons to Networks: A Holistic Review of Electroencephalography (EEG) from Neurophysiological Foundations to AI Techniques

by
Christos Kalogeropoulos
1,
Konstantinos Theofilatos
2 and
Seferina Mavroudi
3,*
1
Department of Mathematics, University of Patras, 265 04 Patras, Greece
2
School of Cardiovascular and Metabolic Medicine & Sciences, King’s College London, London WC2R 2LS, UK
3
Department of Nursing, School of Health Rehabilitation Sciences, University of Patras, 263 34 Patras, Greece
*
Author to whom correspondence should be addressed.
Signals 2026, 7(1), 17; https://doi.org/10.3390/signals7010017
Submission received: 26 December 2025 / Revised: 20 January 2026 / Accepted: 25 January 2026 / Published: 16 February 2026

Abstract

Electroencephalography (EEG) has transitioned from a subjective observational method into a data-intensive analytical field that utilises sophisticated algorithms and mathematical models. This review provides a holistic foundation by detailing the neurophysiological basis, recording techniques, and applications of EEG before providing a rigorous examination of traditional and modern analytical pillars. Statistical and Time-Series Analysis, Spectral and Time-Frequency Analysis, Spatial Analysis and Source Modelling, Connectivity and Network Analysis, and Nonlinear and Chaotic Analysis are explored. Afterwards, while acknowledging the historical role of Machine Learning (ML) and Deep Learning (DL) architectures, such as Support Vector Machines (SVMs) and Convolutional Neural Networks (CNNs), this review shifts the primary focus toward current state-of-the-art Artificial Intelligence (AI) trends. We place emphasis on the emergence of Foundation Models, including Large Language Models (LLMs) and Large Vision Models (LVMs), adapted for high-dimensional neural sequences. Finally, we explore the integration of Generative AI for data augmentation and review Explainable AI (XAI) frameworks designed to bridge the gap between “black-box” decoding and clinical interpretability. We conclude that the next generation of EEG analysis will likely converge into Neuro-Symbolic architectures, synergising the massive generative power of foundation models with the rigorous, rule-based interpretability of classical signal theory.

1. Introduction

Electroencephalography (EEG) has substantially evolved throughout the years. The incorporation of mathematical tools and computational models has enabled researchers to gain a profound insight into brain function and its applications now encompass a wider array of fields. The study of biomedical signals draws upon multiple disciplines such as biology, biochemistry, neuroscience, medicine, engineering, mathematics, and computer science, rendering it as one of the most fascinating areas of research. From a biological and neuroscience point of view, EEG signals elucidate neuronal dynamics that can be proven pivotal in the diagnosis and treatment of many neuropsychiatric conditions. From a mathematical and signal processing point of view, they represent a highly complex and information-rich class of signals that can be challenging to interpret. This is because they are intricate by nature, for they are the result of million neurons’ activity, which create a non-stationary, nonlinear, and temporally dynamic signal [1]. In addition, the useful signal is often submerged in a highly noisy background originating from various sources. For this reason, sophisticated pre-processing techniques are considered imperative to isolate brain activity [2]. Finally, the relationship between the observed electrical potentials in the head and the underlying neurophysiological processes is not direct or simple, as the signal has a complex spatial topography and it is a multi-faceted outcome derived from the interactions of the brain sources [3].
These properties make clinical analysis vulnerable to subjectivity and require advanced computing tools for the reliable and quantitative extraction of diagnostically important information. For these reasons, in recent years, many researchers have been working to advance the methods for processing biomedical signals [4,5,6,7,8]. Basic descriptive statistics provide essential baselines for artefact detection and signal quality assessment, while spectral and time-frequency analysis methods allow for the precise characterisation of oscillatory power dynamics across different brain states [9]. Furthermore, time series analysis techniques [10] and advanced statistical methods [11], are utilised to model the temporal evolution of the signal for forecasting and feature extraction. To map these temporal dynamics to anatomical substrates, spatial analysis and source modelling techniques [12] are used. Connectivity and network analysis employs graph theoretical metrics to quantify the functional integration and information flow between these distributed regions [13]. Additionally, recognising the brain’s complex dynamical nature, nonlinear and chaotic analysis methods are increasingly applied to detect pathological changes in signal complexity [14]. However, the current state of the field is witnessing a paradigmatic shift from these traditional methods toward “Large EEG Models” and Generative AI [15]; while foundation models like LaBraM [16] and Gram [17] leverage massive pre-training on thousands of hours of data to achieve universal cross-subject generalisation and diffusion models [18] enable the direct reconstruction of visual stimuli from neural signals. Beyond these frameworks, contemporary research is increasingly adopting large-scale vision and language architectures to identify high-level semantic patterns within neural time-series data. This evolution encompasses the integration of explainable AI and self-supervised learning to ensure that these sophisticated models remain both transparent and capable of learning from vast, unlabelled clinical datasets [19,20].
Consequently, the objective of this review is to provide a formalised, integrative framework for the study of EEG. Rather than a general survey, this work offers a methodological systematisation that uses the mathematical tools of signal processing as its core. By situating these analytical methods within the context of biophysical origins, recording techniques and applications, the manuscript provides researchers with a foundational roadmap. This approach is intended to equip the reader with a broader perspective of the field’s interconnected layers, serving as a robust starting point for advancing their own research in EEG analysis and application. The discussion commences with the biophysical principles and recording techniques that govern neural signal generation. It then explores clinical applications and essential signal characteristics before transitioning into traditional mathematical frameworks. This analysis starts with statistical and time-series methods, progresses through spectral and time-frequency analysis, and moves into spatial source modelling. Then, the review examines the brain as a complex system through connectivity and network analysis alongside nonlinear and chaotic measures. Subsequently, the focus shifts to contemporary state-of-the-art developments involving foundation models and generative AI. Finally, the work culminates in a proposal for neuro-symbolic architectures as the next evolution in interpretable neural decoding.

2. Biophysical Principles and Neurophysiological Basis

EEG records the temporal changes in the brain’s electric field, which arise from the sum of extracellular currents associated with the postsynaptic potentials of neurons. The recorded signal is a result not only of endogenous neuronal activity, but also of the electrical properties of the tissues (brain, bone, skin) that function as a potential conductor, as well as the orientation of the sources with respect to the surface of the head. The human brain consists of tens of billions of neurons and glial cells. The basis of electrical activity is the membrane potential  ( V m 70   mV )  [21], which is due to the difference in ion concentration  K + ,   Na + ,   Cl ,   Ca 2 +  inside and outside the membrane and it is regulated by active pumps  e . g . ,   Na + / K + - ATPase  and passive ion channels. The action potential is a rapid change in this membrane voltage that transmits information along the neuronal terminals.
The Hodgkin-Huxley model [22,23,24] is a Nobel Prize winning mathematical model that faithfully describes the mechanism of generation and propagation of action potentials in nerve cells. This model was not just a description, but a predictive theory that laid the foundation for modern computational neuroscience. Hodgkin and Huxley’s basic idea was this: The action potential arises from changes in the permeability of the neuron’s membrane to specific ions  ( Na +   and   K + ) , which are controlled by voltage-gated ion channels. Imagine the neuron’s membrane as a battery with many small switches (the channels). At rest  ( 70   mV ) K +  channels are closed,  Na +  channels are tightly closed (but ready to open). When the potential exceeds a threshold  55   mV , the  Na +  channels open rapidly (depolarisation). Positive  Na +  ions flow into the cell, further uncoupling the potential  + 40   mV . This is the anodic phase of the spike. Then, the  Na +  channels begin to close after a short period of time and the slowly activated  K +  channels open fully, driving the potential below the normal resting potential, before returning to its original value (refractory period) [25]. This choreography of opening and closing of the channels is what produces the action potential. For an intuitive approach to the Hodgkin-Huxley model, refer to the Appendix A.
The transmission of electrical signals in the brain can be explained by the principle of volume conduction [26,27]. According to this, electrical charges interact following the well-known laws of electrostatics: opposite charges attract and like repel. Volume conduction is the process by which a group of ions repels neighbouring ions with the same charge. These, in turn, repel subsequent ions, creating a “wave” of charge that moves through the extracellular space. Since the brain is not homogeneous, variations in tissue density can either facilitate or hinder the flow of ions. Furthermore, a signal from a large source (large dipole) can travel a much greater distance than a signal from a small source. To record such a signal from an electrode on the scalp, it must first traverse several layers: from brain tissue, dura mater, skull bone, skin, and hair, until it finally reaches the electrode. While simplified models occasionally describe the head layers as a series of capacitors [28], formal electromagnetic theory for EEG demonstrates that this is not the dominant macroscopic mechanism.
Volume conduction (resistive) and capacitive (displacement) currents combine as  J = ( σ + j ω ϵ ) E , where  σ  is tissue conductivity,  ϵ  its permittivity, and  ω  the angular frequency. The relative contribution of capacitive currents is approximately quantified by the dimensionless ratio  ω ϵ / σ = capacitive   current / resistive   current . When  ω ϵ / σ 1 , capacitive currents are much smaller than resistive (volume-conduction) currents, and volume conduction dominates. For typical brain and CSF properties, this condition is strongly satisfied for physiological EEG/LFP frequencies (<1 kHz), so extracellular potentials in this range are almost entirely determined by volume conduction. The relative importance of capacitive effects increases in low-conductivity, thin layers such as the skull or dry skin. In these tissues,  σ  is smaller, so  ω ϵ / σ  grows with frequency. However, capacitive currents generally become comparable to resistive currents only at much higher frequencies (~10–100 kHz), or when modelling high-frequency stimulation or electrode–skin interfaces [29,30]. For standard EEG/LFP applications, these capacitive contributions remain negligible, and volume conduction provides a quantitatively dominant mechanism. The transition from cellular ionic exchange to scalp-level EEG recordings involves multiple scales of biological and physical phenomena, as illustrated in Figure 1.
A seminal review [31] attempts a unified and deep understanding of the biophysical mechanisms that generate extracellular electric fields and currents. According to this work, extracellular fields do not originate exclusively from synaptic currents but are the result of the superposition of all active transmembrane currents, including action potentials  Na + a n d   Ca 2 + s p i k e s , intrinsic membrane currents (ionic currents that flow through voltage-gated channels of the neuronal membrane and are not directly induced by synaptic activity), and afterhyperpolarisations (AHPs) that immediately follow the termination of an action potential. In addition to action potentials, calcium spikes, intrinsic membrane oscillations, and neuron-glia interactions play a role in shaping the EEG signal. The overall contribution is critically determined by two factors: (a) the cellular-synaptic architectural design of the tissue (e.g., parallel colonies of pyramidal neurons generate strong “open” fields), and (b) the simultaneous synchronisation of these currents.
The EEG is not directly associated with the action potential, but postsynaptic potentials (EPSPs, IPSPs) that are evoked by neurotransmitters (e.g., glutamate, GABA) in the postsynaptic neurons. EPSPs cause depolarisation from positive ion influx, making the outside relatively more positive, while IPSPs cause hyperpolarisation from negative ion influx or positive ion efflux, making the outside relatively more negative [32]. To detect the signal in the EEG, the spatial and temporal synchronisation of a large population activity, over 10,000–50,000 pyramidal cells [33], is required with a minimum threshold of six square centimetres of synchronised cortex [34]. These values should be understood as order-of-magnitude estimates obtained under favourable conditions, namely strong temporal synchronisation of aligned pyramidal neurons in superficial gyral cortex, adequate signal-to-noise ratio, and conventional scalp EEG montages, and may vary substantially across subjects, frequency bands, source depth and orientation, and recording configurations. The main source of the signal comes from the pyramidal neurons of the neocortex, which, due to the parallel arrangement and the large size of their diverging dendrites, create equivalent electric dipoles. The geometry of these dipoles (tangential or radial to the skull surface) determines the amplitude and polarity of the measured signal. A dipole is characterised by a current “sink” (negative charge, usually at the base of the dendrites) and a current “source” (positive charge, close to the particle). Recording critically depends on the orientation of the dipole with respect to the electrode: radial dipoles are easier to detect than tangential ones [35,36].
The extracellular currents resulting from these synaptic events obey the physics of the conductor’s volume. In the quasi-static field approximation, which is valid for EEG frequencies (<100 Hz), we ignore magnetic induction effects and consider the field to be changing so slowly that at any given time it is as if it were static. Under this approach, no significant charge accumulates in the tissues; this is expressed mathematically by Equation (1), where the divergence measures how much a vector field spreads or concentrates at a point and  J  represents the current density. We also know from Ohm’s law that the current  J  is proportional to the electric field  E  (2) where  σ  is the tissue conductivity. Electric field  E  can be written as the negative slope of the electric potential  Φ  (3) where the slope shows the direction and magnitude of the maximum increase in potential and substituting all of this into (1) we obtain Equation (4). This implies the electric field  E  is curl-free and current is divergence-free except at active sources.
· J = 0 ,
J = σ E ,
E = Φ ,
· σ Φ = 0 ,
Equation (4) would hold if there were no sources. However, synapses are the sources of current. The density of these “primary” or “active” currents is denoted as  J p ( r ) . The deviation of this current,  · J p ( r ) , tells us exactly where the sources (positive deviation) and sinks (negative deviation) are. In this way, the final equation that describes how the sources create the potential in a conductor with conductance  σ  is Poisson’s Equation (5) [37].
· σ r Φ r = · J p r ,
Instead of modelling each neuron individually, an equivalent current dipole can represent the synchronised activity of a cortical region. This is a mathematical abstraction: a vector  p  that has magnitude (proportional to the sum of the currents) and direction (from source to sink). In a simple, infinite, and homogeneous conductor of conductance  σ , the potential  Φ  at a distance  r  from such a dipole is given by the classical formula (6). Of course, the head is neither infinite nor homogeneous. For more realistic models (e.g., four-sphere head model), the formula becomes more complex [38]. Importantly, only the net (summed) dipolar moment of the active cortical patch matters at the scalp: isolated single-neuron fields cancel, but synchronised populations form a macroscopic dipole that is measurable.
Φ r = 1 4 π σ p · r r 3 ,
The Forward Problem [39]: “If the location and strength of the sources (dipoles) inside the brain are known, what will be the electric potential in the scalp?” is a well-defined physical problem. Its solution involves solving Poisson’s equation for a given  J p . In computational terms, this is summarised in a linear matrix, the Lead Field Matrix  L , that “connects” the sources to the electrodes by Equation (7), where  V  is the vector of potentials at the electrodes and  I p  is the vector of source strengths. The Inverse Problem [40]: “Given the measured potentials  V  at the electrodes, what are the sources  I p  inside the brain that caused them?”. This problem is ill-posed because infinitely many different source distributions can produce the same pattern in the scalp. To find a unique solution, constraints must be introduced (e.g., the solution must have the minimum possible energy) through a process called regularisation.
V = L I p ,

3. Recording Technique

EEG technique involves recording the electrical activity of the brain through electrodes placed on the scalp, based on the principle of differential amplification, where the voltages between pairs of electrodes are amplified and displayed as waves on a digital screen. The basic components of an EEG system include the electrodes, the montage selection circuitry that determines which electrodes are paired and routed to the differential amplifiers, the analogue-to-digital converters (ADCs), and the processing and feedback systems. The process begins with scalp preparation, including cleansing and sometimes light skin peeling to reduce the scalp impedance below 5–10 kΩ for traditional electrode systems to improve conductivity and minimise interference [41]. The placement of electrodes follows the international 10–20 system, which ensures standardised and reproducible placement of the electrodes over brain regions (e.g., frontal, temporal, parietal). The traditional 10–20 electrode system (Figure 2) involves 19 electrode sites and two earlobe-mounted electrodes (A1/A2) that are associated with specific anatomical regions, such that 10–20% of the distance between them is used as the interval for electrode placement. The locations are named with two characters where the first character indicates the brain region (Fp = frontopolar, F = frontal, C = central, P = parietal, O = occipital, T = temporal and A = auricular), and the second character is a number (even = right, odd = left) or “z” for the midline electrodes (e.g., Cz) [41,42,43,44,45,46].
EEG technology is characterised by a wide variety of design parameters and applications, which can be classified into multiple dimensions. EEG systems can be categorised based on four main axes (Figure 3): acquisition approach and interface, architecture and electronics, data and computations, and context and integration [47]. In terms of signal acquisition, systems range from completely non-invasive scalp surface EEG [48], around-ear EEG for improved portability [49], and subgaleal/epicranial placement [50], to invasive techniques such as electrocorticography (ECoG) with subcutaneous arrays [51,52], stereotactic EEG (sEEG) with approved (depth) electrodes [53], and intracortical microelectrodes for research and neuroprosthetic applications [54]. The device architecture can be wired for high-reliability laboratory applications, wireless (with Bluetooth, Wi-Fi) to facilitate recordings in naturalistic, real-world environments. This latter configuration allows for the continuous collection of neurophysiological data while subjects are unrestricted in their movement and engaging in daily activities, which is essential for research in neuroergonomics and home-based clinical assessments [55]. It can also be hybrid [56], wearable (consumer headsets) [57], implantable with telemetry [58], or even modular with structured electrode arrays [59]. Electrode technology covers a wide range, from the contact method: liquid/gel [60], semi-dry/hydrogel [61], dry (e.g., needle, finger) [62,63,64], microneedle electrodes [65], to construction materials such as metal films [66] and polyimide [67], knitted/woven electrodes [68], conducting polymers/graphene [69], with a distinction between active (with built-in preamplifier) and passive electrodes, as well as redeemable versus reusable. The number of channels and montage varies from low density (1–8) for basic brain–computer interfaces (BCI), medium density (16–64) for research, high density (64–256) for source detection, to extremely high density (>256) for special applications, with layouts such as the standard 10–20, 10–10, or custom grids [70,71,72].
It is worth mentioning that common EEG recording standards have bifurcated into distinct clinical and research tiers, driven by advances in amplifier technology. While regulatory bodies like the American Clinical Neurophysiology Society (ACNS) and the International Federation of Clinical Neurophysiology (IFCN) establish a safety floor, mandating a minimum of 16–21 channels sampled at 256 Hz to ensure basic diagnostic competence, modern practice frequently exceeds these baselines. Clinical systems now commonly utilise 25–32 channels sampled at 500–1024 Hz, whereas high-performance research setups employ 64–256 high-density arrays sampled at 2 kHz up to 20 kHz to capture high-frequency oscillations. Crucially, the field has transitioned to 24-bit DC-coupled amplifiers, which offer a massive dynamic range (often ±400 mV) capable of recording stable infraslow potentials and accommodating large offsets without saturation, a significant leap from the 16-bit AC-coupled legacy systems [46,73,74,75].
Amplifier and signal preprocessing characteristics are critical and include input noise (e.g., <0.5 μV RMS), input impedance (e.g., >1 GΩ), common mode rejection ratio (CMRR), bandwidth and filters, dynamic 1.4, dynamic 2, dynamic bandwidth, filters, and active isolation (driven right leg) [76,77]. Temporal and spectral capabilities are determined by the sampling rate (low: <250 Hz, medium: 250–1000 Hz and high: >1000 Hz serving as approximate guidelines). Real-time latency and synchronisation methods (TTL, PTP, GPS) [78,79] are critical for closed-loop applications where the system must process data and provide feedback within milliseconds. Data management and connectivity involve local storage, data streaming, communication protocols (BLE, Wi-Fi), security, and synchronisation. The software and analysis stack includes real-time preprocessing, AI inference on the device or in the cloud, closed-loop control, and standards compliance (e.g., BIDS) [80]. Multi-modal integration is common, such as EEG + fNIRS, EEG + motion sensors (IMU), EEG + eye tracking/ECG/EGD, and synchronisation with TMS/fMRI/MEG [81,82]. Regulatory compliance and robustness determine whether a system is for research only, a medical device (Class II/III, CE, FDA), MRI compatible, or deployed for industrial use. Ergonomics are tailored to the user (infant, paediatric, adult, animal) and the duration of use. Finally, the power system can be mains-connected, battery-powered, or even energy harvesting systems, with low-power modes for extended studies, while the ecosystem can be closed (vendor-dependent) or open source/hardware [83,84,85]. Common clinical hardware systems are summarised in Table 1, while major analysis software ecosystems are addressed in Table 2.
EEG recordings are perpetually challenged by a diverse array of artefacts, ranging from physiological sources, such as high-amplitude ocular dipoles (EOG), broadband muscle contractions (EMG), and cardiac rhythmicity (ECG) to environmental interference like 50/60 Hz power line noise and non-stationary movement artefacts caused by cable sway or electrode shifts [86,87]. To mitigate these interference sources, modern acquisition setups increasingly employ hardware solutions like active electrodes with integrated pre-amplifiers and active shielding, as well as shielded rooms (Faraday cages), which significantly boost the CMRR and stabilise signal impedance against capacitive coupling and triboelectric noise [88,89]. On the software front, while traditional linear filtering (high-pass/notch) and regression methods remain useful for simple drifts or ocular correction, Independent Component Analysis (ICA) has established itself as the gold standard for separating stationary artefacts like blinks and heartbeats into distinct, removable components. For more complex, non-stationary noise typical of mobile EEG, Artefact Subspace Reconstruction (ASR) provides a robust, automated method for removing high-variance transient bursts by reconstructing clean data from statistical principal component subspaces [90,91,92,93], while emerging Deep Learning (DL) architectures [94,95,96], including Convolutional Neural Networks (CNNs) [97], U-Nets [98], and Generative Adversarial Networks (GANs) [99] are demonstrating superior efficacy in learning non-linear mappings to denoise complex muscle and motion artefacts without the reference limitations of traditional pipelines.

4. Applications

Contemporary research is witnessing an unprecedented expansion in the application range of EEG (Figure 4), which now sufficiently provide valuable information about a wide spectrum of brain functions employed in a plethora of fields, depending on the purpose [100]. One of the most advanced areas is BCI, where EEG signals are used to control machines, robots, drones and prosthetic limbs, aiding people with motor disabilities [101]. Moreover, its technology is leveraged to create video games providing both entertaining and therapeutic effects [102]. Beyond rehabilitation, EEG serves as a critical tool for the broad clinical assessment of neurological states, encompassing the longitudinal management of chronic disorders (e.g., epilepsy, ADHD) [103,104], the detection of acute physiological biomarkers (e.g., seizures) [105], and the monitoring of traumatic or surgical conditions (e.g., TBI, neurosurgery) [106,107]. In neuroscience, EEG has been used to measure cognitive load, attention, stress and emotional state [108]. Recent research extends applications to neuroadaptive environments [109] that adjust lighting, sound, or content based on the user’s brain activity and digital psychiatry [110], which provides a sophisticated conceptual framework for the automation of mental health practices. Rather than merely utilising digital interfaces for remote care, digital psychiatry is defined as the synthesis of traditional psychiatric domain knowledge with modern intelligent systems to automate the clinical pipeline. Within this framework, EEG serves as a critical objective data layer that feeds into large-scale pre-trained models and DL algorithms. The core objective is to move beyond qualitative symptom reporting toward a fully automated digital psychiatric system capable of automated diagnostic processes, symptom tracking, and disease progression prediction. In addition, it has been used for detecting anxiety and depression through AI-EEG combinations [111], and neuroergonomics [112], where EEG is used in real-world or simulated environments to assess cognitive states like mental fatigue, workload, and attention. This can be utilised in a new emerging research field called neuromarketing [113], where the goal is to create more effective marketing campaigns and desirable products. In the field of education, EEG is used in the evaluation of learning effort and concentration, leading to the development of intelligent and adaptive teaching systems [114]. A primary metric utilised in these scenarios is the engagement index, calculated as the ratio of Beta power to the sum of Alpha and Theta power  ( P b e t a / [ P a l p h a + P t h e t a ] ) , which correlates with task immersion and mental workload [115]. While large-scale implementations are currently rare due to the logistical challenges of high-density setups, pilot “smart classroom” initiatives have utilised wearable, low-density EEG headbands to provide teachers with real-time heatmaps of student engagement levels during lectures [116]. Similarly, in sports and wellness, applications focus on neurofeedback to optimise the “peak performance” state. A key scenario is the “quiet eye” period in precision sports (e.g., archery or golf), where metrics such as frontal midline theta ( f m θ ) power are monitored to ensure the athlete has reached a state of relaxed concentration before execution [117]. In high-performance sports academies, neurofeedback systems utilise the Alpha/Theta ratio to train athletes in stress regulation and recovery, helping to mitigate the effects of “choking” under pressure [118]. Despite these successes, real-world deployment is often limited by the high inter-subject variability of these EEG biomarkers, necessitating individualised “neuro-profiles” for each athlete. Importantly, EEG-biometric systems can offer identification based on brain patterns [119].
Regarding neuromarketing, it is essential to distinguish between its commercial potential and current scientific limitations. While EEG provides high temporal resolution for tracking consumer reactions, the field is frequently challenged by the reverse inference problem, where specific neural patterns are incorrectly assumed to uniquely identify a single emotional state like “desire” [120]. Methodologically, research in this area often suffers from small sample sizes and a lack of standardised protocols across commercial providers, which significantly limits the generalisability and reproducibility of findings. Furthermore, the use of neurophysiological data to influence behaviour raises profound ethical concerns regarding consumer autonomy and the “privacy of thoughts,” as it seeks to bypass conscious filters to influence subconscious decision-making [121]. Similar rigour must be applied to EEG-based biometric systems. While they offer unique advantages such as inherent liveness detection and “cancelability”, allowing a “brainprint” to be reset if a specific stimulus is compromised, they are not yet fully secure. Practical security is hindered by long-term temporal instability and inter-session variability, where signals are significantly altered by a user’s fatigue, stress, or mental workload. Moreover, recent evidence highlights their vulnerability to adversarial attacks, where deep-learning-generated synthetic EEG signals can potentially bypass authentication layers, necessitating the development of robust encryption and rigorous cross-session validation before these systems can transition from experimental prototypes to secure identification standards [122,123,124].
It is important to distinguish between applications that are routinely utilised in clinical practice and those currently at a research or prototypical stage. Among the applications listed, only a subset of EEG uses is clinically established. Routine clinical practice relies primarily on EEG for the diagnosis and monitoring of neurological disorders, such as epilepsy, sleep disorders, and, to a lesser extent, dementia and encephalopathies. These are supported by high-level evidence and standardised clinical guidelines [125]. In contrast, many of the other applications mentioned, including brain–computer interfaces, neuroadaptive environments, digital psychiatry, neuroergonomics, neuromarketing, education, sports, wellness, and EEG-based biometrics, remain largely experimental or research-focused. While these emerging applications show promise and are actively explored in research and prototype settings, they currently lack widespread clinical adoption or strong evidence from large-scale trials.

5. Basic Characteristics

EEG signals are classified by their frequency into distinct bands, each of which is associated with specific brain functions and states (Figure 5). Delta waves (0.5–4 Hz) are the predominant activity during deep sleep (stage N3) and are normal in infants, while their presence in awake adults may indicate pathology [126,127]. Theta waves (4–8 Hz) are observed during states of deep relaxation and meditation and are normal in children [128,129]. Alpha waves (8–13 Hz) predominate in the occipital region when a person is awake and calm with eyes closed and are associated with a state of calm and inner alertness [130,131]. Beta waves (13–30 Hz) are associated with active, alert thinking, concentration, and problem solving [132,133]. Gamma waves (30–100 Hz) are thought to be involved in higher cognitive functions, such as committing information to memory, perception, and integrating sensory information [134,135]. Finally, EEG frequencies above 100 Hz are considered part of high-frequency activity, often associated with the upper end of the gamma band and beyond. This range can be difficult to measure reliably due to low amplitude, and it can be influenced by factors like anxiety, neurostimulation, and certain neurological conditions [136,137].
In addition, EEGs show a variety of morphological patterns (Figure 5) that reflect different physiological functions or pathological conditions. The Mu (μ) rhythm occurs in the alpha frequency range (8–13 Hz) but has a characteristic arcuate shape and is recorded over the central (Rolandic) areas. In contrast to the alpha rhythm, which is suppressed by opening the eyes, the μ rhythm reacts and is suppressed during actual or even thought of movement, as it is associated with the motor cortex [138]. During sleep, several distinct patterns appear. Sleep spindles are rhythmic activity patterns in the 11–15 Hz range and are one of the defining characteristics of stage N2 (light) sleep, believed to reflect the synchronised activity of thalamo-cortical neuronal networks [139]. The K-complex is a large, high-amplitude biphasic wave that also characterises stage N2 sleep. It is a sudden delta wave, often triggered by external stimuli, and is thought to play a role in maintaining sleep by suppressing arousal [140]. Vertex waves (or V-waves) are short, sharp waves that occur in the central-frontal regions during stage N1 sleep and signal the onset of sleep [141]. In the awake state, lambda waves are transient, triangular waves that occur in the occipital region and are associated with visual exploration or saccadic eye movements, often activated during reading or watching a scene [142]. Furthermore, spike waves are an important EEG finding that is often associated with epileptic activity [143]. Finally, High-Frequency Oscillations (HFOs) are crucial epileptic biomarkers found in the 80–500 Hz range, representing localised seizure networks. These distinct oscillations, detected in scalp and intracranial EEG, are vital for localising the seizure onset zone [144,145].
EEG is a complex, spatiotemporal signal that is often treated as a stochastic process. Although the underlying biophysical mechanisms may not be truly random, their high level of intricacy makes statistical description the usual and practical choice for quantitative analysis. EEG signals can be considered quasi-stationary over short temporal windows (on the order of a few to tens of seconds) [146]. This property is fundamental for EEG analysis, as it allows the application of classical signal processing techniques to short segments of the signal, over which it can be considered stationary and permits the estimation of instantaneous measures such as mean, variance, skewness, and kurtosis. These summary statistics and their corresponding distribution tables describe the probability of amplitude values and provide snapshots of the energy distribution; however, their estimates are strongly affected by the possible non-independence of successive samples and by the non-stationarity of the signal, so tests for normality (e.g., Kolmogorov–Smirnov) and corrections for correlation of neighbouring samples must be applied with caution [147,148]. Digitisation of EEG requires sampling and smoothing/quantisation. The choice of sampling rate follows the Nyquist theorem stating that to accurately reconstruct an analogue signal from its discrete sampling, the sampling frequency (sampling rate) must be at least twice the highest frequency contained in the signal. EEG analysis must be accompanied by appropriate pre-filtering to prevent aliasing, while the quantisation width (e.g., 9–11 bits) affects the signal-to-noise ratio and the fidelity of the representation [149]. Temporal correlation between samples is described by autocorrelation/coherence functions and spectral properties are projected via the power spectral density, which is the Fourier transform of the autocorrelation. Techniques such as periodogram/FFT and smoothing or ensemble-averaging modes are used for reliable spectrum estimates, while spectral moments and parameters such as Hjorth indices summarise morphological features of the spectrum [150]. When the distribution is non-Gaussian, higher-order analysis (e.g., bispectrum/bicoherence) reveals nonlinear correlations and spectral phase coupling between harmonic components [151,152]. Alternatively, interval or period analysis based on zero-crossing machines, measurement of intervals between peaks or half-waves, and interval-amplitude diagrams offers a simple but powerful temporal description of wave structure and is mathematically related to spectral moments and zero-crossing rates (N0, N1, N2), although it is sensitive to high-frequency noise (hysteresis/dead-band is commonly introduced to avoid spurious crossings) and tends to overestimate fast components while underestimating rare long periods [153]. In summary, the comprehensive description of EEG for research combines statistical snapshots (distribution shapes and moments), temporal relationships (auto-/intercorrelation), and spectral representations, complementing where necessary, parametric approaches (e.g., AR models) or pattern detection methods for feature extraction [154].
Recent neurophysiological research has fundamentally re-evaluated the “background” noise in EEG power spectra, identifying the aperiodic spectral component, characterised by a  1 f χ  power-law decay (where  χ  represents the spectral exponent) as a critical physiological marker rather than mere instrumental noise [155]. This aperiodic slope is theoretically linked to the balance between neural excitation and inhibition, where flatter slopes reflect higher asynchronous firing and excitation (typical of wakefulness and active cognitive processing) and steeper slopes reflect strong inhibition or synchronisation (typical of sleep, anaesthesia, and coma) [156,157]. To isolate this feature from traditional oscillatory peaks (like alpha waves), researchers utilise advanced decomposition algorithms such as FOOOF (Fitting Oscillations and One-Over-F) or IRASA (Irregular Resampling Auto-Spectral Analysis) [158]. A seminal application of this framework demonstrated that the spectral exponent could distinguish between Unresponsive Wakefulness Syndrome (UWS) and Minimally Conscious State (MCS) in non-postanoxic patients, a diagnostic grey zone where traditional alpha power metrics often fail. By combining the spectral slope (indicating the state of neural tissue) with spatial alpha gradients, the authors successfully stratified patients and found that the resting-state slope correlated strongly with the Perturbational Complexity Index (PCI), although they noted that in severe postanoxic cases where the cortex is largely destroyed, the simple presence of oscillatory power remains the superior predictor [159].

6. Methods for EEG Processing

6.1. Statistical and Time-Series Analysis

Basic statistical and descriptive methods are the cornerstone of quantitative analysis of EEG signals, as they allow an initial understanding of their shape, variability and statistical behaviour. EEG, as a stochastic and nonlinear signal, exhibits fluctuations that reflect the dynamic interaction of large populations of neurons. As mentioned previously, descriptive parameters, such as mean, variance, standard deviation, skewness and kurtosis, are handy tools to obtain key insights into the statistical properties of the signal [160,161]. At the same time, the investigation of the probability distributions of the EEG provides information on the nature and nonlinearity of the underlying processes and it is the deciding factor in the selection of appropriate statistical tests or models.
The correct choice of sampling frequency, according to the Nyquist theorem, and the use of appropriate anti-aliasing filters are vital for the prevention of distortions as well as the loss of information, while the analysis in bits governs the accuracy of the recording. The autocorrelation and cross-correlation functions can be used to estimate temporal dependencies and delays between channels, providing a first takeaway of synchronisation or functional correlation [162,163]. Finally, methods such as interval or period analysis pave the way for the study of wave periodicity, utilising techniques such as calculating zero-crossings or half-wave intervals to extract simple but insightful temporal features [164,165].
The goal of statistical and probabilistic methods is not only to describe the signal, but also to estimate the possible generating processes behind it. Bayesian approach is based on Bayes’ Theorem (8) [166].
P θ D = P D θ P θ P D ,
The logic is that we combine prior knowledge (priors) with observed data to obtain updated estimates (posterior estimates). Bayesian approach is instrumental to Bayesian source localisation, which estimates the most likely distribution of brain sources that produce the observed signal, hierarchical Bayesian models, which estimates connectivity by integrating information between subjects, and uncertainty quantification, which calculates probabilities for the reliability of each parameter [167]. Another useful tool is the Expectation-Maximisation (EM) algorithm, which is an iterative optimisation approach used when data contains latent (unobservable) variables. The objective is to maximise the likelihood  P X θ  based on the available data  X  and the algorithm alternates between two stages: E-step calculates the expected value of the latent variables (Expectation) and M-step updates the parameters by maximising the expected likelihood (Maximisation) [168,169]. EM is exploited in gaussian mixtures for clustering EEG states as well as in Hidden Markov Models (HMM) and Dynamic Causal Modelling (DCM) for parametric estimation. HMMs assume that there is a latent sequence of states  S t  that produces the observed data  X t  and the probability of the entire sequence is calculated by Equation (9) [170].
P X , S = P S 1 t P S t S t 1 P X t S t ,
This way, the EEG is considered as an alternation of latent “brain states”, which are not directly observed, but leave an imprint on the EEG waves. The estimation of the parameters is usually performed with the EM algorithm and these models have been used in the detection of brain states (e.g., sleep, attention, judgement), brain-state decoding and dynamic functional connectivity and event segmentation in cognitive experiments [171]. DCM is a Bayesian state-space model that describes how brain regions interact causally. The system is described by differential Equation (10), where  x  is the internal brain states,  u  is the external stimuli and  θ  is the connection strengths. The parameters are estimated via variational Bayes, which approximates the posterior distribution of connections [172]. DCM does not simply describe correlations, but how one region causes the response of another. Thus, it is a causal generative model. It is useful in the analysis of directional connectivity (effective connectivity) [173].
x ˙ = f x , u , θ ,
EEG signal is a time-series as it captures measurements of the brain’s electrical activity over a consistent interval of time. Thus, time-series methods can be applied. The Prony method is an ancestor of parametric autoregressive models, using exponential fitting to the signal (11), where  A i  is the amplitude and  s i = α i + j ω i  is the complex frequency [174,175].
x t = i = 1 N A i e s i t ,
The method fits a sum of damped exponential functions and is useful for the estimation of dominant frequency components, analysis of damped oscillations and modelling of transient episodes. In modern applications, it has been extended to Matrix Prony and Subspace Prony [176], which utilise algebraic and eigenspace (subspace) techniques to estimate exponential components in multichannel EEG signals, allowing more stable and accurate identification of dominant oscillatory modes and temporally consistent spectral analysis even in highly noisy data.
A signal  x t  can be described by an Autoregressive (AR) model as a linear combination of its  p  previous values (12), where  a i  is the autoregressive coefficient and  ε t  is white noise. Hence the AR model predicts the current value based on the memory of the signal. In a Moving Average (MA) model the signal depends on the current and past errors (13), where  b i  corresponds to a noise filter. The MA smooths out the stochastic variability, capturing transient or random fluctuations. Autoregressive Moving Average (ARMA) combines the previous two (14) and offers a balance between memory and random component, which is ideal for EEG that exhibits both internal dynamics and noisy effects. Parameter estimation is executed with methods such as Yule-Walker, Levinson-Durbin, or Burg’s algorithm [177,178,179]. Extending ARMA by introducing an integration term for non-stationarity, we obtain the Autoregressive Integrated Moving Average (ARIMA) (15), where  B  is the lag operator and  d  is the degree of integration. Thus, ARIMA (p,d,q) can describe EEG series with long-term variations or trends, while preserving local dynamics [180]. In Autoregressive Moving Average with Exogenous Inputs (ARMAX) models, external variables  u t  are introduced that do not belong to the system itself but affect the behaviour of the time series, such as sensory stimuli or experimental conditions that affect the EEG during recording (16). In practice, it allows the modelling of EEG with stimuli (e.g., sensory input, task events), combining temporal memory, stochasticity and external influence [181]. Fractional ARIMA (FARIMA) generalises ARIMA by allowing fractional integration  d R  (17). This facilitates the capture of long-range dependence or memory effects in EEG. This means that current samples are influenced not only by recent, but also by incredibly old events [182]. Vector Autoregressive (VAR) models extend AR to multivariate time series (18), where  x t  is a vector of EEG channels,  A i  is the coefficient matrix, and  ε t  is an error vector. The basic idea is that each channel can influence and be influenced by the others, thus allowing the analysis of interdependencies and causality between brain regions [183]. Multivariate VAR (MVAR) extensions allow the calculation of the Granger causality which estimates whether the activity of one channel can predict the future behaviour of another, the Directed Transfer Function (DTF) and the Partial Directed Coherence (PDC) which are based on the spectral representation of the MVAR model and support the estimation of frequency-dependent directionality, while their time-varying extensions capture how these causal connections change dynamically during cognitive processes or states [184,185,186]. Volatility, how the noise or energy of the signal changes over time, can be modelled with the Generalised Autoregressive Conditional Heteroskedasticity (GARCH) model. The central idea of GARCH(p, q) is that the variance of the noise depends on previous values of the variance and the errors. For a time-series  x t σ t 2  is the conditional variance, which evolves according to Equation (19) [187]. Choosing the optimal p, q order of a parametric model is critical to the accuracy and stability of the model. The most common criteria are the Akaike Information Criterion (AIC) and the Bayesian Information Criterion (BIC) [188]. All these methods have been used for instance in dealing with artefacts [189], the prediction of epileptic seizures [190], the diagnosis of autism [191], and the real-time monitoring of the effects of drugs (such as anaesthetics) [192].
x t = i = 1 p a i x t i + ε t ,
x t = i = 0 q b i ε t i ,
x t = i = 1 p a i x t i + j = 0 q b j ε t j ,
d x t = ( 1 B ) d x t ,
x t = i = 1 p a i x t i + j = 0 q b j ε t j + k = 1 r c k u t k ,
( 1 B ) d x t = ε t ,   w h e r e   ( 1 B ) d = k = 0 Γ k d Γ d Γ k + 1 B k ,
x t = i = 1 p A i x t i + ε t ,
σ t 2 = α 0 + i = 1 q α i ε t i 2 + j = 1 p β j σ t j 2 ,

6.2. Spectral and Time-Frequency Analysis

There are a lot of transformations that reveal the energy structure of the EEG in terms of frequency, most of them are well-described [193,194]. The Continuous Time Fourier Transform (CTFT) transforms a time signal into a frequency signal, representing it as a sum of sine and cosine functions [195]. For a continuous signal  x t , it is defined by Equation (20) where  f  is the frequency spanning from 0 Hz to half the Nyquist frequency and  j  the imaginary unit. In digital processing, this is implemented as the Discrete Fourier Transform (DFT) (21) for a signal of  N  samples. The index of the frequency component calculated ( k ) corresponds to a specific frequency  f k = k × F s N , where  F s  is the sampling frequency. The efficient computation of the DFT is achieved via the Fast Fourier Transform (FFT) algorithm, which reduces computational complexity from  O N 2  to  O N   log 2 N . To estimate the Power Spectral Density (PSD), the Periodogram is often used, calculated as the normalised squared magnitude of the DFT (22). This facilitates the comparison of the power spectrum of different signals, or of the same signal under different conditions, the identification of periodicities as well as the frequency with the maximum contribution to the total power of the signal [196]. However, the Periodogram is inconsistent and sensitive to noise. The Welch method improves stability by averaging periodograms over overlapping windowed segments (23) [197,198]. Alternatively, the Multitaper method offers a statistically superior estimate by using a set of orthogonal tapers (Slepian sequences) to maximise spectral concentration and reduce variance without the trade-offs inherent in single-window methods [199].
F f = x t e j 2 π f t d t ,
F f = n = 0 N 1 x n e j 2 π k n N ,
P f = 1 N n = 0 N 1 x n e j 2 π k n N 2 ,
P x x W e l c h k = 1 K i = 0 K 1 1 L U m = 0 L 1 x i m   w m   e j 2 π k m / L 2 ,
While spectral density provides a global overview, it fails to capture the dynamic, non-stationary nature of brain activity. To resolve this, Time-Frequency Distributions (TFDs) are employed. The Short-Time Fourier Transform (STFT) introduces a sliding window  w n  to compute local spectra (24). Yet, the STFT is strictly bound by the Heisenberg-Gabor uncertainty principle, where a fixed window width forces a permanent compromise between time and frequency resolution. It has been used as a basic tool for extracting spectral features by detecting frequencies associated with physiological states such as sleep [200], mental load [201] or pathological states such as epilepsy [202]. Nevertheless, the FFT assumes that the signal is stationary within the analysis window. This results in weaknesses in detecting transient phenomena. It also requires a relatively long signal length to provide reliable spectral estimation, as very short segments have low frequency resolution and cause spectral leakage due to the window boundaries. The FFT is also sensitive to noise. The Continuous Wavelet Transform (CWT) addresses fixed window compromise by decomposing the signal using a scalable “mother wavelet”  ψ  (25). Τhis allows for multiresolution analysis, though it often suffers from “spectral dilution” at high frequencies. The Discrete Wavelet Transform (DWT) discretises scales for efficiency [203,204,205]. Simply put, the CWT acts like a “magnifying lens” that scans each moment in time at all frequencies, while the DWT “breaks” the signal into structural levels, revealing the content of the EEG in an economical but complete way. Related techniques include Wavelet Packet Decomposition (WPD), which is a generalisation of the DWT offering a more uniform and flexible analysis of the entire frequency spectrum, making it extremely useful for EEG signals that contain information at multiple, non-hierarchically organised frequencies [206], and the Stockwell Transform (S-transform), which combines the phase retention of the STFT with the variable resolution of wavelets [207] (26). Wavelet methods have been used in various applications, including feature extraction [208], EEG signal denoising [209], seizure [210], and Event Related Potentials (ERPs) detection [211].
S T F T x t m ,   ω = n = x n w n m e j ω n ,
W a , b = 1 a + x t ψ t b a d t ,
S τ , f = + x t f 2 π e τ t 2 f 2 2 e i 2 π f t d t ,
The Synchrosqueezing Transform (SST) reassigns the diffused energy of the CWT back to the signal’s true “instantaneous frequency” (IF). It first estimates the candidate frequency  ω a , b  using the phase derivative of the wavelet coefficients (27). The SST then generates a concentrated Time-Frequency Representation (TFR)  T ω , b , by integrating coefficients over scales that map to the same frequency bin (28). Ideally, this “squeezes” energy into sharp ridges [212]. Building on this, the Synchroextracting Transform (SET) employs a stricter “keep-or-discard” strategy. Instead of moving energy, SET utilises a synchroextracting operator to retain only the TFR values that strictly align with the estimated IF trajectory [213]. For scenarios with low Signal-to-Noise Ratio (SNR), such as scalp EEG, the Concentration of Frequency and Time (ConceFT) method offers enhanced robustness by combining the Multitaper framework with Synchrosqueezing. ConceFT averages the reassigned transformations obtained from multiple random linear combinations of orthonormal tapers [214,215]. This averaging cancels out the unstable “wiggling” of noise ridges while reinforcing the stable coherent structures of the neural signal.
ω a , b = i W a , b / b / W a , b ,
T ω , b = W a , b δ ω ω a , b d a ,
Shifting emphasis a recent innovation, the Superlet Transform (SLT) introduces a fundamentally different approach to resolution that does not rely on phase reassignment, but rather on the geometric combination of multiple wavelet estimates. The core philosophy of the Superlet is to bypass the single-wavelet trade-off by utilising a set of wavelets with varying bandwidths (cycles) for the same central frequency. A “Superlet” consists of a spectrum of wavelets ranging from those with narrow temporal support (low cycles, capturing high temporal precision) to those with wide temporal support (high cycles, capturing high frequency precision). The SLT computes the geometric mean of the magnitudes of these wavelet responses, acting as a “soft intersection” operator [216]. Mathematically, for a set of wavelets with cycles  c 1 ,   c 2 , ,   c o , the transform is defined as (29). This geometric combination ensures that the resulting representation retains high values only where all wavelets in the set have significant responses, thereby effectively sharpening the localisation in both time and frequency domains simultaneously. The Superlet is particularly advantageous for EEG analysis because it solves the “spectral dilution” problem inherent to the standard CWT. In a traditional CWT, high-frequency wavelets become extremely short in time, which causes their energy to spread excessively across the frequency axis, making it difficult to distinguish adjacent high-frequency components (e.g., distinguishing 60 Hz from 80 Hz). The Adaptive Superlet Transform (ASLT) addresses this by increasing the order ( o ) (number of combined wavelets) linearly as the frequency increases. This allows the method to maintain the high temporal resolution necessary to detect transient events, such as short-lived Gamma bursts in visual processing or beta spindles in motor control, while simultaneously enforcing the high frequency resolution required to separate them from neighbouring oscillations [217,218]. Furthermore, Superlets have been employed to enhance motor imagery classification in BCI, achieving superior accuracy by resolving the fine spectro-temporal features of sensorimotor rhythms that are often blurred by conventional wavelets [219,220]. By adaptively balancing these constraints, Superlets reveal fine-grained temporal microstructure in high-frequency bands that are typically blurred or invisible in standard CWT or STFT scalograms.
S L T f , o t = i = 1 o W t , f , c i 1 o ,

6.3. Spatial Analysis and Source Modelling

Spatial analysis and source modelling includes methods for estimating EEG brain sources (inverse problem). ICA is based on the model (30), where the observed EEG signal x is a linear mixture of independent sources  s , through a mixing matrix  A .
x = A × s ,
The method aims to estimate  A 1 , maximising the statistical independence of the components (e.g., through maximising non-Gaussianity or minimising mutual information). This technique is used in EEG denoising (EOG/EMG subtraction), detection of independent rhythms, and functional source analysis (EEG-fMRI integration) [221]. Joint ICA (EEG-fMRI) is applied to unified datasets  X = X E E G , X f M R I , where the aim is to find common independent factors that explain both modalities simultaneously. Instead of separating only EEG components, it finds common spatiotemporal factors that explain both the dynamics of the EEG and the spatial distribution of the fMRI. It has been used in combined spatiotemporal interpretation of functional brain activity (neurovascular coupling) [222]. Independent Vector Analysis (IVA) extends ICA to multiple datasets (31) where each  s k  is independent of the others.
x k = A k s k , k = 1 , , K ,
The goal is to estimate  A k  that preserve internal interdependence and external independence. It has been utilised in group-level EEG analysis, multimodal analysis (EEG-MEG) and comparative EEG studies in populations [223,224]. Principal Component Analysis (PCA) finds orthogonal components that maximise the variance of the data (32).
X = U Σ V T ,
The principal components are the columns of  V , and the eigenvalues of  X T X  give the contribution of each component. This method reduces the dimension of the EEG to a few axes that describe the largest variation (energy) of the signal. It has been used in pre-processing for ICA, data compression, EEG power factor analysis and pattern classification [225].
Non-Negative Factorisation (NMF) assumes that the data  X  can be approximated as a sum of positive bases  W  that are time activated by  H  (33). Practically, this means we look for repeating spatial/temporal patterns that add up to form the signal. This is useful for isolating alpha/β patterns and artefacts [226,227]. In Sparse Coding or Dictionary Learning we assume that each instance of the signal is written as a linear combination of some columns of a dictionary  D , with sparse coefficients  α  (34) [228,229]. In other words, we have a “dictionary of waveforms” and at any given time a few of them are activated. Empirical Mode Decomposition (EMD) decomposes the signal into Intrinsic Mode Functions (IMFs) plus a residual (35). It peels the signal to isolate the natural, local time-varying frequencies, suitable for transient/nonstationary features. In the inverse source localisation problem, the linear model  x   =   L j   +   n  is used [230]. Beamformers construct spatial filters  w  that maximise the gain at the location of interest and minimise the power elsewhere, the formulation and closed-form solution are given in (36) [231,232]. The Minimum-Norm Estimate (MNE) favours the solution with the lowest total energy and is written as regularised least-squares (37), so when several scenarios explain the data, we choose the most parsimonious explanation [233]. Variants such as LORETA/sLORETA/eLORETA introduce spatial smoothness terms (matrix  R ) in the normalised problem for smoother and more reliable source allocations (38) [234]. Dipole fitting is expressed as a nonlinear optimisation for fitting a few dipoles  θ , q  that explain the signal and formulated in (39) [235,236]. These methodologies are chosen or combined depending on the objective (e.g., beamformer for targeted spatial isolation, MNE/LORETA for distributed evoked maps, dipole fitting for focal sources), while decomposition techniques (NMF, dictionary learning, EMD) are often used as pre-processing or to extract features.
min W , H 0   X W H   F 2 ,
min D , A     X DA   F 2 + λ   A   1   ,   s . t .     d k   2 1 k ,
x t = k = 1 K I M F k t + r t ,
min w   w R w ,   s . t .   w l = 1 w = R 1 l l R 1 l ,
j ^ = a r g   min j   x L j   2 2 + λ   j   2 2 j ^ = ( L L + λ I ) 1 L x ,
j ^ = ( L L + λ R ) 1 L x ,
min θ , q   x L θ q   2 2 ,

6.4. Connectivity and Network Analysis

Another domain of EEG analysis is the study of functional and causal relationships between brain regions based on temporal, spectral and probabilistic correlations. Functional connectivity measures statistical correlation or coherence between pairs of signals: Pearson satisfies the need for linear correlation between two time series (40), while Coherence expresses the frequency-dependent linear relationship through spectral densities (41) [237]. In practice, high  r  or coherence values in a specific band indicate possible involvement in common cognitive processes and are used in studies of hemispheric cooperation, sleep, and pathologies such as epilepsy [238]. Causal connectivity is based on MVAR models that predict the present from the past; in Granger’s formulation, if the predictive power of one signal is improved by adding the past terms of another, we consider that Granger causality exists (42). Spectral extensions of this framework give the frequency transfer function  H f  (43) and directivity measures such as the DTF (44) and PDC (45) mentioned before, which represent the flow of information per frequency; these are useful for identifying information flow in sensory networks and pathological conditions [239].
r x y = E x μ x y μ y σ x σ y ,
C x y f = S x y f 2 S x x f S y y f ,
x t = k = 1 p A x x k x t k + k = 1 p A x y k y t k + e t ,
H ( f ) = ( I k = 1 p A k e i 2 π f k ) 1 ,
D T F i j f = H i j f 2 k H i k f 2 ,
P D C i j f = A i j f k A k j f 2 ,
Effective connectivity transcends the statistical correlations of functional connectivity by quantifying the directed, causal influence one neural system exerts over another, thereby revealing the brain’s intrinsic hierarchical architecture [13]. The theoretical framework for this is primarily divided into predictive approaches, such as Wiener-Granger Causality, which relies on temporal precedence and variance reduction, and generative approaches like DCM, which infers causality by inverting biophysical models of neural masses [240]. A pervasive challenge in EEG-based EC analysis is volume conduction mentioned previously, where the instantaneous propagation of electrical fields can generate spurious zero-lag interactions [241,242]. Consequently, recent studies emphasise that robust EC estimation should ideally be performed on source-reconstructed signals rather than sensor-level data to mitigate signal leakage and improve spatial validity. While linear estimators like DTF and PDC are standard for identifying frequency-specific information flow, they may fail to capture the brain’s inherent non-linear dynamics. To address this, Information-theoretic methods such as Transfer Entropy (TE) [243], which measures the directional nonlinear information from Y to X (46), provide a model-free framework for detecting non-linear directed coupling [244]. More recent advancements have introduced robust variants like Phase Transfer Entropy (PTE) and Symbolic Transfer Entropy (STE), which are specifically designed to be resilient against amplitude artefacts and volume conduction effects [245]. These metrics have emerged as critical biomarkers in clinical neurophysiology, successfully characterising pathological circuit disruptions in stroke recovery, Alzheimer’s disease, and psychiatric disorders where standard spectral analyses often fail to detect subtle network failures [246,247,248,249]. Finally, Mutual Information [250] measures the general mutual dependence between two variables (47); the intuition here is that these metrics detect relationships that are not constrained by linearity or Gaussian assumptions and are therefore applicable to complex cognitive processes, judgments, and consciousness studies.
T E Y X = p x t + 1 , x t , y t log p x t + 1 x t , y t p x t + 1 x t ,
M I X , Y = x , y p x , y log p x , y p x p y ,
Phase methods focus on phase synchronisation: Phase-Locking Value (PLV) measures the stability of the phase difference (48), Phase-Lag Index (PLI) excludes zero phase lags to reduce the effect of volume conduction (49), while weighted-PLI (50) and Amplitude Envelope Correlation (AEC) (51) provide variants that emphasise reliable, non-zero phase contributions or amplitude envelope correlation; these measures are widely used to study phase locking in attention, perception, neurofeedback, and epilepsy. Cross-Frequency Coupling (CFC), and especially Phase-Amplitude Coupling (PAC), examines whether the phase of a low frequency modulates the amplitude of a higher one; this can be measured simply as phase-amplitude correlation or with the Modulation Index (Tort) based on Kullback–Leibler divergence (52), and serves to highlight hierarchical neural interactions (e.g., memory, attention) [251,252,253].
P L V = 1 N n = 1 N e i ϕ x n ϕ y n ,
P L I = 1 n t = 1 n s g n I m e i φ j t φ k t ,
w P L I = E Im S x y E Im S x y ,
A E C = corr e n v x t , e n v y t ,
P A C corr p h a s e low t , a m p high t , M I = D K L P U log N ,
Symbolising connectivity as a graph G(V,E) allows the calculation of indicator graphs such as modularity  Q  (53), global efficiency (54), clustering coefficient (55), and centrality metrics, which quantify the organisation, efficiency, and roles of nodes in the network. Such indices are applied in resting-state studies and in diseases such as Alzheimer’s [254]. Since connectivity varies over time, time-varying approaches are applied: sliding windows for time-local estimates or Kalman-based/recursive MVAR models [255] where the parameters  A t  change over time and are estimated using Bayesian or recursive filters (56); these are crucial for studying dynamic FC, microstates, and task-related changes. DCM model mentioned previously approaches causality with biophysically parameterised state-space models (57) and Bayesian inversion to estimate the effect of experimental factors on connections; DCM allows comparison of causal structure hypotheses and is widely used in Statistical Parametric Mapping (SPM) for EEG/MEG [173]. Finally, the Phase Synchronisation Index (PSI) generally measures the stability of the phase difference (58) and is used to evaluate synchronisation in sensory, motor, and cognitive processes [256]. In all of the above approaches, the choice of the appropriate metric depends on the background of the hypothesis (linear vs. nonlinear, statistical vs. causal), sensitivity to volume conduction, desired spatial/temporal resolution, and the semantics of the study; practical factors such as spectral density estimation, noise model, normalisation, and correct estimation of hyperparameters are critical for reliable conclusions.
Q = 1 2 m i , j A i j k i k j 2 m δ c i , c j ,
E g l o b = 1 N N 1 i j 1 d i j ,
C i = 2 T i k i k i 1 ,
x t = A t x t 1 + w t ,
x ˙ = A x + B u + C ,
P S I = 1 N n = 1 N e i ϕ x n ϕ y n ,

6.5. Nonlinear and Chaotic Analysis

EEG activity is an extremely complex and nonlinear signal that reflects the collective action of large neural populations. Brain dynamics are often characterised by nonlinear interactions, transient states, and chaotic behaviour, which makes the use of purely linear models inadequate for their description [14]. For this reason, a wide range of nonlinear and chaotic analysis methods have been developed, which allow the investigation of the stability, complexity, and predictability of EEG time-series, offering a deeper understanding of the underlying neural dynamics. One of the fundamental measures in chaos theory is the Lyapunov exponents, which describe the average rate of divergence of two neighbouring trajectories in phase space. If two initial states differ by  d x 0 , then the Lyapunov exponent is defined as the logarithm of the ratio of the final to the initial distance, divided by time (59). Positive values of λ indicate sensitivity to initial conditions and therefore chaotic behaviour. In EEG analysis, the presence of positive Lyapunov exponents has been linked to unstable and chaotic neural states, such as in epileptic seizures, while negative or zero values correspond to stable or periodic states, such as sleep or anaesthesia [257,258].
λ = lim t 1 t ln δ x t δ x 0 ,
While Lyapunov exponents measure the predictability or divergence of trajectories (chaos), Fractal Dimension (FD) measures the complexity or “roughness” of the signal structure and is another fundamental measure of complexity, as it expresses how the detail of a signal changes with the scale of observation. Using the box-counting approach, this dimension is defined as the limit of the ratio of the logarithms of the number of squares  N ϵ  required to cover the signal trajectory and their inverse size (60) [259,260]. To investigate the dynamic structure of the EEG, reconstruction of the attractor in phase space is often used, according to Takens’ theorem. Each point of the signal can be represented as a state vector  x t  through the delay  τ  and the embedding dimension m, as shown in Equation (61). The geometry of the reconstructed tractor can be described by the correlation dimension  D 2 , defined by the correlation function  C r  in Equation (62). Recent studies have solidified the utility of FD as a sensitive biomarker. For instance, resting-state EEG complexity has been shown to trace the trajectory of brain maturation, increasing significantly from childhood to adolescence as neural networks prune redundant connections and optimise integration [261]. Conversely, neurological disorders such as Alzheimer’s Disease, Parkinson’s Disease, and epilepsy manifest as distinct alterations in this complexity, often detectable before standard spectral changes [262,263]. Fractal-based approaches have become essential for capturing the “self-organised criticality” and structural complexity of neural networks that spectral analysis often misses. Two of the most prominent time-domain algorithms are Higuchi’s Fractal Dimension (HFD) [264] and Katz’s Fractal Dimension (KFD) [265]. HFD is widely regarded as the most accurate method for estimating the theoretical complexity of physiological signals, making it highly effective for tracing neurodevelopmental trajectories and analysing short data epochs. However, HFD is highly sensitive to high-frequency noise, so it must be paired with rigorous preprocessing, such as ICA, to prevent artefact-driven inflation of complexity values [266]. In contrast, Katz’s algorithm calculates complexity based on the geometric spread (diameter) of the waveform rather than its path length variance. This approach makes KFD significantly more robust to noise and artefacts, rendering it the preferred choice for ambulatory epilepsy monitoring and real-time applications where signal quality varies. However, this robustness comes at the cost of lower sensitivity to fine-grained physiological changes compared to HFD. Another approach, Sevcik’s Fractal Dimension [267] approximates complexity by normalising the signal into a unit square and calculating the curve length. For applications requiring minimal computational power, such as wearable devices, Petrosian’s Fractal Dimension (PFD) [268] is often preferred. PFD estimates complexity by converting the analogue EEG signal into a binary sequence based on derivative sign changes. For analysing long-range temporal correlations, Multifractal Detrended Fluctuation Analysis (MFDFA) offers a superior framework. MFDFA characterises the spectrum of scaling exponents, allowing researchers to distinguish between intrinsic multifractality and complexity arising from heavy-tailed distributions. This capability makes MFDFA particularly valuable for characterising stable states such as sleep stages and depth of anaesthesia, though it requires longer data epochs (typically > 6 s) to achieve statistical stability [269,270].
The applicability of these methods has expanded significantly in recent years, particularly in BCI and emotion recognition. A benchmarking study demonstrated that no single fractal feature is sufficient; instead, a feature fusion approach combining Katz, Box-counting, and Correlation Dimension achieved a classification accuracy of 79.2% in motor imagery tasks, outperforming state-of-the-art methods [271]. Similarly, “Fractal Spike Neural Networks”, which mimic biological fractal connectivity, have been developed to extract temporal–spectral–spatial features, setting new performance standards for EEG-based emotion recognition on benchmark datasets like SEED-IV [272]. Finally, these nonlinear metrics serve as critical biomarkers for pathology. In neurodegenerative research, a 2024 study introduced Fractal Dimension Distributions (FDD), revealing that a specific reduction in EEG complexity can effectively distinguish Alzheimer’s dementia from subjective cognitive impairment, often providing greater diagnostic sensitivity than mean fractal values alone [273]. In epilepsy, the field has moved toward hybrid prediction models that feed fractal features into DL architectures. These models leverage the temporal evolution of signal complexity to detect the pre-ictal state with higher accuracy than traditional spectral power or isolated nonlinear metrics [274].
D = lim ϵ 0 log N ϵ log 1 ϵ ,
x t = x t , x t + τ , , x t + m 1 τ ,
C r = 2 N N 1 i < j H r x i x j , D 2 = dln C r dln r ,
Approximate entropy (ApEn) introduces a statistical measure of signal regularity. It is defined as the difference between the probability that similar patterns of length  m  remain similar when extended by one dimension, as given in Equation (63). Low ApEn values indicate predictable signals, while high values indicate greater complexity. Sample Entropy (SampEn) improves ApEn by excluding self-correlations and is defined as the negative logarithm of the ratio of probabilities A and B of two similar patterns of length  m + 1  and  m  (64). SampEn is less biased and more reliable in short EEG sequences, which is why it is used in studies of sleep, consciousness, and cognitive load. Multiscale Sample Entropy (MSE) extends SampEn by calculating entropy at different time levels through “coarse-graining” of the signal. Each new sequence is obtained as the average of the values within windows of length  τ  according to Equation (65). In this way, MSE provides a profile of complexity at multiple time scales, while the Refined Composite MSE (RCMSE) version offers improved stability for short EEG time-series [275]. Kolmogorov entropy (K) links chaos theory with information theory, expressing the rate at which new information is generated in a dynamic system. It is defined as the sum of all positive Lyapunov exponents, as shown in Equation (66). High  K  values correspond to increased chaos and rapid loss of predictability, characteristics that have been observed in EEG states of increased neural excitation. Permutation entropy (PE) offers a quick and efficient assessment of EEG complexity. It converts the time series into permutation sequences and calculates the Shannon entropy of their probabilities, as given in Equation (67). This measure is particularly robust to noise and has been used extensively in applications such as seizure detection and anaesthesia monitoring. TE mentioned before measures the nonlinear, directed flow of information between two time series and is defined as the conditional information difference between probability distributions. Unlike Granger’s Linear Causality, TE does not assume linearity or normality and has been widely used to study the effective connectivity of the brain [276,277].
A p E n m , r , N = ϕ m r ϕ m + 1 r ,
S a m p E n m , r , N = ln A B ,
y j τ = 1 τ i = j 1 τ + 1 j τ x i ,
K = λ i > 0 λ i ,
P E = i = 1 n ! p i ln p i ,
Another fundamental measure is the Hurst exponent (H), which quantifies the long-term dependence and self-similarity of the signal. The relationship between the  R S  ratio and the sample size  N  follows Equation (68).  H  values greater than 0.5 indicate persistent behaviour, while values less than 0.5 correspond to mean-reverting processes. The measurement of H has been applied in the analysis of fatigue, ageing, and cognitive levels [278,279]. Finally, Recurrence Quantification Analysis (RQA) is based on the creation of a recurrence plot, i.e., a table that records when the system’s trajectory returns to previous states. This table is defined as (69) where  Θ  is the Heaviside function and  ϵ  is the proximity radius. This table yields indices such as the repetition rate, determinism, and stationarity, which reveal transitions between different brain states [280,281,282]. Overall, the above nonlinear and chaotic methods provide a multidimensional approach to the study of brain activity.
R S N H ,
R i j = Θ ϵ x i x j ,

6.6. Machine Learning and Deep Learning

The analysis of EEG signals has been reshaped in the last half-decade. Τhe manual feature engineering that defined the late 20th century paved the way for sophisticated data-driven approaches, more suitable for the dynamic human brain signals [283]. A plethora of Machine Learning (ML) algorithms have been used, and the choice of classifier depends heavily on the dataset size and the requirement for interpretability. Support Vector Machines (SVM) are the most robust classifier for “small data” regimes, which characterises many BCI pilot studies (where N < 20 subjects) [284]. For offline analysis where computational speed is less critical, ensemble methods like Random Forest and Gradient Boosting (XGBoost, LightGBM) have seen increased use [285,286]. While conventional ML operates in Euclidean space, a significant theoretical advancement in the 2020–2025 period has been the widespread adoption of Riemannian Geometry [287]. This approach represents a “middle ground” between classical ML and DL as it is mathematically rigorous, highly accurate, but does not require the massive parameters of a neural network. Manifold-Based Classification Riemannian methods utilise the Affine Invariant Riemannian Metric (AIRM) to calculate distances on this manifold. The geodesic distance (the shortest path along the curve) between two covariance matrices supply with a much more robust measure of similarity than the Euclidean distance [288,289]. The most successful implementation of this is the Tangent Space Mapping (TSM) approach. This “Riemannian-Geometry-based” pipeline currently holds state-of-the-art accuracy for motor imagery BCI tasks on small datasets, outperforming complex Convolutional Neural Networks when training data is limited. Techniques like Riemannian Procrustes Analysis (RPA) allow researchers to geometrically align the data manifolds of different subjects [290,291,292]. By translating, rotating, and scaling the covariance matrices of a new user to match the “average user” manifold, models can be trained on a database of existing subjects and applied to a new subject with minimal calibration. This capability is critical for the development of “plug-and-play” BCI systems.
The fundamental advantage of DL is the end-to-end learning as the model learns the feature extraction steps itself, without the need for manual selection of frequency bands or entropy measures. CNNs are the most reliable tool of modern EEG analysis, utilised in over 90% of DL studies until 2023 [283]. EEGNet [293] is a lightweight architecture that uses depthwise separable convolutions. It first applies a temporal convolution (acting as a frequency filter) and then a spatial convolution (acting as a spatial filter) separately. DeepConvNet [294] and ShallowConvNet [295] use varying depths to capture different levels of abstraction. ShallowConvNet is designed to extract logarithmic band power features, while DeepConvNet can learn more complex, hierarchical representations. Multi-Branch CNNs [296] process the raw signal in parallel streams with different kernel sizes (e.g., small kernels for gamma waves, large kernels for delta waves). The branches are then fused, allowing the model to capture multi-scale temporal dynamics simultaneously. While CNNs excel at extracting local features, they struggle with long-term dependencies (e.g., a sleep cycle lasting 90 min). To address this, Long Short-Term Memory (LSTM) networks and Gated Recurrent Units (GRU) are employed. The standard architecture is the C-RNN (Convolutional-Recurrent Neural Network) [297,298,299]. In this hybrid model a CNN front-end extracts spatial–spectral features from short EEG chunks (e.g., 1 s). These feature vectors are fed as a sequence into an LSTM back-end. The LSTM captures the temporal evolution of the brain over longer periods (e.g., 30 s). The introduction of the Attention Mechanism has refined EEG models. For instance, channel attention can be utilised so the model learns which electrodes are most relevant for the current task and suppresses noise from irrelevant channels. Temporal attention makes the model focus on the specific time segments containing the event of interest while ignoring the background EEG activity. Now, attention blocks are routinely inserted into CNN architectures (e.g., SE-Net blocks) to improve performance and interpretability [300]. Large Vision Models (LVMs) are being applied to EEG by treating time-frequency or spatial EEG representations as images. For example, spectrograms from intracranial EEG (iEEG) were fed to a Vision Transformer (ViT) for seizure detection [301]. This ViT achieved 96.8% accuracy (5-fold CV) on a large iEEG dataset and outperformed a ResNet-50 and standard CNN baseline. Similarly, hybrid models fuse CNNs and ViTs: the CMFViT model converts scalp EEG into wavelet-derived images and processes them with parallel CNN and ViT streams. On the CHB-MIT seizure corpus and a large Kaggle epilepsy set, CMFViT achieved ~98.9% accuracy in single-subject tests, outperforming many prior methods [302]. These vision-based approaches capture both local features (via CNN) and global temporal context (via self-attention) from EEG.
One of the most theoretically grounded advancements in recent years is the application of Graph Neural Networks (GNNs) [303]. GNNs treat the electrodes as nodes in a graph, connected by edges representing their relationship, taking advantage of the actual 3D geometry of the head. In static graphs, edges are defined by physical Euclidean distance (e.g., Fp1 is connected to Fp2 but not O1). In functional graphs, edges are defined by functional connectivity measures (e.g., Phase Locking Value or Coherence). If two regions oscillate in sync, they are connected, regardless of physical distance. The most advanced GNNs employ Dynamic Graph Learning. Since functional connectivity in the brain is not static but evolves on a millisecond timescale, these models learn the adjacency matrix from the data itself at each time step. The network dynamically strengthens or weakens connections between nodes, effectively learning the evolving functional network of the brain during a cognitive task. GNNs have demonstrated superior performance in Emotion Recognition, a task heavily dependent on the functional interplay between frontal and temporal lobes. By explicitly modelling these interactions, GNNs capture the global network patterns of emotion that local CNN filters miss [304,305,306].
Transformer-based language models (LLMs) are being leveraged to interpret EEG as if it were a form of “language” [307,308]. A 2025 survey [19] categorises recent work on LLMs and EEG into (1) foundation models (e.g., EEG-pretrained transformers), (2) EEG-to-language decoding, (3) cross-modal generation (images, 3D objects), and (4) clinical applications and dataset management tools. For instance, LaBraM [16] addresses the heterogeneity of EEG data (different datasets use different electrode caps). It segments signals into “EEG channel patches” and uses a Neural Tokenizer trained via Vector-Quantised Neural Spectrum Prediction. It is pre-trained on approximately 2500 h of data using a Masked Modelling objective (like BERT). Random patches of the EEG are masked, and the model must reconstruct the missing neural codes. This forces the model to learn the grammar of neural dynamics. Similarly, BrainBERT [309] and NeuroBERT [310] utilise masked auto-encoding on intracranial and scalp EEG, respectively. These foundation models aim to create a “universal embedding” space for brain signals. A key capability of these models is Cross-Subject Generalisation. Because they are trained on thousands of individuals, they learn features that are invariant to the anatomical differences between subjects, addressing the “calibration problem” that has plagued BCI for decades. In practice, researchers have fine-tuned models like GPT or BERT to decode EEG signals into text or semantic labels. For instance, prefix-tuning methods (BELT-2) align EEG embeddings with a GPT-style decoder to translate imagined speech EEG into words [311]. Language models have also been used to assign rich interpretations to EEG: one approach uses LLaMA-2 as an intermediary to guide EEG-to-image generation via a diffusion pipeline, effectively translating EEG features into visual reconstructions of perceived stimuli [312]. In application domains, LLM-based EEG systems have tackled emotion recognition, mental-health screening, and motor imagery. Lightweight “EEG Emotion Copilot” systems use a 0.5B-parameter transformer to classify affective states and even generate personalised feedback [313]. In mental-health, GPT-4 was adapted to classify EEG records for schizophrenia diagnosis, producing results aligned with clinical patterns. Beyond decoding, LLMs assist data management; for example, EEGUnity uses an LLM to parse, clean, and standardise datasets and reports across heterogeneous EEG sources [314]. Overall, transformer models bring powerful sequence modelling and semantic understanding to EEG tasks, enabling natural-language and multimodal interpretations of brain signals.
The scarcity of labelled EEG data has motivated use of generative models (variational autoencoders (VAEs), GANs, diffusion models) to create synthetic EEG and improve training [15]. A recent survey notes that EEG generative methods based on VAEs, GANs, and diffusion models have been widely explored to augment BCI datasets [315]. For example, conditional VAE-GAN hybrids have been trained on motor-imagery EEG: one study reported a CVAE-GAN achieving the best inception and SWD scores on synthetic EEG samples, and improving downstream MI classification accuracy over CNN or standard GAN baselines [316]. Other works transform EEG features (e.g., differential entropy) into image-like inputs for generative models (e.g., VAE-D2GAN) to augment emotion-recognition data [317]. The scarcity of high-quality labelled EEG data (especially for rare pathologies) is a major bottleneck. Denoising Diffusion Probabilistic Models (DDPMs) are now used to generate synthetic EEG signals. The model learns the data distribution by reversing a gradual noise addition process. Starting from pure Gaussian noise, it iteratively “denoises” the signal to produce a realistic EEG waveform. Recent studies confirm that this synthetic data preserves the spectral and temporal characteristics of real EEG (e.g., Visual Evoked Potentials) and can be used to train classifiers, improving accuracy [318,319,320]. DreamDiffusion [321] is a framework that uses a Stable Diffusion backbone but replaces the text encoder with a pre-trained EEG encoder. It generates images directly from EEG signals recorded while a user looks at a picture. NeuroDM [322] uses an EEG-Visual-Transformer (EV-Transformer) to extract visual-semantic features from the EEG. These features then guide a diffusion model (EEG-Guided Diffusion) to reconstruct the image. NeuroDM has achieved state-of-the-art results, generating images that not only match the category (e.g., “dog”) but also the structural composition of the stimulus seen by the user. Beyond generating images, diffusion is used for feature learning. EEGDM [323] employs a Structured State-Space Model (SSM) within a diffusion framework. By training the model to denoise EEG signals, the SSM captures the complex temporal dynamics of the brain. The latent representations learned during this process are then fused and used for highly accurate classification of seizures, outperforming standard Transformer baselines.
Self-Supervised and Contrastive Learning are emerging paradigms used to further reduce reliance on labelled data [324,325]. Contrastive learning trains a model to recognise that two augmented versions of the same EEG segment are “similar” (positive pair) while distinguishing them from other segments (negative pairs). The challenge in EEG is defining “augmentation” because unlike images, it cannot be rotated or cropped without destroying information. Signal-Transformation Contrastive Learning, applies physiologically plausible augmentations like permutation of symmetric channels, time-warping, or adding band-limited noise. Frameworks like MoCo [326] allow for massive queues of negative samples, enabling the model to learn highly discriminative features without a single label. Contrastive Predictive Coding (CPC) trains the model to predict the future latent representation of the signal based on the past. This forces the model to encode information that is slow-varying and predictive (like the underlying brain state) while discarding fast-varying noise [327].
A new trend is integrating EEG with other modalities using multimodal AI frameworks. These include combining EEG and eye tracking, video, or text in unified models. For video understanding, the “CognitionCapturer” framework maps EEG into CLIP’s image embedding space: separate “modality expert” encoders extract EEG features, then a diffusion prior aligns them with CLIP image features [328]. A pretrained image generator then reconstructs the viewed image with high fidelity, all without end-to-end fine-tuning. Similarly, the Hypergraph Multimodal LLM (HMLLM) was designed to analyse viewers’ EEG and eye-tracking data while watching videos. By treating EEG, gaze, and video context as nodes in a hypergraph, HMLLM learns associations across modalities and demographics to evaluate video understanding [329]. Another example is EEG-CLIP, which uses contrastive learning to align EEG snippets with clinical text reports in a shared embedding space, enabling zero-shot EEG classification and retrieval of related medical notes [330]. Adapter-based systems even transform EEG into either pseudo-text or images on-the-fly, so that existing LLMs or ViTs can process them [331]. In affective and clinical settings, multimodal models fuse EEG with facial expression, audio, or physiological signals for richer emotion/mental-state estimation [332].
As EEG models become more complex, explainability is a critical concern. Consequently, many XAI techniques have been adapted to EEG. Backpropagation-based saliency methods (Grad-CAM, GuidedGradCAM, etc.) are commonly used to highlight important EEG channels or time segments [333]. However, not all methods are reliable: a recent empirical study found that standard gradient saliency maps are not model-specific for EEG, whereas the DeepLift method consistently identified the true contributing features (temporal, spatial, spectral) even after model randomisation [334]. Methods like SHAP and LRP are now routinely applied to EEG models [335,336]. They generate heatmaps showing which time points and electrodes contribute to the prediction. SHERPA [337] uses XAI not just for validation but for discovery. By calculating the “importance score” of every point in an ERP, SHERPA has identified subtle cognitive processes (like negative selection mechanisms) that standard statistical analysis missed. The “averagedLIME” method aggregates LIME explanations over multiple EEG trials to extract common saliency patterns. In two case studies it revealed coherent, neurophysiologically plausible regions that individual Grad-CAM or SHAP maps had missed. Compared to single-sample XAI, averagedLIME produced more consistent, generalisable explanation maps across conditions [338]. For clinical EEG, attention maps and saliency highlights have been used to validate that models base decisions on meaningful waveform features. For motor-imagery BCIs, relevance maps can show which electrodes or time-windows are most discriminative. In affective computing, explaining why an EEG sample was classified as “anger” vs. “calm” can help ensure the system is using known spectral markers of emotion [339,340,341]. Overall, explainability techniques are now being prioritised in EEG research to make AI models’ outputs interpretable and clinically meaningful. Figure 6 summarises the diverse methods applied to EEG data.

7. Future Trajectory and Evolution

EEG analysis is rapidly advancing toward the incorporation of integrated approaches that combine multimodal data, real-time edge computing, improved algorithm interpretability, and solutions to ethical issues. These efforts aim to bridge the gap between EEG’s excellent temporal resolution and its limited spatial information and move neuroscientific discoveries from the lab into practical, everyday use. The goal is to guarantee responsible, equitable, and safe use of neurotechnologies. Integrating other imaging modalities into EEG analysis combines the millisecond neuronal fluctuations recorded by EEG with the high spatial precision of fMRI and the complementary sensitivity of MEG. Successful fusion requires careful safety and artefact management. The integration of other biomarkers (fNIRS, EMG, ECG, GSR, eye-tracking, accelerometers) in polymorphic frameworks provides complementary information for neurophysiological and psychophysiological states [342,343,344,345]. The transition to real-time applications at the edge faces latency, privacy, and energy consumption challenges. Classic algorithms with a small footprint (LDA, linear SVM) remain useful, while architectures such as EEGNet show that deep models can be sufficiently compressed (pruning, quantisation) to run on ARM processors via TensorFlow Lite Micro or ONNX Runtime Mobile [346]. Design practices for the edge include stream-oriented buffer management with sliding windows, incremental feature updates to avoid re-computation, event-driven subroutine activation to save energy, and dynamic online model adaptation with trust mechanisms to prevent catastrophic forgetting. Local processing limits the exposure of raw neural data, while encrypted channels and limited transmission of fused features ensure privacy.
Recent years have seen rapid advances in applying cutting-edge AI models to EEG data. Generative models have been used to synthesise realistic EEG signals for data augmentation, while large pretrained language and vision models are being adapted to neural decoding. Multimodal frameworks combine EEG with other inputs (e.g., images, video, text), and XAI techniques are increasingly emphasised to interpret model decisions. Interpretability is a key prerequisite for clinical trust and responsible implementation. Strategies include integrating attention mechanisms into networks to visualise temporal/spatial “zones” of importance, using Grad-CAM on spectrograms. Also, methods such as SHAP and fidelity tests (retraining on top-features) assess the reliability of explanations. Finally, the ethical and social consequences require careful policy: strong privacy measures (local preprocessing, encryption, retention limitation), dynamic and informed consent, algorithm evaluation and mitigating bias due to disparities in signal quality. In applications of job control or educational assessment, rules are needed that prevent coercive use or stigmatisation. It is evident that data ownership and sharing agreements need to be carefully controlled to strike a balance between the benefit of open science and the rights of participants [347]. The field of EEG analysis is poised to transition from task-specific DL architectures to Universal Brain Foundation Models. Future methodologies will likely centre on calibration-free systems, were massive pre-training on thousands of hours of diverse neural data, exemplified by Large EEG Models (LEMs), effectively eliminating the historic bottleneck of subject-specific training sessions. Additionally, the integration of Generative Diffusion Models is expected to evolve from simple data augmentation to sophisticated Neural Decoding, enabling the direct, high-fidelity reconstruction of visual and semantic content from brain activity. This paradigm shift will be anchored by advanced XAI frameworks, which are critical for decoding the “black box” of these foundation models to discover novel neurophysiological biomarkers, thereby bridging the gap between computational power and clinical trust.
Overall, the future trajectory of EEG analysis is expected to shift toward more advanced, all-in-one solutions that are faster, more user-friendly, and ethically sound. Combined with better sensors and processing devices, this will enable a responsible transition from the laboratory to a wide range of applications.

Author Contributions

Conceptualisation, C.K., K.T. and S.M.; writing—original draft preparation, C.K.; writing—review and editing, C.K., K.T. and S.M.; visualisation, C.K.; supervision, K.T. and S.M. All authors have read and agreed to the published version of the manuscript.

Funding

The publication fees of this manuscript have been financed by the Research Council of the University of Patras.

Data Availability Statement

No new data were created or analysed in this study.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
ACNSAmerican Clinical Neurophysiology Society
ADCAnalogue-to-digital converter
ADHDAttention-deficit/hyperactivity disorder
AECAmplitude envelope correlation
AHPAfterhyperpolarisation
AICAkaike information criterion
AIRMAffine invariant Riemannian metric
ApEnApproximate entropy
ARIMAAutoregressive integrated moving average
ARMAAutoregressive moving average
ARMAXAutoregressive moving average with exogenous inputs
ASLTAdaptive superlet transform
ASRArtefact subspace reconstruction
BCIBrain–computer interface
BICBayesian information criterion
BIDSBrain imaging data structure
BLEBluetooth low energy
CFCCross-frequency coupling
CMRRCommon mode rejection ratio
CNNConvolutional neural network
ConceFTConcentration of frequency and time
CPCContrastive predictive coding
CTFTContinuous time Fourier transform
CWTContinuous wavelet transform
DCMDynamic causal modelling
DDPMDenoising diffusion probabilistic model
DFTDiscrete Fourier transform
DLDeep learning
DTFDirected transfer function
DWTDiscrete wavelet transform
ECGElectrocardiogram
ECoGElectrocorticography
EEGElectroencephalography
EGDEsophagogastroduodenoscopy
EMExpectation-maximisation
EMDEmpirical mode decomposition
EOGElectrooculography
EPSPExcitatory postsynaptic potential
ERPEvent-related potential
EV-TransformerEEG-visual-transformer
FARIMAFractional autoregressive integrated moving average
FDFractal dimension
FDAFood and Drug Administration
FDDFractal dimension distribution
FFTFast Fourier transform
fMRIFunctional magnetic resonance imaging
fNIRSFunctional near-infrared spectroscopy
FOOOFFitting oscillations and one-over-f
GABAGamma-aminobutyric acid
GANGenerative adversarial network
GARCHGeneralised autoregressive conditional heteroskedasticity
GNNGraph neural network
GPSGlobal positioning system
GSRGalvanic skin response
HHurst exponent
HFDHiguchi’s fractal dimension
HFOHigh-frequency oscillation
HMLLMHypergraph multimodal large language model
HMMHidden Markov model
ICAIndependent component analysis
iEEGIntracranial electroencephalography
IFInstantaneous frequency
IFCNInternational federation of clinical neurophysiology
IMFIntrinsic mode function
IMUInertial measurement unit
IPSPInhibitory postsynaptic potential
IRASAIrregular resampling auto-spectral analysis
IVAIndependent vector analysis
KKolmogorov entropy
KFDKatz’s fractal dimension
LEMLarge EEG model
LFPLocal field potential
LLMLarge language model
LORETALow resolution electromagnetic tomography
LSTMLong short-term memory
LVMLarge vision model
MAMoving average
MCSMinimally conscious state
MEGMagnetoencephalography
MFDFAMultifractal detrended fluctuation analysis
MLMachine learning
MNEMinimum-norm estimate
MSEMultiscale sample entropy
MVARMultivariate vector autoregressive
NMFNon-negative factorisation
PACPhase-amplitude coupling
PCAPrincipal component analysis
PCIPerturbational complexity index
PDCPartial directed coherence
PEPermutation entropy
PFDPetrosian’s fractal dimension
PLIPhase-lag index
PLVPhase-locking value
PSDPower spectral density
PSIPhase synchronisation index
PTEPhase transfer entropy
PTPPrecision time protocol
RCMSERefined composite multiscale sample entropy
RPARiemannian Procrustes analysis
RQARecurrence quantification analysis
SampEnSample entropy
sEEGStereotactic electroencephalography
SETSynchroextracting transform
SLTSuperlet transform
SNRSignal-to-noise ratio
SPMStatistical parametric mapping
SSMStructured state-space model
SSTSynchrosqueezing transform
STESymbolic transfer entropy
STFTShort-time Fourier transform
STStockwell transform (S-transform)
SVMSupport vector machine
TBITraumatic brain injury
TETransfer entropy
TFDTime-frequency distribution
TFRTime-frequency representation
TMSTranscranial magnetic stimulation
TSMTangent space mapping
TTLTransistor-transistor logic
UWSUnresponsive wakefulness syndrome
VAEVariational autoencoder
VARVector autoregressive
WPDWavelet packet decomposition
XAIExplainable artificial intelligence

Appendix A

Just as water can flow to lower potential energy levels and produce work, so too do the individual electrical charges on the two sides of the neuronal membrane have the potential to move and create an electric current, provided they are given a path for discharge. If the membrane were completely permeable to ions, this stored energy would be converted into kinetic energy of the ion flow, and the charge difference would be neutralised. However, when the membrane is impermeable, the excess charge has no outlet and the energy remains stored. The size of this storage depends both on the size of the charge difference and on properties of the membrane such as its thickness and surface area. This charge-storing capability, known as capacitance  C  and the behaviour of an ideal capacitor is described by the basic Equation (A1) relating stored charge  Q  to voltage  V m . Its time derivative yields a new Equation (A2) describing how voltage changes as charge redistributes.
Neurons, however, are not simple ideal capacitors. The membrane is not completely impermeable; it contains specialised protein channels that open and close, allowing the flow of specific ions. These ion channels can be thought of as resistors operating in parallel with the capacitor. When ions cross the membrane through these channels, electrical currents are created that change the charge distribution. Since current is defined as the rate of change in electrical charge, we can rewrite the capacitor equation so that the right-hand side corresponds to the sum of all the ionic currents that pass through the membrane. This Equation (A3) is the foundation for describing neuronal dynamics and relates the rate of change in membrane voltage to the various currents.
An action potential is a short, intense electrical impulse that results mainly from a coherent sequence of sodium  N a +  and potassium  K +  ion flows through channels in the membrane. It begins when the membrane voltage slightly exceeds its resting level and crosses a certain threshold. This depolarisation activates sodium channels, allowing positive charge to enter the cell, which causes further depolarisation (positive feedback). After a while, the sodium channels deactivate, stopping the influx, while potassium channels open and allowing K+ to efflux, returning the membrane voltage to its resting level. The entire process takes place within a few milliseconds. The rate of ion flow through the channels depends predominantly on two factors: the number of open channels and the driving force acting on the ions. The driving force comes from two physical processes: (1) diffusion, which the tendency of particles to move from areas of high concentration to areas of low concentration, and (2) electric force, which is the attraction or repulsion between charged particles. These two forces often counteract each other. For any given concentration ratio, there is a specific value of the membrane voltage, the equilibrium potential  E e q , at which there is no net flow of ions. The ionic current for each ion species is proportional to the difference between the actual membrane voltage and the equilibrium potential. This ratio is defined as the conductance  g i o n , which reflects how many channels are open and how easily ions flow. The Equation (A4) expresses Ohm’s law for biological channels.
Many ion channels are voltage-gated, meaning that their conductance varies with the membrane voltage. Consequently, we often express it as the product of two terms (A5): the maximum conductance  g ¯ i o n , which illustrates the case when all channels are open and the fraction of open channels  p 0 , 1 . Ion channels consist of gates, structural units that carry charged residues, capable of detecting changes in membrane potential and responding to them. Each gate is in an open or closed state and if  n  is the probability of a gate being open and  α , β  are the rates of moving to an open state and to close state, accordingly, which are dependent on the membrane potential and can be determined empirically. The time evolution of the probability of a gate is described by the differential Equation (A6). If a channel has four identical gates, like  K +  channel does, the probability that all of them are open is  n 4 , and therefore its conductance is described by the Equation (A7) and its current by the Equation (A8). As for the  N a +  channel, it consists of two different kinds of gates: three activation gates (the “ m  gates”) which opens in response to depolarisation, and an inactivation gate (the “ h  gate”) which closes shortly after to terminate the flow of ions. Hence, we can write the Equations (A9) and (A10) regarding the sodium ion channel. Each of the three gating variables follows a differential Equation (A11). The Equation (A3) can be written as (A12) if we account for a “leak current”, for permanently open channels like  C l  channels. However, neurons are intricately structured and have non-homogeneous potential profiles. On that account, a neuron is modelled as interconnected cylindrical compartments, with Hodgkin-Huxley equations for each compartment and cable equations between them. This leads to a system of coupled differential Equation (A13) where the  D 4 R a 2 V m x 2  term (from cable theory where  D  is the diameter of the axon and  R a  the specific resistance of the cytoplasm) describes how the electrical signal propagates along the neuron’s axon, like water flowing through a pipe, the  C V m t  term represents the current used to charge or discharge the cell membrane, and the remaining terms describe the ionic currents. This mathematical description is extremely precise but also complex, making it difficult to intuitively understand and analyse the system. For this reason, simplified models such as FitzHugh-Nagumo are used that preserve the main properties but reduce complexity, facilitating the study of neuronal function.
Q = C V m ,
d Q d t = C d V m d t ,
C d V m d t = I i o n ,
I i o n = g i o n V m E e q ,
I i o n = g ¯ i o n   p E e q V m ,
d n d t = α V m 1 n β V m n ,
g K + = g ¯ K +   n 4 ,
I K + = g ¯ K +   n 4 E e q K + V m ,
g Na + = g ¯ Na +   m 3 h ,
I Na + = g ¯ Na +   m 3 h E e q Na + V m ,
d x d t = α 1 x β x , x n ,   m ,   h ,
C d V m d t = I K + + I Na + + I l e a k ,
D 4 R a 2 V m x 2 = C V m t + g ¯ K +   n 4 E e q K + V m + g ¯ Na +   m 3 h E e q Na + V m + g ¯ l e a k   E e q l e a k V m ,

References

  1. Klonowski, W. Everything You Wanted to Ask about EEG but Were Afraid to Get the Right Answer. Nonlinear Biomed. Phys. 2009, 3, 2. [Google Scholar] [CrossRef] [PubMed]
  2. Lai, C.Q.; Ibrahim, H.; Abdullah, M.Z.; Abdullah, J.M.; Suandi, S.A.; Azman, A. Artifacts and Noise Removal for Electroencephalogram (EEG): A Literature Review. In Proceedings of the 2018 IEEE Symposium on Computer Applications & Industrial Electronics (ISCAIE), Penang, Malaysia, 28–29 April 2018; IEEE: New York, NY, USA, 2018; pp. 326–332. [Google Scholar]
  3. Nolte, G.; Bai, O.; Wheaton, L.; Mari, Z.; Vorbach, S.; Hallett, M. Identifying True Brain Interaction from EEG Data Using the Imaginary Part of Coherency. Clin. Neurophysiol. 2004, 115, 2292–2307. [Google Scholar] [CrossRef] [PubMed]
  4. Pardey, J.; Roberts, S.; Tarassenko, L. A Review of Parametric Modelling Techniques for EEG Analysis. Med. Eng. Phys. 1996, 18, 2–11. [Google Scholar] [CrossRef] [PubMed]
  5. Subha, D.P.; Joseph, P.K.; Acharya, U.R.; Lim, C.M. EEG Signal Analysis: A Survey. J. Med. Syst. 2010, 34, 195–212. [Google Scholar] [CrossRef]
  6. Siuly, S.; Li, Y.; Zhang, Y. EEG Signal Analysis and Classification; Springer International Publishing: Cham, Switzerland, 2016; ISBN 978-3-319-47652-0. [Google Scholar]
  7. Sharma, R.; Meena, H.K. Emerging Trends in EEG Signal Processing: A Systematic Review. SN Comput. Sci. 2024, 5, 415. [Google Scholar] [CrossRef]
  8. Vafaei, E.; Hosseini, M. Transformers in EEG Analysis: A Review of Architectures and Applications in Motor Imagery, Seizure, and Emotion Classification. Sensors 2025, 25, 1293. [Google Scholar] [CrossRef]
  9. Huang, G. Statistical Analysis. In EEG Signal Processing and Feature Extraction; Springer: Singapore, 2019; pp. 335–375. [Google Scholar]
  10. Chiang, S.; Zito, J.; Rao, V.R.; Vannucci, M. Time-Series Analysis. In Statistical Methods in Epilepsy; Chapman and Hall/CRC: Boca Raton, FL, USA, 2024; pp. 166–200. [Google Scholar]
  11. Obermaier, B.; Guger, C.; Neuper, C.; Pfurtscheller, G. Hidden Markov Models for Online Classification of Single Trial EEG Data. Pattern Recognit. Lett. 2001, 22, 1299–1309. [Google Scholar] [CrossRef]
  12. Koles, Z.J. Trends in EEG Source Localization. Electroencephalogr. Clin. Neurophysiol. 1998, 106, 127–137. [Google Scholar] [CrossRef]
  13. Chiarion, G.; Sparacino, L.; Antonacci, Y.; Faes, L.; Mesin, L. Connectivity Analysis in EEG Data: A Tutorial Review of the State of the Art and Emerging Trends. Bioengineering 2023, 10, 372. [Google Scholar] [CrossRef]
  14. Pritchard, W.S.; Duke, D.W. Measuring Chaos in the Brain: A Tutorial Review of Nonlinear Dynamical Eeg Analysis. Int. J. Neurosci. 1992, 67, 31–80. [Google Scholar] [CrossRef]
  15. Shukla, S.; Torres, J.; Murhekar, A.; Liu, C.; Mishra, A.; Gwizdka, J.; Roychowdhury, S. A Survey on Bridging EEG Signals and Generative AI: From Image and Text to Beyond. arXiv 2025, arXiv:2502.12048. [Google Scholar] [CrossRef]
  16. Jiang, W.-B.; Zhao, L.-M.; Lu, B.-L. Large Brain Model for Learning Generic Representations with Tremendous EEG Data in BCI. In Proceedings of the Twelfth International Conference on Learning Representations (ICLR 2024), Vienna, Austria, 7–11 May 2024. [Google Scholar]
  17. Li, Z.; Zheng, W.-L.; Lu, B.-L. Gram: A Large-Scale General EEG Model for Raw Data Classification and Restoration Tasks. In Proceedings of the ICASSP 2025—2025 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Hyderabad, India, 6–11 April 2025; IEEE: New York, NY, USA, 2025; pp. 1–5. [Google Scholar]
  18. Luo, J.; Yang, L.; Liu, Y.; Hu, C.; Wang, G.; Yang, Y.; Yang, T.-L.; Zhou, X. Review of Diffusion Models and Its Applications in Biomedical Informatics. BMC Med. Inform. Decis. Mak. 2025, 25, 390. [Google Scholar] [CrossRef] [PubMed]
  19. Babu, N.; Mathew, J.; Vinod, A.P. Large Language Models for EEG: A Comprehensive Survey and Taxonomy. arXiv 2025, arXiv:2506.06353. [Google Scholar]
  20. Rafiei, M.H.; Gauthier, L.V.; Adeli, H.; Takabi, D. Self-Supervised Learning for Electroencephalography. IEEE Trans. Neural Netw. Learn. Syst. 2024, 35, 1457–1471. [Google Scholar] [CrossRef]
  21. Hodgkin, A.L.; Huxley, A.F. Resting and Action Potentials in Single Nerve Fibres. J. Physiol. 1945, 104, 176–195. [Google Scholar] [CrossRef]
  22. Catterall, W.A.; Raman, I.M.; Robinson, H.P.C.; Sejnowski, T.J.; Paulsen, O. The Hodgkin-Huxley Heritage: From Channels to Circuits. J. Neurosci. 2012, 32, 14064. [Google Scholar] [CrossRef]
  23. Hodgkin, A.L.; Huxley, A.F.; Katz, B. Measurement of Current-Voltage Relations in the Membrane of the Giant Axon of Loligo. J. Physiol. 1952, 116, 424–448. [Google Scholar] [CrossRef]
  24. Hodgkin, A.L.; Huxley, A.F. A Quantitative Description of Membrane Current and Its Application to Conduction and Excitation in Nerve. J. Physiol. 1952, 117, 500–544. [Google Scholar] [CrossRef]
  25. Barnett, M.W.; Larkman, P.M. The Action Potential. Pract. Neurol. 2007, 7, 192. [Google Scholar]
  26. Ahlfors, S.P.; Han, J.; Belliveau, J.W.; Hämäläinen, M.S. Sensitivity of MEG and EEG to Source Orientation. Brain Topogr. 2010, 23, 227–232. [Google Scholar] [CrossRef]
  27. van den Broek, S.P.; Reinders, F.; Donderwinkel, M.; Peters, M.J. Volume Conduction Effects in EEG and MEG. Electroencephalogr. Clin. Neurophysiol. 1998, 106, 522–534. [Google Scholar] [CrossRef]
  28. Jackson, A.F.; Bolger, D.J. The Neurophysiological Bases of EEG and EEG Measurement: A Review for the Rest of Us. Psychophysiology 2014, 51, 1061–1071. [Google Scholar] [CrossRef] [PubMed]
  29. Miranda, P.C. Physics of Effects of Transcranial Brain Stimulation. Handb. Clin. Neurol. 2013, 116, 353–366. [Google Scholar] [CrossRef] [PubMed]
  30. Williams, T.; Bouazza-Marouf, K.; Zecca, M.; Green, A.L. Analysis of the Validity of the Mathematical Assumptions of Electrical Impedance Tomography for Human Head Tissues. Biomed. Phys. Eng. Express 2021, 7, 025011. [Google Scholar] [CrossRef] [PubMed]
  31. Buzsáki, G.; Anastassiou, C.A.; Koch, C. The Origin of Extracellular Fields and Currents—EEG, ECoG, LFP and Spikes. Nat. Rev. Neurosci. 2012, 13, 407–420. [Google Scholar] [CrossRef]
  32. Goyal, R.K.; Chaudhury, A. Structure Activity Relationship of Synaptic and Junctional Neurotransmission. Auton. Neurosci. 2013, 176, 11–31. [Google Scholar] [CrossRef]
  33. Murakami, S.; Okada, Y. Contributions of Principal Neocortical Neurons to Magnetoencephalography and Electroencephalography Signals. J. Physiol. 2006, 575, 925–936. [Google Scholar] [CrossRef]
  34. Dunne, J. Workshops (WS) Workshop 1: Electroencephalogram (EEG) WS1.1. The Origin of EEG, Recording Techniques and Quality. Clin. Neurophysiol. 2021, 132, e51. [Google Scholar] [CrossRef]
  35. Usakli, A.B. Improvement of EEG Signal Acquisition: An Electrical Aspect for State of the Art of Front End. Comput. Intell. Neurosci. 2010, 2010, 630649. [Google Scholar] [CrossRef]
  36. Whittingstall, K.; Stroink, G.; Gates, L.; Connolly, J.F.; Finley, A. Effects of Dipole Position, Orientation and Noise on the Accuracy of EEG Source Localization. Biomed. Eng. Online 2003, 2, 14. [Google Scholar] [CrossRef]
  37. Doschoris, M.; Kariotou, F. Mathematical Foundation of Electroencephalography. In Electroencephalography; InTech: Tokyo, Japan, 2017. [Google Scholar]
  38. Næss, S.; Chintaluri, C.; Ness, T.V.; Dale, A.M.; Einevoll, G.T.; Wójcik, D.K. Corrected Four-Sphere Head Model for EEG Signals. Front. Hum. Neurosci. 2017, 11, 490. [Google Scholar] [CrossRef] [PubMed]
  39. Hallez, H.; Vanrumste, B.; Grech, R.; Muscat, J.; De Clercq, W.; Vergult, A.; D’Asseler, Y.; Camilleri, K.P.; Fabri, S.G.; Van Huffel, S.; et al. Review on Solving the Forward Problem in EEG Source Analysis. J. Neuroeng. Rehabil. 2007, 4, 46. [Google Scholar] [CrossRef] [PubMed]
  40. Grech, R.; Cassar, T.; Muscat, J.; Camilleri, K.P.; Fabri, S.G.; Zervakis, M.; Xanthopoulos, P.; Sakkalis, V.; Vanrumste, B. Review on Solving the Inverse Problem in EEG Source Analysis. J. Neuroeng. Rehabil. 2008, 5, 25. [Google Scholar] [CrossRef] [PubMed]
  41. Acharya, J.N.; Hani, A.; Cheek, J.; Thirumala, P.; Tsuchida, T.N. American Clinical Neurophysiology Society Guideline 2: Guidelines for Standard Electrode Position Nomenclature. J. Clin. Neurophysiol. 2016, 33, 245–252. [Google Scholar] [CrossRef]
  42. Homan, R.W.; Herman, J.; Purdy, P. Cerebral Location of International 10–20 System Electrode Placement. Electroencephalogr. Clin. Neurophysiol. 1987, 66, 376–382. [Google Scholar] [CrossRef]
  43. Cascino, G.D. Current Practice of Clinical Electroencephalography. In Neurology, 2nd ed.; 1991; Volume 41, p. 467. [Google Scholar] [CrossRef]
  44. Klem, G.H.; Lüders, H.; Jasper, H.H.; Elger, C.E. The Ten-Twenty Electrode System of the International Federation. The International Federation of Clinical Neurophysiology. Electroencephalogr. Clin. Neurophysiol. 1999, 52, 3–6. [Google Scholar]
  45. Jurcak, V.; Tsuzuki, D.; Dan, I. 10/20, 10/10, and 10/5 Systems Revisited: Their Validity as Relative Head-Surface-Based Positioning Systems. Neuroimage 2007, 34, 1600–1611. [Google Scholar] [CrossRef]
  46. Sinha, S.R.; Sullivan, L.; Sabau, D.; San-Juan, D.; Dombrowski, K.E.; Halford, J.J.; Hani, A.J.; Drislane, F.W.; Stecker, M.M. American Clinical Neurophysiology Society Guideline 1: Minimum Technical Requirements for Performing Clinical Electroencephalography. J. Clin. Neurophysiol. 2016, 33, 235–244. [Google Scholar] [CrossRef]
  47. Qin, Y.; Zhang, Y.; Zhang, Y.; Liu, S.; Guo, X. Application and Development of EEG Acquisition and Feedback Technology: A Review. Biosensors 2023, 13, 930. [Google Scholar] [CrossRef]
  48. Dabbabi, T.; Bouafif, L.; Cherif, A. A Review of Non Invasive Methods of Brain Activity Measurements via EEG Signals Analysis. In Proceedings of the 2023 IEEE International Conference on Advanced Systems and Emergent Technologies (IC_ASET), Hammamet, Tunisia, 29 April–1 May 2023; pp. 1–6. [Google Scholar]
  49. Knierim, M.T.; Bleichner, M.G.; Reali, P. A Systematic Comparison of High-End and Low-Cost EEG Amplifiers for Concealed, Around-the-Ear EEG Recordings. Sensors 2023, 23, 4559. [Google Scholar] [CrossRef]
  50. Haneef, Z.; Yang, K.; Sheth, S.A.; Aloor, F.Z.; Aazhang, B.; Krishnan, V.; Karakas, C. Sub-Scalp Electroencephalography: A next-Generation Technique to Study Human Neurophysiology. Clin. Neurophysiol. 2022, 141, 77–87. [Google Scholar] [CrossRef] [PubMed]
  51. Shah, A.K.; Mittal, S. Invasive Electroencephalography Monitoring: Indications and Presurgical Planning. Ann. Indian Acad. Neurol. 2014, 17, 89–94. [Google Scholar] [CrossRef] [PubMed]
  52. Coles, L.; Ventrella, D.; Carnicer-Lombarte, A.; Elmi, A.; Troughton, J.G.; Mariello, M.; El Hadwe, S.; Woodington, B.J.; Bacci, M.L.; Malliaras, G.G.; et al. Origami-Inspired Soft Fluidic Actuation for Minimally Invasive Large-Area Electrocorticography. Nat. Commun. 2024, 15, 6290. [Google Scholar] [CrossRef] [PubMed]
  53. Tay, A.S.-M.S.; Menaker, S.A.; Chan, J.L.; Mamelak, A.N. Placement of Stereotactic Electroencephalography Depth Electrodes Using the Stealth Autoguide Robotic System: Technical Methods and Initial Results. Oper. Neurosurg. 2022, 22, e150–e157. [Google Scholar] [CrossRef]
  54. Wang, Y.; Yang, X.; Zhang, X.; Wang, Y.; Pei, W. Implantable Intracortical Microelectrodes: Reviewing the Present with a Focus on the Future. Microsyst. Nanoeng. 2023, 9, 7. [Google Scholar] [CrossRef]
  55. Niso, G.; Romero, E.; Moreau, J.T.; Araujo, A.; Krol, L.R. Wireless EEG: A Survey of Systems and Studies. Neuroimage 2023, 269, 119774. [Google Scholar] [CrossRef]
  56. Chen, Y.; Qian, W.; Razansky, D.; Yu, X.; Qian, C. WISDEM: A Hybrid Wireless Integrated Sensing Detector for Simultaneous EEG and MRI. Nat. Methods 2025, 22, 1944–1953. [Google Scholar] [CrossRef]
  57. Casson, A.J. Wearable EEG and Beyond. Biomed. Eng. Lett. 2019, 9, 53–71. [Google Scholar] [CrossRef]
  58. Abhinav, V.; Basu, P.; Verma, S.S.; Verma, J.; Das, A.; Kumari, S.; Yadav, P.R.; Kumar, V. Advancements in Wearable and Implantable BioMEMS Devices: Transforming Healthcare Through Technology. Micromachines 2025, 16, 522. [Google Scholar] [CrossRef]
  59. Samimisabet, P.; Krieger, L.; De Palol, M.V.; Gün, D.; Pipa, G. Enhancing Mobile EEG: Software Development and Performance Insights of the DreamMachine. HardwareX 2025, 23, e00689. [Google Scholar] [CrossRef]
  60. Yuan, H.; Li, Y.; Yang, J.; Li, H.; Yang, Q.; Guo, C.; Zhu, S.; Shu, X. State of the Art of Non-Invasive Electrode Materials for Brain–Computer Interface. Micromachines 2021, 12, 1521. [Google Scholar] [CrossRef] [PubMed]
  61. Wang, D.; Xue, H.; Xia, L.; Li, Z.; Zhao, Y.; Fan, X.; Sun, K.; Wang, H.; Hamalainen, T.; Zhang, C.; et al. A Tough Semi-Dry Hydrogel Electrode with Anti-Bacterial Properties for Long-Term Repeatable Non-Invasive EEG Acquisition. Microsyst. Nanoeng. 2025, 11, 105. [Google Scholar] [CrossRef] [PubMed]
  62. Xiong, F.; Fan, M.; Feng, Y.; Li, Y.; Yang, C.; Zheng, J.; Wang, C.; Zhou, J. Advancements in Dry and Semi-Dry EEG Electrodes: Design, Interface Characteristics, and Performance Evaluation. AIP Adv. 2025, 15, 040703. [Google Scholar] [CrossRef]
  63. Lopez-Gordo, M.A.; Sanchez-Morillo, D.; Valle, F.P. Dry EEG Electrodes. Sensors 2014, 14, 12847–12870. [Google Scholar] [CrossRef]
  64. Searle, A.; Kirkup, L. A Direct Comparison of Wet, Dry and Insulating Bioelectric Recording. Physiol. Meas. 2000, 21, 271. [Google Scholar] [CrossRef]
  65. Liu, Z.; Xu, X.; Huang, S.; Huang, X.; Liu, Z.; Yao, C.; He, M.; Chen, J.; Chen, H.; Liu, J.; et al. Multichannel Microneedle Dry Electrode Patches for Minimally Invasive Transdermal Recording of Electrophysiological Signals. Microsyst. Nanoeng. 2024, 10, 72. [Google Scholar] [CrossRef]
  66. Jeong, H.; Ntolkeras, G.; Warbrick, T.; Jaschke, M.; Gupta, R.; Lev, M.H.; Peters, J.M.; Grant, P.E.; Bonmassar, G. Aluminum Thin Film Nanostructure Traces in Pediatric EEG Net for MRI and CT Artifact Reduction. Sensors 2023, 23, 3633. [Google Scholar] [CrossRef]
  67. Ong, S.; Kullmann, A.; Mertens, S.; Rosa, D.; Diaz-Botia, C.A. Electrochemical Testing of a New Polyimide Thin Film Electrode for Stimulation, Recording, and Monitoring of Brain Activity. Micromachines 2022, 13, 1798. [Google Scholar] [CrossRef]
  68. Euler, L.; Guo, L.; Persson, N.-K. Textile Electrodes: Influence of Knitting Construction and Pressure on the Contact Impedance. Sensors 2021, 21, 1578. [Google Scholar] [CrossRef]
  69. Moyseowicz, A.; Minta, D.; Gryglewicz, G. Conductive Polymer/Graphene-Based Composites for Next Generation Energy Storage and Sensing Applications. ChemElectroChem 2023, 10, e202201145. [Google Scholar] [CrossRef]
  70. Oostenveld, R.; Praamstra, P. The Five Percent Electrode System for High-Resolution EEG and ERP Measurements. Clin. Neurophysiol. 2001, 112, 713–719. [Google Scholar] [CrossRef] [PubMed]
  71. Soler, A.; Moctezuma, L.A.; Giraldo, E.; Molinas, M. Automated Methodology for Optimal Selection of Minimum Electrode Subsets for Accurate EEG Source Estimation Based on Genetic Algorithm Optimization. Sci. Rep. 2022, 12, 11221. [Google Scholar] [CrossRef] [PubMed]
  72. Ming, G.; Pei, W.; Tian, S.; Chen, X.; Gao, X.; Wang, Y. High-Density EEG Enables the Fastest Visual Brain-Computer Interfaces. arXiv 2025, arXiv:2507.17242. [Google Scholar]
  73. Halford, J.J.; Sabau, D.; Drislane, F.W.; Tsuchida, T.N.; Sinha, S.R. American Clinical Neurophysiology Society Guideline 4: Recording Clinical EEG on Digital Media. Neurodiagn. J. 2016, 56, 261–265. [Google Scholar] [CrossRef]
  74. Babiloni, C.; Barry, R.J.; Başar, E.; Blinowska, K.J.; Cichocki, A.; Drinkenburg, W.H.I.M.; Klimesch, W.; Knight, R.T.; Lopes da Silva, F.; Nunez, P.; et al. International Federation of Clinical Neurophysiology (IFCN)—EEG Research Workgroup: Recommendations on Frequency and Topographic Analysis of Resting State EEG Rhythms. Part 1: Applications in Clinical Research Studies. Clin. Neurophysiol. 2020, 131, 285–307. [Google Scholar] [CrossRef]
  75. Peltola, M.E.; Leitinger, M.; Halford, J.J.; Vinayan, K.P.; Kobayashi, K.; Pressler, R.M.; Mindruta, I.; Mayor, L.C.; Lauronen, L.; Beniczky, S. Routine and Sleep EEG: Minimum Recording Standards of the International Federation of Clinical Neurophysiology and the International League Against Epilepsy. Epilepsia 2023, 64, 602–618. [Google Scholar] [CrossRef]
  76. Teplan, M. Fundamentals of EEG Measurement. Meas. Sci. Rev. 2002, 2, 1–11. [Google Scholar]
  77. Chi, Y.M.; Cauwenberghs, G. Wireless Non-Contact EEG/ECG Electrodes for Body Sensor Networks. In Proceedings of the 2010 International Conference on Body Sensor Networks, Singapore, 7–9 June 2010; pp. 297–301. [Google Scholar]
  78. Vanhatalo, S.; Palva, J.M.; Andersson, S.; Rivera, C.; Voipio, J.; Kaila, K. Slow Endogenous Activity Transients and Developmental Expression of K+–Cl Cotransporter 2 in the Immature Human Cortex. Eur. J. Neurosci. 2005, 22, 2799–2804. [Google Scholar] [CrossRef]
  79. Mullen, T.; Kothe, C.; Chi, Y.M.; Ojeda, A.; Kerth, T.; Makeig, S.; Cauwenberghs, G.; Jung, T.-P. Real-Time Modeling and 3D Visualization of Source Dynamics and Connectivity Using Wearable EEG. In Proceedings of the 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Osaka, Japan, 3–7 July 2013; IEEE: New York, NY, USA, 2013; pp. 2184–2187. [Google Scholar]
  80. Pernet, C.R.; Appelhoff, S.; Gorgolewski, K.J.; Flandin, G.; Phillips, C.; Delorme, A.; Oostenveld, R. EEG-BIDS, an Extension to the Brain Imaging Data Structure for Electroencephalography. Sci. Data 2019, 6, 103. [Google Scholar] [CrossRef]
  81. Fleury, M.; Figueiredo, P.; Vourvopoulos, A.; Lécuyer, A. Two Is Better? Combining EEG and FMRI for BCI and Neurofeedback: A Systematic Review. J. Neural Eng. 2023, 20, 051003. [Google Scholar] [CrossRef]
  82. Chen, J.; Yu, K.; Bi, Y.; Ji, X.; Zhang, D. Strategic Integration: A Cross-Disciplinary Review of the FNIRS-EEG Dual-Modality Imaging System for Delivering Multimodal Neuroimaging to Applications. Brain Sci. 2024, 14, 1022. [Google Scholar] [CrossRef]
  83. Adnan Khan, M.D.S.; Hoq, M.T.; Zadidul Karim, A.H.M.; Alam, M.K.; Howlader, M.; Rajkumar, R.K. Energy Harvesting—Technical Analysis of Evolution, Control Strategies, and Future Aspectsa. J. Electron. Sci. Technol. 2019, 17, 116–125. [Google Scholar] [CrossRef]
  84. BAVEL, M.; Leonov, V.; Yazicioglu, R.; Torfs, T.; Van Hoof, C.; Posthuma, N.; Vullers, R. Wearable Battery-Free Wireless 2-Channel EEG Systems Powered by Energy Scavengers. Sens. Transducers 2008, 94, 103–115. [Google Scholar]
  85. Choudhary, S.K.; Bera, T.K. Designing of Battery-Based Low Noise Electroencephalography (EEG) Amplifier for Brain Signal Monitoring: A Simulation Study. In Proceedings of the 2022 IEEE 6th International Conference on Condition Assessment Techniques in Electrical Systems (CATCON), Virtual, 17–19 December 2022; IEEE: New York, NY, USA, 2022; pp. 422–426. [Google Scholar]
  86. Kappel, S.L.; Looney, D.; Mandic, D.P.; Kidmose, P. Physiological Artifacts in Scalp EEG and Ear-EEG. Biomed. Eng. Online 2017, 16, 103. [Google Scholar] [CrossRef] [PubMed]
  87. Amin, U.; Nascimento, F.A.; Karakis, I.; Schomer, D.; Benbadis, S.R. Normal Variants and Artifacts: Importance in EEG Interpretation. Epileptic Disord. 2023, 25, 591–648. [Google Scholar] [CrossRef] [PubMed]
  88. Chen, M.; Wang, J.; Anzai, D.; Fischer, G.; Kirchner, J. Common-Mode Noise Reduction in Noncontact Biopotential Acquisition Circuit Based on Imbalance Cancellation of Electrode-Body Impedance. Sensors 2020, 20, 7140. [Google Scholar] [CrossRef] [PubMed]
  89. Suwandi, G.R.F.; Khotimah, S.N.; Haryanto, F.; Suprijadi, S. Study of the Effect of Magnetic Fields on Electroencephalography Measurement in Faraday’s Cage. Spektra: J. Fis. Dan Apl. 2021, 6, 101–106. [Google Scholar] [CrossRef]
  90. Jiang, X.; Bian, G.-B.; Tian, Z. Removal of Artifacts from EEG Signals: A Review. Sensors 2019, 19, 987. [Google Scholar] [CrossRef]
  91. Urigüen, J.A.; Garcia-Zapirain, B. EEG Artifact Removal—State-of-the-Art and Guidelines. J. Neural Eng. 2015, 12, 031001. [Google Scholar] [CrossRef]
  92. Bigdely-Shamlo, N.; Mullen, T.; Kothe, C.; Su, K.-M.; Robbins, K.A. The PREP Pipeline: Standardized Preprocessing for Large-Scale EEG Analysis. Front. Neuroinform. 2015, 9, 120. [Google Scholar] [CrossRef]
  93. Dhole, P.V.; Chaudhary, D.G.; Dhangar, V.D.; Shejul, S.D.; Datwase, S.S.; Gawali, B.W. A Review of EEG Artifact Removal Techniques for Brain-Computer Interface. SN Comput. Sci. 2025, 6, 1026. [Google Scholar] [CrossRef]
  94. Sherbieny, Z.M.; Shalaby, W.A.; Rihan, M.; Messiha, N.; El-Fisawy, A.S.; Abd El-Samie, F.E. A Comprehensive Survey and Deep Learning Based Framework for Denoising EEG Signals. In Proceedings of the 2025 4th International Conference on Electronic Engineering (ICEEM), Menouf, Egypt, 4–5 October 2025; IEEE: New York, NY, USA, 2025; pp. 1–8. [Google Scholar]
  95. O’Reilly, C.; Huberty, S. Removing EOG Artifacts from EEG Recordings Using Deep Learning. In Proceedings of the 7th International Conference on Advances in Signal Processing and Artificial Intelligence (ASPAI’ 2025), Innsbruck, Austria, 26 May 2024. [Google Scholar]
  96. Raj, V.A.; Parupudi, T.; Thalengala, A.; Nayak, S.G. A Comprehensive Review of Deep Learning Models for Denoising EEG Signals: Challenges, Advances, and Future Directions. Discov. Appl. Sci. 2025, 7, 1268. [Google Scholar] [CrossRef]
  97. Azhar, M.; Shafique, T.; Amjad, A. A Convolutional Neural Network for the Removal of Simultaneous Ocular and Myogenic Artifacts from EEG Signals. Electronics 2024, 13, 4576. [Google Scholar] [CrossRef]
  98. Hossain, M.S.; Mahmud, S.; Khandakar, A.; Al-Emadi, N.; Chowdhury, F.A.; Bin Mahbub, Z.; Reaz, M.B.I.; Chowdhury, M.E.H. MultiResUNet3+: A Full-Scale Connected Multi-Residual UNet Model to Denoise Electrooculogram and Electromyogram Artifacts from Corrupted Electroencephalogram Signals. Bioengineering 2023, 10, 579. [Google Scholar] [CrossRef]
  99. Tibermacine, I.E.; Russo, S.; Citeroni, F.; Mancini, G.; Rabehi, A.; Alharbi, A.H.; El-kenawy, E.-S.M.; Napoli, C. Adversarial Denoising of EEG Signals: A Comparative Analysis of Standard GAN and WGAN-GP Approaches. Front. Hum. Neurosci. 2025, 19, 1583342. [Google Scholar] [CrossRef]
  100. Soufineyestani, M.; Dowling, D.; Khan, A. Electroencephalography (EEG) Technology Applications and Available Devices. Appl. Sci. 2020, 10, 7453. [Google Scholar] [CrossRef]
  101. Värbu, K.; Muhammad, N.; Muhammad, Y. Past, Present, and Future of EEG-Based BCI Applications. Sensors 2022, 22, 3331. [Google Scholar] [CrossRef]
  102. Keutayeva, A.; Jesse Nwachukwu, C.; Alaran, M.; Otarbay, Z.; Abibullaev, B. Neurotechnology in Gaming: A Systematic Review of Visual Evoked Potential-Based Brain-Computer Interfaces. IEEE Access 2025, 13, 74944–74966. [Google Scholar] [CrossRef]
  103. Zhao, Y.; Wang, Q.; Ren, Z.; Wen, B.; Li, Y.; Wang, N.; Wang, B.; Zhao, T.; Chen, Y.; Zhao, P.; et al. A Diagnosis and Prediction Algorithm for Juvenile Myoclonic Epilepsy Based on Clinical and Quantitative EEG Features. Seizure Eur. J. Epilepsy 2025, 129, 59–69. [Google Scholar] [CrossRef]
  104. López, C.Q.; Vera, V.D.G.; Quintero, M.J.R. Diagnosis of ADHD in Children with EEG and Machine Learning: Systematic Review and Meta-Analysis. Clin. Health 2025, 36, 109–121. [Google Scholar] [CrossRef]
  105. Chawla, D.; Sharma, E.; Rajab, N.; Łajczak, P.; Silva, Y.P.; Baptista, J.M.; Pomianoski, B.W.; Ahmed, A.R.; Majeed, M.W.; de Sousa, Y.G.; et al. Utilizing Machine Learning Techniques for EEG Assessment in the Diagnosis of Epileptic Seizures in the Brain: A Systematic Review and Meta-Analysis. Seizure Eur. J. Epilepsy 2025, 126, 16–23. [Google Scholar] [CrossRef] [PubMed]
  106. Manzari, F.; Ghaderyan, P. Insights into Computer-Aided EEG Signal Processing for Traumatic Brain Injury Assessment: A Review. Measurement 2025, 251, 117279. [Google Scholar] [CrossRef]
  107. Redžepi, S.; Burazerović, E.; Redžepi, S.; Husović, E.; Pojskić, M. Systematic Review of Advanced Algorithms for Brain Mapping in Stereotactic Neurosurgery: Integration of FMRI and EEG Data. Brain Sci. 2025, 15, 1188. [Google Scholar] [CrossRef] [PubMed]
  108. Amer, N.S.; Belhaouari, S.B. EEG Signal Processing for Medical Diagnosis, Healthcare, and Monitoring: A Comprehensive Review. IEEE Access 2023, 11, 143116–143142. [Google Scholar] [CrossRef]
  109. García-Salinas, J.S.; Wróblewska, A.; Kucewicz, M.T. Detection of EEG Activity in Response to the Surrounding Environment: A Neuro-Architecture Study. Brain Sci. 2025, 15, 1103. [Google Scholar] [CrossRef]
  110. Wu, H.; Li, M.D. Digital Psychiatry: Concepts, Framework, and Implications. Front. Psychiatry 2025, 16, 1572444. [Google Scholar] [CrossRef]
  111. Rutkowski, J.; Saab, M. AI-Based EEG Analysis: New Technology and the Path to Clinical Adoption. Clin. Neurophysiol. 2025, 179, 2110994. [Google Scholar] [CrossRef]
  112. Rahman, M.; Karwowski, W.; Fafrowicz, M.; Hancock, P.A. Neuroergonomics Applications of Electroencephalography in Physical Activities: A Systematic Review. Front. Hum. Neurosci. 2019, 13, 182. [Google Scholar] [CrossRef]
  113. Khondakar, M.F.K.; Sarowar, M.H.; Chowdhury, M.H.; Majumder, S.; Hossain, M.A.; Dewan, M.A.A.; Hossain, Q.D. A Systematic Review on EEG-Based Neuromarketing: Recent Trends and Analyzing Techniques. Brain Inform. 2024, 11, 17. [Google Scholar] [CrossRef]
  114. Hu, P.-C.; Kuo, P.-C. Adaptive Learning System for E-Learning Based on EEG Brain Signals. In Proceedings of the 2017 IEEE 6th Global Conference on Consumer Electronics (GCCE), Nagoya, Japan, 24–27 October 2017; IEEE: New York, NY, USA, 2017; pp. 1–2. [Google Scholar]
  115. Apicella, A.; Arpaia, P.; Frosolone, M.; Improta, G.; Moccaldi, N.; Pollastro, A. EEG-Based Measurement System for Monitoring Student Engagement in Learning 4.0. Sci. Rep. 2022, 12, 5857. [Google Scholar] [CrossRef]
  116. Wang, J.-W.; Zhang, D.-W.; Johnstone, S.J. Portable EEG for Assessing Attention in Educational Settings: A Scoping Review. Acta Psychol. 2025, 255, 104933. [Google Scholar] [CrossRef] [PubMed]
  117. Kao, S.-C.; Huang, C.-J.; Hung, T.-M. Frontal Midline Theta Is a Specific Indicator of Optimal Attentional Engagement During Skilled Putting Performance. J. Sport Exerc. Psychol. 2013, 35, 470–478. [Google Scholar] [CrossRef] [PubMed]
  118. van Boxtel, G.J.M.; Denissen, A.J.J.M.; de Groot, J.A.; Neleman, M.S.; Vellema, J.; Hart de Ruijter, E.M. Alpha Neurofeedback Training in Elite Soccer Players Trained in Groups. Appl. Psychophysiol. Biofeedback 2024, 49, 589–602. [Google Scholar] [CrossRef] [PubMed]
  119. Tatar, A.B. Biometric Identification System Using EEG Signals. Neural Comput. Appl. 2023, 35, 1009–1023. [Google Scholar] [CrossRef]
  120. Poldrack, R.A. Inferring Mental States from Neuroimaging Data: From Reverse Inference to Large-Scale Decoding. Neuron 2011, 72, 692–697. [Google Scholar] [CrossRef]
  121. Ferrell, M.L.; Beatty, A.; Dubljevic, V. The Ethics of Neuromarketing: A Rapid Review. Neuroethics 2025, 18, 19. [Google Scholar] [CrossRef]
  122. Fontanillo Lopez, C.A.; Li, G.; Zhang, D. Beyond Technologies of Electroencephalography-Based Brain-Computer Interfaces: A Systematic Review from Commercial and Ethical Aspects. Front. Neurosci. 2020, 14, 611130. [Google Scholar] [CrossRef]
  123. Mouazen, B.; El Bouyed, Z. Toward Ethical and Transparent Brain-Computer Interfaces: Challenges and Perspectives in EEG-Based AI. In Artificial Neural Networks and Their Applications; IntechOpen: London, UK, 2025. [Google Scholar]
  124. Becerra, M.A.; Duque-Mejia, C.; Castro-Ospina, A.; Serna-Guarín, L.; Mejía, C.; Duque-Grisales, E. EEG-Based Biometric Identification and Emotion Recognition: An Overview. Computers 2025, 14, 299. [Google Scholar] [CrossRef]
  125. Tsuchida, T.N.; Acharya, J.N.; Halford, J.J.; Kuratani, J.D.; Sinha, S.R.; Stecker, M.M.; Tatum, W.O.; Drislane, F.W. American Clinical Neurophysiology Society: EEG Guidelines Introduction. J. Clin. Neurophysiol. 2016, 33, 301–302. [Google Scholar] [CrossRef]
  126. Cordeau, J.P. Monorhythmic Frontal Delta Activity in the Human Electroencephalogram: A Study of 100 Cases. Electroencephalogr. Clin. Neurophysiol. 1959, 11, 733–746. [Google Scholar] [CrossRef]
  127. Harmony, T.; Fernández, T.; Silva, J.; Bernal, J.; Díaz-Comas, L.; Reyes, A.; Marosi, E.; Rodríguez, M.; Rodríguez, M. EEG Delta Activity: An Indicator of Attention to Internal Processing during Performance of Mental Tasks. Int. J. Psychophysiol. 1996, 24, 161–171. [Google Scholar] [CrossRef] [PubMed]
  128. Green, J.D.; Arduini, A.A. Hippocampal Electrical Activity in Arousal. J. Neurophysiol. 1954, 17, 533–557. [Google Scholar] [CrossRef] [PubMed]
  129. Snipes, S.; Krugliakova, E.; Meier, E.; Huber, R. The Theta Paradox: 4-8 Hz EEG Oscillations Reflect Both Sleep Pressure and Cognitive Control. J. Neurosci. 2022, 42, 8569–8586. [Google Scholar] [CrossRef] [PubMed]
  130. Aird, R.B.; Gastaut, Y. Occipital and Posterior Electroencephalographic Ryhthms. Electroencephalogr. Clin. Neurophysiol. 1959, 11, 637–656. [Google Scholar] [CrossRef]
  131. Klimesch, W. Alpha-Band Oscillations, Attention, and Controlled Access to Stored Information. Trends Cogn. Sci. 2012, 16, 606–617. [Google Scholar] [CrossRef]
  132. Frost, J.D.; Carrie, J.R.G.; Borda, R.P.; Kellaway, P. The Effects of Dalmane (Flurazepam Hydrochloride) on Human EEG Characteristics. Electroencephalogr. Clin. Neurophysiol. 1973, 34, 171–175. [Google Scholar] [CrossRef]
  133. Hussain, S.J.; Cohen, L.G.; Bönstrup, M. Beta Rhythm Events Predict Corticospinal Motor Output. Sci. Rep. 2019, 9, 18305. [Google Scholar] [CrossRef]
  134. Jia, X.; Kohn, A. Gamma Rhythms in the Brain. PLoS Biol. 2011, 9, e1001045. [Google Scholar] [CrossRef]
  135. Satapathy, S.K.; Dehuri, S.; Jagadev, A.K.; Mishra, S. Introduction. In EEG Brain Signal Classification for Epileptic Seizure Disorder Detection; Elsevier: Amsterdam, The Netherlands, 2019; pp. 1–25. [Google Scholar][Green Version]
  136. Urrestarazu, E.; Jirsch, J.D.; LeVan, P.; Hall, J. High-Frequency Intracerebral EEG Activity (100–500 Hz) Following Interictal Spikes. Epilepsia 2006, 47, 1465–1476. [Google Scholar] [CrossRef]
  137. Ray, S.; Maunsell, J.H.R. Different Origins of Gamma Rhythm and High-Gamma Activity in Macaque Visual Cortex. PLoS Biol. 2011, 9, e1000610. [Google Scholar] [CrossRef]
  138. Hutchins, T.; Vivanti, G.; Mateljevic, N.; Jou, R.J.; Shic, F.; Cornew, L.; Roberts, T.P.L.; Oakes, L.; Gray, S.A.O.; Ray-Subramanian, C.; et al. Mu Rhythm. In Encyclopedia of Autism Spectrum Disorders; Springer: New York, NY, USA, 2013; pp. 1940–1941. [Google Scholar]
  139. Fernandez, L.M.J.; Lüthi, A. Sleep Spindles: Mechanisms and Functions. Physiol. Rev. 2020, 100, 805–868. [Google Scholar] [CrossRef] [PubMed]
  140. Cash, S.S.; Halgren, E.; Dehghani, N.; Rossetti, A.O.; Thesen, T.; Wang, C.; Devinsky, O.; Kuzniecky, R.; Doyle, W.; Madsen, J.R.; et al. The Human K-Complex Represents an Isolated Cortical Down-State. Science 2009, 324, 1084–1087. [Google Scholar] [CrossRef] [PubMed]
  141. Da Rosa, A.C.; Kemp, B.; Paiva, T.; Lopes da Silva, F.H.; Kamphuisen, H.A.C. A Model-Based Detector of Vertex Waves and K Complexes in Sleep Electroencephalogram. Electroencephalogr. Clin. Neurophysiol. 1991, 78, 71–79. [Google Scholar] [CrossRef] [PubMed]
  142. Gélisse, P.; Crespel, A. Powerful Activation of Lambda Waves with Inversion of Polarity by Reading on Tablet. Epileptic Disord. 2024, 26, 254–256. [Google Scholar] [CrossRef]
  143. Fröhlich, F. Epilepsy. In Network Neuroscience; Elsevier: Amsterdam, The Netherlands, 2016; pp. 297–308. [Google Scholar]
  144. Chaibi, S.; Kachouri, A. Toward Reliable Models for Distinguishing Epileptic High-Frequency Oscillations (HFOs) from Non-HFO Events Using LSTM and Pre-Trained OWL-ViT Vision–Language Framework. AI 2025, 6, 230. [Google Scholar] [CrossRef]
  145. Maeda, K.; Tsuboi, H.; Hosoda, N.; Fukumoto, J.; Fujita, S.; Yamaguchi, S.; Ichino, N.; Osakabe, K.; Sugimoto, K.; Furukawa, G.; et al. Phenotypic Classification of Scalp High-Frequency Oscillations in Absence Epilepsy Based on Multiple Characteristics Using K-Means Clustering. Bioengineering 2026, 13, 65. [Google Scholar] [CrossRef]
  146. Siebert, W.M.C. Processing Neuroelectric Data; Technical Report 351; Technology Press of the Massachusetts Institute of Technology: Cambridge, MA, USA, 1959. [Google Scholar]
  147. Lilliefors, H.W. On the Kolmogorov-Smirnov Test for Normality with Mean and Variance Unknown. J. Am. Stat. Assoc. 1967, 62, 399–402. [Google Scholar] [CrossRef]
  148. Persson, J. Comments on Estimations and Tests of EEG Amplitude Distributions. Electroencephalogr. Clin. Neurophysiol. 1974, 37, 309–313. [Google Scholar] [CrossRef]
  149. Goldensohn, E.S. Handbook of Electroencephalography and Clinical Neurophysiology. Neurology 1975, 25, 299. [Google Scholar] [CrossRef]
  150. Hjorth, B. EEG Analysis Based on Time Domain Properties. Electroencephalogr. Clin. Neurophysiol. 1970, 29, 306–310. [Google Scholar] [CrossRef]
  151. Huber, P.; Kleiner, B.; Gasser, T.; Dumermuth, G. Statistical Methods for Investigating Phase Relations in Stationary Stochastic Processes. IEEE Trans. Audio Electroacoust. 1971, 19, 78–86. [Google Scholar] [CrossRef]
  152. Dumermuth, G.; Huber, P.J.; Kleiner, B.; Gasser, T. Analysis of the Interrelations between Frequency Bands of the EEG by Means of the Bispectrum a Preliminary Study. Electroencephalogr. Clin. Neurophysiol. 1971, 31, 137–148. [Google Scholar] [CrossRef]
  153. Saltzberg, B.; Burch, N.R. Period Analytic Estimates of Moments of the Power Spectrum: A Simplified EEG Time Domain Procedure. Electroencephalogr. Clin. Neurophysiol. 1971, 30, 568–570. [Google Scholar] [CrossRef] [PubMed]
  154. Wendling, F.; Congendo, M.; Lopes da Silva, F.H. EEG Analysis: Theory and Practice; Schomer, D.L., Lopes da Silva, F.H., Eds.; Oxford University Press: Oxford, UK, 2017; Volume 1. [Google Scholar]
  155. Donoghue, T.; Haller, M.; Peterson, E.J.; Varma, P.; Sebastian, P.; Gao, R.; Noto, T.; Lara, A.H.; Wallis, J.D.; Knight, R.T.; et al. Parameterizing Neural Power Spectra into Periodic and Aperiodic Components. Nat. Neurosci. 2020, 23, 1655–1665. [Google Scholar] [CrossRef] [PubMed]
  156. Donoghue, T. A Systematic Review of Aperiodic Neural Activity in Clinical Investigations. Eur. J. Neurosci. 2025, 62, e70255. [Google Scholar] [CrossRef] [PubMed]
  157. Kałamała, P.; Gyurkovics, M.; Bowie, D.C.; Clements, G.M.; Low, K.A.; Dolcos, F.; Fabiani, M.; Gratton, G. Event-Induced Modulation of Aperiodic Background EEG: Attention-Dependent and Age-Related Shifts in E:I Balance, and Their Consequences for Behavior. Imaging Neurosci. 2024, 2, 2-00054. [Google Scholar] [CrossRef]
  158. Gerster, M.; Waterstraat, G.; Litvak, V.; Lehnertz, K.; Schnitzler, A.; Florin, E.; Curio, G.; Nikulin, V. Separating Neural Oscillations from Aperiodic 1/f Activity: Challenges and Recommendations. Neuroinformatics 2022, 20, 991–1012. [Google Scholar] [CrossRef]
  159. Colombo, M.A.; Comanducci, A.; Casarotto, S.; Derchi, C.-C.; Annen, J.; Viganò, A.; Mazza, A.; Trimarchi, P.D.; Boly, M.; Fecchio, M.; et al. Beyond Alpha Power: EEG Spatial and Spectral Gradients Robustly Stratify Disorders of Consciousness. Cereb. Cortex 2023, 33, 7193–7210. [Google Scholar] [CrossRef]
  160. Rasoulzadeh, V.; Erkus, E.C.; Yogurt, T.A.; Ulusoy, I.; Zergeroğlu, S.A. A Comparative Stationarity Analysis of EEG Signals. Ann. Oper. Res. 2017, 258, 133–157. [Google Scholar] [CrossRef]
  161. Schlattmann, P. An Introduction to Statistical Concepts for the Analysis of EEG Data and the Planning of Pharmaco-EEG Trials. Methods Find. Exp. Clin. Pharmacol. 2002, 24, 1–6. [Google Scholar]
  162. Matouŝek, M.; Volavka, J.; Roubíček, J.; Chamrád, V. The Autocorrelation and Frequency Analysis of the EEG Compared with GSR at Different Levels of Activation. Brain Res. 1969, 15, 507–514. [Google Scholar] [CrossRef]
  163. van Drongelen, W.; Nordli, D.R.; Taha, M. Approaches for Interchannel EEG Analysis. medRxiv 2025. [Google Scholar] [CrossRef]
  164. Zhang, H.; Zhou, Q.-Q.; Chen, H.; Hu, X.-Q.; Li, W.-G.; Bai, Y.; Han, J.-X.; Wang, Y.; Liang, Z.-H.; Chen, D.; et al. The Applied Principles of EEG Analysis Methods in Neuroscience and Clinical Neurology. Mil. Med. Res. 2023, 10, 67. [Google Scholar] [CrossRef] [PubMed]
  165. Burch, N.R. Period Analysis of the Clinical Electroencephalogram. In The Nervous System and Electric Currents; Springer: Boston, MA, USA, 1971; pp. 55–56. [Google Scholar]
  166. Lynch, S.M. Bayesian Statistics. In Encyclopedia of Social Measurement; Elsevier: Amsterdam, The Netherlands, 2005; pp. 135–144. [Google Scholar]
  167. Dimmock, S.; O’Donnell, C.; Houghton, C. Bayesian Analysis of Phase Data in EEG and MEG. eLife 2023, 12, e84602. [Google Scholar] [CrossRef] [PubMed]
  168. Khan, M.E.; Dutt, D.N. An Expectation-Maximization Algorithm Based Kalman Smoother Approach for Event-Related Desynchronization (ERD) Estimation from EEG. IEEE Trans. Biomed. Eng. 2007, 54, 1191–1198. [Google Scholar] [CrossRef]
  169. Ezugwu, A.E.; Ikotun, A.M.; Oyelade, O.O.; Abualigah, L.; Agushaka, J.O.; Eke, C.I.; Akinyelu, A.A. A Comprehensive Survey of Clustering Algorithms: State-of-the-Art Machine Learning Applications, Taxonomy, Challenges, and Future Research Prospects. Eng. Appl. Artif. Intell. 2022, 110, 104743. [Google Scholar] [CrossRef]
  170. Rocha, M.; Ferreira, P.G. Hidden Markov Models. In Bioinformatics Algorithms; Elsevier: Amsterdam, The Netherlands, 2018; pp. 255–273. [Google Scholar]
  171. Palma, G.R.; Thornberry, C.; Commins, S.; Moral, R.d.A. Understanding Learning from EEG Data: Combining Machine Learning and Feature Engineering Based on Hidden Markov Models and Mixed Models. Neuroinformatics 2023, 22, 487–497. [Google Scholar] [CrossRef]
  172. Friston, K.J.; Harrison, L.; Penny, W. Dynamic Causal Modelling. Neuroimage 2003, 19, 1273–1302. [Google Scholar] [CrossRef]
  173. Kiebel, S.J.; Garrido, M.I.; Moran, R.J.; Friston, K.J. Dynamic Causal Modelling for EEG and MEG. Cogn. Neurodyn. 2008, 2, 121–136. [Google Scholar] [CrossRef]
  174. Slivinskas, V.; Šimonyte, V. On the Foundation of Prony’s Method. IFAC Proc. Vol. 1986, 19, 121–126. [Google Scholar] [CrossRef]
  175. Hauer, J.F.; Demeure, C.J.; Scharf, L.L. Initial Results in Prony Analysis of Power System Response Signals. IEEE Trans. Power Syst. 1990, 5, 80–89. [Google Scholar] [CrossRef]
  176. Maris, E. A Resampling Method for Estimating the Signal Subspace of Spatio-Temporal Eeg/Meg Data. IEEE Trans. Biomed. Eng. 2003, 50, 935–949. [Google Scholar] [CrossRef] [PubMed]
  177. Moran, P.A.; Whittle, P. Hypothesis Testing in Time Series Analysis. J. R. Stat. Soc. Ser. A 1951, 114, 579. [Google Scholar] [CrossRef]
  178. Lippmann, R.P. Pattern Classification Using Neural Networks. IEEE Commun. Mag. 1989, 27, 47–50. [Google Scholar] [CrossRef]
  179. Box, G.E.P.; Pierce, D.A. Distribution of Residual Autocorrelations in Autoregressive-Integrated Moving Average Time Series Models. J. Am. Stat. Assoc. 1970, 65, 1509–1526. [Google Scholar] [CrossRef]
  180. Bartholomew, D.J.; Box, G.E.P.; Jenkins, G.M. Time Series Analysis Forecasting and Control. Oper. Res. Q. 1971, 22, 199. [Google Scholar] [CrossRef]
  181. Haas, S.M.; Frei, M.G.; Osorio, I.; Pasik-Duncan, B.; Radel, J. EEG Ocular Artifact Removal through ARMAX Model System Identification Using Extended Least Squares. Commun. Inf. Syst. 2003, 3, 19–40. [Google Scholar] [CrossRef]
  182. Wairagkar, M.; Hayashi, Y.; Nasuto, S.J. Modeling the Ongoing Dynamics of Short and Long-Range Temporal Correlations in Broadband EEG During Movement. Front. Syst. Neurosci. 2019, 13, 66. [Google Scholar] [CrossRef]
  183. Herrera, R.E.; Sun, M.; Dahl, R.E.; Ryan, N.D.; Sclabassi, R.J. Vector Autoregressive Model Selection in Multichannel EEG. In Proceedings of the 19th Annual International Conference of the IEEE Engineering in Medicine and Biology Society. “Magnificent Milestones and Emerging Opportunities in Medical Engineering” (Cat. No.97CH36136), Chicago, IL, USA, 30 October–2 November 1997; IEEE: New York, NY, USA, 1997; pp. 1211–1214. [Google Scholar]
  184. Endemann, C.M.; Krause, B.M.; Nourski, K.V.; Banks, M.I.; Veen, B. Van Multivariate Autoregressive Model Estimation for High-Dimensional Intracranial Electrophysiological Data. Neuroimage 2022, 254, 119057. [Google Scholar] [CrossRef]
  185. Hettiarachchi, I.T.; Mohamed, S.; Nyhof, L.; Nahavandi, S. An Extended Multivariate Autoregressive Framework for EEG-Based Information Flow Analysis of a Brain Network. In Proceedings of the 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Osaka, Japan, 3–7 July 2013; IEEE: New York, NY, USA, 2013; pp. 3945–3948. [Google Scholar]
  186. Bressler, S.L.; Kumar, A.; Singer, I. Brain Synchronization and Multivariate Autoregressive (MVAR) Modeling in Cognitive Neurodynamics. Front. Syst. Neurosci. 2022, 15, 638269. [Google Scholar] [CrossRef]
  187. Follis, J.L.; Lai, D. Modeling Volatility Characteristics of Epileptic EEGs Using GARCH Models. Signals 2020, 1, 26–46. [Google Scholar] [CrossRef]
  188. Li, W.; Nyholt, D.R. Marker Selection by Akaike Information Criterion and Bayesian Information Criterion. Genet. Epidemiol. 2001, 21, S272–S277. [Google Scholar] [CrossRef] [PubMed]
  189. Li, P.; Wang, X.; Li, F.; Zhang, R.; Ma, T.; Peng, Y.; Lei, X.; Tian, Y.; Guo, D.; Liu, T.; et al. Autoregressive Model in the Lp Norm Space for EEG Analysis. J. Neurosci. Methods 2015, 240, 170–178. [Google Scholar] [CrossRef] [PubMed]
  190. Abbasi, M.U.; Rashad, A.; Basalamah, A.; Tariq, M. Detection of Epilepsy Seizures in Neo-Natal EEG Using LSTM Architecture. IEEE Access 2019, 7, 179074–179085. [Google Scholar] [CrossRef]
  191. Rajabioun, M. Autistic Recognition from EEG Signals by Extracted Features from Several Time Series Models. Res. Sq. 2024, 1. [Google Scholar] [CrossRef]
  192. Maghsoudi, R.; White, C.D. Real-Time Identification of Parameters of the ARMA Model of the Human EEG Waveforms. Biomed. Sci. Instrum. 1993, 29, 191–198. [Google Scholar]
  193. Morales, S.; Bowers, M.E. Time-Frequency Analysis Methods and Their Application in Developmental EEG Data. Dev. Cogn. Neurosci. 2022, 54, 101067. [Google Scholar] [CrossRef]
  194. Pachori, R.B. Time-Frequency Analysis Techniques and Their Applications; CRC Press: Boca Raton, FL, USA, 2023; ISBN 9781003367987. [Google Scholar]
  195. Wang, Y.; Li, J.; Stoica, P. Spectral Analysis of Signals; Springer International Publishing: Cham, Switzerland, 2005; ISBN 978-3-031-01397-3. [Google Scholar]
  196. Ng, M.C.; Jing, J.; Westover, M.B. A Primer on EEG Spectrograms. J. Clin. Neurophysiol. 2022, 39, 177–183. [Google Scholar] [CrossRef]
  197. Göker, H. Welch Spectral Analysis and Deep Learning Approach for Diagnosing Alzheimer’s Disease from Resting-State EEG Recordings. Trait. Du Signal 2023, 40, 257–264. [Google Scholar] [CrossRef]
  198. Göker, H. Comparison of Different Spectral Analysis Methods with an Experimental EEG Dataset. Aintelia Sci. Notes 2022, 1, 15–17. [Google Scholar] [CrossRef]
  199. Göker, H. Detection of Alzheimer’s Disease from Electroencephalography (EEG) Signals Using Multitaper and Ensemble Learning Methods. Uludağ Univ. J. Fac. Eng. 2023, 28, 141–152. [Google Scholar] [CrossRef]
  200. Ali, M.H.; Uddin, M.B. Detection of Sleep Arousal from STFT-Based Instantaneous Features of Single Channel EEG Signal. Physiol. Meas. 2024, 45, 105005. [Google Scholar] [CrossRef] [PubMed]
  201. Lew, R.; Dyre, B.P.; Werner, S.; Wotring, B.; Tran, T. Exploring the Potential of Short-Time Fourier Transforms for Analyzing Skin Conductance and Pupillometry in Real-Time Applications. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2008, 52, 1536–1540. [Google Scholar] [CrossRef]
  202. Sharma, N.; Gaurav, G.; Anand, R.S. Epileptic Seizure Detection Using STFT Based Peak Mean Feature and Support Vector Machine. In Proceedings of the 2021 8th International Conference on Signal Processing and Integrated Networks (SPIN), Noida, India, 26 August 2021; IEEE: New York, NY, USA, 2021; pp. 1131–1136. [Google Scholar]
  203. Grobbelaar, M.; Phadikar, S.; Ghaderpour, E.; Struck, A.F.; Sinha, N.; Ghosh, R.; Ahmed, M.Z.I. A Survey on Denoising Techniques of Electroencephalogram Signals Using Wavelet Transform. Signals 2022, 3, 577–586. [Google Scholar] [CrossRef]
  204. Elshekhidris, I.H.; MohamedAmien, M.B.; Fragoon, A. Wavelet Transforms for EEG Signal Denoising and Decomposition. Int. J. Adv. Signal Image Sci. 2023, 9, 11–28. [Google Scholar] [CrossRef]
  205. Bajaj, N. Wavelets for EEG Analysis. In Wavelet Theory; IntechOpen: London, UK, 2021. [Google Scholar]
  206. Shirodkar, V.; Edla, D.R.; Kumari, A. Generative Adversarial Networks for Motor Imagery Classification Using Wavelet Packet Decomposition and Complex Morlet Transform. Multimed. Tools Appl. 2025, 84, 34377–34400. [Google Scholar] [CrossRef]
  207. Liu, Y.; Liu, G.; Wu, S.; Tin, C. Phase Spectrogram of EEG from S-Transform Enhances Epileptic Seizure Detection. Expert Syst. Appl. 2025, 262, 125621. [Google Scholar] [CrossRef]
  208. Gosala, B.; Dindayal Kapgate, P.; Jain, P.; Nath Chaurasia, R.; Gupta, M. Wavelet Transforms for Feature Engineering in EEG Data Processing: An Application on Schizophrenia. Biomed. Signal Process. Control 2023, 85, 104811. [Google Scholar] [CrossRef]
  209. Alyasseri, Z.A.A.; Khader, A.T.; Al-Betar, M.A.; Abasi, A.K.; Makhadmeh, S.N. EEG Signals Denoising Using Optimal Wavelet Transform Hybridized with Efficient Metaheuristic Methods. IEEE Access 2020, 8, 10584–10605. [Google Scholar] [CrossRef]
  210. Urbina Fredes, S.; Dehghan Firoozabadi, A.; Adasme, P.; Zabala-Blanco, D.; Palacios Játiva, P.; Azurdia-Meza, C. Enhanced Epileptic Seizure Detection through Wavelet-Based Analysis of EEG Signal Processing. Appl. Sci. 2024, 14, 5783. [Google Scholar] [CrossRef]
  211. Amin, H.U.; Ullah, R.; Reza, M.F.; Malik, A.S. Single-Trial Extraction of Event-Related Potentials (ERPs) and Classification of Visual Stimuli by Ensemble Use of Discrete Wavelet Transform with Huffman Coding and Machine Learning Techniques. J. Neuroeng. Rehabil. 2023, 20, 70. [Google Scholar] [CrossRef] [PubMed]
  212. Cura, O.K.; Akan, A. Classification of Epileptic EEG Signals Using Synchrosqueezing Transform and Machine Learning. Int. J. Neural Syst. 2021, 31, 2150005. [Google Scholar] [CrossRef] [PubMed]
  213. Ra, J.S.; Li, T.; Li, Y. Epileptic Seizure Prediction Based on Synchroextracting Transform and Sparse Representation. IEEE Access 2024, 12, 187684–187695. [Google Scholar] [CrossRef]
  214. Yousif, M.A.A.; Ozturk, M. Deep Learning-Based Classification of Epileptic Electroencephalography Signals Using a Concentrated Time-Frequency Approach. Int. J. Neural Syst. 2023, 33, 2350064. [Google Scholar] [CrossRef]
  215. Shimizu, R.; Wu, H.-T. Unveil Sleep Spindles with Concentration of Frequency and Time (ConceFT). Physiol. Meas. 2024, 45, 085003. [Google Scholar] [CrossRef]
  216. Moca, V.V.; Bârzan, H.; Nagy-Dăbâcan, A.; Mureșan, R.C. Time-Frequency Super-Resolution with Superlets. Nat. Commun. 2021, 12, 337. [Google Scholar] [CrossRef]
  217. Kumar Dwivedi, A.; Prakash Verma, O.; Taran, S. Adaptive Flexible Analytic Wavelet Transform for EEG-Based Emotion Recognition. IEEE Sens. J. 2024, 24, 28941–28951. [Google Scholar] [CrossRef]
  218. Kumar Dwivedi, A.; Prakash Verma, O.; Taran, S. Group Sparse and Super Resolution Time-Frequency-Based Method for Emotion Recognition. Digit. Signal Process. 2026, 170, 105761. [Google Scholar] [CrossRef]
  219. Guo, Y.; Wan, L.; Sheng, X.; Wang, G.; Kang, S.; Zhou, H.; Zhang, X. The Application of Superlet Transform in EEG-Based Motor Imagery Classification of Unilateral Knee Movement. In Proceedings of the International Conference on Autonomous Unmanned Systems, Singapore, 19–21 September 2024; pp. 511–521. [Google Scholar]
  220. Sharma, N.; Sharma, M.; Singhal, A.; Fatema, N.; Jadoun, V.K.; Malik, H.; Afthanorhan, A. A Spatiotemporal Feature Extraction Technique Using Superlet-CNN Fusion for Improved Motor Imagery Classification. IEEE Access 2025, 13, 2141–2151. [Google Scholar] [CrossRef]
  221. Makeig, S.; Jung, T.-P.; Bell, A.J.; Sejnowski, T.J. Independent Component Analysis of Electroencephalographic Data. In Proceedings of the Advances in Neural Information Processing Systems 8 (NIPS 1995), Denver, CO, USA, 27–30 November 1995. [Google Scholar]
  222. Moosmann, M.; Eichele, T.; Nordby, H.; Hugdahl, K.; Calhoun, V.D. Joint Independent Component Analysis for Simultaneous EEG–FMRI: Principle and Simulation. Int. J. Psychophysiol. 2008, 67, 212–221. [Google Scholar] [CrossRef]
  223. Luo, Z. Independent Vector Analysis: Model, Applications, Challenges. Pattern Recognit. 2023, 138, 109376. [Google Scholar] [CrossRef]
  224. Moraes, C.P.A.; Aristimunha, B.; Dos Santos, L.H.; Pinaya, W.H.L.; de Camargo, R.Y.; Fantinato, D.G.; Neves, A. Applying Independent Vector Analysis on EEG-Based Motor Imagery Classification. In Proceedings of the ICASSP 2023—2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Rhodes Island, Greece, 4–10 June 2023; IEEE: New York, NY, USA, 2023; pp. 1–5. [Google Scholar]
  225. Subasi, A.; Ismail Gursoy, M. EEG Signal Classification Using PCA, ICA, LDA and Support Vector Machines. Expert Syst. Appl. 2010, 37, 8659–8666. [Google Scholar] [CrossRef]
  226. Gillis, N. The Why and How of Nonnegative Matrix Factorization. arXiv 2014, arXiv:1401.5226. [Google Scholar] [PubMed]
  227. Hu, G.; Zhou, T.; Luo, S.; Mahini, R.; Xu, J.; Chang, Y.; Cong, F. Assessment of Nonnegative Matrix Factorization Algorithms for Electroencephalography Spectral Analysis. Biomed. Eng. Online 2020, 19, 61. [Google Scholar] [CrossRef]
  228. Liu, F.; Wang, S.; Rosenberger, J.; Su, J.; Liu, H. A Sparse Dictionary Learning Framework to Discover Discriminative Source Activations in EEG Brain Mapping. In Proceedings of the AAAI Conference on Artificial Intelligence, San Francisco, CA, USA, 4–9 February 2017; Volume 31. [Google Scholar] [CrossRef]
  229. Barthélemy, Q.; Gouy-Pailler, C.; Isaac, Y.; Souloumiac, A.; Larue, A.; Mars, J.I. Multivariate Temporal Dictionary Learning for EEG. Comput. Neurosci. 2013, 215, 19–28. [Google Scholar] [CrossRef][Green Version]
  230. OK, F.; Rajesh, R. Empirical Mode Decomposition of EEG Signals for the Effectual Classification of Seizures. In Advances in Neural Signal Processing; IntechOpen: London, UK, 2020. [Google Scholar]
  231. ShahbazPanahi, S.; Jing, Y. Recent Advances in Network Beamforming. In Academic Press Library in Signal Processing, Volume 7; Elsevier: Amsterdam, The Netherlands, 2018; pp. 403–477. [Google Scholar]
  232. Westner, B.U.; Dalal, S.S.; Gramfort, A.; Litvak, V.; Mosher, J.C.; Oostenveld, R.; Schoffelen, J.-M. A Unified View on Beamformers for M/EEG Source Reconstruction. Neuroimage 2022, 246, 118789. [Google Scholar] [CrossRef]
  233. Hauk, O. Keep It Simple: A Case for Using Classical Minimum Norm Estimation in the Analysis of EEG and MEG Data. Neuroimage 2004, 21, 1612–1621. [Google Scholar] [CrossRef]
  234. Dattola, S.; Morabito, F.C.; Mammone, N.; La Foresta, F. Findings about LORETA Applied to High-Density EEG—A Review. Electronics 2020, 9, 660. [Google Scholar] [CrossRef]
  235. Bastola, S.; Jahromi, S.; Chikara, R.; Stufflebeam, S.M.; Ottensmeyer, M.P.; De Novi, G.; Papadelis, C.; Alexandrakis, G. Improved Dipole Source Localization from Simultaneous MEG-EEG Data by Combining a Global Optimization Algorithm with a Local Parameter Search: A Brain Phantom Study. Bioengineering 2024, 11, 897. [Google Scholar] [CrossRef]
  236. Veeramalla, S.K.; Talari, V.K.H.R. Multiple Dipole Source Localization of EEG Measurements Using Particle Filter with Partial Stratified Resampling. Biomed. Eng. Lett. 2020, 10, 205–215. [Google Scholar] [CrossRef]
  237. Guevara, M.A.; Corsi-Cabrera, M. EEG Coherence or EEG Correlation? Int. J. Psychophysiol. 1996, 23, 145–153. [Google Scholar] [CrossRef]
  238. Hwang, S.; Shin, Y.; Sunwoo, J.-S.; Son, H.; Lee, S.-B.; Chu, K.; Jung, K.-Y.; Lee, S.K.; Kim, Y.-G.; Park, K.-I. Increased Coherence Predicts Medical Refractoriness in Patients with Temporal Lobe Epilepsy on Monotherapy. Sci. Rep. 2024, 14, 20530. [Google Scholar] [CrossRef]
  239. Awais, M.A.; Yusoff, M.Z.; Khan, D.M.; Yahya, N.; Kamel, N.; Ebrahim, M. Effective Connectivity for Decoding Electroencephalographic Motor Imagery Using a Probabilistic Neural Network. Sensors 2021, 21, 6570. [Google Scholar] [CrossRef]
  240. Escalante Puente de la Vega, W.; Pisarchik, A.N. Effective Brain Connectivity Analysis During Endogenous Selective Attention Based on Granger Causality. Appl. Sci. 2025, 16, 101. [Google Scholar] [CrossRef]
  241. Brunner, C.; Billinger, M.; Seeber, M.; Mullen, T.R.; Makeig, S. Volume Conduction Influences Scalp-Based Connectivity Estimates. Front. Comput. Neurosci. 2016, 10, 121. [Google Scholar] [CrossRef] [PubMed]
  242. Anzolin, A.; Presti, P.; Van De Steen, F.; Astolfi, L.; Haufe, S.; Marinazzo, D. Quantifying the Effect of Demixing Approaches on Directed Connectivity Estimated Between Reconstructed EEG Sources. Brain Topogr. 2019, 32, 655–674. [Google Scholar] [CrossRef] [PubMed]
  243. Ursino, M.; Ricci, G.; Magosso, E. Transfer Entropy as a Measure of Brain Connectivity: A Critical Analysis with the Help of Neural Mass Models. Front. Comput. Neurosci. 2020, 14, 45. [Google Scholar] [CrossRef]
  244. Huang, C.-S.; Pal, N.R.; Chuang, C.-H.; Lin, C.-T. Identifying Changes in EEG Information Transfer during Drowsy Driving by Transfer Entropy. Front. Hum. Neurosci. 2015, 9, 570. [Google Scholar] [CrossRef]
  245. Zhu, J.-Y.; Li, M.-M.; Zhang, Z.-H.; Liu, G.; Wan, H. Performance Baseline of Phase Transfer Entropy Methods for Detecting Animal Brain Area Interactions. Entropy 2023, 25, 994. [Google Scholar] [CrossRef]
  246. Lee, J.-Y.; Shim, M.; Chang, W.K.; Cho, H.-M.; Choi, J.-S.; Kim, H.; Suh, B.; Paik, N.-J.; Hwang, H.-J.; Kim, W.-S. Functional Connectivity Associated with Severe Upper Limb Impairment in Resting-State Electroencephalography among Chronic Stroke Survivors: A Machine Learning Approach. J. Neuroeng. Rehabil. 2025, 22, 267. [Google Scholar] [CrossRef]
  247. Paitel, E.R.; Otteman, C.B.D.; Polking, M.C.; Licht, H.J.; Nielson, K.A. Functional and Effective EEG Connectivity Patterns in Alzheimer’s Disease and Mild Cognitive Impairment: A Systematic Review. Front. Aging Neurosci. 2025, 17, 1496235. [Google Scholar] [CrossRef] [PubMed]
  248. Bagherzadeh, S.; Shalbaf, A. EEG-Based Schizophrenia Detection Using Fusion of Effective Connectivity Maps and Convolutional Neural Networks with Transfer Learning. Cogn. Neurodyn. 2024, 18, 2767–2778. [Google Scholar] [CrossRef] [PubMed]
  249. Wang, H.; Zhang, Q.; Luo, Y.; Wang, Q.; Zhu, S.; Yi, W.; Wang, J. Analysis of Depressive EEG Signals via Symbolic Phase Transfer Entropy with an Adaptive Template Method. AIP Adv. 2024, 14, 065115. [Google Scholar] [CrossRef]
  250. Na, S.H.; Jin, S.-H.; Kim, S.Y.; Ham, B.-J. EEG in Schizophrenic Patients: Mutual Information Analysis. Clin. Neurophysiol. 2002, 113, 1954–1960. [Google Scholar] [CrossRef]
  251. Baselice, F.; Sorriso, A.; Rucco, R.; Sorrentino, P. Phase Linearity Measurement: A Novel Index for Brain Functional Connectivity. IEEE Trans. Med. Imaging 2019, 38, 873–882. [Google Scholar] [CrossRef]
  252. Helfrich, R.F.; Herrmann, C.S.; Engel, A.K.; Schneider, T.R. Different Coupling Modes Mediate Cortical Cross-Frequency Interactions. Neuroimage 2016, 140, 76–82. [Google Scholar] [CrossRef]
  253. Chao, J.; Zheng, S.; Lei, C.; Peng, H.; Hu, B. Exploratory Cross-Frequency Coupling and Scaling Analysis of Neuronal Oscillations Stimulated by Emotional Images: An Evidence From EEG. IEEE Trans. Cogn. Dev. Syst. 2023, 15, 1732–1743. [Google Scholar] [CrossRef]
  254. de Haan, W.; Pijnenburg, Y.A.L.; Strijers, R.L.; van der Made, Y.; van der Flier, W.M.; Scheltens, P.; Stam, C.J. Functional Neural Network Analysis in Frontotemporal Dementia and Alzheimer’s Disease Using EEG and Graph Theory. BMC Neurosci. 2009, 10, 101. [Google Scholar] [CrossRef]
  255. Pagnotta, M.F.; Plomp, G. Time-Varying MVAR Algorithms for Directed Connectivity Analysis: Critical Comparison in Simulations and Benchmark EEG Data. PLoS ONE 2018, 13, e0198846. [Google Scholar] [CrossRef]
  256. Kawano, T.; Hattori, N.; Uno, Y.; Kitajo, K.; Hatakenaka, M.; Yagura, H.; Fujimoto, H.; Yoshioka, T.; Nagasako, M.; Otomune, H.; et al. Large-Scale Phase Synchrony Reflects Clinical Status After Stroke: An EEG Study. Neurorehabil. Neural Repair 2017, 31, 561–570. [Google Scholar] [CrossRef]
  257. Pradhan, N.; Narayana Dutt, D. A Nonlinear Perspective in Understanding the Neurodynamics of EEG. Comput. Biol. Med. 1993, 23, 425–442. [Google Scholar] [CrossRef] [PubMed]
  258. Winter, L.; Taylor, P.; Bellenger, C.; Grimshaw, P.; Crowther, R.G. The Application of the Lyapunov Exponent to Analyse Human Performance: A Systematic Review. J. Sports Sci. 2023, 41, 1994–2013. [Google Scholar] [CrossRef] [PubMed]
  259. Moaveninejad, S.; Cauzzo, S.; Porcaro, C. Fractal Dimension and Clinical Neurophysiology Fusion to Gain a Deeper Brain Signal Understanding: A Systematic Review. Inf. Fusion 2025, 118, 102936. [Google Scholar] [CrossRef]
  260. Valjarevic, S.; Paunovic Pantic, J.; Cumic, J.; Corridon, P.R.; Pantic, I. Fractal Analysis of Auditory Evoked Potentials: Research Gaps and Potential AI Applications. Fractal Fract. 2025, 10, 20. [Google Scholar] [CrossRef]
  261. Tou, S.L.J.; Chau, T. The Fractal Dimension of Resting State EEG Increases over Age in Children. Cereb. Cortex 2025, 35, bhaf138. [Google Scholar] [CrossRef]
  262. Lal, U.; Chikkankod, A.V.; Longo, L. Fractal Dimensions and Machine Learning for Detection of Parkinson’s Disease in Resting-State Electroencephalography. Neural Comput. Appl. 2024, 36, 8257–8280. [Google Scholar] [CrossRef]
  263. Esteban, F.J.; Vargas, E. Foundations and Clinical Applications of Fractal Dimension in Neuroscience: Concepts and Perspectives. AppliedMath 2026, 6, 7. [Google Scholar] [CrossRef]
  264. Higuchi, T. Approach to an Irregular Time Series on the Basis of the Fractal Theory. Phys. D 1988, 31, 277–283. [Google Scholar] [CrossRef]
  265. Katz, M.J. Fractals and the Analysis of Waveforms. Comput. Biol. Med. 1988, 18, 145–156. [Google Scholar] [CrossRef]
  266. Khoa, T.Q.D.; Ha, V.Q.; Toi, V. Van Higuchi Fractal Properties of Onset Epilepsy Electroencephalogram. Comput. Math. Methods Med. 2012, 2012, 461426. [Google Scholar] [CrossRef]
  267. Sevcik, C. A Procedure to Estimate the Fractal Dimension of Waveforms. arXiv 2010, arXiv:1003.5266. [Google Scholar] [CrossRef]
  268. Petrosian, A. Kolmogorov Complexity of Finite Sequences and Recognition of Different Preictal EEG Patterns. In Proceedings of the Eighth IEEE Symposium on Computer-Based Medical Systems, Lubbock, TX, USA, 9–10 June 1995; IEEE: New York, NY, USA, 1995; pp. 212–217. [Google Scholar]
  269. Wang, F.; Wang, H.; Zhou, X.; Fu, R. A Driving Fatigue Feature Detection Method Based on Multifractal Theory. IEEE Sens. J. 2022, 22, 19046–19059. [Google Scholar] [CrossRef]
  270. Wang, X. A Multifractal Detrended Fluctuation Analysis of EEG Signals Based on Meditation and Attention Tasks. In Proceedings of the 2025 International Conference on Big Data, Communication Technology and Computer Applications, Kuala Lumpur, Malaysia, 14–16 February 2025; ACM: New York, NY, USA, 2025; pp. 75–80. [Google Scholar]
  271. Mohamed, A.F.; Jusas, V. Advancing Fractal Dimension Techniques to Enhance Motor Imagery Tasks Using EEG for Brain–Computer Interface Applications. Appl. Sci. 2025, 15, 6021. [Google Scholar] [CrossRef]
  272. Li, W.; Fang, C.; Zhu, Z.; Chen, C.; Song, A. Fractal Spiking Neural Network Scheme for EEG-Based Emotion Recognition. IEEE J. Transl. Eng. Health Med. 2024, 12, 106–118. [Google Scholar] [CrossRef] [PubMed]
  273. Yoder, K.J.; Brookshire, G.; Glatt, R.M.; Merrill, D.A.; Gerrol, S.; Quirk, C.; Lucero, C. Fractal Dimension Distributions of Resting-State Electroencephalography (EEG) Improve Detection of Dementia and Alzheimer’s Disease Compared to Traditional Fractal Analysis. Clin. Transl. Neurosci. 2024, 8, 27. [Google Scholar] [CrossRef]
  274. Tan, S.; Tang, Z.; He, Q.; Li, Y.; Cai, Y.; Zhang, J.; Fan, D.; Guo, Z. Automatic Detection and Prediction of Epileptic EEG Signals Based on Nonlinear Dynamics and Deep Learning: A Review. Front. Neurosci. 2025, 19, 1630664. [Google Scholar] [CrossRef]
  275. Kannathal, N.; Choo, M.L.; Acharya, U.R.; Sadasivan, P.K. Entropies for Detection of Epilepsy in EEG. Comput. Methods Programs Biomed. 2005, 80, 187–194. [Google Scholar] [CrossRef]
  276. Gao, Y.; Wang, X.; Potter, T.; Zhang, J.; Zhang, Y. Single-Trial EEG Emotion Recognition Using Granger Causality/Transfer Entropy Analysis. J. Neurosci. Methods 2020, 346, 108904. [Google Scholar] [CrossRef]
  277. Aftanas, L.I.; Lotova, N.V.; Koshkarov, V.I.; Pokrovskaja, V.L.; Popov, S.A.; Makhnev, V.P. Non-Linear Analysis of Emotion EEG: Calculation of Kolmogorov Entropy and the Principal Lyapunov Exponent. Neurosci. Lett. 1997, 226, 13–16. [Google Scholar] [CrossRef]
  278. Geng, S.; Zhou, W.; Yuan, Q.; Cai, D.; Zeng, Y. EEG Non-Linear Feature Extraction Using Correlation Dimension and Hurst Exponent. Neurol. Res. 2011, 33, 908–912. [Google Scholar] [CrossRef]
  279. Lahmiri, S. Generalized Hurst Exponent Estimates Differentiate EEG Signals of Healthy and Epileptic Patients. Phys. A Stat. Mech. Its Appl. 2018, 490, 378–385. [Google Scholar] [CrossRef]
  280. Niknazar, M.; Mousavi, S.R.; Vosoughi Vahdat, B.; Sayyah, M. A New Framework Based on Recurrence Quantification Analysis for Epileptic Seizure Detection. IEEE J. Biomed. Health Inform. 2013, 17, 572–578. [Google Scholar] [CrossRef] [PubMed]
  281. Shabani, H.; Mikaili, M.; Noori, S.M.R. Assessment of Recurrence Quantification Analysis (RQA) of EEG for Development of a Novel Drowsiness Detection System. Biomed. Eng. Lett. 2016, 6, 196–204. [Google Scholar] [CrossRef]
  282. Talaat, M.; Awadalla, M.; Abdel-Hamid, L. Recurrence Quantification Analysis (RQA) Features vs. Traditional EEG Features for Alzheimer’s Disease Diagnosis. Intel. Artif. 2025, 28, 170–185. [Google Scholar] [CrossRef]
  283. Sun, C.; Mou, C. Survey on the Research Direction of EEG-Based Signal Processing. Front. Neurosci. 2023, 17, 1203059. [Google Scholar] [CrossRef]
  284. Hosseini, M.-P.; Hosseini, A.; Ahi, K. A Review on Machine Learning for EEG Signal Processing in Bioengineering. IEEE Rev. Biomed. Eng. 2021, 14, 204–218. [Google Scholar] [CrossRef]
  285. Saeidi, M.; Karwowski, W.; Farahani, F.V.; Fiok, K.; Taiar, R.; Hancock, P.A.; Al-Juaid, A. Neural Decoding of EEG Signals with Machine Learning: A Systematic Review. Brain Sci. 2021, 11, 1525. [Google Scholar] [CrossRef]
  286. Jain, A.; Raja, R.; Srivastava, S.; Sharma, P.C.; Gangrade, J.; Manoj, R. Analysis of EEG Signals and Data Acquisition Methods: A Review. Comput. Methods Biomech. Biomed. Eng. Imaging Vis. 2024, 12, 2304574. [Google Scholar] [CrossRef]
  287. Näher, T.; Bastian, L.; Vorreuther, A.; Fries, P.; Goebel, R.; Sorger, B. Riemannian Geometry Boosts Functional Near-Infrared Spectroscopy-Based Brain-State Classification Accuracy. Neurophotonics 2025, 12, 045002. [Google Scholar] [CrossRef]
  288. Wosiak, A.; Tereszczuk, A.; Żykwińska, K. Determining Levels of Affective States with Riemannian Geometry Applied to EEG Signals. Appl. Sci. 2025, 15, 10370. [Google Scholar] [CrossRef]
  289. Al-Mashhadani, Z.; Bayat, N.; Kadhim, I.F.; Choudhury, R.; Park, J.-H. The Efficacy and Utility of Lower-Dimensional Riemannian Geometry for EEG-Based Emotion Classification. Appl. Sci. 2023, 13, 8274. [Google Scholar] [CrossRef]
  290. Tibermacine, I.E.; Russo, S.; Tibermacine, A.; Rabehi, A.; Nail, B.; Kadri, K.; Napoli, C. Riemannian Geometry-Based EEG Approaches: A Literature Review. arXiv 2024, arXiv:2407.20250. [Google Scholar]
  291. Zhuo, F.; Zhang, X.; Tang, F.; Yu, Y.; Liu, L. Riemannian Transfer Learning Based on Log-Euclidean Metric for EEG Classification. Front. Neurosci. 2024, 18, 1381572. [Google Scholar] [CrossRef] [PubMed]
  292. Bleuzé, A.; Mattout, J.; Congedo, M. Tangent Space Alignment: Transfer Learning for Brain-Computer Interface. Front. Hum. Neurosci. 2022, 16, 1049985. [Google Scholar] [CrossRef]
  293. Lawhern, V.J.; Solon, A.J.; Waytowich, N.R.; Gordon, S.M.; Hung, C.P.; Lance, B.J. EEGNet: A Compact Convolutional Network for EEG-Based Brain-Computer Interfaces. J. Neural Eng. 2018, 15, 056013. [Google Scholar] [CrossRef]
  294. Schirrmeister, R.T.; Springenberg, J.T.; Fiederer, L.D.J.; Glasstetter, M.; Eggensperger, K.; Tangermann, M.; Hutter, F.; Burgard, W.; Ball, T. Deep Learning with Convolutional Neural Networks for EEG Decoding and Visualization. Hum. Brain Mapp. 2017, 38, 5391–5420. [Google Scholar] [CrossRef]
  295. Islam, M.R.; Massicotte, D.; Nougarou, F.; Massicotte, P.; Zhu, W.-P. S-Convnet: A Shallow Convolutional Neural Network Architecture for Neuromuscular Activity Recognition Using Instantaneous High-Density Surface EMG Images. In Proceedings of the 2020 42nd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Virtual, 20–24 July 2020; IEEE: New York, NY, USA, 2020; pp. 744–749. [Google Scholar]
  296. Zhao, X.; Zhang, H.; Zhu, G.; You, F.; Kuang, S.; Sun, L. A Multi-Branch 3D Convolutional Neural Network for EEG-Based Motor Imagery Classification. IEEE Trans. Neural Syst. Rehabil. Eng. 2019, 27, 2164–2177. [Google Scholar] [CrossRef]
  297. Craik, A.; He, Y.; Contreras-Vidal, J.L. Deep Learning for Electroencephalogram (EEG) Classification Tasks: A Review. J. Neural Eng. 2019, 16, 031001. [Google Scholar] [CrossRef]
  298. Altaheri, H.; Muhammad, G.; Alsulaiman, M.; Amin, S.U.; Altuwaijri, G.A.; Abdul, W.; Bencherif, M.A.; Faisal, M. Deep Learning Techniques for Classification of Electroencephalogram (EEG) Motor Imagery (MI) Signals: A Review. Neural Comput. Appl. 2023, 35, 14681–14722. [Google Scholar] [CrossRef]
  299. Roy, Y.; Banville, H.; Albuquerque, I.; Gramfort, A.; Falk, T.H.; Faubert, J. Deep Learning-Based Electroencephalography Analysis: A Systematic Review. J. Neural Eng. 2019, 16, 051001. [Google Scholar] [CrossRef]
  300. Chowdhury, M.R.; Ding, Y.; Sen, S. SSL-SE-EEG: A Framework for Robust Learning from Unlabeled EEG Data with Self-Supervised Learning and Squeeze-Excitation Networks. In Proceedings of the 47th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC 2025), Copenhagen, Denmark, 14–17 July 2025. [Google Scholar]
  301. Afzal, M.F.; Desai, S.A.; Barry, W.; Tcheng, T.K.; Kuo, J.; Benard, S.W.; Traner, C.B.; Greene, D.; Seale, C.G.; Morrell, M.J. Using Vision Transformers for Electrographic Seizure Classification to Aid Physician Review of Intracranial Electroencephalography Recordings. Front. Hum. Neurosci. 2025, 19, 1680395. [Google Scholar] [CrossRef] [PubMed]
  302. Li, Q.; Cao, W.; Zhang, A. Multi-Stream Feature Fusion of Vision Transformer and CNN for Precise Epileptic Seizure Detection from EEG Signals. J. Transl. Med. 2025, 23, 871. [Google Scholar] [CrossRef] [PubMed]
  303. Klepl, D.; Wu, M.; He, F. Graph Neural Network-Based EEG Classification: A Survey. IEEE Trans. Neural Syst. Rehabil. Eng. 2024, 32, 493–503. [Google Scholar] [CrossRef] [PubMed]
  304. Zhang, Y.; Liao, Y.; Chen, W.; Zhang, X.; Huang, L. Emotion Recognition of EEG Signals Based on Contrastive Learning Graph Convolutional Model. J. Neural Eng. 2024, 21, 046060. [Google Scholar] [CrossRef]
  305. Amrani, G.; Adadi, A.; Berrada, M.; Souirti, Z.; Boujraf, S. EEG Signal Analysis Using Deep Learning: A Systematic Literature Review. In Proceedings of the 2021 Fifth International Conference on Intelligent Computing in Data Sciences (ICDS), Fez, Morocco, 20 October 2021; IEEE: New York, NY, USA, 2021; pp. 1–8. [Google Scholar]
  306. Ye, W.; Zhang, Z.; Teng, F.; Zhang, M.; Wang, J.; Ni, D.; Li, F.; Xu, P.; Liang, Z. Semi-Supervised Dual-Stream Self-Attentive Adversarial Graph Contrastive Learning for Cross-Subject EEG-Based Emotion Recognition. IEEE Trans. Affect. Comput. 2025, 16, 290–305. [Google Scholar] [CrossRef]
  307. Zhang, H.; Li, H. Transformer-Based EEG Decoding: A Survey. arXiv 2025, arXiv:2507.02320. [Google Scholar] [CrossRef]
  308. Kuruppu, G.; Wagh, N.; Varatharajah, Y. EEG Foundation Models: A Critical Review of Current Progress and Future Directions. arXiv 2025, arXiv:2507.11783. [Google Scholar] [CrossRef]
  309. Wang, C.; Subramaniam, V.; Yaari, A.U.; Kreiman, G.; Katz, B.; Cases, I.; Barbu, A. BrainBERT: Self-Supervised Representation Learning for Intracranial Recordings. In Proceedings of the Eleventh International Conference on Learning Representations (ICLR 2023), Virtual, 1–5 May 2023. [Google Scholar]
  310. Wu, D.; Li, S.; Yang, J.; Sawan, M. Neuro-BERT: Rethinking Masked Autoencoding for Self-Supervised Neurological Pretraining. In IEEE Journal of Biomedical and Health Informatics; IEEE: New York, NY, USA, 2024; pp. 1–11. [Google Scholar] [CrossRef]
  311. Zhou, J.; Duan, Y.; Chang, F.; Do, T.; Wang, Y.-K.; Lin, C.-T. BELT-2: Bootstrapping EEG-to-Language Representation Alignment for Multi-Task Brain Decoding. arXiv 2024, arXiv:2409.00121. [Google Scholar]
  312. Liu, A.; Jing, H.; Liu, Y.; Ma, Y.; Zheng, N. Hidden States in LLMs Improve EEG Representation Learning and Visual Decoding. ECAI 2024, 392, 2130–2137. [Google Scholar]
  313. Chen, H.; Zeng, W.; Chen, C.; Cai, L.; Wang, F.; Shi, Y.; Wang, L.; Zhang, W.; Li, Y.; Yan, H.; et al. EEG Emotion Copilot: Optimizing Lightweight LLMs for Emotional EEG Interpretation with Assisted Medical Record Generation. Neural Netw. 2025, 192, 107848. [Google Scholar] [CrossRef]
  314. Qin, C.; Yang, R.; You, W.; Chen, Z.; Zhu, L.; Huang, M.; Wang, Z. EEGUnity: Open-Source Tool in Facilitating Unified EEG Datasets Towards Large-Scale EEG Model. arXiv 2024, arXiv:2410.07196. [Google Scholar] [CrossRef] [PubMed]
  315. You, Z.; Guo, Y.; Zhang, X.; Zhao, Y. Virtual Electroencephalogram Acquisition: A Review on Electroencephalogram Generative Methods. Sensors 2025, 25, 3178. [Google Scholar] [CrossRef] [PubMed]
  316. Yang, J.; Yu, H.; Shen, T.; Song, Y.; Chen, Z. 4-Class MI-EEG Signal Generation and Recognition with CVAE-GAN. Appl. Sci. 2021, 11, 1798. [Google Scholar] [CrossRef]
  317. Bao, G.; Yan, B.; Tong, L.; Shu, J.; Wang, L.; Yang, K.; Zeng, Y. Data Augmentation for EEG-Based Emotion Recognition Using Generative Adversarial Networks. Front. Comput. Neurosci. 2021, 15, 723843. [Google Scholar] [CrossRef]
  318. Torma, S.; Szegletes, L. Generative Modeling and Augmentation of EEG Signals Using Improved Diffusion Probabilistic Models. J. Neural Eng. 2025, 22, 016001. [Google Scholar] [CrossRef]
  319. Zhou, T.; Chen, X.; Shen, Y.; Nieuwoudt, M.; Pun, C.-M.; Wang, S. Generative AI Enables EEG Data Augmentation for Alzheimer’s Disease Detection Via Diffusion Model. In Proceedings of the 2023 IEEE International Symposium on Product Compliance Engineering-Asia (ISPCE-ASIA), Shanghai, China, 4 November 2023; IEEE: New York, NY, USA, 2023; pp. 1–6. [Google Scholar]
  320. Alexandre, H.d.L.; Lima, C.A.d.M. Synthetic EEG Generation Using Diffusion Models for Motor Imagery Tasks. In Proceedings of the 13th Brazilian Conference on Intelligent Systems (BRACIS 2024), Belém, Brazil, 17–21 November 2025. [Google Scholar]
  321. Bai, Y.; Wang, X.; Cao, Y.; Ge, Y.; Yuan, C.; Shan, Y. DreamDiffusion: Generating High-Quality Images from Brain EEG Signals. In Proceedings of the 18th European Conference on Computer Vision (ECCV 2024), Milan, Italy, 29 September–4 October 2023. [Google Scholar]
  322. Qian, D.; Zeng, H.; Cheng, W.; Liu, Y.; Bikki, T.; Pan, J. NeuroDM: Decoding and Visualizing Human Brain Activity with EEG-Guided Diffusion Model. Comput. Methods Programs Biomed. 2024, 251, 108213. [Google Scholar] [CrossRef]
  323. Puah, J.H.; Goh, S.K.; Zhang, Z.; Ye, Z.; Chan, C.K.; Lim, K.S.; Fong, S.L.; Woon, K.S.; Guan, C. EEGDM: EEG Representation Learning via Generative Diffusion Model. arXiv 2025, arXiv:2508.14086. [Google Scholar]
  324. Li, W.; Li, H.; Sun, X.; Kang, H.; An, S.; Wang, G.; Gao, Z. Self-Supervised Contrastive Learning for EEG-Based Cross-Subject Motor Imagery Recognition. J. Neural Eng. 2024, 21, 026038. [Google Scholar] [CrossRef]
  325. Weng, W.; Gu, Y.; Guo, S.; Ma, Y.; Yang, Z.; Liu, Y.; Chen, Y. Self-Supervised Learning for Electroencephalogram: A Systematic Survey. J. ACM Comput. Surv. (CSUR) 2024, 57, 1–38. [Google Scholar] [CrossRef]
  326. Hallgarten, P.; Bethge, D.; Özdcnizci, O.; Grosse-Puppendahl, T.; Kasneci, E. TS-MoCo: Time-Series Momentum Contrast for Self-Supervised Physiological Representation Learning. In Proceedings of the 2023 31st European Signal Processing Conference (EUSIPCO), Helsinki, Finland, 4–8 September 2023; IEEE: New York, NY, USA, 2023; pp. 1030–1034. [Google Scholar]
  327. Kostas, D.; Aroca-Ouellette, S.; Rudzicz, F. BENDR: Using Transformers and a Contrastive Self-Supervised Learning Task to Learn from Massive Amounts of EEG Data. Front. Hum. Neurosci. 2021, 15, 653659. [Google Scholar] [CrossRef]
  328. Zhang, K.; He, L.; Jiang, X.; Lu, W.; Wang, D.; Gao, X. CognitionCapturer: Decoding Visual Stimuli From Human EEG Signal with Multimodal Information. arXiv 2024, arXiv:2412.10489. [Google Scholar] [CrossRef]
  329. Wu, M.; Zhao, C.; Su, A.; Di, D.; Fu, T.; An, D.; He, M.; Gao, Y.; Ma, M.; Yan, K.; et al. Hypergraph Multi-Modal Large Language Model: Exploiting EEG and Eye-Tracking Modalities to Evaluate Heterogeneous Responses for Video Understanding. arXiv 2024, arXiv:2407.08150. [Google Scholar]
  330. Camaret Ndir, T.; Schirrmeister, R.T.; Ball, T. EEG-CLIP: Learning EEG Representations from Natural Language Descriptions. Front. Robot. AI 2025, 12, 1625731. [Google Scholar] [CrossRef] [PubMed]
  331. Wang, B.; Fu, X.; Lan, Y.; Zhang, L.; Zheng, W.; Xiang, Y. Large Transformers Are Better EEG Learners. arXiv 2024, arXiv:2308.11654. [Google Scholar] [CrossRef]
  332. Hu, Y.; Zhang, S.; Dang, T.; Jia, H.; Salim, F.D.; Hu, W.; Quigley, A.J. Exploring Large-Scale Language Models to Evaluate EEG-Based Multimodal Data for Mental Health. arXiv 2024, arXiv:2408.07313. [Google Scholar] [CrossRef]
  333. Zhou, X.; Liu, C.; Zhou, J.; Wang, Z.; Zhai, L.; Jia, Z.; Guan, C.; Liu, Y. Interpretable and Robust AI in EEG Systems: A Survey. arXiv 2025, arXiv:2304.10755. [Google Scholar]
  334. Sujatha Ravindran, A.; Contreras-Vidal, J. An Empirical Comparison of Deep Learning Explainability Approaches for EEG Using Simulated Ground Truth. Sci. Rep. 2023, 13, 17709. [Google Scholar] [CrossRef]
  335. Saarela, M.; Podgorelec, V. Recent Applications of Explainable AI (XAI): A Systematic Literature Review. Appl. Sci. 2024, 14, 8884. [Google Scholar] [CrossRef]
  336. Khan, W.; Khan, M.S.; Qasem, S.N.; Ghaban, W.; Saeed, F.; Hanif, M.; Ahmad, J. An Explainable and Efficient Deep Learning Framework for EEG-Based Diagnosis of Alzheimer’s Disease and Frontotemporal Dementia. Front. Med. 2025, 12, 1590201. [Google Scholar] [CrossRef]
  337. Sylvester, S.; Sagehorn, M.; Gruber, T.; Atzmueller, M.; Schöne, B. SHAP Value-Based ERP Analysis (SHERPA): Increasing the Sensitivity of EEG Signals with Explainable AI Methods. Behav. Res. Methods 2024, 56, 6067–6081. [Google Scholar] [CrossRef]
  338. Rejer, I.; Gago, I. AveragedLIME for General Explanations in EEG Domain. Neuroimage 2025, 323, 121588. [Google Scholar] [CrossRef] [PubMed]
  339. Raab, D.; Theissler, A.; Spiliopoulou, M. XAI4EEG: Spectral and Spatio-Temporal Explanation of Deep Learning-Based Seizure Detection in EEG Time Series. Neural Comput. Appl. 2023, 35, 10051–10068. [Google Scholar] [CrossRef]
  340. Park, D.; Park, H.; Kim, S.; Choo, S.; Lee, S.; Nam, C.S.; Jung, J.-Y. Spatio-Temporal Explanation of 3D-EEGNet for Motor Imagery EEG Classification Using Permutation and Saliency. IEEE Trans. Neural Syst. Rehabil. Eng. 2023, 31, 4504–4513. [Google Scholar] [CrossRef] [PubMed]
  341. Zhao, Y.; Cao, L.; Ji, Y.; Wang, B.; Wu, W. Interpretable EEG Emotion Classification via CNN Model and Gradient-Weighted Class Activation Mapping. Brain Sci. 2025, 15, 886. [Google Scholar] [CrossRef]
  342. Yang, L.; Wang, Z. Applications and Advances of Combined FMRI-FNIRs Techniques in Brain Functional Research. Front. Neurol. 2025, 16, 1542075. [Google Scholar] [CrossRef]
  343. Lian, X.; Liu, C.; Gao, C.; Deng, Z.; Guan, W.; Gong, Y. A Multi-Branch Network for Integrating Spatial, Spectral, and Temporal Features in Motor Imagery EEG Classification. Brain Sci. 2025, 15, 877. [Google Scholar] [CrossRef]
  344. Codina, T.; Blankertz, B.; von Lühmann, A. Multimodal FNIRS-EEG Sensor Fusion: Review of Data-Driven Methods and Perspective for Naturalistic Brain Imaging. Imaging Neurosci. 2025, 3, IMAG.a.974. [Google Scholar] [CrossRef]
  345. Cichy, R.M.; Oliva, A. A M/EEG-FMRI Fusion Primer: Resolving Human Brain Responses in Space and Time. Neuron 2020, 107, 772–781. [Google Scholar] [CrossRef]
  346. Bian, S.; Kang, P.; Moosmann, J.; Liu, M.; Bonazzi, P.; Rosipal, R.; Magno, M. On-Device Learning of EEGNet-Based Network for Wearable Motor Imagery Brain-Computer Interface. In Proceedings of the 2024 ACM International Symposium on Wearable Computers (ISWC ’24), Melbourne, Australia, 5–9 October 2024. [Google Scholar] [CrossRef]
  347. Muhl, E. The Challenge of Wearable Neurodevices for Workplace Monitoring: An EU Legal Perspective. Front. Hum. Dyn. 2024, 6, 1473893. [Google Scholar] [CrossRef]
Figure 1. (A). Recording of the EEG through electrodes placed on the surface of the skull is schematically presented. (B). Extends the view, capacitive effects at the neuronal membrane determine the timing and shape of transmembrane currents, which act as the primary sources of extracellular potentials. These potentials then spread through bulkhead tissues predominantly by resistive volume conduction to reach scalp electrodes. (C). A representative neuron with a dendritic and axonal network is presented, which is the basic unit of generation of Local Field Potentials (LFPs). (D). Focuses on the neuronal membrane, where ion channels (mainly  Na +  and  K + ) and the flow of ions between the extracellular and intracellular spaces are depicted, a mechanism that generates membrane potential differences. (E). The biophysical model of the neural membrane is described as an equivalent electrical circuit that includes resistors, capacitors, and voltage sources, representing the capacitance and conductive mechanisms of the membrane.
Figure 1. (A). Recording of the EEG through electrodes placed on the surface of the skull is schematically presented. (B). Extends the view, capacitive effects at the neuronal membrane determine the timing and shape of transmembrane currents, which act as the primary sources of extracellular potentials. These potentials then spread through bulkhead tissues predominantly by resistive volume conduction to reach scalp electrodes. (C). A representative neuron with a dendritic and axonal network is presented, which is the basic unit of generation of Local Field Potentials (LFPs). (D). Focuses on the neuronal membrane, where ion channels (mainly  Na +  and  K + ) and the flow of ions between the extracellular and intracellular spaces are depicted, a mechanism that generates membrane potential differences. (E). The biophysical model of the neural membrane is described as an equivalent electrical circuit that includes resistors, capacitors, and voltage sources, representing the capacitance and conductive mechanisms of the membrane.
Signals 07 00017 g001
Figure 2. The international 10–20 system.
Figure 2. The international 10–20 system.
Signals 07 00017 g002
Figure 3. Taxonomy of EEG systems.
Figure 3. Taxonomy of EEG systems.
Signals 07 00017 g003
Figure 4. A spectrum of EEG applications. A conceptual overview of the diverse utility of EEG across clinical and non-clinical sectors. The neurology branch represents areas where EEG provides critical diagnostic or monitoring data, including chronic disorders (e.g., epilepsy, ADHD) and acute clinical phenomena or biomarkers (e.g., seizures, coma). Other branches illustrate EEG’s role in neuroscience (cognitive state monitoring), BCI (direct device control), neurotherapy (improve cognitive function), biometrics (individual identification), and neuromarketing.
Figure 4. A spectrum of EEG applications. A conceptual overview of the diverse utility of EEG across clinical and non-clinical sectors. The neurology branch represents areas where EEG provides critical diagnostic or monitoring data, including chronic disorders (e.g., epilepsy, ADHD) and acute clinical phenomena or biomarkers (e.g., seizures, coma). Other branches illustrate EEG’s role in neuroscience (cognitive state monitoring), BCI (direct device control), neurotherapy (improve cognitive function), biometrics (individual identification), and neuromarketing.
Signals 07 00017 g004
Figure 5. Major frequency bands and wave patterns in a typical EEG signal.
Figure 5. Major frequency bands and wave patterns in a typical EEG signal.
Signals 07 00017 g005
Figure 6. Methods used in EEG analysis.
Figure 6. Methods used in EEG analysis.
Signals 07 00017 g006
Table 1. Major EEG hardware and acquisition systems.
Table 1. Major EEG hardware and acquisition systems.
Manufacturer/DeveloperHardware System(s)Acquisition Software
Natus Medical Incorporated (Middleton, WI, USA)NeuroWorks, Brain Quick, Natus Elite EMGNeuroWorks v.10/Brain Quick v.4/Natus Elite v.2
Nihon Kohden Corporation (Tokyo, Japan)EEG-1200/2100 series, JE-921Neurofax v.5.03
Compumedics Limited (Abbotsford, Australia)SynAmps RT, NuAmps, GraelCURRY 9 v.9.0.3
Brain Products GmbH (Gilching, Germany)actiCHamp, BrainAmp, LiveAmpBrainVision Recorder 2
BioSemi B.V. (Amsterdam, The Netherlands)ActiveTwoActiView (v.10.4)
Electrical Geodesics, Inc. (Magstim EGI) (Eugene, OR, USA)Net Amps 400, Geodesic Sensor NetsNet Station (v.5.5)
g.tec medical engineering GmbH (Schiedlberg, Austria)g.USBamp, g.HIamp, g.Nautilusg.tec Suite 2024
ANT Neuro (Hengelo, The Netherlands)eego™ amplifiers, waveguard™ capsASA v.7.5
Neuroelectrics (Barcelona, Spain)Enobio, StarStimNIC2 v.2.1.25
TMSi (Oldenzaal, The Netherlands)SAGA, APEXTMSi Python interface (v.5.3.0.0)
OpenBCI (Brooklyn, NY, USA)Cyton, Daisy, Ultracortex, GaleoOpenBCI GUI v.6.0.0-beta.1/BrainFlow v.5.20.1
Cognionics (San Diego, CA, USA)Quick-20r, Quick-32, Insight-8CGX Acquisition Suite
mBrainTrain (Belgrade, Serbia)SMARTING/SMARTING PROmbtStreamer/SmartingProApp/mbtCameraLSL
Table 2. Major EEG analysis software ecosystems.
Table 2. Major EEG analysis software ecosystems.
SoftwareDeveloperLanguage/Base
EEGLAB v.2025.1.0SCCN (San Diego, CA, USA)MATLAB
MNE-Python v.1.11.0MNE CommunityPython
FieldTrip (rolling release)Donders Institute for Brain, Cognition and Behaviour (Nijmegen, GE, The Netherlands)MATLAB
Brainstorm (rolling release)CNRS (Paris, France)/USC (Los Angeles, CA, USA)MATLAB/Java
Persyst v.15Persyst Development Corp. (Solana Beach, CA, USA)C++/Windows
CURRY (Neuroimaging Suite) v.9.0.3Compumedics Neuroscan (Melbourne, Australia)Windows
BrainVision Analyzer v.2.3.1Brain Products (Gilching, Germany)Windows
BESA Research v.7.1BESA GmbH (Graefelfing, Germany)Windows
Polysmith v.12Nihon Kohden (Tokyo, Japan)Windows
ASA v.7.5ANT Neuro (Hengelo, The Netherlands)Windows
LORETA-KEY 3rd generationKEY Institute (Zurich, Switzerland)Windows
BrainFlow v5.20.1OpenBCI/Community (Brooklyn, NY, USA)Python/C++/Java
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Kalogeropoulos, C.; Theofilatos, K.; Mavroudi, S. From Neurons to Networks: A Holistic Review of Electroencephalography (EEG) from Neurophysiological Foundations to AI Techniques. Signals 2026, 7, 17. https://doi.org/10.3390/signals7010017

AMA Style

Kalogeropoulos C, Theofilatos K, Mavroudi S. From Neurons to Networks: A Holistic Review of Electroencephalography (EEG) from Neurophysiological Foundations to AI Techniques. Signals. 2026; 7(1):17. https://doi.org/10.3390/signals7010017

Chicago/Turabian Style

Kalogeropoulos, Christos, Konstantinos Theofilatos, and Seferina Mavroudi. 2026. "From Neurons to Networks: A Holistic Review of Electroencephalography (EEG) from Neurophysiological Foundations to AI Techniques" Signals 7, no. 1: 17. https://doi.org/10.3390/signals7010017

APA Style

Kalogeropoulos, C., Theofilatos, K., & Mavroudi, S. (2026). From Neurons to Networks: A Holistic Review of Electroencephalography (EEG) from Neurophysiological Foundations to AI Techniques. Signals, 7(1), 17. https://doi.org/10.3390/signals7010017

Article Metrics

Back to TopTop