DHCT-GAN: Improving EEG Signal Quality with a Dual-Branch Hybrid CNN–Transformer Network
Abstract
:1. Introduction
- Dual-branch architecture for artifact-specific learning: This network features two branches, one for learning clean EEG features and the other for artifact features. This design allows the model to distinguish EEG signals and artifacts, leveraging cross-domain knowledge for more effective denoising. A gated network module controls the information exchange between branches, adapting to diverse data characteristics.
- Multi-scale feature extraction and dependency capture: By combining CNNs and Transformers, DHCT-GAN captures both spatial and temporal characteristics, enhancing adaptability to complex signal dynamics. The integration of global and local Transformers enables the model to capture long-term and short-term dependencies, boosting feature robustness.
- Multi-discriminator GAN framework for stability: DHCT-GAN employs a multi-discriminator approach to mitigate issues such as mode collapse, ensuring stable and reliable denoising across diverse artifacts.
- We introduce a dual-branch hybrid CNN–Transformer network specifically designed for EEG denoising. This network explicitly models the noise and incorporates it as conditional information, effectively extracting both local and global features from EEG signals, enabling a more accurate and flexible denoising process. Additionally, the use of a multi-discriminator approach enhances the stability of model training.
- The design of DHCT-GAN carefully addresses the diversity and variability of artifact challenges, making it highly effective in practical scenarios, especially in areas such as BCI and clinical EEG analysis.
- The effectiveness and robustness of DHCT-GAN were validated on three public datasets. Experimental results demonstrate its ability to effectively remove four types of artifacts and reconstruct more accurate waveforms. Quantitative evaluation further highlights its superior performance compared to recent state-of-the-art models.
2. Materials and Methods
2.1. Datasets
2.2. Methods
2.2.1. Pseudo Code
Algorithm 1 EEG Denoising |
Input: Training datasets (x; y), number of adversarial training iterations T, number of training iterations for the denoising network K, mini-batch size M, and early stopping criterion C. 1: Randomly initialize the generator parameters θ and the parameters of the three discriminators: Φ1 Φ2, Φ2 2: for t ← 1 to T do 3: for k ← 1 to K do 4: Sample M instances{x(m), y(m)} from the training datasets //G is the overall generator network, G1 generates clean EEG signals, and G2 generates noise signals 5: Compute the loss and gradients for the discriminator network D1 6: Update the parameters of discriminator network D1 7: Compute the loss and gradients for the discriminator network D2 8: Update the parameters of discriminator network D2 9: Compute the loss and gradients for the discriminator network D3 10: Update the parameters of discriminator network D3 11: Compute the loss and gradients for the generator G 12: Update the parameters of generator network G with end for 13: Compute the loss of generator network G on the validation set 14: If the validation loss increases consecutively more than C times, then stop training; otherwise, continue training end for Output: Generator network G(x; θ) |
2.2.2. DHCT-GAN Structure
- Preprocessing module: This module converts the one-dimensional signal into high-dimensional (32 dimensions) features by employing two 1-D convolution layers and an average pooling layer to enhance information extraction capabilities.
- Encoding module: This module consists of 5 CNN-LGTB encoding blocks, which include parallel CNN and Local–Global Transformer Block (LGTB), with information dimensions starting from 64 and doubling incrementally (64, 128, 256, 512, and 1024). The output information of the last encoding block is combined through a feature fusion layer, and after passing through a convolutional layer and batch normalization (BN) layer, it is output as extracted features. The CNN block consists of two convolutional layers with a kernel size of 3, a batch normalization layer, and a Leaky Rectified Linear Unit (LReLU) layer. The LGTB includes a local self-attention module (LSA), a global self-attention module (GSA), and a feedforward module. The local attention module divides the input data into multiple parts (set to 8 blocks in this study), calculates local attention for each part, and then concatenates them to form local attention features. Local self-attention focuses on a limited context, which enables the model to effectively capture local dependencies within the input sequence. In contrast, global self-attention considers the entire sequence, allowing the model to capture long-range dependencies and overall contextual relationships. By integrating both local and global self-attention mechanisms, the model effectively captures a wide range of dependencies, enhancing its ability to understand complex patterns within the data. This combination facilitates a more comprehensive representation of the input, improving performance across various tasks. The output is then passed through a feedforward network to the global attention module, where global attention features are computed. Subsequently, the output undergoes further processing through a feedforward network, a convolutional layer, a batch normalization layer, and an LReLU layer.
- Decoding module: This module reverts the intermediate features back to the original signal dimension by utilizing two convolutional layers and one fully connected layer to produce denoised signals.
2.2.3. Loss Function
2.2.4. Model Training and Implementation
2.2.5. Evaluation Metrics
3. Results
3.1. Performance Evaluation
3.2. Ablation Study
4. Discussion
5. Conclusions
Author Contributions
Funding
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Glossary
References
- Casson, A.J.; Yates, D.C.; Smith, S.J.; Duncan, J.S.; Rodriguez-Villegas, E. Wearable electroencephalography. IEEE Eng. Med. Biol. Mag. 2010, 29, 44–56. [Google Scholar] [CrossRef]
- Henry, J.C. Electroencephalography: Basic Principles, Clinical Applications, and Related Fields, Fifth Edition. Neurology 2006, 67, 2092-a. [Google Scholar] [CrossRef]
- Alarcao, S.M.; Fonseca, M.J. Emotions recognition using EEG signals: A survey. IEEE Trans. Affect. Comput. 2017, 10, 374–393. [Google Scholar] [CrossRef]
- Zhang, J.; Yin, Z.; Chen, P.; Nichele, S. Emotion recognition using multi-modal data and machine learning techniques: A tutorial and review. Inf. Fusion 2020, 59, 103–126. [Google Scholar] [CrossRef]
- Sikander, G.; Anwar, S. Driver Fatigue Detection Systems: A Review. IEEE Trans. Intell. Transp. Syst. 2019, 20, 2339–2352. [Google Scholar] [CrossRef]
- Lal, S.K.L.; Craig, A.; Boord, P.; Kirkup, L.; Nguyen, H. Development of an algorithm for an EEG-based driver fatigue countermeasure. J. Saf. Res. 2003, 34, 321–328. [Google Scholar] [CrossRef] [PubMed]
- Abiri, R.; Borhani, S.; Sellers, E.W.; Jiang, Y.; Zhao, X. A comprehensive review of EEG-based brain–computer interface paradigms. J. Neural Eng. 2019, 16, 011001. [Google Scholar] [CrossRef]
- Shih, J.J.; Krusienski, D.J.; Wolpaw, J.R. Brain-Computer Interfaces in Medicine. Mayo Clin. Proc. 2012, 87, 268–279. [Google Scholar] [CrossRef]
- Lebedev, M.A.; Nicolelis, M.A. Brain-machine interfaces: From basic science to neuroprostheses and neurorehabilitation. Physiol. Rev. 2017, 97, 767–837. [Google Scholar] [CrossRef]
- Britton, J.W.; Frey, L.C.; Hopp, J.L.; Korb, P.; Koubeissi, M.Z.; Lievens, W.E.; Pestana-Knight, E.M.; St Louis, E.K. Electroencephalography (EEG): An Introductory Text and Atlas of Normal and Abnormal Findings in Adults, Children, and Infants; American Epilepsy Society: Chicago, IL, USA, 2016. [Google Scholar]
- Hartmann, M.M.; Schindler, K.; Gebbink, T.A.; Gritsch, G.; Kluge, T. PureEEG: Automatic EEG artifact removal for epilepsy monitoring. Neurophysiol. Clin.Clin. Neurophysiol. 2014, 44, 479–490. [Google Scholar] [CrossRef]
- Jiang, X.; Bian, G.-B.; Tian, Z. Removal of artifacts from EEG signals: A review. Sensors 2019, 19, 987. [Google Scholar] [CrossRef]
- Radüntz, T.; Scouten, J.; Hochmuth, O.; Meffert, B. Automated EEG artifact elimination by applying machine learning algorithms to ICA-based features. J. Neural Eng. 2017, 14, 046004. [Google Scholar] [CrossRef] [PubMed]
- Sazgar, M.; Young, M.G. Absolute Epilepsy and EEG Rotation Review: Essentials for Trainees; Springer: Berlin/Heidelberg, Germany, 2019; ISBN 978-3-030-03511-2. [Google Scholar]
- Liu, A.; Liu, Q.; Zhang, X.; Chen, X.; Chen, X. Muscle artifact removal toward mobile SSVEP-based BCI: A comparative study. IEEE Trans. Instrum. Meas. 2021, 70, 1–12. [Google Scholar] [CrossRef]
- Labate, D.; La Foresta, F.; Mammone, N.; Morabito, F.C. Effects of artifacts rejection on EEG complexity in Alzheimer’s disease. In Advances in Neural Networks: Computational and Theoretical Issues; Springer: Berlin/Heidelberg, Germany, 2015; pp. 129–136. [Google Scholar]
- Sawangjai, P.; Trakulruangroj, M.; Boonnag, C.; Piriyajitakonkij, M.; Tripathy, R.K.; Sudhawiyangkul, T.; Wilaiprasitporn, T. EEGANet: Removal of ocular artifacts from the EEG signal using generative adversarial networks. IEEE J. Biomed. Health Inform. 2021, 26, 4913–4924. [Google Scholar] [CrossRef] [PubMed]
- Ivaldi, M.; Giacometti, L.; Conversi, D. Quantitative Electroencephalography: Cortical Responses under Different Postural Conditions. Signals 2023, 4, 708–724. [Google Scholar] [CrossRef]
- Gratton, G. Dealing with artifacts: The EOG contamination of the event-related brain potential. Behav. Res. Methods Instrum. Comput. 1998, 30, 44–53. [Google Scholar] [CrossRef]
- Croft, R.J.; Barry, R.J. Removal of ocular artifact from the EEG: A review. Neurophysiol. Clin. Clin. Neurophysiol. 2000, 30, 5–19. [Google Scholar] [CrossRef]
- Chen, X.; Liu, A.; Chiang, J.; Wang, Z.J.; McKeown, M.J.; Ward, R.K. Removing muscle artifacts from EEG data: Multichannel or single-channel techniques? IEEE Sens. J. 2015, 16, 1986–1997. [Google Scholar] [CrossRef]
- Klados, M.A.; Papadelis, C.; Braun, C.; Bamidis, P.D. REG-ICA: A hybrid methodology combining blind source separation and regression techniques for the rejection of ocular artifacts. Biomed. Signal Process. Control. 2011, 6, 291–300. [Google Scholar] [CrossRef]
- Marque, C.; Bisch, C.; Dantas, R.; Elayoubi, S.; Brosse, V.; Perot, C. Adaptive filtering for ECG rejection from surface EMG recordings. J. Electromyogr. Kinesiol. 2005, 15, 310–315. [Google Scholar] [CrossRef]
- Somers, B.; Bertrand, A. Removal of eye blink artifacts in wireless EEG sensor networks using reduced-bandwidth canonical correlation analysis. J. Neural Eng. 2016, 13, 066008. [Google Scholar] [CrossRef] [PubMed]
- Kumar, P.S.; Arumuganathan, R.; Sivakumar, K.; Vimal, C. Removal of ocular artifacts in the EEG through wavelet transform without using an EOG reference channel. Int. J. Open Probl. Compt. Math 2008, 1, 188–200. [Google Scholar]
- Safieddine, D.; Kachenoura, A.; Albera, L.; Birot, G.; Karfoul, A.; Pasnicu, A.; Biraben, A.; Wendling, F.; Senhadji, L.; Merlet, I. Removal of muscle artifact from EEG data: Comparison between stochastic (ICA and CCA) and deterministic (EMD and wavelet-based) approaches. EURASIP J. Adv. Signal Process. 2012, 2012, 127. [Google Scholar] [CrossRef]
- Choi, S.; Cichocki, A.; Park, H.-M.; Lee, S.-Y. Blind source separation and independent component analysis: A review. Neural Inf. Process.-Lett. Rev. 2005, 6, 1–57. [Google Scholar]
- Berg, P.; Scherg, M. Dipole modelling of eye activity and its application to the removal of eye artefacts from the EEG and MEG. Clin. Phys. Physiol. Meas. 1991, 12, 49. [Google Scholar] [CrossRef] [PubMed]
- Casarotto, S.; Bianchi, A.M.; Cerutti, S.; Chiarenza, G.A. Principal component analysis for reduction of ocular artefacts in event-related potentials of normal and dyslexic children. Clin. Neurophysiol. 2004, 115, 609–619. [Google Scholar] [CrossRef]
- Flexer, A.; Bauer, H.; Pripfl, J.; Dorffner, G. Using ICA for removal of ocular artifacts in EEG recorded from blind subjects. Neural Netw. 2005, 18, 998–1005. [Google Scholar] [CrossRef] [PubMed]
- James, C.J.; Hesse, C.W. Independent component analysis for biomedical signals. Physiol. Meas. 2004, 26, R15–R39. [Google Scholar] [CrossRef]
- Zhou, W.; Gotman, J. Automatic removal of eye movement artifacts from the EEG using ICA and the dipole model. Prog. Nat. Sci. 2009, 19, 1165–1170. [Google Scholar] [CrossRef]
- De Clercq, W.; Vergult, A.; Vanrumste, B.; Van Paesschen, W.; Van Huffel, S. Canonical correlation analysis applied to remove muscle artifacts from the electroencephalogram. IEEE Trans. Biomed. Eng. 2006, 53, 2583–2587. [Google Scholar] [CrossRef] [PubMed]
- Borga, M.; Friman, O.; Lundberg, P.; Knutsson, H. A canonical correlation approach to exploratory data analysis in fMRI. In Proceedings of the ISMRM Annual Meeting, Honolulu, HI, USA, 18–24 May 2002. [Google Scholar]
- Vos, D.M.; Riès, S.; Vanderperren, K.; Vanrumste, B.; Alario, F.-X.; Huffel, V.S.; Burle, B. Removal of Muscle Artifacts from EEG Recordings of Spoken Language Production. Neuroinformatics 2010, 8, 135–150. [Google Scholar] [CrossRef] [PubMed]
- Chen, X.; Chen, Q.; Zhang, Y.; Wang, Z.J. A novel EEMD-CCA approach to removing muscle artifacts for pervasive EEG. IEEE Sens. J. 2018, 19, 8420–8431. [Google Scholar] [CrossRef]
- Wang, G.; Teng, C.; Li, K.; Zhang, Z.; Yan, X. The removal of EOG artifacts from EEG signals using independent component analysis and multivariate empirical mode decomposition. IEEE J. Biomed. Health Inform. 2015, 20, 1301–1308. [Google Scholar] [CrossRef] [PubMed]
- Sweeney, K.T.; McLoone, S.F.; Ward, T.E. The use of ensemble empirical mode decomposition with canonical correlation analysis as a novel artifact removal technique. IEEE Trans. Biomed. Eng. 2012, 60, 97–105. [Google Scholar] [CrossRef] [PubMed]
- Mahajan, R.; Morshed, B.I. Unsupervised eye blink artifact denoising of EEG data with modified multiscale sample entropy, kurtosis, and wavelet-ICA. IEEE J. Biomed. Health Inform. 2014, 19, 158–165. [Google Scholar] [CrossRef]
- Mammone, N.; Morabito, F.C. Enhanced automatic wavelet independent component analysis for electroencephalographic artifact removal. Entropy 2014, 16, 6553–6572. [Google Scholar] [CrossRef]
- Hamaneh, M.B.; Chitravas, N.; Kaiboriboon, K.; Lhatoo, S.D.; Loparo, K.A. Automated removal of EKG artifact from EEG data using independent component analysis and continuous wavelet transformation. IEEE Trans. Biomed. Eng. 2013, 61, 1634–1641. [Google Scholar] [CrossRef]
- Urigüen, J.A.; Garcia-Zapirain, B. EEG artifact removal—State-of-the-art and guidelines. J. Neural Eng. 2015, 12, 031001. [Google Scholar] [CrossRef]
- Motamedi-Fakhr, S.; Moshrefi-Torbati, M.; Hill, M.; Hill, C.M.; White, P.R. Signal processing techniques applied to human sleep EEG signals—A review. Biomed. Signal Process. Control. 2014, 10, 21–33. [Google Scholar] [CrossRef]
- Tanner, D.; Morgan-Short, K.; Luck, S.J. How inappropriate high-pass filters can produce artifactual effects and incorrect conclusions in ERP studies of language and cognition. Psychophysiology 2015, 52, 997–1009. [Google Scholar] [CrossRef]
- Kim, H.; Luo, J.; Chu, S.; Cannard, C.; Hoffmann, S.; Miyakoshi, M. ICA’s bug: How ghost ICs emerge from effective rank deficiency caused by EEG electrode interpolation and incorrect re-referencing. Front. Signal Process. 2023, 3, 1064138. [Google Scholar] [CrossRef]
- Manjunath, N.K.; Paneliya, H.; Hosseini, M.; Hairston, W.D.; Mohsenin, T. A low-power lstm processor for multi-channel brain eeg artifact detection. In Proceedings of the 2020 21st International Symposium on Quality Electronic Design (ISQED), Santa Clara, CA, USA, 25–26 March 2020; pp. 105–110. [Google Scholar]
- Nejedly, P.; Cimbalnik, J.; Klimes, P.; Plesinger, F.; Halamek, J.; Kremen, V.; Viscor, I.; Brinkmann, B.H.; Pail, M.; Brazdil, M. Intracerebral EEG artifact identification using convolutional neural networks. Neuroinformatics 2019, 17, 225–234. [Google Scholar] [CrossRef] [PubMed]
- Vincent, P.; Larochelle, H.; Lajoie, I.; Bengio, Y.; Manzagol, P.-A.; Bottou, L. Stacked denoising autoencoders: Learning useful representations in a deep network with a local denoising criterion. J. Mach. Learn. Res. 2010, 11, 3371–3408. [Google Scholar]
- Leite, N.M.N.; Pereira, E.T.; Gurjao, E.C.; Veloso, L.R. Deep convolutional autoencoder for EEG noise filtering. In Proceedings of the 2018 IEEE International Conference on Bioinformatics and Biomedicine (BIBM), Madrid, Spain, 3–6 December 2018; pp. 2605–2612. [Google Scholar]
- Xiong, W.; Ma, L.; Li, H. A general dual-pathway network for EEG denoising. Front. Neurosci. 2024, 17, 1258024. [Google Scholar] [CrossRef]
- Sun, W.; Su, Y.; Wu, X.; Wu, X. A novel end-to-end 1D-ResCNN model to remove artifact from EEG signals. Neurocomputing 2020, 404, 108–121. [Google Scholar] [CrossRef]
- Xiong, J.; Meng, X.-L.; Chen, Z.-Q.; Wang, C.-S.; Zhang, F.-Q.; Grau, A.; Chen, Y. One-Dimensional EEG Artifact Removal Network Based on Convolutional Neural Networks. J. Netw. Intell. 2024, 9, 142–159. [Google Scholar]
- Gao, T.; Chen, D.; Tang, Y.; Ming, Z.; Li, X. EEG Reconstruction With a Dual-Scale CNN-LSTM Model for Deep Artifact Removal. IEEE J. Biomed. Health Inform. 2023, 27, 1283–1294. [Google Scholar] [CrossRef]
- Wu, L.; Liu, A.; Li, C.; Chen, X. Enhancing EEG artifact removal through neural architecture search with large kernels. Adv. Eng. Inform. 2024, 62, 102831. [Google Scholar] [CrossRef]
- Huang, J.; Wang, C.; Zhao, W.; Grau, A.; Xue, X.; Zhang, F. LTDNet-EEG: A Lightweight Network of Portable/Wearable Devices for Real-Time EEG Signal Denoising. IEEE Trans. Consum. Electron. 2024, 70, 5561–5575. [Google Scholar] [CrossRef]
- Pei, Y.; Xu, J.; Chen, Q.; Wang, C.; Yu, F.; Zhang, L.; Luo, W. DTP-Net: Learning to Reconstruct EEG Signals in Time-Frequency Domain by Multi-scale Feature Reuse. IEEE J. Biomed. Health Inform. 2024, 28, 2662–2673. [Google Scholar] [CrossRef]
- Wang, B.; Deng, F.; Jiang, P. EEGDiR: Electroencephalogram denoising network for temporal information storage and global modeling through Retentive Network. Comput. Biol. Med. 2024, 177, 108626. [Google Scholar] [CrossRef]
- Goodfellow, I.; Pouget-Abadie, J.; Mirza, M.; Xu, B.; Warde-Farley, D.; Ozair, S.; Courville, A.; Bengio, Y. Generative adversarial networks. Commun. ACM 2020, 63, 139–144. [Google Scholar] [CrossRef]
- Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, Ł.; Polosukhin, I. Attention is all you need. Adv. Neural Inf. Process. Syst. 2017, 30, 5998–6008. [Google Scholar]
- Brophy, E.; Redmond, P.; Fleury, A.; De Vos, M.; Boylan, G.; Ward, T. Denoising EEG signals for Real-World BCI Applications using GANs. Front. Neuroergonomics 2022, 2, 805573. [Google Scholar] [CrossRef]
- Gandhi, S.; Oates, T.; Mohsenin, T.; Hairston, D. Denoising Time Series Data Using Asymmetric Generative Adversarial Networks. In Advances in Knowledge Discovery and Data Mining; Phung, D., Tseng, V.S., Webb, G.I., Ho, B., Ganji, M., Rashidi, L., Eds.; Lecture Notes in Computer Science; Springer International Publishing: Cham, Switzerland, 2018; Volume 10939, pp. 285–296. ISBN 978-3-319-93039-8. [Google Scholar]
- Sumiya, Y.; Horie, K.; Shiokawa, H.; Kitagawa, H. NR-GAN: Noise Reduction GAN for Mice Electroencephalogram Signals. In Proceedings of the 2019 4th International Conference on Biomedical Imaging, Signal Processing, Nagoya, Japan, 17 October 2019; pp. 94–101. [Google Scholar]
- Dong, Y.; Tang, X.; Li, Q.; Wang, Y.; Jiang, N.; Tian, L.; Zheng, Y.; Li, X.; Zhao, S.; Li, G. An Approach for EEG Denoising Based on Wasserstein Generative Adversarial Network. IEEE Trans. Neural Syst. Rehabil. Eng. 2023, 31, 3524–3534. [Google Scholar] [CrossRef] [PubMed]
- Lin, T.; Wang, Y.; Liu, X.; Qiu, X. A survey of transformers. AI Open 2022, 3, 111–132. [Google Scholar] [CrossRef]
- Chen, J.; Pi, D.; Jiang, X.; Xu, Y.; Chen, Y.; Wang, X. Denosieformer: A Transformer based Approach for Single-Channel EEG Artifact Removal. IEEE Trans. Instrum. Meas. 2023, 73, 2501116. [Google Scholar] [CrossRef]
- Pfeffer, M.A.; Ling, S.S.H.; Wong, J.K.W. Exploring the Frontier: Transformer-Based Models in EEG Signal Analysis for Brain-Computer Interfaces. Comput. Biol. Med. 2024, 178, 108705. [Google Scholar] [CrossRef] [PubMed]
- Pu, X.; Yi, P.; Chen, K.; Ma, Z.; Zhao, D.; Ren, Y. EEGDnet: Fusing non-local and local self-similarity for EEG signal denoising with transformer. Comput. Biol. Med. 2022, 151, 106248. [Google Scholar] [CrossRef] [PubMed]
- Yin, J.; Liu, A.; Li, C.; Qian, R.; Chen, X. A GAN Guided Parallel CNN and Transformer Network for EEG Denoising. IEEE J. Biomed. Health Inform. 2023, 27, 1–12. [Google Scholar] [CrossRef]
- Huang, X.; Li, C.; Liu, A.; Qian, R.; Chen, X. EEGDfus: A Conditional Diffusion Model for Fine-Grained EEG Denoising. IEEE J. Biomed. Health Inform. 2024, 1–13. [Google Scholar] [CrossRef]
- Zhang, H.; Zhao, M.; Wei, C.; Mantini, D.; Li, Z.; Liu, Q. EEGdenoiseNet: A benchmark dataset for deep learning solutions of EEG denoising. J. Neural Eng. 2021, 18, 056057. [Google Scholar] [CrossRef]
- Moody, G.B.; Mark, R.G. The impact of the MIT-BIH arrhythmia database. IEEE Eng. Med. Biol. Mag. 2001, 20, 45–50. [Google Scholar] [CrossRef]
- Klados, M.A.; Bamidis, P.D. A semi-simulated EEG/EOG dataset for the comparison of EOG artifact rejection techniques. Data Brief 2016, 8, 1004–1006. [Google Scholar] [CrossRef]
- Dora, M.; Holcman, D. Adaptive single-channel EEG artifact removal with applications to clinical monitoring. IEEE Trans. Neural Syst. Rehabil. Eng. 2022, 30, 286–295. [Google Scholar] [CrossRef] [PubMed]
- Cho, H.; Ahn, M.; Ahn, S.; Kwon, M.; Jun, S.C. EEG datasets for motor imagery brain–computer interface. GigaScience 2017, 6, gix034. [Google Scholar] [CrossRef]
- Kanoga, S.; Nakanishi, M.; Mitsukura, Y. Assessing the effects of voluntary and involuntary eyeblinks in independent components of electroencephalogram. Neurocomputing 2016, 193, 20–32. [Google Scholar] [CrossRef]
- Fatourechi, M.; Bashashati, A.; Ward, R.K.; Birch, G.E. EMG and EOG artifacts in brain computer interface systems: A survey. Clin. Neurophysiol. 2007, 118, 480–494. [Google Scholar] [CrossRef] [PubMed]
- Naeem, M.; Brunner, C.; Leeb, R.; Graimann, B.; Pfurtscheller, G. Seperability of four-class motor imagery data using independent components analysis. J. Neural Eng. 2006, 3, 208. [Google Scholar] [CrossRef]
- Schlögl, A.; Keinrath, C.; Zimmermann, D.; Scherer, R.; Leeb, R.; Pfurtscheller, G. A fully automated correction method of EOG artifacts in EEG recordings. Clin. Neurophysiol. 2007, 118, 98–104. [Google Scholar] [CrossRef] [PubMed]
- Schlögl, A.; Kronegg, J.; Huggins, J.E.; Mason, S.G. Evaluation criteria for BCI research. In Toward Brain-Computer Interfacing; MIT Press: Cambridge, MA, USA, 2007. [Google Scholar]
- Rantanen, V.; Ilves, M.; Vehkaoja, A.; Kontunen, A.; Lylykangas, J.; Mäkelä, E.; Rautiainen, M.; Surakka, V.; Lekkala, J. A survey on the feasibility of surface EMG in facial pacing. In Proceedings of the 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Orlando, FL, USA, 16–20 August 2016; pp. 1688–1691. [Google Scholar]
- Brunner, C.; Naeem, M.; Leeb, R.; Graimann, B.; Pfurtscheller, G. Spatial filtering and selection of optimized components in four class motor imagery EEG data using independent components analysis. Pattern Recognit. Lett. 2007, 28, 957–964. [Google Scholar] [CrossRef]
- Pion-Tonachini, L.; Kreutz-Delgado, K.; Makeig, S. ICLabel: An automated electroencephalographic independent component classifier, dataset, and website. NeuroImage 2019, 198, 181–197. [Google Scholar] [CrossRef] [PubMed]
- Elbert, T.; Lutzenberger, W.; Rockstroh, B.; Birbaumer, N. Removal of ocular artifacts from the EEG—A biophysical approach to the EOG. Electroencephalogr. Clin. Neurophysiol. 1985, 60, 455–463. [Google Scholar] [CrossRef]
- Hochreiter, S.; Schmidhuber, J. Long short-term memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef]
Artifact Types | Metrics | DuoCL | GCTNet | EEGDiR | LSTM | SimpleCNN | DHCT-GAN |
---|---|---|---|---|---|---|---|
EMG | RRMSEt | 0.4407 | 0.4003 | 0.4218 | 0.5213 | 0.5457 | 0.3918 |
RRMSEf | 0.3168 | 0.2942 | 0.3131 | 0.4270 | 0.3934 | 0.2837 | |
CC * | 0.8977 | 0.9166 | 0.9069 | 0.8628 | 0.8390 | 0.9201 | |
η | 77.40% | 81.58% | 79.43% | 69.71% | 64.44% | 82.35% | |
SSIM | 0.6600 | 0.7137 | 0.6528 | 0.5627 | 0.5091 | 0.7167 | |
MI | 0.8590 | 0.9898 | 0.8700 | 0.6792 | 0.5880 | 1.0031 | |
EOG | RRMSEt | 0.3985 | 0.3285 | 0.3533 | 0.4832 | 0.3743 | 0.2734 |
RRMSEf | 0.2755 | 0.2341 | 0.2473 | 0.3804 | 0.2612 | 0.2024 | |
CC * | 0.9173 | 0.9448 | 0.9359 | 0.8872 | 0.9275 | 0.9624 | |
η | 81.95% | 87.96% | 86.02% | 75.40% | 84.18% | 91.80% | |
SSIM | 0.7524 | 0.8109 | 0.8027 | 0.6627 | 0.7444 | 0.8402 | |
MI | 1.0673 | 1.2804 | 1.2787 | 0.8375 | 1.0688 | 1.4290 | |
ECG | RRMSEt | 0.4158 | 0.3322 | 0.4228 | 0.5281 | 0.4246 | 0.3111 |
RRMSEf | 0.2936 | 0.2446 | 0.3185 | 0.4198 | 0.3050 | 0.2285 | |
CC * | 0.9095 | 0.9437 | 0.9076 | 0.8606 | 0.9054 | 0.9504 | |
η | 79.87% | 87.49% | 79.46% | 70.00% | 78.97% | 88.97% | |
SSIM | 0.6960 | 0.7840 | 0.6771 | 0.5739 | 0.6767 | 0.7796 | |
MI | 0.9506 | 1.2157 | 0.9239 | 0.7038 | 0.9043 | 1.2309 | |
EOG + EMG (mixed) | RRMSEt | 0.4724 | 0.4025 | 0.3978 | 0.5280 | 0.5301 | 0.3975 |
RRMSEf | 0.3334 | 0.2946 | 0.2951 | 0.4251 | 0.3891 | 0.2904 | |
CC * | 0.8814 | 0.9160 | 0.9184 | 0.8588 | 0.8480 | 0.9184 | |
η | 73.94% | 81.53% | 82.06% | 68.96% | 66.59% | 82.07% | |
SSIM | 0.6046 | 0.6899 | 0.6794 | 0.5334 | 0.4899 | 0.6996 | |
MI | 0.7960 | 0.9903 | 0.9891 | 0.6810 | 0.6142 | 1.0159 |
Artifact Types | Metrics | Clean | Noise | No-Transformer | Self-Attention | DHCT-GAN |
---|---|---|---|---|---|---|
EMG | RRMSEt | 0.3958 | 0.8405 | 0.4022 | 0.4134 | 0.3918 |
RRMSEf | 0.2904 | 0.6667 | 0.2986 | 0.3046 | 0.2837 | |
CC * | 0.9186 | 0.7225 | 0.9159 | 0.9109 | 0.9201 | |
η | 82.02% | 38.71% | 81.42% | 80.32% | 82.35% | |
EOG | RRMSEt | 0.3215 | 0.2785 | 0.2885 | 0.3033 | 0.2734 |
RRMSEf | 0.2318 | 0.2009 | 0.2151 | 0.2232 | 0.2024 | |
CC * | 0.9474 | 0.9605 | 0.9587 | 0.9536 | 0.9624 | |
η | 88.51% | 91.37% | 90.99% | 89.88% | 91.80% | |
ECG | RRMSEt | 0.3356 | 0.3907 | 0.3169 | 0.3199 | 0.3111 |
RRMSEf | 0.2452 | 0.2883 | 0.2307 | 0.2370 | 0.2285 | |
CC * | 0.9423 | 0.9243 | 0.9486 | 0.9477 | 0.9504 | |
η | 87.18% | 83.17% | 88.57% | 88.37% | 88.97% | |
EOG + EMG (mixed) | RRMSEt | 0.4000 | 0.8771 | 0.4003 | 0.4206 | 0.3975 |
RRMSEf | 0.2908 | 0.7647 | 0.2943 | 0.3115 | 0.2904 | |
CC * | 0.9167 | 0.7159 | 0.9171 | 0.9086 | 0.9184 | |
η | 81.70% | 37.61% | 81.79% | 79.93% | 82.07% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Cai, Y.; Meng, Z.; Huang, D. DHCT-GAN: Improving EEG Signal Quality with a Dual-Branch Hybrid CNN–Transformer Network. Sensors 2025, 25, 231. https://doi.org/10.3390/s25010231
Cai Y, Meng Z, Huang D. DHCT-GAN: Improving EEG Signal Quality with a Dual-Branch Hybrid CNN–Transformer Network. Sensors. 2025; 25(1):231. https://doi.org/10.3390/s25010231
Chicago/Turabian StyleCai, Yinan, Zhao Meng, and Dian Huang. 2025. "DHCT-GAN: Improving EEG Signal Quality with a Dual-Branch Hybrid CNN–Transformer Network" Sensors 25, no. 1: 231. https://doi.org/10.3390/s25010231
APA StyleCai, Y., Meng, Z., & Huang, D. (2025). DHCT-GAN: Improving EEG Signal Quality with a Dual-Branch Hybrid CNN–Transformer Network. Sensors, 25(1), 231. https://doi.org/10.3390/s25010231