Previous Article in Journal
Development of a Rotation-Robust PPG Sensor for a Smart Ring
Previous Article in Special Issue
Feature–Shuffle and Multi–Head Attention–Based Autoencoder for Eliminating Electrode Motion Noise in ECG Applications
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

MultiScaleSleepNet: A Hybrid CNN–BiLSTM–Transformer Architecture with Multi-Scale Feature Representation for Single-Channel EEG Sleep Stage Classification

1
China-UK Low Carbon College, Shanghai Jiaotong University, Shanghai 200240, China
2
Shanghai Changhai Hospital, Shanghai 200433, China
*
Authors to whom correspondence should be addressed.
Sensors 2025, 25(20), 6328; https://doi.org/10.3390/s25206328 (registering DOI)
Submission received: 1 September 2025 / Revised: 20 September 2025 / Accepted: 28 September 2025 / Published: 13 October 2025
(This article belongs to the Special Issue AI on Biomedical Signal Sensing and Processing for Health Monitoring)

Abstract

Accurate automatic sleep stage classification from single-channel EEG remains challenging due to the need for effective extraction of multiscale neurophysiological features and modeling of long-range temporal dependencies. This study aims to address these limitations by developing an efficient and compact deep learning architecture tailored for wearable and edge device applications. We propose MultiScaleSleepNet, a hybrid convolutional neural network–bidirectional long short-term memory–transformer architecture that extracts multiscale temporal and spectral features through parallel convolutional branches, followed by sequential modeling using a BiLSTM memory network and transformer-based attention mechanisms. The model obtained an accuracy, macro-averaged F1 score, and kappa coefficient of 88.6%, 0.833, and 0.84 on the Sleep-EDF dataset; 85.6%, 0.811, and 0.80 on the Sleep-EDF Expanded dataset; and 84.6%, 0.745, and 0.79 on the SHHS dataset. Ablation studies indicate that attention mechanisms and spectral fusion consistently improve performance, with the most notable gains observed for stages N1, N3, and rapid eye movement. MultiScaleSleepNet demonstrates competitive performance across multiple benchmark datasets while maintaining a compact size of 1.9 million parameters, suggesting robustness to variations in dataset size and class distribution. The study supports the feasibility of real-time, accurate sleep staging from single-channel EEG using parameter-efficient deep models suitable for portable systems.
Keywords: automatic sleep stage classification; EEG; multi-head attention; temporal-spectral fusion; hybrid architecture automatic sleep stage classification; EEG; multi-head attention; temporal-spectral fusion; hybrid architecture

Share and Cite

MDPI and ACS Style

Liu, C.; Guan, Q.; Zhang, W.; Sun, L.; Wang, M.; Dong, X.; Xu, S. MultiScaleSleepNet: A Hybrid CNN–BiLSTM–Transformer Architecture with Multi-Scale Feature Representation for Single-Channel EEG Sleep Stage Classification. Sensors 2025, 25, 6328. https://doi.org/10.3390/s25206328

AMA Style

Liu C, Guan Q, Zhang W, Sun L, Wang M, Dong X, Xu S. MultiScaleSleepNet: A Hybrid CNN–BiLSTM–Transformer Architecture with Multi-Scale Feature Representation for Single-Channel EEG Sleep Stage Classification. Sensors. 2025; 25(20):6328. https://doi.org/10.3390/s25206328

Chicago/Turabian Style

Liu, Cenyu, Qinglin Guan, Wei Zhang, Liyang Sun, Mengyi Wang, Xue Dong, and Shuogui Xu. 2025. "MultiScaleSleepNet: A Hybrid CNN–BiLSTM–Transformer Architecture with Multi-Scale Feature Representation for Single-Channel EEG Sleep Stage Classification" Sensors 25, no. 20: 6328. https://doi.org/10.3390/s25206328

APA Style

Liu, C., Guan, Q., Zhang, W., Sun, L., Wang, M., Dong, X., & Xu, S. (2025). MultiScaleSleepNet: A Hybrid CNN–BiLSTM–Transformer Architecture with Multi-Scale Feature Representation for Single-Channel EEG Sleep Stage Classification. Sensors, 25(20), 6328. https://doi.org/10.3390/s25206328

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop