Next Article in Journal
An Automated Fluorescence Microscopy-Based Sensing System for Continuous Detection of Airborne Asbestos Fibers on a PM2.5 Monitoring Platform
Previous Article in Journal
Sex-Specific Differences in Neuromuscular Performance, Joint Mobility, and Postural Control in Elite Karate Athletes
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

Dynamic-Attentive Selective Mamba with Group-Aware Convolution for Wearable Sensor-Based Sports and Daily Activity Recognition

1
School of Physical Education, Xiangnan University, Chenzhou 423000, China
2
Department of Industrial and Systems Engineering, The Hong Kong Polytechnic University, Hong Kong 999077, China
*
Author to whom correspondence should be addressed.
Sensors 2026, 26(10), 3165; https://doi.org/10.3390/s26103165 (registering DOI)
Submission received: 16 April 2026 / Revised: 10 May 2026 / Accepted: 13 May 2026 / Published: 16 May 2026
(This article belongs to the Section Wearables)

Abstract

Wearable inertial sensors produce multi-axis motion signals with rich spatial and temporal structure. Existing deep-learning pipelines for human activity recognition (HAR) rarely tackle three issues jointly: explicit modeling of the body-part grouping of multi-location inertial channels, bidirectional temporal modeling at linear-time cost, and dynamic, time-varying attention for non-stationary motion. We aim to close these three gaps within a single architecture. To this end we propose Dynamic-Attentive Selective Mamba (DASM), which combines three components: Group-Aware Convolutions (GroupConv) for body-part-aware local features, a Bidirectional Mamba (BiMamba) module for linear-time forward and backward temporal context, and a Dynamic CBAM (DCBAM) that produces per-timestep channel and spatial attention for non-stationary windows. On the UCI Daily and Sports Activities dataset (19 classes, 8 subjects), under stratified segment-level 5-fold cross-validation (3 seeds, 15 runs/model), DASM reaches 99.89% accuracy and F1, a 0.11% gain over CNN-BiGRU-CBAM and 0.50% over Multi-STMT; under leave-one-subject-out (LOSO), it reaches 89.34%, 1.69% above the strongest baseline. The 10.55% drop under LOSO shows that segment-level results overestimate cross-subject generalization. Ablations show small but statistically detectable gains (Cohen’s d[0.4,0.7] per module, d1.5 full-vs-baseline). We therefore position the contribution as a structured architecture within a near-saturated benchmark; broader deployment claims require multi-dataset subject-independent validation.
Keywords: human activity recognition; state space model; Mamba; attention mechanism; wearable sensors; deep learning; inertial measurement unit human activity recognition; state space model; Mamba; attention mechanism; wearable sensors; deep learning; inertial measurement unit

Share and Cite

MDPI and ACS Style

Li, Z.; Kang, W. Dynamic-Attentive Selective Mamba with Group-Aware Convolution for Wearable Sensor-Based Sports and Daily Activity Recognition. Sensors 2026, 26, 3165. https://doi.org/10.3390/s26103165

AMA Style

Li Z, Kang W. Dynamic-Attentive Selective Mamba with Group-Aware Convolution for Wearable Sensor-Based Sports and Daily Activity Recognition. Sensors. 2026; 26(10):3165. https://doi.org/10.3390/s26103165

Chicago/Turabian Style

Li, Zhuojian, and Wenhao Kang. 2026. "Dynamic-Attentive Selective Mamba with Group-Aware Convolution for Wearable Sensor-Based Sports and Daily Activity Recognition" Sensors 26, no. 10: 3165. https://doi.org/10.3390/s26103165

APA Style

Li, Z., & Kang, W. (2026). Dynamic-Attentive Selective Mamba with Group-Aware Convolution for Wearable Sensor-Based Sports and Daily Activity Recognition. Sensors, 26(10), 3165. https://doi.org/10.3390/s26103165

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop