A Multi-Sensor Dataset for Human Activity Recognition Using Inertial and Orientation Data
Abstract
1. Summary
2. Data Description
2.1. Data Structure and Format
- Chest (sternum)
- Left hand
- Right hand
- Left knee
- Right knee
- Orientation (Quaternions)
- Linear acceleration (Accelerometer)
- Angular velocity (Gyroscope)
- Activity label
2.2. Sensor Data Columns
- Orientation: represented by quaternions (q_w, q_x, q_y, q_z)
- Linear acceleration: 3-axis accelerometer data (a_x, a_y, a_z)
- Angular velocity: 3-axis gyroscope data (g_x, g_y, g_z)
- chest: Sensor placed on the sternum
- left_hand: Sensor placed on the left hand
- right_hand: Sensor placed on the right hand
- left_knee: Sensor placed above the left knee
- right_knee: Sensor placed above the right knee
2.3. Labels and Metadata
- subject_id: Unique numeric identifier assigned to each participant (e.g., 1 to 67).
- gender: Self-reported gender of the participant (male, female, prefer not to say, etc.).
- age: Participant’s age in years at the time of the recording session.
- height (m): Self-reported or measured height of the participant, expressed in meters.
- weight (kg): Self-reported or measured weight of the participant, expressed in kilograms.
2.4. Sampling Rate
3. Methods
3.1. Data Collection Protocol
3.2. Recording Environment
- A treadmill for the walking activity, allowing continuous and safe movement at a constant pace.
- A stationary exercise bike designated for the cycling (ride a bike) activity.
- A chair and table setup for the sitting and folding clothes tasks, where participants sat and performed repetitive folding motions with garments.
- A set of medium-sized cardboard boxes placed beside the table, used for the box-moving activity.
- A sweeping area, defined as a square region in the center of the room, where participants were instructed to perform sweeping motions using a standard broom in a continuous pattern.
- A host station located in front of the participant, where a laptop running the data collection and synchronization software was monitored by the experiment supervisor.
3.3. Sensors
- 3-axis accelerometer (±16 g), used to measure linear acceleration.
- 3-axis gyroscope (±2000°/s), used to measure angular velocity.
- Bosch BMM150 magnetometer (used internally for sensor fusion).
- Sensor fusion module based on the Bosch BNO055 or BMI160, providing computed quaternion-based orientation data.
- BLE interface for real-time streaming at up to 100 Hz.
3.4. Synchronization and Validation
3.5. Data Quality and Cleaning
3.6. Ethical Approval
4. Usage Notes
4.1. Data Access
- A metadata file (participants_info.csv) containing demographic and biometric information for all 67 participants.
- A README file describing the dataset structure, contents, and file naming conventions.
- A Jupyter notebook providing example preprocessing routines for loading, cleaning, and segmenting the data into windows for model training.
4.2. Data Preprocessing and Segmentation Guidelines
4.3. Limitations
- Fixed Sensor Placement: Sensors were placed on five specific body locations (chest, hands, knees) using elastic bands or straps. Variations in placement tightness, orientation, or slight shifts during activity execution may introduce variability. However, these conditions were not varied systematically, and sensor misplacement or loose fitting was not explicitly modeled.
- Single-Session Recordings per Participant: Each subject performed the activity protocol once, under supervision. Therefore, the dataset does not capture intra-subject variability across multiple days, levels of fatigue, or environmental contexts.
- No Timestamp Column: Although all sensor data are temporally synchronized and uniformly sampled at 50 Hz, the dataset does not include absolute timestamps. This design choice simplifies data structure and is consistent with many HAR datasets. However, it introduces limitations for certain applications that require alignment with external modalities or for performing evaluations that depend on real-time system behavior and latency estimation. Without a global reference clock, precise fusion with other sensor types or time-dependent systems must rely on additional synchronization mechanisms (e.g., simultaneous start triggers or external time markers). We acknowledge this constraint, and in future versions of the dataset, we plan to include optional absolute timestamp fields to facilitate precise alignment with external modalities and improve interoperability in multimodal and real-time experimental setups.
- Lack of Transitional or Dynamic Activities: The dataset focuses on a well-structured set of common daily activities, but it does not include dynamic or transitional movements. These types of actions are relevant in real-world applications and typically involve more complex motion patterns and inter-class transitions. While this omission was intentional to maintain protocol consistency and recording simplicity, we acknowledge it as a limitation and plan to incorporate such activities in future expansions of the dataset to enhance model generalizability and robustness.
- Class Imbalance Potential: Some activities, such as sitting, have longer duration segments than other activities. While the labeling protocol was designed for balance, users should assess class distribution when training models.
- Sensor-Specific Modalities: Not all sensors provide the same type of data. For example, quaternion orientation is only available from the chest, left hand, and right knee. This design reflects practical constraints and avoids redundancy but may require special handling in models that expect uniform feature vectors across sensors.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Conflicts of Interest
Abbreviations
HAR | Human Activity Recognition |
CSV | Comma-Separated Values |
CNN | Convolutional Neural Network |
LSTM | Long Short-Term Memory |
BLE | Bluetooth Low Energy |
UTF | Unicode Transformation Format |
References
- Michel, V. Proceedings. Ciaco—i6doc.com, 2013. In Proceedings of the ESANN 2013: 21st European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, Bruges, Belgium, 24–26 April 2013. [Google Scholar]
- Kwapisz, J.R.; Weiss, G.M.; Moore, S.A. Activity Recognition Using Cell Phone Accelerometers. SIGKDD Explor. Newsl. 2011, 12, 74–82. [Google Scholar] [CrossRef]
- Reiss, A.; Stricker, D. Introducing a new benchmarked dataset for activity monitoring. In Proceedings of the International Symposium on Wearable Computers, ISWC, Newcastle, UK, 18–22 June 2012; pp. 108–109. [Google Scholar] [CrossRef]
- Kim, Y.; Toomajian, B. Hand Gesture Recognition Using Micro-Doppler Signatures with Convolutional Neural Network. IEEE Access 2016, 4, 7125–7130. [Google Scholar] [CrossRef]
- Roggen, D.; Calatroni, A.; Rossi, M.; Holleczek, T.; Förster, K.; Tröster, G.; Lukowicz, P.; Bannach, D.; Pirkl, G.; Ferscha, A.; et al. Collecting Complex Activity Datasets in Highly Rich Networked Sensor Environments. In Proceedings of the 7th International Conference on Networked Sensing Systems (INSS 2010), Kassel, Germany, 15–18 June 2010; IEEE: Piscataway, NJ, USA, 2010; pp. 233–240. [Google Scholar] [CrossRef]
- Zappi, P.; Stiefmeier, T.; Farella, E.; Roggen, D.; Benini, L.; Tröster, G. Activity Recognition from on-Body Sensors by Classifier Fusion: Sensor Scalability and Robustness. In Proceedings of the 3rd International Conference on Intelligent Sensors, Sensor Networks and Information (ISSNIP 2007), Melbourne, VIC, Australia, 3–6 December 2007; IEEE: Piscataway, NJ, USA, 2007; pp. 281–286. [Google Scholar]
- Reyes-Ortiz, J.-L.; Oneto, L.; Samà, A.; Parra, X.; Anguita, D. Transition-Aware Human Activity Recognition Using Smartphones. Neurocomputing 2016, 171, 754–767. [Google Scholar] [CrossRef]
- Tao, W.; Chen, H.; Moniruzzaman, M.; Leu, M.C.; Yi, Z.; Qin, R. Attention-Based Sensor Fusion for Human Activity Recognition Using IMU Signals. arXiv 2021, arXiv:2112.11224. [Google Scholar] [CrossRef]
- Xaviar, S.; Yang, X.; Ardakanian, O. Robust Multimodal Fusion for Human Activity Recognition. arXiv 2023, arXiv:2303.04636. [Google Scholar] [CrossRef]
- Chung, S.; Lim, J.; Noh, K.J.; Kim, G.; Jeong, H. Sensor data acquisition and multimodal sensor fusion for human activity recognition using deep learning. Sensors 2019, 19, 1716. [Google Scholar] [CrossRef] [PubMed]
- Huang, X.; Yuan, Y.; Chang, C.; Gao, Y.; Zheng, C.; Yan, L. Human Activity Recognition Method Based on Edge Computing-Assisted and GRU Deep Learning Network. Appl. Sci. 2023, 13, 9059. [Google Scholar] [CrossRef]
- Teerapittayanon, S.; McDanel, B.; Kung, H.T. Distributed Deep Neural Networks over the Cloud, the Edge and End Devices. arXiv 2017, arXiv:1709.01921. [Google Scholar] [CrossRef]
- MbientLab Inc. MetaMotionRL Sensor Platform. 2024. Available online: https://mbientlab.com/metamotionrl/ (accessed on 8 July 2025).
- World Medical Association. World Medical Association Declaration of Helsinki: Ethical Principles for Medical Research Involving Human Subjects. 2013. Available online: https://www.wma.net/policies-post/wma-declaration-of-helsinki/ (accessed on 7 July 2025).
Dataset | Year | Type of Sensor | No. of Activities | No. of Participants |
---|---|---|---|---|
UCI HAR [1] | 2012 | Accelerometer, Gyroscope | 6 | 30 |
WISDM [2] | 2011 | Accelerometer | 6 | 26 |
PAMAP2 [3] | 2012 | Accelerometer, Gyroscope, Magnetometer | 18 | 9 |
KU-HAR [4] | 2017 | Accelerometer, Gyroscope, Magnetometer | 18 | 18 |
Opportunity [5] | 2010 | Body-worn and Ambient Sensors | 18 | 4 |
Skoda [6] | 2008 | Accelerometer | 10 | 1 |
REALDISP [7] | 2013 | Accelerometer, Gyroscope | 33 | 17 |
Proposed dataset | 2025 | Accelerometer, Gyroscope, Quaternion | 6 | 67 |
Sensor Location | Orientation (4D) | Linear Acceleration (3D) | Angular Velocity (3D) |
---|---|---|---|
Chest | q_w, q_x, q_y, q_z | a_x, a_y, a_z | g_x, g_y, g_z |
Left Hand | q_w, q_x, q_y, q_z | N/A | N/A |
Right Hand | N/A | a_x, a_y, a_z | g_x, g_y, g_z |
Left Knee | N/A | a_x, a_y, a_z | g_x, g_y, g_z |
Right Knee | q_w, q_x, q_y, q_z | N/A | N/A |
Activity | Label | Description |
---|---|---|
Sitting | 0 | The subject is seated in a stationary position. |
Sweeping | 1 | The subject is performing sweeping movements with a broom. |
Folding Clothes | 2 | The subject is performing repetitive folding motions using both arms. |
Walking | 3 | The subject is walking at a comfortable, natural pace. |
Moving Boxes | 4 | The subject is lifting and moving medium-sized boxes repeatedly. |
Riding a Bike | 5 | The subject is a pedaling stationary bicycle. |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Rivas-Caicedo, J.L.; Saldaña-Aristizabal, L.; Niño-Tejada, K.; Patarroyo-Montenegro, J.F. A Multi-Sensor Dataset for Human Activity Recognition Using Inertial and Orientation Data. Data 2025, 10, 129. https://doi.org/10.3390/data10080129
Rivas-Caicedo JL, Saldaña-Aristizabal L, Niño-Tejada K, Patarroyo-Montenegro JF. A Multi-Sensor Dataset for Human Activity Recognition Using Inertial and Orientation Data. Data. 2025; 10(8):129. https://doi.org/10.3390/data10080129
Chicago/Turabian StyleRivas-Caicedo, Jhonathan L., Laura Saldaña-Aristizabal, Kevin Niño-Tejada, and Juan F. Patarroyo-Montenegro. 2025. "A Multi-Sensor Dataset for Human Activity Recognition Using Inertial and Orientation Data" Data 10, no. 8: 129. https://doi.org/10.3390/data10080129
APA StyleRivas-Caicedo, J. L., Saldaña-Aristizabal, L., Niño-Tejada, K., & Patarroyo-Montenegro, J. F. (2025). A Multi-Sensor Dataset for Human Activity Recognition Using Inertial and Orientation Data. Data, 10(8), 129. https://doi.org/10.3390/data10080129