Real-Time Sensor-Based Human Activity Recognition for eFitness and eHealth Platforms
Abstract
:1. Introduction
- mobile application for remote/at-home observation of elderly people or people with chronic diseases (e.g., COPD, long-COVID);
- telerehabilitation (physical, pulmonary, cardiac): feedback and adherence monitoring;
- promoting physical activity for elderly people;
- gamification in telerehabilitation and activity promotion for young people (smart games).
- HAR should be based on wearable IMU sensors (ideally one);
- it should ignore background activities (low rate of false-positive detection);
- it should be easy to tune for users with different levels of physical ability;
- full-body activities should be recognized;
- it should be easy to integrate our framework with mobile applications (iOS, Android) and provide real-time feedback;
- it should be easy to add new exercise/movement patterns with a small number of examples;
- end users are not expected to be IT specialists, so no manual feature engineering should be required if a new exercise is added.
- we use a single sensor placed on the chest, while in the cited papers the sensor is placed on the wrist, or multiple sensors are used;
- our approach works in real-time, while repetition counting in the cited papers is performed offline, has high latency, and relies on peak detection for the whole series; to our best knowledge, ours is the first evaluation of real-time repetition counting with a chest-placed IMU sensor; on the other hand, real-time operation mode does not reach as high quality as [2,11];
- we use one deep network model with encoder–detector architecture for all types of exercises; compared to [2], our solution is easier to extend to new exercises and is suitable for mobile devices.
- a deep neural network model for real-time exercise recognition and repetition counting based on signals from a chest-located IMU sensor;
- a method of false-positive error reduction based on contrastive learning;
- publicly available dataset AIDLAB-HAR to encourage further research on this topic.
2. Materials and Methods
2.1. Data Collection
- differentiate between similar exercises (e.g., crunches vs. abdominal tenses, lunges vs. side lunges);
- full body exercises (e.g., burpees, standing-to-plank-downward-dog-to-plank sequence), exercises used in tests (sit-to-stand).
- annotator provides 1–3 reference annotations of repetitions of a given exercise;
- annotator selects informative signals (one or more);
- DTW [26] is calculated for each informative signal between reference annotations and window sliding on the data series of a given exercise;
- for each window, distance is calculated as the median of values obtained in step (iii);
- for a given series, threshold is calculated as a fixed fraction of the median of window distances in this series;
- for a given series, all distance minima below the threshold are selected as repetitions.
- raw signal collected from the device consists of recorder frame acceleration along the X, Y, and Z axis, and the rotation quaternion; data were collected at 50 Hz;
- for further processing, we take recorder frame acceleration and calculate linear acceleration along the Z axis (of the earth frame), adding pitch and yaw rotations;
- signals are filtered using the low pass filter with a cut-off frequency of 10 Hz;
- filtered signals are arranged in windows of 2.8 s size, sliding in 0.1 s steps (see Figure 2);
- data in each window is given as input to the deep neural network.
2.2. Detector
2.3. Training
- Training the encoder with contrastive learning and Euclidean distance, fixing encoder parameters, and training the classification head with binary cross entropy loss. We train one encoder for all exercises and use a dedicated classification head for each type of exercise.
- End-to-end training, where the encoder and the classification head work as a single model and are trained together with binary cross entropy loss. In this case, we train a dedicated model for each exercise.
3. Results
3.1. Dataset
3.2. Evaluation
3.3. AIDLAB-HAR Dataset
4. Discussion
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Prabhu, G.; O’Connor, N.E.; Moran, K. Recognition and Repetition Counting for Local Muscular Endurance Exercises in Exercise-Based Rehabilitation: A Comparative Study Using Artificial Intelligence Models. Sensors 2020, 20, 4791. [Google Scholar] [CrossRef]
- Soro, A.; Brunner, G.; Tanner, S.; Wattenhofer, R. Recognition and Repetition Counting for Complex Physical Exercises with deep-learning. Sensors 2019, 19, 714. [Google Scholar] [CrossRef] [PubMed]
- Nweke, H.F.; Teh, Y.W.; Al-Garadi, M.A.; Alo, U.R. Deep learning algorithms for human activity recognition using mobile and wearable sensor networks: State of the art and research challenges. Expert Syst. Appl. 2018, 105, 233–261. [Google Scholar] [CrossRef]
- Jobanputra, C.; Bavishi, J.; Doshi, N. Human Activity Recognition: A Survey. Procedia Comput. Sci. 2019, 155, 698–703. [Google Scholar] [CrossRef]
- Bouchabou, D.; Nguyen, S.M.; Lohr, C.; LeDuc, B.; Kanellos, I. A Survey of Human Activity Recognition in Smart Homes Based on IoT Sensors Algorithms: Taxonomies, Challenges, and Opportunities with Deep Learning. Sensors 2021, 21, 6037. [Google Scholar] [CrossRef] [PubMed]
- Fu, B.; Damer, N.; Kirchbuchner, F.; Kuijper, A. Sensing Technology for Human Activity Recognition: A Comprehensive Survey. IEEE Access 2020, 8, 83791–83820. [Google Scholar] [CrossRef]
- Bisio, I.; Garibotto, C.; Lavagetto, F.; Sciarrone, A. When eHealth Meets IoT: A Smart Wireless System for Post-Stroke Home Rehabilitation. IEEE Wirel. Commun. 2019, 26, 24–29. [Google Scholar] [CrossRef]
- Prabhu, G.; O’Connor, N.E.; Moran, K. A Deep Learning Model for Exercise-Based Rehabilitation Using Multi-channel Time-Series Data from a Single Wearable Sensor. In Proceedings of the Wireless Mobile Communication and Healthcare, Virtual Event, 13–14 November 2021; pp. 104–115. [Google Scholar]
- Um, T.T.; Pfister, F.M.J.; Pichler, D.; Endo, S.; Lang, M.; Hirche, S.; Fietzek, U.; Kulić, D. Data Augmentation of Wearable Sensor Data for Parkinson’s Disease Monitoring Using Convolutional Neural Networks. In Proceedings of the 19th ACM International Conference on Multimodal Interaction, Glasgow, UK, 13–17 November 2017; ICMI ’17. pp. 216–220. [Google Scholar] [CrossRef]
- Van Lummel, R.C.; Walgaard, S.; Maier, A.B.; Ainsworth, E.; Beek, P.J.; van Dieën, J.H. The instrumented sit-to-stand test (iSTS) has greater clinical relevance than the manually recorded sit-to-stand test in older adults. PLoS ONE 2016, 11, e0157968. [Google Scholar] [CrossRef] [PubMed]
- Morris, D.; Saponas, T.S.; Guillory, A.; Kelner, I. RecoFit: Using a Wearable Sensor to Find, Recognize, and Count Repetitive Exercises. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Toronto, ON, Canada, 26 April–1 May 2014; pp. 3225–3234. [Google Scholar] [CrossRef]
- Ahmadi, A.; Mitchell, E.; Destelle, F.; Gowing, M.; O’Connor, N.E.; Richter, C.; Moran, K. Automatic Activity Classification and Movement Assessment During a Sports Training Session Using Wearable Inertial Sensors. In Proceedings of the 2014 11th International Conference on Wearable and Implantable Body Sensor Networks, Zurich, Switzerland, 16–19 June 2014; pp. 98–103. [Google Scholar] [CrossRef]
- Kondo, Y.; Ishii, S.; Aoyagi, H.; Hossain, T.; Yokokubo, A.; Lopez, G. FootbSense: Soccer Moves Identification Using a Single IMU. In Sensor- and Video-Based Activity and Behavior Computing; Springer: Singapore, 2022; pp. 115–131. [Google Scholar]
- Almeida, A.; Alves, A. Activity recognition for movement-based interaction in mobile games. In Proceedings of the 19th International Conference on Human–Computer Interaction with Mobile Devices and Services, Vienna, Austria, 4–7 September 2017; pp. 1–8. [Google Scholar] [CrossRef]
- Alazba, A.; Al-Khalifa, H.; AlSobayel, H. RabbitRun: An Immersive Virtual Reality Game for Promoting Physical Activities Among People with Low Back Pain. Technologies 2019, 7, 2. [Google Scholar] [CrossRef]
- Yin, Z.X.; Xu, H.M. A wearable rehabilitation game controller using IMU sensor. In Proceedings of the 2018 IEEE International Conference on Applied System Invention (ICASI), Chiba, Japan, 13–17 April 2018; pp. 1060–1062. [Google Scholar] [CrossRef]
- O’Reilly, M.; Whelan, D.; Chanialidis, C.; Friel, N.; Delahunt, E.; Ward, T.; Caulfield, B. Evaluating squat performance with a single inertial measurement unit. In Proceedings of the 2015 IEEE 12th International Conference on Wearable and Implantable Body Sensor Networks (BSN), Cambridge, MA, USA, 9–12 June 2015; pp. 1–6. [Google Scholar] [CrossRef]
- Whelan, D.; O’Reilly, M.; Huang, B.; Giggins, O.; Kechadi, T.; Caulfield, B. Leveraging IMU data for accurate exercise performance classification and musculoskeletal injury risk screening. In Proceedings of the 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Orlando, FL, USA, 16–20 August 2016; Volume 2016, pp. 659–662. [Google Scholar] [CrossRef]
- Rikli, R.E.; Jones, C.J. Senior Fitness Test Manual; Human Kinetics: Champaign, IL, USA, 2013. [Google Scholar]
- Romaszko-Wojtowicz, A.; Maksymowicz, S.; Jarynowski, A.; Jaskiewicz, L.; Czekaj, L.; Doboszynska, A. Telemonitoring in Long-COVID Patients: Preliminary Findings. Int. J. Environ. Res. Public Health 2022, 19, 5268. [Google Scholar] [CrossRef] [PubMed]
- Czekaj, L.; Domaszewicz, J.; Radzinski, L.; Jarynowski, A.; Kitlowski, R.; Doboszynska, A. Validation and usability of AIDMED-telemedical system for cardiological and pulmonary diseases. E-Methodology 2020, 7, 125–139. [Google Scholar] [CrossRef]
- Ismail Fawaz, H.; Forestier, G.; Weber, J.; Idoumghar, L.; Muller, P.A. Deep learning for time series classification: A review. Data Min. Knowl. Discov. 2019, 33, 917–963. [Google Scholar] [CrossRef]
- Casale, P.; Pujol, O.; Radeva, P. Human activity recognition from accelerometer data using a wearable device. In Proceedings of the Pattern Recognition and Image Analysis: 5th Iberian Conference, IbPRIA 2011, Las Palmas de Gran Canaria, Spain, 8–10 June 2011; Proceedings 5. Springer: Berlin/Heidelberg, Germany, 2011; pp. 289–296. [Google Scholar]
- Seeger, C.; Buchmann, A.; Van Laerhoven, K. myHealthAssistant: A phone-based body sensor network that captures the wearer’s exercises throughout the day. In Proceedings of the 6th International ICST Conference on Body Area Networks, Beijing, China, 7–10 November 2011. [Google Scholar]
- Czekaj, L.; Ziembla, W.; Jezierski, P.; Swiniarski, P.; Kolodziejak, A.; Ogniewski, P.; Niedbalski, P.; Jezierska, A.; Wesierski, D. Labeler-hot Detection of EEG Epileptic Transients. In Proceedings of the 2019 27th European Signal Processing Conference (EUSIPCO), A Coruna, Spain, 2–6 September 2019; pp. 1–5. [Google Scholar] [CrossRef]
- Salvador, S.; Chan, P. Toward Accurate Dynamic Time Warping in Linear Time and Space. Intell. Data Anal. 2004, 11, 70–80. [Google Scholar] [CrossRef]
- Chen, T.; Guestrin, C. XGBoost: A Scalable Tree Boosting System. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, 13–17 August 2016; ACM: New York, NY, USA, 2016. KDD’16. pp. 785–794. [Google Scholar] [CrossRef]
- Kaya, M.; Bilge, H. Deep Metric Learning: A Survey. Symmetry 2019, 11, 1066. [Google Scholar] [CrossRef]
- Parnami, A.; Lee, M. Learning from few examples: A summary of approaches to few-shot learning. arXiv 2022, arXiv:2203.04291. [Google Scholar]
Title | Task | Data Source | Activities | Method | Quality |
---|---|---|---|---|---|
Recognition and repetition counting for complex physical exercises with deep learning [2] | exercise recognition and repetition counting | signals recorded simultaneously from 2 smartwatches | 10 complex full-body exercises typical in CrossFit (e.g., pull-ups, push-ups, burpees) | two separate models for exercise recognition and the start of repetition detection; deep CNN; overlapping 5 s data window; offline | recognition accuracy: 99.96%; repetition counting: repetitions in 91% of the tests |
Recognition and repetition counting for local muscular endurance exercises in exercise-based rehabilitation: A comparative study using artificial intelligence models [1] | exercise recognition and repetition counting | single wrist-worn IMU sensor | 10 endurance-based exercises (e.g., biceps curls, squats, lunges) | recognition task: multi-class classification with a deep CNN based on AlexNet architecture; repetition counting: counts compact segments of detection; offline | recognition F1-score: 97.18%; repetition counting: ±1 repetition error in 90% of the tests |
Human activity recognition from accelerometer data using a wearable device [23] | activity recognition | single IMU sensor located on the chest | 5 activities: regular walking, climbing stairs, talking with a person, staying standing, working at the computer | activity recognition: random forest; 20 features computed for 1 s data windows | activity recognition accuracy: 94% |
myHealthAssistant: a phone-based body sensor network that captures the wearer’s exercises throughout the day [24] | exercise recognition and repetition counting | 3 accelerometers (on the hand, arm, and leg) | 13 exercises | exercise recognition: Bayesian classifier trained on the mean and variance on each accelerometer axis; repetition counting: peak counting on one of the accelerometer axes; offline | recognition accuracy: 92% (subject-specific model) |
RecoFit: Using a wearable sensor to find, recognize, and count repetitive exercises [11] | segmenting exercise from intermittent non-exercise/rest periods; exercise recognition and repetition counting | accelerometer on the arm | 26 exercises | segmentation and recognition tasks: linear support vector machines, features from 5 s data window; repetition counting performed offline with peak counting | segmentation precision and recall: >95%; exercise recognition accuracy: 96–99%; repetition counting ±1 repetition in 93% of the tests |
Exercise (Training) | F1 (%) | MAPE (%) | FPR (Events/s) |
---|---|---|---|
abd. tenses (enc) | 97 (1) | 2 (2) | 0.03 (0.01) |
abd. tenses (e2e) | 97 (1) | 0 (2) | 0.08 (0.02) |
dw.-dog (enc) | 58 (4) | 72 (23) | 0.10 (0.08) |
dw.-dog (e2e) | 64 (6) | 67 (30) | 0.16 (0.11) |
lying hip rises (enc) | 98 (1) | 1 (1) | 0.00 (0.01) |
lying hip rises (e2e) | 99 (1) | 0 (1) | 0.02 (0.01) |
side lunges (enc) | 98 (5) | 4 (5) | 0.00 (0.01) |
side lunges (e2e) | 88 (3) | 13 (6) | 0.02 (0.02) |
sit-to-stands (enc) | 92 (1) | 8 (3) | 0.02 (0.02) |
sit-to-stands (e2e) | 87 (2) | 21 (7) | 0.07 (0.01) |
bends (enc) | 86 (1) | 21 (2) | 0.03 (0.01) |
bends (e2e) | 68 (8) | 41 (11) | 0.13 (0.08) |
broad jumps (enc) | 99 (1) | 1 (1) | 0.02 (0.02) |
broad jumps (e2e) | 99 (1) | 0 (1) | 0.12 (0.03) |
burpees (enc) | 89 (2) | 5 (2) | 0.01 (0.01) |
burpees (e2e) | 87 (6) | 2 (4) | 0.23 (0.05) |
crunches (enc) | 92 (2) | 5 (3) | 0.04 (0.01) |
crunches (e2e) | 93 (1) | 5 (1) | 0.14 (0.02) |
lunges (enc) | 99 (3) | 1 (4) | 0.02 (0.02) |
lunges (e2e) | 99 (2) | 1 (3) | 0.06 (0.02) |
push-ups (enc) | 71 (6) | 36 (8) | 0.04 (0.04) |
push-ups (e2e) | 25 (4) | 81 (4) | 0.32 (0.16) |
squats (enc) | 88 (2) | 7 (5) | 0.04 (0.02) |
squats (e2e) | 77 (4) | 15 (10) | 0.11 (0.07) |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Czekaj, Ł.; Kowalewski, M.; Domaszewicz, J.; Kitłowski, R.; Szwoch, M.; Duch, W. Real-Time Sensor-Based Human Activity Recognition for eFitness and eHealth Platforms. Sensors 2024, 24, 3891. https://doi.org/10.3390/s24123891
Czekaj Ł, Kowalewski M, Domaszewicz J, Kitłowski R, Szwoch M, Duch W. Real-Time Sensor-Based Human Activity Recognition for eFitness and eHealth Platforms. Sensors. 2024; 24(12):3891. https://doi.org/10.3390/s24123891
Chicago/Turabian StyleCzekaj, Łukasz, Mateusz Kowalewski, Jakub Domaszewicz, Robert Kitłowski, Mariusz Szwoch, and Włodzisław Duch. 2024. "Real-Time Sensor-Based Human Activity Recognition for eFitness and eHealth Platforms" Sensors 24, no. 12: 3891. https://doi.org/10.3390/s24123891
APA StyleCzekaj, Ł., Kowalewski, M., Domaszewicz, J., Kitłowski, R., Szwoch, M., & Duch, W. (2024). Real-Time Sensor-Based Human Activity Recognition for eFitness and eHealth Platforms. Sensors, 24(12), 3891. https://doi.org/10.3390/s24123891