# ExerTrack—Towards Smart Surfaces to Track Exercises

^{1}

^{2}

^{*}

^{†}

## Abstract

**:**

## 1. Introduction

- We propose a floor-based sensing system using a capacitive proximity sensor to recognize eight strength-based sport activities.
- Due to the limited amount of labeled training samples, we demonstrated the ability of using data augmentation methods to regularize the classification network from over-fitting.
- We further demonstrate the advantages of using our proposed floor-based system to a single arm-worn accelerometer for the targeted set of whole-body activities.
- Owing to the simple network model architecture and the sparse resolution of our system, it can be run online on a Raspberry Pi 3.

## 2. Related Work

## 3. Hardware Setup

## 4. Software Implementation

#### 4.1. Data Preprocessing and Cleaning

#### 4.2. Data Augmentation

#### 4.3. Processing Pipeline

#### 4.4. Repetition Counting

- We first create templates for individual exercise from the training data.
- We then move the template over the test data and find matches using a similarity measure.
- We detect local maxima in the match score and count them as repetitions.

## 5. Experiments and Evaluation

#### 5.1. Evaluation on Classification Performance

#### 5.2. Evaluation on Repetition Counting

## 6. Comparison to Acceleration Data from Wearable

## 7. Limitation and Discussion

## 8. Conclusions and Outlook

## Author Contributions

## Funding

## Conflicts of Interest

## Abbreviations

CNN | convolutional neural network |

GNB | Gaussian Naive Bayesian |

HMM | Hidden Markov Model |

k-NN | k-nearest neighbors |

LSTM | Long short-term memory |

PBEF | Parallel branch early fusion network |

PBLF | Parallel branch late fusion network |

PCA | Principle component analysis |

SVM | support vector machine |

## Appendix A

**Table A1.**The extracted features for every sensor channel, resulting in 120 features total for a time series segment.

Feature | Parameters | Definition |
---|---|---|

Mean | – | $\overline{x}=\frac{1}{n}\sum x$ |

Variance | – | ${\sigma}^{2}=\frac{1}{n}\sum {(x-\overline{x})}^{2}$ |

Skewness | – | ${G}_{1}=\frac{n}{(n-1)(n-2)}{\sum}_{i=1}^{n}{\left(\right)}^{\frac{{x}_{i}-\overline{x}}{s}}3$ |

Kurtosis | – | ${G}_{2}=\frac{n}{(n-1)(n-2)}{\sum}_{i=1}^{n}{\left(\right)}^{\frac{{x}_{i}-\overline{x}}{s}}4$ |

Sample Entropy | – | $-log\frac{A}{B}$ |

Absolute Sum of Changes | – | ${\sum}_{i=0}^{n-1}|{x}_{i+1}-{x}_{i}|$ |

Autocorrelation | lag $l\in \{20,40,60\}$ | $R\left(l\right)=\frac{1}{(n-l){\sigma}^{2}}{\sum}_{t}^{n-l}({X}_{t}-\mu )({X}_{t+l}-\mu )$ |

Number of Peaks | $s\in \{\frac{20\xb7\mathrm{ws}}{4},\frac{20\xb7\mathrm{ws}}{8}\}$ | – |

FFT Coefficients | real, k $\in \left\{0\right\}$ | ${A}_{k}={\sum}_{m=0}^{n-1}{a}_{m}exp\left(\right)open="\{"\; close="\}">-2\pi i\frac{mk}{n}$, $k=0,\cdots ,n-1$ |

absolute, k $\in \{0,5,15\}$ | ||

angle, k $\in \left\{15\right\}$ |

**Figure A1.**Confusion matrices of the models with the best performance on the test sets for the single-user evaluation is shown.

**Figure A2.**Confusion matrices of the models with the performance on the test sets for the multi-user evaluation is depicted here.

## References

- Maurer, U.; Smailagic, A.; Siewiorek, D.; Deisher, M. Activity Recognition and Monitoring Using Multiple Sensors on Different Body Positions. In Proceedings of the International Workshop on Wearable and Implantable Body Sensor Networks (BSN’06), Cambridge, MA, USA, 3–5 April 2006. [Google Scholar]
- Zhang, S.; Wei, Z.; Nie, J.; Huang, L.; Wang, S.; Li, Z. A Review on Human Activity Recognition Using Vision-Based Method. J. Healthcare Eng.
**2017**, 2017, 1–31. [Google Scholar] [CrossRef] [PubMed] - Kwapisz, J.R.; Weiss, G.M.; Moore, S.A. Activity Recognition using Cell Phone Accelerometers. ACM SIGKDD Explor. Newsl.
**2011**, 12, 74. [Google Scholar] [CrossRef] - Lara, O.D.; Labrador, M.A. A Survey on Human Activity Recognition using Wearable Sensors. IEEE Commun. Surv. Tutorials
**2013**, 15, 1192–1209. [Google Scholar] [CrossRef] - Leavitt, M.O. Physical Activity Guidelines for Americans. Available online: https://health.gov/sites/default/files/2019-09/paguide.pdf (accessed on 16 March 2020).
- Chang, K.H.; Chen, M.Y.; Canny, J. Tracking Free-Weight Exercises. UbiComp 2007: Ubiquitous Computing. In Proceedings of the 9th International Conference, Innsbruck, Austria, 16–19 September 2007. [Google Scholar]
- Rish, I. An Empirical Study of the Naive Bayes Classifier. Available online: https://www.cc.gatech.edu/~isbell/reading/papers/Rish.pdf (accessed on 16 March 2020).
- Rabiner, L.R. A tutorial on hidden Markov models and selected applications in speech recognition. Proc. IEEE
**1989**, 77, 257–286. [Google Scholar] [CrossRef] - Morris, D.; Saponas, T.S.; Guillory, A.; Kelner, I. RecoFit: Using a Wearable Sensor to Find, Recognize, and Count Repetitive Exercises. Available online: https://dl.acm.org/doi/abs/10.1145/2556288.2557116 (accessed on 16 March 2020).
- Wold, S.; Esbensen, K.; Geladi, P. Principal component analysis. Chemom. Intell. Lab. Syst.
**1987**, 2, 37–52. [Google Scholar] [CrossRef] - Antón, D.; Goñi, A.; Illarramendi, A. Exercise Recognition for Kinect-based Telerehabilitation. Methods Inf. Med.
**2015**, 54, 145–155. [Google Scholar] [CrossRef] [PubMed] - Müller, M. Dynamic time warping. Inf. Retr. Music Motion
**2007**, 69–84. Available online: https://link.springer.com/chapter/10.1007/978-3-540-74048-3_4 (accessed on 16 March 2020). - Khurana, R.; Ahuja, K.; Yu, Z.; Mankoff, J.; Harrison, C.; Goel, M. GymCam: Detecting, Recognizing and Tracking Simultaneous Exercises in Unconstrained Scenes. Proc. ACM Interact. Mobile Wearable Ubiquitous Technol.
**2018**, 2, 1–17. [Google Scholar] [CrossRef] [Green Version] - Sundholm, M.; Cheng, J.; Zhou, B.; Sethi, A.; Lukowicz, P. Smart-Mat: Recognizing and Counting Gym Exercises with Low-cost Resistive Pressure Sensing Matrix. Available online: https://dl.acm.org/doi/abs/10.1145/2632048.2636088 (accessed on 16 March 2020).
- Cheng, J.; Sundholm, M.; Zhou, B.; Hirsch, M.; Lukowicz, P. Smart-surface: Large scale textile pressure sensors arrays for activity recognition. Pervasive Mob. Comput.
**2016**, 30, 97–112. [Google Scholar] [CrossRef] - Performance Research Inc. SmartMat | Interactive In-home Yoga | SmartMat.com. 2018. Available online: https://www.smartmat.com/about/ (accessed on 16 March 2020).
- Grosse-Puppendahl, T. OpenCapSense: A Rapid Prototyping Toolkit for Pervasive Interaction Using Capacitive Sensing. In Proceedings of the 2013 IEEE International Conference on Pervasive Computing and Communications (PerCom), San Diego, CA, USA, 18–22 March 2013; pp. 152–159. [Google Scholar]
- Grosse-Puppendahl, T.; Holz, C.; Cohn, G.; Wimmer, R.; Bechtold, O.; Hodges, S.; Reynolds, M.S.; Smith, J.R. Finding Common Ground: A Survey of Capacitive Sensing in Human-Computer Interaction. Available online: https://dl.acm.org/doi/10.1145/3025453.3025808 (accessed on 16 March 2020).
- Braun, A.; Wichert, R.; Kuijper, A.; Fellner, D.W. Capacitive proximity sensing in smart environments. J. Ambient Intell. Smart Environ.
**2015**, 7, 483–510. [Google Scholar] [CrossRef] [Green Version] - George, B.; Zangl, H.; Bretterklieber, T.; Brasseur, G. A Combined Inductive–Capacitive Proximity Sensor for Seat Occupancy Detection. IEEE Trans. Instrum. Meas.
**2010**, 59, 1463–1470. [Google Scholar] [CrossRef] - Braun, A.; Heggen, H.; Wichert, R. CapFloor—A Flexible Capacitive Indoor Localization System. Eval. AAL Syst. Through Compet. Benchmarking
**2012**, 309, 26–35. [Google Scholar] - Valtonen, M.; Maentausta, J.; Vanhala, J. TileTrack: Capacitive Human Tracking Using Floor Tiles. In Proceedings of the 2009 IEEE International Conference on Pervasive Computing and Communications, Galveston, TX, USA, 9–13 March 2009. [Google Scholar]
- Haescher, M.; Matthies, D.J.C.; Bieber, G.; Urban, B. CapWalk: A Capacitive Recognition of Walking-Based Activities as a Wearable Assistive Technology. Available online: https://dl.acm.org/doi/abs/10.1145/2769493.2769500 (accessed on 16 March 2020).
- Meyer, J.; Lukowicz, P.; Troster, G. Textile Pressure Sensor for Muscle Activity and Motion Detection. In Proceedings of the 2006 10th IEEE International Symposium on Wearable Computers, Montreux, Switzerland, 11–14 October 2006. [Google Scholar]
- Zhou, B.; Sundholm, M.; Cheng, J.; Cruz, H.; Lukowicz, P. Measuring muscle activities during gym exercises with textile pressure mapping sensors. Pervasive Mob. Comput.
**2017**, 38, 331–345. [Google Scholar] [CrossRef] - A Survey of Online Activity Recognition Using Mobile Phones. Sensors
**2015**, 15, 2059–2085. [CrossRef] [PubMed] - Goodfellow, I.; Bengio, Y.; Courville, A. Deep Learning; MIT Press: Cambridge, MA, USA, 2016. [Google Scholar]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. ImageNet Classification with Deep Convolutional Neural Networks. Available online: https://papers.nips.cc/paper/4824-imagenet-classification-with-deep-convolutional-neural-networ (accessed on 16 March 2020).
- Le Guennec, A.; Malinowski, S.; Tavenard, R. Data Augmentation for Time Series Classification Using Convolutional Neural Networks. Available online: https://halshs.archives-ouvertes.fr/halshs-01357973/ (accessed on 16 March 2020).
- Um, T.T.; Pfister, F.M.J.; Pichler, D.; Endo, S.; Lang, M.; Hirche, S.; Fietzek, U.; Kulić, D. Data Augmentation of Wearable Sensor Data for Parkinson’s Disease Monitoring using Convolutional Neural Networks. Available online: https://dl.acm.org/doi/abs/10.1145/3136755.3136817 (accessed on 16 March 2020).
- Cui, Z.; Chen, W.; Chen, Y. Multi-Scale Convolutional Neural Networks for Time Series Classification. arXiv
**2016**, arXiv:1603.06995. [Google Scholar] - Christ, M.; Braun, N.; Neuffer, J.; Kempa-Liehr, A.W. Time Series FeatuRe Extraction on basis of Scalable Hypothesis tests (tsfresh–A Python package). Neurocomputing
**2018**, 307, 72–77. [Google Scholar] [CrossRef] - Chollet, F. Keras. GitHub Repository. 2015. Available online: https://github.com/fchollet/keras (accessed on 16 March 2020).
- Abadi, M.; Agarwal, A.; Barham, P.; Brevdo, E.; Chen, Z.; Citro, C.; Corrado, G.S.; Davis, A.; Dean, J.; Devin, M. TensorFlow: Large-Scale Machine Learning on Heterogeneous Systems. 2015. Available online: tensorflow.org (accessed on 16 March 2020).
- Bengio, Y.S. Practical Recommendations for Gradient-Based Training of Deep Architectures. Available online: https://link.springer.com/chapter/10.1007/978-3-642-35289-8_26 (accessed on 16 March 2020).

**Figure 2.**Approximate electric field lines in the case of loading mode capacitive sensing. If no object is present, the opposite parallel plate can be imagined to be in the infinity. When a human object is above the plate capacitor, the electric field lines will end on the foot as the opposite plate capacitor. Electric field lines are always orthogonal to the surface.

**Figure 3.**The working principle of an active shield to avoid the parasitic coupling from the environment.

**Figure 4.**Prototype of our proposed sensing mat for sport exercise recognition by using eight capacitive proximity sensors. The symmetrical setup is used to allow the option for easy data augmentation process.

**Figure 5.**A raw sensory input of 6 seconds length and its corresponding sensor values. The image representation is just another representation of the same information as time series. The sensor numbering is depicted on the right.

**Figure 7.**The process of applying multiple label preserving warps to the same sequence. The gray areas on the left denote breaks between exercise repetitions. For time warp a polynomial is accumulated and interpreted as time intervals to modify all sensor channels.

**Figure 8.**The three different kernel structures are used to extract different sensor correlations. Along the x-axis, features are extracted across time, and along the y-axis neighboring sensors are combined. These filters are the fundamental building blocks of a convolutional neural network. On the left, the sequential convolutional neural network (CNN) structure is shown with the classification layer on top.

**Figure 9.**Parallel Branch Early fusion convolutional neural network (PBEF-CNN) model concatenating the structural information from spatially neighboring electrode is shown on the left side. The right architecture investigated the performance of the model by first maintaining the separated low-level feature extraction and concatenate the classification at the final stage. We name this structure the Parallel Branch Late Fusion convolutional neural network (PBLF-CNN).

**Figure 10.**Box plot for the person dependent evaluation is shown on the left side, conventional classifiers in the upper row and CNN models below. The different data augmentation variants are on the x-axis for each model. The + indicates the performance on the test set.

**Figure 11.**Box plot for the person-independent user evaluation is depicted on the right. The + indicates the performance of our test set.

**Figure 12.**Exercise repetition counting results for each participant and exercise using template selection in the multi-user case.

**Figure 13.**Acceleration data for the eight exercises from an entire sport session from one randomly selected participant. The red, blue and green curves represent the acceleration in x, y, and z directions, respectively. The x-axis is the sample time dimension and y-axis shows the normalized acceleration data. The class segmented rotation and bridge have the least modulation due to the dormant hand position.

**Table 1.**Comparison of electrode materials. The option of shield is of vital importance to increase the sensing range and robustness. Therefore, the choice of use in this work is the double-sided copper plate with sensing and shielding side.

Property | Double-Sided Plate | Sheet | Foil | Cable |
---|---|---|---|---|

Shielded | yes | no | no | no |

Range | 15 cm | 5 cm | 3 cm | 0 cm |

Robustness | rigid | soft | deformable | solid |

Signal quality | excellent | noisy | noisy | good |

Scalability | perfect | good | bad | good |

Longevity | robust | short | short | robust |

**Table 2.**The tested data augmentation methods for which the evaluation will be carried out. $r\in \{0,\dots ,3\}$ denotes the original orientation and the three flips. ${M}_{f},{M}_{c},{T}_{f},{T}_{c}$ denote fine and coarse magnitude and time warps.

Augmentation Method | Applied Transformations | Data Increase |
---|---|---|

Domain | ${X}_{r}$ with $r\in \{0,\dots ,3\}$ | $1:4$ |

General | ${M}_{f}({M}_{c}(X,\sigma =0.2),\sigma =0.1)$ | $1:4$ |

${T}_{f}({T}_{c}({M}_{f}({M}_{c}(X,\sigma =0.1),\sigma =0.2),\sigma =0.2),\sigma =0.2)$ | ||

${T}_{f}({T}_{c}({M}_{f}({M}_{c}(X,\sigma =0.3),\sigma =0.6),\sigma =0.6),\sigma =0.6)$ | ||

All | ${X}_{r}$ with $r\in \{0,\dots ,3\}$ | $1:8$ |

${T}_{f}({T}_{c}({M}_{f}({M}_{c}({X}_{r},\sigma =0.1),\sigma =0.2),\sigma =0.2),\sigma =0.2)$ | ||

All Extended | ${X}_{r}$ with $r\in \{0,\dots ,3\}$ | $1:16$ |

${M}_{f}({M}_{c}({X}_{r},\sigma =0.2),\sigma =0.1)$ | ||

${T}_{f}({T}_{c}({M}_{f}({M}_{c}({X}_{r},\sigma =0.1),\sigma =0.2),\sigma =0.2),\sigma =0.2)$ | ||

${T}_{f}({T}_{c}({M}_{f}({M}_{c}({X}_{r},\sigma =0.3),\sigma =0.6),\sigma =0.6),\sigma =0.6)$ |

Exercise | Threshold | Precision | Recall | F1 Measure |
---|---|---|---|---|

Push-Up | 0.6 | 1.0 | 1.0 | 1.0 |

Sit-Up | 0.5 | 0.9884 | 1.0 | 0.9942 |

Quadruped | 0.35 | 0.9839 | 0.9823 | 0.9831 |

Bridge | 0.45 | 1.0 | 1.0 | 1.0 |

Trunk Rot. | 0.45 | 0.9638 | 0.8859 | 0.9232 |

Swim | 0.5 | 0.9841 | 0.9725 | 0.9783 |

Squat | 0.45 | 0.9224 | 0.5879 | 0.7181 |

Segmental Rot. | 0.5 | 0.9267 | 0.8932 | 0.9096 |

Average | – | 0.9712 | 0.9152 | 0.9383 |

© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Fu, B.; Jarms, L.; Kirchbuchner, F.; Kuijper, A.
ExerTrack—Towards Smart Surfaces to Track Exercises. *Technologies* **2020**, *8*, 17.
https://doi.org/10.3390/technologies8010017

**AMA Style**

Fu B, Jarms L, Kirchbuchner F, Kuijper A.
ExerTrack—Towards Smart Surfaces to Track Exercises. *Technologies*. 2020; 8(1):17.
https://doi.org/10.3390/technologies8010017

**Chicago/Turabian Style**

Fu, Biying, Lennart Jarms, Florian Kirchbuchner, and Arjan Kuijper.
2020. "ExerTrack—Towards Smart Surfaces to Track Exercises" *Technologies* 8, no. 1: 17.
https://doi.org/10.3390/technologies8010017