An Ensemble of Condition Based Classifiers for Device Independent Detailed Human Activity Recognition Using Smartphones †
Abstract
:1. Introduction
- (a)
- We propose an activity recognition framework using an ensemble of condition based classifiers to identify detailed static activities (Sit on chair, Sit on floor, Lying right, Lying left) as well as detailed dynamic activities (Slow walk, Brisk walk). The proposed technique works irrespective of device configuration and usage behavior.
- (b)
- The process utilizes accelerometer and gyroscope sensor of smartphones that are available in almost all smartphones by most of the manufacturers, thus making the framework ubiquitous.
- (c)
- The proposed technique can identify the effect of accelerometer and gyroscope for identifying individual detailed activity.
2. Related Work
3. Problem Definition
4. Detailed Activity Recognition Framework
4.1. Data Preprocessing and Feature Extraction
4.2. Classification of Detailed Activity
5. Performance Evaluation
5.1. Training Position Selection
5.2. Base Classifier Selection
5.3. Activity Classification Using Only an Accelerometer
5.4. Activity Classification Using Both an Accelerometer and Gyroscope
5.5. Evaluating the Performance of the Proposed System
6. Conclusions
Author Contributions
Conflicts of Interest
References
- Saha, J.; Chakraborty, S.; Chowdhury, C.; Biswas, S.; Aslam, N. Designing Two-Phase Ensemble Classifier for Smartphone based Activity Recognition. In Proceedings of the IEEE eHPWAS’17 Workshop of WiMob, Rome, Italy, 9 October 2017; pp. 257–264. [Google Scholar]
- Shany, T.; Redmond, S.J.; Narayanan, M.R.; Lovell, N.H. Sensors-Based Wearable Systems for Monitoring of Human Movement and Falls. IEEE Sens. 2012, 12, 658–670. [Google Scholar] [CrossRef]
- Banerjee, T.; Keller, J.M.; Skubic, M.; Stone, E. Day or Night Activity Recognition From Video Using Fuzzy Clustering Techniques. IEEE Trans. Fuzzy Syst. 2014, 22, 483–493. [Google Scholar] [CrossRef]
- MalekiTabar, A.; Keshavarz, A.; Aghajan, H. Smart Home Care Network using Sensor Fusion and Distributed Vision-based Reasoning. In Proceedings of the VSSN’06, Santa Barbara, CA, USA, 27 October 2006; pp. 145–154. [Google Scholar]
- Ren, L.; Shi, W. Chameleon: Personalised and adaptive fall detection of elderly people in home-based environments. Int. J. Sens. Netw. 2016, 20, 142–149. [Google Scholar] [CrossRef]
- Suryadevara, N.K.; Mukhopadhyay, S.C. Determining Wellness through an Ambient Assisted Living Environment. IEEE Intell. Syst. 2014, 29, 30–37. [Google Scholar] [CrossRef]
- Bao, L.; Intille, S.S. Activity recognition from user-annotated acceleration data. Proc. Pervasive 2004, 3001, 1–17. [Google Scholar]
- Liu, T.; Inoue, Y.; Shibata, K. Development of a wearable sensor system for quantitative gait analysis. Measurement 2009, 42, 978–988. [Google Scholar] [CrossRef]
- Tran, D.N.; Phan, D.D. Human activities recognition in android smartphone using support vector machine. In Proceedings of the 7th International Conference on Intelligent Systems, Modelling and Simulation (ISMS), Bangkok, Thailand, 25–27 January 2016; pp. 64–68. [Google Scholar]
- Wannenburg, J.; Malekian, R. Physical Activity Recognition From Smartphone Accelerometer Data for User Context Awareness Sensing. IEEE Trans. Syst. Man Cybern. Syst. 2017, 47, 3142–3149. [Google Scholar] [CrossRef]
- De, D.; Bharti, P.; Das, S.K.; Chellappan, S. Multimodal Wearable Sensing for Fine-Grained Activity Recognition in Healthcare. IEEE Int. Comput. 2015, 19, 26–35. [Google Scholar] [CrossRef]
- Pham, C.; Diep, N.N.; Phuong, T.M. A wearable sensor based approach to real-time fall detection and fine-grained activity recognition. Mob. Multimed. 2013, 9, 15–26. [Google Scholar]
- Awan, M.A.; Guangbin, Z.; Kim, H.-C.; Kim, S.-D. Subject-independent human activity recognition using Smartphone accelerometer with cloud support. Int. J. Ad Hoc Ubiquitous Comput. 2015, 20, 172–185. [Google Scholar] [CrossRef]
- Shoaib, M.; Scholten, H.; Havinga, P.J.M. Towards Physical Activity Recognition Using Smartphone Sensors. In Proceedings of the 2013 IEEE 10th International Conference on Ubiquitous Intelligence and Computing and IEEE 10th International Conference on Autonomic and Trusted Computing, Vietri sul Mere, Italy, 18–21 December 2013; pp. 80–87. [Google Scholar]
- Yuan, Y.; Wang, C.; Zhang, J.; Xu, J.; Li, M. An Ensemble Approach for Activity Recognition with Accelerometer in Mobile-Phone. In Proceedings of the 2014 IEEE 17th International Conference on Computational Science and Engineering, Chengdu, China, 19–21 December 2014; pp. 1469–1474. [Google Scholar]
- Mo, L.; Liu, S.; Gao, R.X.; Fredson, P.S. Multi-Sensor Ensemble Classifier for Activity Recognition. J. Softw. Eng. Appl. 2012, 5, 113–116. [Google Scholar] [CrossRef]
- Ustev, Y.E.; Incel, O.D.; Ersoy, C. User, device and orientation independent human activity recognition on mobile phone challenges and a proposal. In Proceedings of the 2013 ACM conference on pervasive and ubiquitous computing adjunct publication, Zurich, Switzerland, 8–12 September 2013; pp. 1427–1436. [Google Scholar]
- Yang, R.; Wang, B. PACP: A Position-Independent Activity Recognition Method Using Smartphone Sensors. Information 2016, 7, 72. [Google Scholar] [CrossRef]
- Li, Q.; Stankovic, J.A.; Hanson, M.A.; Barth, A.T.; Lach, J.; Zhou, G. Accurate, Fast Fall Detection Using Gyroscopes and Accelerometer-Derived Posture Information. In Proceedings of the 2009 Sixth International Workshop on Wearable and Implantable Body Sensor Networks, Berkeley, CA, USA, 3–5 June 2009; pp. 138–143. [Google Scholar]
- Siirtola, P.; Röning, J. Recognizing human activities user-independently on smartphones based on accelerometer data. Int. J. Interact. Multimed. Artif. Intell. 2012, 1, 38–45. [Google Scholar] [CrossRef]
- Anguita, D.; Ghio, A.; Oneto, L.; Parra, X.; Reyes-Ortiz, J.L. A Public Domain Dataset for Human Activity Recognition Using Smartphones. In Proceedings of the 21th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, ESANN 2013, Bruges, Belgium, 22–24 April 2013; pp. 24–26. [Google Scholar]
- Bayat, A.; Pomplun, M.; Tran, D.A. A Study on Human Activity Recognition Using Accelerometer Data from Smartphones. Procedia Comput. Sci. 2014, 34, 450–457. [Google Scholar] [CrossRef]
- Mannini, A.; Sabatini, A.M. Machine learning methods for classifying human physical activity from on-body accelerometers. Sensors 2010, 10, 1154–1175. [Google Scholar] [CrossRef] [PubMed]
- Kwapisz, J.R.; Weiss, G.M.; Moore, S.A. Activity recognition using cell phone accelerometers. ACM SIGKDD Explor. Newsl. 2010, 12, 74–82. [Google Scholar] [CrossRef]
- Lara, O.D.; Labrador, M.A. A survey on human activity recognition using wearable sensors. IEEE Commun. Surv. Tutor. 2013, 15, 1192–1209. [Google Scholar] [CrossRef]
- Yurtman, A.; Barshan, B. Activity recognition invariant to sensor orientation with wearable motion sensors. Sensors 2017, 17, 1838. [Google Scholar] [CrossRef] [PubMed]
- Baek, W.-S.; Kim, D.-M.; Bashir, F.; Pyun, J.-Y. Real life applicable fall detection system based on wireless body area network. In Proceedings of the 2013 IEEE 10th Consumer Communications and Networking Conference (CCNC), Las Vegas, NV, USA, 10–13 January 2013; pp. 62–67. [Google Scholar]
- Ermes, M.; Parkka, J.; Cluitmans, L. Advancing from offline to online activity recognition with wearable sensors. In Proceedings of the 30th Annual International Conference on 2008 IEEE Engineering in Medicine and Biology Society, Vancouver, BC, Canada, 20–25 August 2008; pp. 4451–4454. [Google Scholar]
- Kao, T.-P.; Lin, C.-W.; Wang, J.-S. Development of a portable activity detector for daily activity recognition. In Proceedings of the 2009 IEEE International Symposium on Industrial Electronics, Seoul, South Korea, 5–8 July 2009; pp. 115–120. [Google Scholar]
- Lara, O.D.; Perez, A.J.; Labrador, M.A.; Posada, J.D. Centinela: A human activity recognition system based on acceleration and vital sign data. Pervasive Mob. Comput. 2012, 8, 717–729. [Google Scholar] [CrossRef]
- Google APIs for Android. Available online: https://developers.google.com/android/reference/com/google/android/gms/location/DetectedActivity (accessed on 26 February 2018).
- Coley, B.; Najafi, B.; Paraschiv-Ionescu, A.; Aminian, K. Stair climbing detection during daily physical activity using a miniature gyroscope. Gait Posture 2005, 22, 287–294. [Google Scholar] [CrossRef] [PubMed]
- Brezmes, T.; Gorricho, J.-L.; Cotrina, J. Activity recognition from accelerometer data on a mobile phone. In Distributed Computing, Artificial Intelligence, Bioinformatics, Soft Computing, and Ambient Assisted Living; 5518 of Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2009; pp. 796–799. [Google Scholar]
- Lustrek, M.; Kaluza, B. Fall Detection and Activity Recognition with Machine Learning. Informatica 2009, 33, 197–204. [Google Scholar]
- Anguita, D.; Ghio, A.; Oneto, L.; Parra, X.; Reyes-Ortiz, J.L. Human Activity Recognition on Smartphones Using a Multiclass Hardware-Friendly Support Vector Machine. In Proceedings of the 4th international conference on Ambient Assisted Living and Home Care-IWAAL’12, Vitoria-Gasteiz, Spain, 3–5 December 2012; Springer: Berlin/Heidelberg, Germany; pp. 216–223. [Google Scholar]
- Kunze, K.; Bahle, G.; Lukowicz, P.; Partridge, K. Can magnetic field sensors replace gyroscopes in wearable sensing applications? In Proceedings of the 2010 International Symposium on Wearable Computers (ISWC), Seoul, South Korea, 10–13 October 2010; pp. 1–4. [Google Scholar]
- Wu, W.; Dasgupta, S.; Ramirez, E.E.; Peterson, C.; Norman, G.J. Classification Accuracies of Physical Activities Using Smartphone Motion Sensors. J. Med. Int. Res. 2012, 14, e130. [Google Scholar] [CrossRef] [PubMed]
- Zhou, Z.H. Ensemble Learning. In Encyclopedia of Biometrics; Li, S.Z., Jain, A.K., Eds.; Springer: New York, NY, USA, 2009; pp. 270–273. [Google Scholar]
- Martin, H.; Bernardos, A.M.; Iglesias, J.; Casar, J.R. Activity logging using lightweight classification techniques in mobile devices. Pers. Ubiquitous Comput. 2013, 17, 675–695. [Google Scholar] [CrossRef]
- Roy, N.; Misra, A.; Cook, D. Ambient and smartphone sensor assisted ADL recognition in multi-inhabitant smart environments. J. Ambient Intell. Humaniz. Comput. 2017, 7, 1–19. [Google Scholar] [CrossRef] [PubMed]
- Miao, F.; He, Y.; Liu, J.; Li, Y.; Ayoola, I. Identifying typical physical activity on smartphone with varying positions and orientations. BioMed. Eng. OnLine 2015, 14, 32. [Google Scholar] [CrossRef] [PubMed]
- Boslaugh, S. Statistics in a Nutshell, 2nd ed.; O’Reilly Media, Inc.: Sebastopol, CA, USA, 2012. [Google Scholar]
- Preece, S.J.; Goulermas, J.Y.; Kenney, L.P.J.; Howard, D. A Comparison of Feature Extraction Methods for the Classification of Dynamic Activities From Accelerometer Data. IEEE Trans. Biomed. Eng. 2009, 56, 871–879. [Google Scholar] [CrossRef] [PubMed]
- Dinakaran, S.; Thangaiah, P.R.J. Role of Attribute Selection in Classification Algorithms. Int. J. Sci. Eng. Res. 2013, 4, 67–71. [Google Scholar]
- Sensor Kinetics Pro. Available online: https://play.google.com/store/apps/details?id=com.innoventions.sensorkineticspro&hl=en (accessed on 10 January 2018).
- Matlab 2013. Available online: https://www.mathworks.com (accessed on 10 January 2018).
- Weka—University ofWaikato. Available online: www.cs.waikato.ac.nz/ml/weka/ (accessed on 10 January 2018).
- Wang, W.Z.; Guo, Y.W.; Huang, B.Y.; Zhao, G.R.; Liu, B.Q.; Wang, L. Analysis of filtering methods for 3D acceleration signals in body sensor network. In Proceedings of the 2011 International Symposium on Bioelectronics and Bioinformations, Suzhou, China, 3–5 November 2011; pp. 263–266. [Google Scholar]
- Wolpert, D.H.; Macready, W.G. No Free Lunch Theorems for Optimization. IEEE Trans. Evolut. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef]
Existing Work, Year | Activities Considered | Device and Position | Sensors | Remarks |
---|---|---|---|---|
[7] in 2004 | Walking, Walking carrying items, Sitting & relaxing, Working on computer, Standing still, Eating/drinking, Watching TV, Reading, Running, Bicycling, Stretching, Strength-training, Scrubbing, Vacuuming, Folding laundry, Lying down & relaxing, Brushing teeth, Climbing stairs, Riding elevator, Riding escalator | Wearable sensors worn at right hip, dominant wrist, non-dominant upper arm, dominant ankle, and non-dominant thigh | Five Biaxial Accelerometers | Using Decision tree classifier they achieved overall accuracy of 80%. They concluded that accelerometer at user’s thigh and dominant wrist are relatively best position for distinguishing between activities. |
[19] in 2009 | Climb Stairs, walk, sit, jump, lay down, run, run on stairs, fall-like motions (quickly sit-down upright, quickly sit-down reclined), flat surface falls (fall forward, fall backward, fall right, fall left), inclined falls (fall on stairs) | Tempo Sensor Nodes 3.0 (Wearable Thigh, Chest) | Accelerometer, Gyroscope | In addition to Activities of Daily Living, fall is detected. It differentiates intentional and unintentional transitions. |
[23] in 2010 | Lying, Cycling, Climbing, Walking, Running, Sitting, Standing | Wearable Sensor | Accelerometer | Uses wearable accelerometer, introduces classification by Hidden Markov Model |
[24] in 2010 | Walking, Jogging, Climbing Stairs, Sitting, Standing | Smartphone placed in pocket | Accelerometer | Achieved 90% accuracy, uses J48, logistic regression, multi layer perceptron. |
[20] in 2012 | Idle, Walking, Cycling, Driving, Running | Smartphone placed in pocket | Accelerometer | The entire system is implemented offline and online. However, in the online mode, the recognition on the device was performed using only a limited number of randomly chosen instances from training data due to limited computational power of the smartphones. Although the online results are almost comparable with the offline results, the system is not entirely user independent. |
[21] in 2013 | Walking, Climbing Upstairs, Climbing Downstairs, Sitting, Standing, Laying Down | Smartphone, Waist Position | Accelerometer | The finest public dataset for HAR using smartphone till now. An accuracy of 96% is achieved on the dataset using multiclass SVM. Features such as energy of different frequency bands, frequency skewness, and angle between vectors are employed |
[22] in 2014 | Running, Slow Walk, Fast Walk, Aerobic Dancing, Stairs Up, Stairs Down | Smartphone, Pocket and Held in hand | Accelerometer | Introduces “fine grained” activities for walk class, rest are coarse grained activities. Position independence is introduced with two positions. Accuracy of 91.5% achieved |
[18] in 2016 | Climb Upstairs, Climb Downstairs, Walking, Running, Standing | Smartphone, Coat Pocket, Trouser Pocket, Hand, Bag | Accelerometer, Gyroscope | Introduces four positions and proposes a position independent system through parameter tuning |
All Features Considered for Analysis. Each of the Feature is Calculated for | Important Features Obtained through Feature Selection | ||
---|---|---|---|
Time Domain Features | Frequency Domain Features | Time Domain Features | Frequency Domain Features |
Mean | Mean | Min() | Mean() |
Standard Deviation | Median | Max() | Mean() |
Variance | Max() | Mean() | |
Mean Absolute Deviation | Max() | Median() | |
Entropy | Max() | Median() | |
Min | |||
Max |
Devices | MotoG4(D1), Pixel(D2), MotoZ2(D3), RedmiNote4(D4), Redmi2Prime(D5) |
Data sampling rate | 50 samples/s |
Filtering method | Butterworth and median filter [48] |
Window type and size | 2 s sliding window with 1 s overlapping window |
Data transformation technique | logarithm with base 10 |
Dataset size | 540,000 samples approximately |
Number of users | 10 |
Feature selection technique | InfoGainAttributeEval, RankerSearch |
Training dataset | SPT-D1, RPT-D1, SPT-D2, RPT-D2 |
Test dataset | SPT-D1-Tst, SPT-D3-Tst, SPT-D4-Tst, SPT-D5-Tst, Hand-D4-Tst |
Classifier Name | Parameter Name | Parameter Detail | Default Value |
---|---|---|---|
Decision Tree (J48) | minObj | Minimum number of instances per leaf | 2 |
K Nearest Neighbor (IBK) | K | The number of neighbors to use | 1 |
Logistic Regression (LR) | MaxIts | Maximum number of iterations to perform | −1 |
Multilayer Perceptron (MLP) | Seed | The value used to seed the random number generator | 0 |
Leaning rate | Learning Rate for the back-propagation algorithm | 0.3 | |
Momentum | Momentum Rate for the back-propagation algorithm | 0.2 |
TrDt | TsDt | Classifier | PtVl | Ac | Er |
---|---|---|---|---|---|
D1-SPT | D1-SPT-Tst | Classifier1 | 33 | 85% | 8% |
D1-RPT | Classifier2 | 40 | 62% | 32% | |
D2-SPT | Classifier3 | −1 | 71% | 21% | |
D2-RPT | Classifier4 | 21 | 65% | 32% | |
Ensemble | — | 91% | 5% | ||
D1-SPT | D5-SPT-Tst | Classifier1 | 20 | 67% | 16% |
D1-RPT | Classifier2 | 22 | 60% | 28% | |
D2-SPT | Classifier3 | 23 | 75% | 18% | |
D2-RPT | Classifier4 | 21 | 68% | 17% | |
Ensemble | — | 90% | 5% | ||
D1-SPT | D3-SPT-Tst | Classifier1 | 40 | 65% | 27% |
D1-RPT | Classifier2 | 22 | 51% | 34% | |
D2-SPT | Classifier3 | 40 | 68% | 25% | |
D2-RPT | Classifier4 | 22 | 51% | 31% | |
Ensemble | — | 85% | 8% | ||
D1-SPT | D4-SPT-Tst | Classifier1 | −1 | 76% | 11% |
D1-RPT | Classifier2 | 20 | 60% | 30% | |
D2-SPT | Classifier3 | 23 | 60% | 26% | |
D2-RPT | Classifier4 | 21 | 62% | 36% | |
Ensemble | — | 79% | 6% | ||
D1-SPT | D4-Hand-Tst | Classifier1 | 21 | 64% | 20% |
D1-RPT | Classifier2 | 35 | 55% | 29% | |
D2-SPT | Classifier3 | 32 | 56% | 25% | |
D2-RPT | Classifier4 | 28 | 54% | 27% | |
Ensemble | — | 78% | 12% |
TrDt | TsDt | Classifier | PtVl | Ac | Er |
---|---|---|---|---|---|
D1-SPT | D1-SPT-Tst | Classifier1 | 21 | 88% | 7% |
D1-RPT | Classifier2 | −1 | 64% | 32% | |
D2-SPT | Classifier3 | 21 | 70% | 24% | |
D2-RPT | Classifier4 | 20 | 61% | 31% | |
Ensemble | — | 94% | 4% | ||
D1-SPT | D5-SPT-Tst | Classifier1 | 23 | 55% | 26% |
D1-RPT | Classifier2 | 35 | 60% | 24% | |
D2-SPT | Classifier3 | 28 | 79% | 18% | |
D2-RPT | Classifier4 | 30 | 75% | 12% | |
Ensemble | — | 93% | 3% | ||
D1-SPT | D3-SPT-Tst | Classifier1 | 25 | 61% | 26% |
D1-RPT | Classifier2 | 28 | 50% | 33% | |
D2-SPT | Classifier3 | 32 | 66% | 28% | |
D2-RPT | Classifier4 | 40 | 60% | 29% | |
Ensemble | — | 86% | 8% | ||
D1-SPT | D4-SPT-Tst | Classifier1 | 26 | 64% | 25% |
D1-RPT | Classifier2 | 31 | 60% | 30% | |
D2-SPT | Classifier3 | 29 | 70% | 12% | |
D2-RPT | Classifier4 | 22 | 68% | 25% | |
Ensemble | — | 81% | 7% | ||
D1-SPT | D4-Hand-Tst | Classifier1 | −1 | 56% | 31% |
D1-RPT | Classifier2 | 21 | 50% | 34% | |
D2-SPT | Classifier3 | 28 | 65% | 24% | |
D2-RPT | Classifier4 | 26 | 67% | 27% | |
Ensemble | — | 80% | 11% |
© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Saha, J.; Chowdhury, C.; Roy Chowdhury, I.; Biswas, S.; Aslam, N. An Ensemble of Condition Based Classifiers for Device Independent Detailed Human Activity Recognition Using Smartphones †. Information 2018, 9, 94. https://doi.org/10.3390/info9040094
Saha J, Chowdhury C, Roy Chowdhury I, Biswas S, Aslam N. An Ensemble of Condition Based Classifiers for Device Independent Detailed Human Activity Recognition Using Smartphones †. Information. 2018; 9(4):94. https://doi.org/10.3390/info9040094
Chicago/Turabian StyleSaha, Jayita, Chandreyee Chowdhury, Ishan Roy Chowdhury, Suparna Biswas, and Nauman Aslam. 2018. "An Ensemble of Condition Based Classifiers for Device Independent Detailed Human Activity Recognition Using Smartphones †" Information 9, no. 4: 94. https://doi.org/10.3390/info9040094
APA StyleSaha, J., Chowdhury, C., Roy Chowdhury, I., Biswas, S., & Aslam, N. (2018). An Ensemble of Condition Based Classifiers for Device Independent Detailed Human Activity Recognition Using Smartphones †. Information, 9(4), 94. https://doi.org/10.3390/info9040094