Next Article in Journal
Top-Down NOX Emissions of European Cities Based on the Downwind Plume of Modelled and Space-Borne Tropospheric NO2 Columns
Next Article in Special Issue
An All-Organic Flexible Visible Light Communication System
Previous Article in Journal
Energy Efficient Policies for Data Transmission in Disruption Tolerant Heterogeneous IoT Networks
Previous Article in Special Issue
Estimation of Cough Peak Flow Using Cough Sounds
Article Menu
Issue 9 (September) cover image

Export Article

Open AccessArticle
Sensors 2018, 18(9), 2892; https://doi.org/10.3390/s18092892

Feature Representation and Data Augmentation for Human Activity Classification Based on Wearable IMU Sensor Data Using a Deep LSTM Neural Network

1
School of Electronics Engineering, Kyungpook National University, 80 Daehak-ro, Buk-gu, Daegu 41566, Korea
2
School of Computing & Informatics Technology, Makerere University, Plot 56, Pool Road, P.O. Box 7062, Kampala, Uganda
*
Author to whom correspondence should be addressed.
Received: 16 July 2018 / Revised: 10 August 2018 / Accepted: 27 August 2018 / Published: 31 August 2018
(This article belongs to the Special Issue Wearable Sensors and Devices for Healthcare Applications)
Full-Text   |   PDF [3767 KB, uploaded 31 August 2018]   |  

Abstract

Wearable inertial measurement unit (IMU) sensors are powerful enablers for acquisition of motion data. Specifically, in human activity recognition (HAR), IMU sensor data collected from human motion are categorically combined to formulate datasets that can be used for learning human activities. However, successful learning of human activities from motion data involves the design and use of proper feature representations of IMU sensor data and suitable classifiers. Furthermore, the scarcity of labelled data is an impeding factor in the process of understanding the performance capabilities of data-driven learning models. To tackle these challenges, two primary contributions are in this article: first; by using raw IMU sensor data, a spectrogram-based feature extraction approach is proposed. Second, an ensemble of data augmentations in feature space is proposed to take care of the data scarcity problem. Performance tests were conducted on a deep long term short term memory (LSTM) neural network architecture to explore the influence of feature representations and the augmentations on activity recognition accuracy. The proposed feature extraction approach combined with the data augmentation ensemble produces state-of-the-art accuracy results in HAR. A performance evaluation of each augmentation approach is performed to show the influence on classification accuracy. Finally, in addition to using our own dataset, the proposed data augmentation technique is evaluated against the University of California, Irvine (UCI) public online HAR dataset and yields state-of-the-art accuracy results at various learning rates. View Full-Text
Keywords: human activity recognition; data augmentation; feature representation; deep learning; long short term memory; inertial measurement unit sensor human activity recognition; data augmentation; feature representation; deep learning; long short term memory; inertial measurement unit sensor
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Steven Eyobu, O.; Han, D.S. Feature Representation and Data Augmentation for Human Activity Classification Based on Wearable IMU Sensor Data Using a Deep LSTM Neural Network. Sensors 2018, 18, 2892.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Sensors EISSN 1424-8220 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top