Sensor-Based Activity Recognition and Interaction

A special issue of Informatics (ISSN 2227-9709).

Deadline for manuscript submissions: closed (28 February 2018) | Viewed by 106176

Special Issue Editors


E-Mail Website
Guest Editor
Institute of Computer Science, University of Rostock, 18051 Rostock, Germany
Interests: mobile multimedia information systems; intelligent environments; activity recognition and annotation; agent-based approaches; technically-assisted rehabilitation

E-Mail Website
Guest Editor
Fraunhofer IGD Rostock, Joachim-Jungius-Straße 11, 18059 Rostock, Germany
Interests: multimedia communication; healthcare analytics; wearable interaction; internet of things; smart factories

E-Mail Website
Guest Editor
Institute of Computer Science, University of Rostock, 18051 Rostock, Germany
Interests: activity and intention recognition; human behavior models; knowledge elicitation; natural language processing; automatic extraction of behavior models from textual sources
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Ubiquitous systems are becoming an integral part of our everyday lives. Functionality and user experience often depend on accurate sensor-based activity recognition and interaction. Systems aiming to provide users with assistance or to monitor their behavior and condition rely heavily on sensors and the activities and interactions that they can recognize. Providing adequate activity recognition and interaction requires consideration of various interlocked aspects, such as sensors that are capable of capturing relevant behavior, rigorous methods to reason about sensor readings in the context of these behaviors, and effective approaches for assisting and interacting with the users. Each of these aspects is essential and can influence the quality and suitability of the provided service.

We solicit original submissions that contribute novel computer science methods, innovative software solutions, and compelling use cases on any of the following topics:

  • sensors, sensor infrastructures, and sensing technologies needed to detect user behaviors and to provide relevant interactions between systems and users;
  • data and model-driven methods for intelligent monitoring and user assistance that supports users in everyday settings;
  • novel applications and evaluation studies of methods for intelligent monitoring of everyday user behavior and user assistance using sensing technologies;
  • intelligent methods for synthesizing assistance and interaction strategies using sensing technologies.
Prof. Dr. Thomas Kirste
Prof. Dr. Bodo Urban
Dr. Kristina Yordanova
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Informatics is an international peer-reviewed open access quarterly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Human activity recognition
  • Healthcare systems
  • Cognitive systems
  • Knowledge representation and reasoning
  • Expert Systems
  • NLP for intelligent systems
  • Ontologies for intelligent systems
  • Knowledge acquisition for intelligent systems
  • Assistive systems in the healthcare and manufacturing
  • Novel applications for assessing everyday behavior
  • Smart homes
  • Behavior monitoring and interpretation
  • Human performance measuring
  • Interaction techniques
  • Intelligent user interfaces
  • Input & output modalities
  • Wearable computing and wearable sensing
  • Context awareness
  • Data mining and machine learning for sensor-based intelligent systems
  • Signal reconstruction and interpolation
  • Innovative wearable sensing technologies
  • Machine learning techniques for interpretation of sensor data

Published Papers (10 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

18 pages, 564 KiB  
Article
Self-Adaptive Multi-Sensor Activity Recognition Systems Based on Gaussian Mixture Models
by Martin Jänicke, Bernhard Sick and Sven Tomforde
Informatics 2018, 5(3), 38; https://doi.org/10.3390/informatics5030038 - 19 Sep 2018
Cited by 7 | Viewed by 7021
Abstract
Personal wearables such as smartphones or smartwatches are increasingly utilized in everyday life. Frequently, activity recognition is performed on these devices to estimate the current user status and trigger automated actions according to the user’s needs. In this article, we focus on the [...] Read more.
Personal wearables such as smartphones or smartwatches are increasingly utilized in everyday life. Frequently, activity recognition is performed on these devices to estimate the current user status and trigger automated actions according to the user’s needs. In this article, we focus on the creation of a self-adaptive activity recognition system based on IMU that includes new sensors during runtime. Starting with a classifier based on GMM, the density model is adapted to new sensor data fully autonomously by issuing the marginalization property of normal distributions. To create a classifier from that, label inference is done, either based on the initial classifier or based on the training data. For evaluation, we used more than 10 h of annotated activity data from the publicly available PAMAP2 benchmark dataset. Using the data, we showed the feasibility of our approach and performed 9720 experiments, to get resilient numbers. One approach performed reasonably well, leading to a system improvement on average, with an increase in the F-score of 0.0053, while the other one shows clear drawbacks due to a high loss of information during label inference. Furthermore, a comparison with state of the art techniques shows the necessity for further experiments in this area. Full article
(This article belongs to the Special Issue Sensor-Based Activity Recognition and Interaction)
Show Figures

Figure 1

22 pages, 7091 KiB  
Article
Towards Clustering of Mobile and Smartwatch Accelerometer Data for Physical Activity Recognition
by Chelsea Dobbins and Reza Rawassizadeh
Informatics 2018, 5(2), 29; https://doi.org/10.3390/informatics5020029 - 12 Jun 2018
Cited by 34 | Viewed by 9991
Abstract
Mobile and wearable devices now have a greater capability of sensing human activity ubiquitously and unobtrusively through advancements in miniaturization and sensing abilities. However, outstanding issues remain around the energy restrictions of these devices when processing large sets of data. This paper presents [...] Read more.
Mobile and wearable devices now have a greater capability of sensing human activity ubiquitously and unobtrusively through advancements in miniaturization and sensing abilities. However, outstanding issues remain around the energy restrictions of these devices when processing large sets of data. This paper presents our approach that uses feature selection to refine the clustering of accelerometer data to detect physical activity. This also has a positive effect on the computational burden that is associated with processing large sets of data, as energy efficiency and resource use is decreased because less data is processed by the clustering algorithms. Raw accelerometer data, obtained from smartphones and smartwatches, have been preprocessed to extract both time and frequency domain features. Principle component analysis feature selection (PCAFS) and correlation feature selection (CFS) have been used to remove redundant features. The reduced feature sets have then been evaluated against three widely used clustering algorithms, including hierarchical clustering analysis (HCA), k-means, and density-based spatial clustering of applications with noise (DBSCAN). Using the reduced feature sets resulted in improved separability, reduced uncertainty, and improved efficiency compared with the baseline, which utilized all features. Overall, the CFS approach in conjunction with HCA produced higher Dunn Index results of 9.7001 for the phone and 5.1438 for the watch features, which is an improvement over the baseline. The results of this comparative study of feature selection and clustering, with the specific algorithms used, has not been performed previously and provides an optimistic and usable approach to recognize activities using either a smartphone or smartwatch. Full article
(This article belongs to the Special Issue Sensor-Based Activity Recognition and Interaction)
Show Figures

Figure 1

18 pages, 6117 KiB  
Article
Real-Time and Embedded Detection of Hand Gestures with an IMU-Based Glove
by Chaithanya Kumar Mummadi, Frederic Philips Peter Leo, Keshav Deep Verma, Shivaji Kasireddy, Philipp M. Scholl, Jochen Kempfle and Kristof Van Laerhoven
Informatics 2018, 5(2), 28; https://doi.org/10.3390/informatics5020028 - 11 Jun 2018
Cited by 59 | Viewed by 14900
Abstract
This article focuses on the use of data gloves for human-computer interaction concepts, where external sensors cannot always fully observe the user’s hand. A good concept hereby allows to intuitively switch the interaction context on demand by using different hand gestures. The recognition [...] Read more.
This article focuses on the use of data gloves for human-computer interaction concepts, where external sensors cannot always fully observe the user’s hand. A good concept hereby allows to intuitively switch the interaction context on demand by using different hand gestures. The recognition of various, possibly complex hand gestures, however, introduces unintentional overhead to the system. Consequently, we present a data glove prototype comprising a glove-embedded gesture classifier utilizing data from Inertial Measurement Units (IMUs) in the fingertips. In an extensive set of experiments with 57 participants, our system was tested with 22 hand gestures, all taken from the French Sign Language (LSF) alphabet. Results show that our system is capable of detecting the LSF alphabet with a mean accuracy score of 92% and an F1 score of 91%, using complementary filter with a gyroscope-to-accelerometer ratio of 93%. Our approach has also been compared to the local fusion algorithm on an IMU motion sensor, showing faster settling times and less delays after gesture changes. Real-time performance of the recognition is shown to occur within 63 milliseconds, allowing fluent use of the gestures via Bluetooth-connected systems. Full article
(This article belongs to the Special Issue Sensor-Based Activity Recognition and Interaction)
Show Figures

Figure 1

37 pages, 839 KiB  
Article
A Comprehensive Study of Activity Recognition Using Accelerometers
by Niall Twomey, Tom Diethe, Xenofon Fafoutis, Atis Elsts, Ryan McConville, Peter Flach and Ian Craddock
Informatics 2018, 5(2), 27; https://doi.org/10.3390/informatics5020027 - 30 May 2018
Cited by 102 | Viewed by 15168
Abstract
This paper serves as a survey and empirical evaluation of the state-of-the-art in activity recognition methods using accelerometers. The paper is particularly focused on long-term activity recognition in real-world settings. In these environments, data collection is not a trivial matter; thus, there are [...] Read more.
This paper serves as a survey and empirical evaluation of the state-of-the-art in activity recognition methods using accelerometers. The paper is particularly focused on long-term activity recognition in real-world settings. In these environments, data collection is not a trivial matter; thus, there are performance trade-offs between prediction accuracy, which is not the sole system objective, and keeping the maintenance overhead at minimum levels. We examine research that has focused on the selection of activities, the features that are extracted from the accelerometer data, the segmentation of the time-series data, the locations of accelerometers, the selection and configuration trade-offs, the test/retest reliability, and the generalisation performance. Furthermore, we study these questions from an experimental platform and show, somewhat surprisingly, that many disparate experimental configurations yield comparable predictive performance on testing data. Our understanding of these results is that the experimental setup directly and indirectly defines a pathway for context to be delivered to the classifier, and that, in some settings, certain configurations are more optimal than alternatives. We conclude by identifying how the main results of this work can be used in practice, specifically in experimental configurations in challenging experimental conditions. Full article
(This article belongs to the Special Issue Sensor-Based Activity Recognition and Interaction)
Show Figures

Figure 1

17 pages, 975 KiB  
Article
Convolutional Neural Networks for Human Activity Recognition Using Body-Worn Sensors
by Fernando Moya Rueda, René Grzeszick, Gernot A. Fink, Sascha Feldhorst and Michael Ten Hompel
Informatics 2018, 5(2), 26; https://doi.org/10.3390/informatics5020026 - 25 May 2018
Cited by 125 | Viewed by 16181
Abstract
Human activity recognition (HAR) is a classification task for recognizing human movements. Methods of HAR are of great interest as they have become tools for measuring occurrences and durations of human actions, which are the basis of smart assistive technologies and manual processes [...] Read more.
Human activity recognition (HAR) is a classification task for recognizing human movements. Methods of HAR are of great interest as they have become tools for measuring occurrences and durations of human actions, which are the basis of smart assistive technologies and manual processes analysis. Recently, deep neural networks have been deployed for HAR in the context of activities of daily living using multichannel time-series. These time-series are acquired from body-worn devices, which are composed of different types of sensors. The deep architectures process these measurements for finding basic and complex features in human corporal movements, and for classifying them into a set of human actions. As the devices are worn at different parts of the human body, we propose a novel deep neural network for HAR. This network handles sequence measurements from different body-worn devices separately. An evaluation of the architecture is performed on three datasets, the Oportunity, Pamap2, and an industrial dataset, outperforming the state-of-the-art. In addition, different network configurations will also be evaluated. We find that applying convolutions per sensor channel and per body-worn device improves the capabilities of convolutional neural network (CNNs). Full article
(This article belongs to the Special Issue Sensor-Based Activity Recognition and Interaction)
Show Figures

Figure 1

14 pages, 2559 KiB  
Article
Fitness Activity Recognition on Smartphones Using Doppler Measurements
by Biying Fu, Florian Kirchbuchner, Arjan Kuijper, Andreas Braun and Dinesh Vaithyalingam Gangatharan
Informatics 2018, 5(2), 24; https://doi.org/10.3390/informatics5020024 - 04 May 2018
Cited by 18 | Viewed by 9647
Abstract
Quantified Self has seen an increased interest in recent years, with devices including smartwatches, smartphones, or other wearables that allow you to monitor your fitness level. This is often combined with mobile apps that use gamification aspects to motivate the user to perform [...] Read more.
Quantified Self has seen an increased interest in recent years, with devices including smartwatches, smartphones, or other wearables that allow you to monitor your fitness level. This is often combined with mobile apps that use gamification aspects to motivate the user to perform fitness activities, or increase the amount of sports exercise. Thus far, most applications rely on accelerometers or gyroscopes that are integrated into the devices. They have to be worn on the body to track activities. In this work, we investigated the use of a speaker and a microphone that are integrated into a smartphone to track exercises performed close to it. We combined active sonar and Doppler signal analysis in the ultrasound spectrum that is not perceivable by humans. We wanted to measure the body weight exercises bicycles, toe touches, and squats, as these consist of challenging radial movements towards the measuring device. We have tested several classification methods, ranging from support vector machines to convolutional neural networks. We achieved an accuracy of 88% for bicycles, 97% for toe-touches and 91% for squats on our test set. Full article
(This article belongs to the Special Issue Sensor-Based Activity Recognition and Interaction)
Show Figures

Figure 1

18 pages, 1681 KiB  
Article
An Internet of Things Based Multi-Level Privacy-Preserving Access Control for Smart Living
by Usama Salama, Lina Yao and Hye-young Paik
Informatics 2018, 5(2), 23; https://doi.org/10.3390/informatics5020023 - 03 May 2018
Cited by 8 | Viewed by 9398
Abstract
The presence of the Internet of Things (IoT) in healthcare through the use of mobile medical applications and wearable devices allows patients to capture their healthcare data and enables healthcare professionals to be up-to-date with a patient’s status. Ambient Assisted Living (AAL), which [...] Read more.
The presence of the Internet of Things (IoT) in healthcare through the use of mobile medical applications and wearable devices allows patients to capture their healthcare data and enables healthcare professionals to be up-to-date with a patient’s status. Ambient Assisted Living (AAL), which is considered as one of the major applications of IoT, is a home environment augmented with embedded ambient sensors to help improve an individual’s quality of life. This domain faces major challenges in providing safety and security when accessing sensitive health data. This paper presents an access control framework for AAL which considers multi-level access and privacy preservation. We focus on two major points: (1) how to use the data collected from ambient sensors and biometric sensors to perform the high-level task of activity recognition; and (2) how to secure the collected private healthcare data via effective access control. We achieve multi-level access control by extending Public Key Infrastructure (PKI) for secure authentication and utilizing Attribute-Based Access Control (ABAC) for authorization. The proposed access control system regulates access to healthcare data by defining policy attributes over healthcare professional groups and data classes classifications. We provide guidelines to classify the data classes and healthcare professional groups and describe security policies to control access to the data classes. Full article
(This article belongs to the Special Issue Sensor-Based Activity Recognition and Interaction)
Show Figures

Figure 1

19 pages, 1739 KiB  
Article
Recognition of Physical Activities from a Single Arm-Worn Accelerometer: A Multiway Approach
by Lieven Billiet, Thijs Swinnen, Kurt De Vlam, Rene Westhovens and Sabine Van Huffel
Informatics 2018, 5(2), 20; https://doi.org/10.3390/informatics5020020 - 16 Apr 2018
Cited by 4 | Viewed by 7389
Abstract
In current clinical practice, functional limitations due to chronic musculoskeletal diseases are still being assessed subjectively, e.g., using questionnaires and function scores. Performance-based methods, on the other hand, offer objective insights. Hence, they recently attracted more interest as an additional source of information. [...] Read more.
In current clinical practice, functional limitations due to chronic musculoskeletal diseases are still being assessed subjectively, e.g., using questionnaires and function scores. Performance-based methods, on the other hand, offer objective insights. Hence, they recently attracted more interest as an additional source of information. This work offers a step towards the shift to performance-based methods by recognizing standardized activities from continuous readings using a single accelerometer mounted on a patient’s arm. The proposed procedure consists of two steps. Firstly, activities are segmented, including rejection of non-informative segments. Secondly, the segments are associated to predefined activities using a multiway pattern matching approach based on higher order discriminant analysis (HODA). The two steps are combined into a multi-layered framework. Experiments on data recorded from 39 patients with spondyloarthritis show results with a classification accuracy of 94.34% when perfect segmentation is assumed. Automatic segmentation has 89.32% overlap with this ideal scenario. However, combining both drops performance to 62.34% due to several badly-recognized subjects. Still, these results are shown to significantly outperform a more traditional pattern matching approach. Overall, the work indicates promising viability of the technique to automate recognition and, through future work, assessment, of functional capacity. Full article
(This article belongs to the Special Issue Sensor-Based Activity Recognition and Interaction)
Show Figures

Figure 1

18 pages, 5859 KiB  
Article
Detecting Transitions in Manual Tasks from Wearables: An Unsupervised Labeling Approach
by Sebastian Böttcher, Philipp M. Scholl and Kristof Van Laerhoven
Informatics 2018, 5(2), 16; https://doi.org/10.3390/informatics5020016 - 29 Mar 2018
Cited by 1 | Viewed by 7477
Abstract
Authoring protocols for manual tasks such as following recipes, manufacturing processes or laboratory experiments requires significant effort. This paper presents a system that estimates individual procedure transitions from the user’s physical movement and gestures recorded with inertial motion sensors. Combined with egocentric or [...] Read more.
Authoring protocols for manual tasks such as following recipes, manufacturing processes or laboratory experiments requires significant effort. This paper presents a system that estimates individual procedure transitions from the user’s physical movement and gestures recorded with inertial motion sensors. Combined with egocentric or external video recordings, this facilitates efficient review and annotation of video databases. We investigate different clustering algorithms on wearable inertial sensor data recorded on par with video data, to automatically create transition marks between task steps. The goal is to match these marks to the transitions given in a description of the workflow, thus creating navigation cues to browse video repositories of manual work. To evaluate the performance of unsupervised algorithms, the automatically-generated marks are compared to human expert-created labels on two publicly-available datasets. Additionally, we tested the approach on a novel dataset in a manufacturing lab environment, describing an existing sequential manufacturing process. The results from selected clustering methods are also compared to some supervised methods. Full article
(This article belongs to the Special Issue Sensor-Based Activity Recognition and Interaction)
Show Figures

Graphical abstract

25 pages, 791 KiB  
Article
A Hybrid Approach to Recognising Activities of Daily Living from Object Use in the Home Environment
by Isibor Kennedy Ihianle, Usman Naeem, Syed Islam and Abdel-Rahman Tawil
Informatics 2018, 5(1), 6; https://doi.org/10.3390/informatics5010006 - 13 Jan 2018
Cited by 13 | Viewed by 7953
Abstract
Accurate recognition of Activities of Daily Living (ADL) plays an important role in providing assistance and support to the elderly and cognitively impaired. Current knowledge-driven and ontology-based techniques model object concepts from assumptions and everyday common knowledge of object use for routine activities. [...] Read more.
Accurate recognition of Activities of Daily Living (ADL) plays an important role in providing assistance and support to the elderly and cognitively impaired. Current knowledge-driven and ontology-based techniques model object concepts from assumptions and everyday common knowledge of object use for routine activities. Modelling activities from such information can lead to incorrect recognition of particular routine activities resulting in possible failure to detect abnormal activity trends. In cases where such prior knowledge are not available, such techniques become virtually unemployable. A significant step in the recognition of activities is the accurate discovery of the object usage for specific routine activities. This paper presents a hybrid framework for automatic consumption of sensor data and associating object usage to routine activities using Latent Dirichlet Allocation (LDA) topic modelling. This process enables the recognition of simple activities of daily living from object usage and interactions in the home environment. The evaluation of the proposed framework on the Kasteren and Ordonez datasets show that it yields better results compared to existing techniques. Full article
(This article belongs to the Special Issue Sensor-Based Activity Recognition and Interaction)
Show Figures

Figure 1

Back to TopTop