Next Article in Journal
Acousto-Optic Q-Switched Fiber Laser-Based Intra-Cavity Photoacoustic Spectroscopy for Trace Gas Detection
Next Article in Special Issue
Detection of Craving for Gaming in Adolescents with Internet Gaming Disorder Using Multimodal Biosignals
Previous Article in Journal
Quantitative Determination of Spring Water Quality Parameters via Electronic Tongue
Previous Article in Special Issue
Lab-on-a-Chip Platforms for Detection of Cardiovascular Disease and Cancer Biomarkers
Open AccessFeature PaperArticle

Motion Artifact Quantification and Sensor Fusion for Unobtrusive Health Monitoring

Philips Chair for Medical Information Technology, RWTH Aachen University, 52074 Aachen, Germany
*
Author to whom correspondence should be addressed.
Sensors 2018, 18(1), 38; https://doi.org/10.3390/s18010038
Received: 30 November 2017 / Revised: 19 December 2017 / Accepted: 20 December 2017 / Published: 25 December 2017
(This article belongs to the Special Issue Sensors for Health Monitoring and Disease Diagnosis)
Sensors integrated into objects of everyday life potentially allow unobtrusive health monitoring at home. However, since the coupling of sensors and subject is not as well-defined as compared to a clinical setting, the signal quality is much more variable and can be disturbed significantly by motion artifacts. One way of tackling this challenge is the combined evaluation of multiple channels via sensor fusion. For robust and accurate sensor fusion, analyzing the influence of motion on different modalities is crucial. In this work, a multimodal sensor setup integrated into an armchair is presented that combines capacitively coupled electrocardiography, reflective photoplethysmography, two high-frequency impedance sensors and two types of ballistocardiography sensors. To quantify motion artifacts, a motion protocol performed by healthy volunteers is recorded with a motion capture system, and reference sensors perform cardiorespiratory monitoring. The shape-based signal-to-noise ratio SNR S is introduced and used to quantify the effect on motion on different sensing modalities. Based on this analysis, an optimal combination of sensors and fusion methodology is developed and evaluated. Using the proposed approach, beat-to-beat heart-rate is estimated with a coverage of 99.5% and a mean absolute error of 7.9 ms on 425 min of data from seven volunteers in a proof-of-concept measurement scenario. View Full-Text
Keywords: motion artifacts; unobtrusive sensing; sensor fusion; motion capture; heart rate; medical signal processing; biosignals; ambient assisted living motion artifacts; unobtrusive sensing; sensor fusion; motion capture; heart rate; medical signal processing; biosignals; ambient assisted living
Show Figures

Figure 1

MDPI and ACS Style

Hoog Antink, C.; Schulz, F.; Leonhardt, S.; Walter, M. Motion Artifact Quantification and Sensor Fusion for Unobtrusive Health Monitoring. Sensors 2018, 18, 38.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Search more from Scilit
 
Search
Back to TopTop