Next Article in Journal
Aptamer-Based Carboxyl-Terminated Nanocrystalline Diamond Sensing Arrays for Adenosine Triphosphate Detection
Next Article in Special Issue
Improving the Accuracy of Direct Geo-referencing of Smartphone-Based Mobile Mapping Systems Using Relative Orientation and Scene Geometric Constraints
Previous Article in Journal
Hierarchical Stereo Matching in Two-Scale Space for Cyber-Physical System
Previous Article in Special Issue
Active Multimodal Sensor System for Target Recognition and Tracking
Article Menu
Issue 7 (July) cover image

Export Article

Open AccessArticle
Sensors 2017, 17(7), 1685; https://doi.org/10.3390/s17071685

Noncontact Sleep Study by Multi-Modal Sensor Fusion

1
Department of Electronics and Computer Engineering, Hanyang University, Seoul 04763, Korea
2
Intelligence Lab, LG Electronics Woomyon Research and Development Campus, Seoul 06763, Korea
3
Department of Otorhinolaryngology-Head and Neck Surgery, College of Medicine, Hanyang University, Seoul 04763, Korea
*
Authors to whom correspondence should be addressed.
Received: 28 June 2017 / Revised: 14 July 2017 / Accepted: 20 July 2017 / Published: 21 July 2017
(This article belongs to the Special Issue Multi-Sensor Integration and Fusion)
View Full-Text   |   Download PDF [1244 KB, uploaded 24 July 2017]   |  

Abstract

Polysomnography (PSG) is considered as the gold standard for determining sleep stages, but due to the obtrusiveness of its sensor attachments, sleep stage classification algorithms using noninvasive sensors have been developed throughout the years. However, the previous studies have not yet been proven reliable. In addition, most of the products are designed for healthy customers rather than for patients with sleep disorder. We present a novel approach to classify sleep stages via low cost and noncontact multi-modal sensor fusion, which extracts sleep-related vital signals from radar signals and a sound-based context-awareness technique. This work is uniquely designed based on the PSG data of sleep disorder patients, which were received and certified by professionals at Hanyang University Hospital. The proposed algorithm further incorporates medical/statistical knowledge to determine personal-adjusted thresholds and devise post-processing. The efficiency of the proposed algorithm is highlighted by contrasting sleep stage classification performance between single sensor and sensor-fusion algorithms. To validate the possibility of commercializing this work, the classification results of this algorithm were compared with the commercialized sleep monitoring device, ResMed S+. The proposed algorithm was investigated with random patients following PSG examination, and results show a promising novel approach for determining sleep stages in a low cost and unobtrusive manner. View Full-Text
Keywords: radar; vital signal; sleep stage; medical device; sensor fusion; microphone radar; vital signal; sleep stage; medical device; sensor fusion; microphone
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Share & Cite This Article

MDPI and ACS Style

Chung, K.-Y.; Song, K.; Shin, K.; Sohn, J.; Cho, S.H.; Chang, J.-H. Noncontact Sleep Study by Multi-Modal Sensor Fusion. Sensors 2017, 17, 1685.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Sensors EISSN 1424-8220 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top