Next Article in Journal
Architectural and Integration Options for 3D NAND Flash Memories
Previous Article in Journal / Special Issue
BICM-ID with Physical Layer Network Coding in TWR Free Space Optical Communication Links
Article Menu

Export Article

Open AccessArticle
Computers 2017, 6(3), 25; doi:10.3390/computers6030025

Conceiving Human Interaction by Visualising Depth Data of Head Pose Changes and Emotion Recognition via Facial Expressions

1
School of Computer Science and Electronic Engineering, University of Essex, Colchester CO4 3SQ, UK
2
Department of Informatics Engineering, Technological Educational Institute of Crete, Stauromenos 71410, Heraklion, Crete, Greece
This paper is an extended version of our paper published in proceedings of the 8th Computer Science and Electronic Engineering Conference (CEEC), University of Essex, UK, 28–30 September 2016.
*
Author to whom correspondence should be addressed.
Received: 31 May 2017 / Revised: 20 July 2017 / Accepted: 20 July 2017 / Published: 23 July 2017
View Full-Text   |   Download PDF [2854 KB, uploaded 23 July 2017]   |  

Abstract

Affective computing in general and human activity and intention analysis in particular comprise a rapidly-growing field of research. Head pose and emotion changes present serious challenges when applied to player’s training and ludology experience in serious games, or analysis of customer satisfaction regarding broadcast and web services, or monitoring a driver’s attention. Given the increasing prominence and utility of depth sensors, it is now feasible to perform large-scale collection of three-dimensional (3D) data for subsequent analysis. Discriminative random regression forests were selected in order to rapidly and accurately estimate head pose changes in an unconstrained environment. In order to complete the secondary process of recognising four universal dominant facial expressions (happiness, anger, sadness and surprise), emotion recognition via facial expressions (ERFE) was adopted. After that, a lightweight data exchange format (JavaScript Object Notation (JSON)) is employed, in order to manipulate the data extracted from the two aforementioned settings. Motivated by the need to generate comprehensible visual representations from different sets of data, in this paper, we introduce a system capable of monitoring human activity through head pose and emotion changes, utilising an affordable 3D sensing technology (Microsoft Kinect sensor). View Full-Text
Keywords: human activity analysis; affective computing; data visualisation; depth data; head pose estimation; emotion recognition human activity analysis; affective computing; data visualisation; depth data; head pose estimation; emotion recognition
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Kalliatakis, G.; Stergiou, A.; Vidakis, N. Conceiving Human Interaction by Visualising Depth Data of Head Pose Changes and Emotion Recognition via Facial Expressions. Computers 2017, 6, 25.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Computers EISSN 2073-431X Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top