Next Article in Journal / Special Issue
A Unified Framework for Head Pose, Age and Gender Classification through End-to-End Face Segmentation
Previous Article in Journal
Context Based Predictive Information
Previous Article in Special Issue
Enhanced Approach Using Reduced SBTFD Features and Modified Individual Behavior Estimation for Crowd Condition Prediction
Article Menu

Export Article

Open AccessArticle

Emotion Recognition from Skeletal Movements

1
Institute of Mechatronics and Information Systems Lodz University of Technology, 90-924 Lodz, Poland
2
iCV Lab, Institute of Technology, University of Tartu, 51014 Tartu, Estonia
3
Faculty of Engineering, Hasan Kalyoncu University, 27000 Sahinbey, Gaziantep, Turkey
4
Institute of Digital Technologies, Loughborough University London, London E15 2GZ, UK
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Entropy 2019, 21(7), 646; https://doi.org/10.3390/e21070646
Received: 1 March 2019 / Revised: 24 June 2019 / Accepted: 26 June 2019 / Published: 29 June 2019
(This article belongs to the Special Issue Statistical Machine Learning for Human Behaviour Analysis)
  |  
PDF [3160 KB, uploaded 29 June 2019]
  |  

Abstract

Automatic emotion recognition has become an important trend in many artificial intelligence (AI) based applications and has been widely explored in recent years. Most research in the area of automated emotion recognition is based on facial expressions or speech signals. Although the influence of the emotional state on body movements is undeniable, this source of expression is still underestimated in automatic analysis. In this paper, we propose a novel method to recognise seven basic emotional states—namely, happy, sad, surprise, fear, anger, disgust and neutral—utilising body movement. We analyse motion capture data under seven basic emotional states recorded by professional actor/actresses using Microsoft Kinect v2 sensor. We propose a new representation of affective movements, based on sequences of body joints. The proposed algorithm creates a sequential model of affective movement based on low level features inferred from the spacial location and the orientation of joints within the tracked skeleton. In the experimental results, different deep neural networks were employed and compared to recognise the emotional state of the acquired motion sequences. The experimental results conducted in this work show the feasibility of automatic emotion recognition from sequences of body gestures, which can serve as an additional source of information in multimodal emotion recognition. View Full-Text
Keywords: emotion recognition; gestures; body movements; Kinect sensor; neural networks; deep learning emotion recognition; gestures; body movements; Kinect sensor; neural networks; deep learning
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Sapiński, T.; Kamińska, D.; Pelikant, A.; Anbarjafari, G. Emotion Recognition from Skeletal Movements. Entropy 2019, 21, 646.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top