Next Article in Journal
Infrared Small Moving Target Detection via Saliency Histogram and Geometrical Invariability
Next Article in Special Issue
Real-Time Recognition of Calling Pattern and Behaviour of Mobile Phone Users through Anomaly Detection and Dynamically-Evolving Clustering
Previous Article in Journal
Analyzing the Characteristics of Soil Moisture Using GLDAS Data: A Case Study in Eastern China
Previous Article in Special Issue
Multiple Sensors Based Hand Motion Recognition Using Adaptive Directed Acyclic Graph
Article Menu
Issue 6 (June) cover image

Export Article

Open AccessArticle
Appl. Sci. 2017, 7(6), 567; doi:10.3390/app7060567

A New Framework of Human Interaction Recognition Based on Multiple Stage Probability Fusion

1
School of Automation, Shenyang Aerospace University, Shenyang 110036, China
2
School of Computing, University of Portsmouth, Portsmouth PO1 3HE, UK
*
Author to whom correspondence should be addressed.
Academic Editors: Plamen Angelov and José Antonio Iglesias Martínez
Received: 19 February 2017 / Revised: 12 May 2017 / Accepted: 24 May 2017 / Published: 1 June 2017
(This article belongs to the Special Issue Human Activity Recognition)
View Full-Text   |   Download PDF [5353 KB, uploaded 1 June 2017]   |  

Abstract

Visual-based human interactive behavior recognition is a challenging research topic in computer vision. There exist some important problems in the current interaction recognition algorithms, such as very complex feature representation and inaccurate feature extraction induced by wrong human body segmentation. In order to solve these problems, a novel human interaction recognition method based on multiple stage probability fusion is proposed in this paper. According to the human body’s contact in interaction as a cut-off point, the process of the interaction can be divided into three stages: start stage, execution stage and end stage. Two persons’ motions are respectively extracted and recognizes in the start stage and the finish stage when there is no contact between those persons. The two persons’ motion is extracted as a whole and recognized in the execution stage. In the recognition process, the final recognition results are obtained by the weighted fusing these probabilities in different stages. The proposed method not only simplifies the extraction and representation of features, but also avoids the wrong feature extraction caused by occlusion. Experiment results on the UT-interaction dataset demonstrated that the proposed method results in a better performance than other recent interaction recognition methods. View Full-Text
Keywords: human interaction recognition; piecewise fusion; weighted fusing; Hidden Markov Model human interaction recognition; piecewise fusion; weighted fusing; Hidden Markov Model
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Ji, X.; Wang, C.; Ju, Z. A New Framework of Human Interaction Recognition Based on Multiple Stage Probability Fusion. Appl. Sci. 2017, 7, 567.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Appl. Sci. EISSN 2076-3417 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top