Next Article in Journal
Webometrics: Some Critical Issues of WWW Size Estimation Methods
Previous Article in Journal
Discourse with Visual Health Data: Design of Human-Data Interaction
Article Menu

Export Article

Open AccessArticle
Multimodal Technologies Interact. 2018, 2(2), 11; https://doi.org/10.3390/mti2020011

Multimodal Observation and Classification of People Engaged in Problem Solving: Application to Chess Players

University Grenoble Alpes, Inria, CNRS, Grenoble INP, LIG, F-38000 Grenoble, France
*
Author to whom correspondence should be addressed.
Received: 28 February 2018 / Revised: 22 March 2018 / Accepted: 23 March 2018 / Published: 31 March 2018
(This article belongs to the Special Issue Human Behavior, Emotion and Representation)
Full-Text   |   PDF [2191 KB, uploaded 31 March 2018]   |  

Abstract

In this paper we present the first results of a pilot experiment in the interpretation of multimodal observations of human experts engaged in solving challenging chess problems. Our goal is to investigate the extent to which observations of eye-gaze, posture, emotion and other physiological signals can be used to model the cognitive state of subjects, and to explore the integration of multiple sensor modalities to improve the reliability of detection of human displays of awareness and emotion. Domains of application for such cognitive model based systems are, for instance, healthy autonomous ageing or automated training systems. Abilities to observe cognitive abilities and emotional reactions can allow artificial systems to provide appropriate assistance in such contexts. We observed chess players engaged in problems of increasing difficulty while recording their behavior. Such recordings can be used to estimate a participant’s awareness of the current situation and to predict ability to respond effectively to challenging situations. Feature selection has been performed to construct a multimodal classifier relying on the most relevant features from each modality. Initial results indicate that eye-gaze, body posture and emotion are good features to capture such awareness. This experiment also validates the use of our equipment as a general and reproducible tool for the study of participants engaged in screen-based interaction and/or problem solving. View Full-Text
Keywords: multimodal perception; affective computing; situation awareness multimodal perception; affective computing; situation awareness
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Guntz, T.; Balzarini, R.; Vaufreydaz, D.; Crowley, J. Multimodal Observation and Classification of People Engaged in Problem Solving: Application to Chess Players. Multimodal Technologies Interact. 2018, 2, 11.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Multimodal Technologies Interact. EISSN 2414-4088 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top