Next Article in Journal
Webometrics: Some Critical Issues of WWW Size Estimation Methods
Previous Article in Journal
Discourse with Visual Health Data: Design of Human-Data Interaction
Open AccessArticle

Multimodal Observation and Classification of People Engaged in Problem Solving: Application to Chess Players

University Grenoble Alpes, Inria, CNRS, Grenoble INP, LIG, F-38000 Grenoble, France
Author to whom correspondence should be addressed.
Multimodal Technologies Interact. 2018, 2(2), 11;
Received: 28 February 2018 / Revised: 22 March 2018 / Accepted: 23 March 2018 / Published: 31 March 2018
(This article belongs to the Special Issue Human Behavior, Emotion and Representation)
In this paper we present the first results of a pilot experiment in the interpretation of multimodal observations of human experts engaged in solving challenging chess problems. Our goal is to investigate the extent to which observations of eye-gaze, posture, emotion and other physiological signals can be used to model the cognitive state of subjects, and to explore the integration of multiple sensor modalities to improve the reliability of detection of human displays of awareness and emotion. Domains of application for such cognitive model based systems are, for instance, healthy autonomous ageing or automated training systems. Abilities to observe cognitive abilities and emotional reactions can allow artificial systems to provide appropriate assistance in such contexts. We observed chess players engaged in problems of increasing difficulty while recording their behavior. Such recordings can be used to estimate a participant’s awareness of the current situation and to predict ability to respond effectively to challenging situations. Feature selection has been performed to construct a multimodal classifier relying on the most relevant features from each modality. Initial results indicate that eye-gaze, body posture and emotion are good features to capture such awareness. This experiment also validates the use of our equipment as a general and reproducible tool for the study of participants engaged in screen-based interaction and/or problem solving. View Full-Text
Keywords: multimodal perception; affective computing; situation awareness multimodal perception; affective computing; situation awareness
Show Figures

Figure 1

MDPI and ACS Style

Guntz, T.; Balzarini, R.; Vaufreydaz, D.; Crowley, J. Multimodal Observation and Classification of People Engaged in Problem Solving: Application to Chess Players. Multimodal Technologies Interact. 2018, 2, 11.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

Back to TopTop