Next Article in Journal
Fractional View Analysis of Acoustic Wave Equations, Using Fractional-Order Differential Equations
Previous Article in Journal
A Contactless Respiratory Rate Estimation Method Using a Hermite Magnification Technique and Convolutional Neural Networks
Open AccessArticle

Questionnaires or Inner Feelings: Who Measures the Engagement Better?

Department of Management and Production Engineering, Politecnico di Torino, 10129 Torino, Italy
*
Author to whom correspondence should be addressed.
Appl. Sci. 2020, 10(2), 609; https://doi.org/10.3390/app10020609
Received: 24 December 2019 / Revised: 9 January 2020 / Accepted: 10 January 2020 / Published: 15 January 2020
(This article belongs to the Special Issue Applications of Emerging Digital Technologies: Beyond AI & IoT)
This work proposes an innovative method for evaluating users’ engagement, combining the User Engagement Scale (UES) questionnaire and a facial expression recognition (FER) system, active research topics of increasing interest in the human–computer interaction domain (HCI). The subject of the study is a 3D simulator that reproduces a virtual FabLab in which users can approach and learn 3D modeling software and 3D printing. During the interaction with the virtual environment, a structured-light camera acquires the face of the participant in real-time, to catch its spontaneous reactions and compare them with the answers to the UES closed-ended questions. FER methods allow overcoming some intrinsic limits in the adoption of questioning methods, such as the non-sincerity of the interviewees and the lack of correspondence with facial expressions and body language. A convolutional neural network (CNN) has been trained on the Bosphorus database (DB) to perform expression recognition and the classification of the video frames in three classes of engagement (deactivation, average activation, and activation) according to the model of emotion developed by Russell. The results show that the two methodologies can be integrated to evaluate user engagement, to combine weighted answers and spontaneous reactions and to increase knowledge for the design of the new product or service. View Full-Text
Keywords: user engagement scale; 3D simulator; human-computer interaction; facial expression recognition; deep learning; CNN; user-centered design user engagement scale; 3D simulator; human-computer interaction; facial expression recognition; deep learning; CNN; user-centered design
Show Figures

Figure 1

MDPI and ACS Style

Nonis, F.; Olivetti, E.C.; Marcolin, F.; Violante, M.G.; Vezzetti, E.; Moos, S. Questionnaires or Inner Feelings: Who Measures the Engagement Better? Appl. Sci. 2020, 10, 609.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop