Next Article in Journal
Fast 2D Laser-Induced Fluorescence Spectroscopy Mapping of Rare Earth Elements in Rock Samples
Next Article in Special Issue
Predicting Group Contribution Behaviour in a Public Goods Game from Face-to-Face Communication
Previous Article in Journal
Dynamic Visual Measurement of Driver Eye Movements
Previous Article in Special Issue
Using Eye Tracking to Assess Gaze Concentration in Meditation
Open AccessArticle

Evaluating and Validating Emotion Elicitation Using English and Arabic Movie Clips on a Saudi Sample

1
College of Computer and Information Science, Prince Sultan University, Riyadh 11586, Saudi Arabia
2
Research School of Computer Science, Australian National University, Canberra 2600, Australia
3
School of Information Technology & Systems, University of Canberra, Canberra 2600, Australia
4
Center for Complex Engineering Systems, King Abdulaziz City for Science and Technology, Riyadh 11586, Saudi Arabia
*
Author to whom correspondence should be addressed.
Sensors 2019, 19(10), 2218; https://doi.org/10.3390/s19102218
Received: 18 April 2019 / Revised: 5 May 2019 / Accepted: 11 May 2019 / Published: 14 May 2019
(This article belongs to the Special Issue Sensors for Affective Computing and Sentiment Analysis)
With the advancement of technology in both hardware and software, estimating human affective states has become possible. Currently, movie clips are used as they are a widely-accepted method of eliciting emotions in a replicable way. However, cultural differences might influence the effectiveness of some video clips to elicit the target emotions. In this paper, we describe several sensors and techniques to measure, validate and investigate the relationship between cultural acceptance and eliciting universal expressions of affect using movie clips. For emotion elicitation, a standardised list of English language clips, as well as an initial set of Arabic video clips are used for comparison. For validation, bio-signal devices to measure physiological and behavioural responses associated with emotional stimuli are used. Physiological and behavioural responses are measured from 29 subjects of Arabic background while watching the selected clips. For the six emotions’ classification, a multiclass SVM (six-class) classifier using the physiological and behavioural measures as input results in a higher recognition rate for elicited emotions from Arabic video clips (avg. 60%) compared to the English video clips (avg. 52%). These results might reflect that using video clips from the subjects’ culture is more likely to elicit the target emotions. Besides measuring the physiological and behavioural responses, an online survey was carried out to evaluate the effectiveness of the selected video clips in eliciting the target emotions. The online survey, having on average 220 respondents for each clip, supported the findings. View Full-Text
Keywords: affective computing; cross-culture; emotion elicitation; emotion recognition; physiological responses; emotion stimuli affective computing; cross-culture; emotion elicitation; emotion recognition; physiological responses; emotion stimuli
Show Figures

Figure 1

MDPI and ACS Style

Alghowinem, S.; Goecke, R.; Wagner, M.; Alwabil, A. Evaluating and Validating Emotion Elicitation Using English and Arabic Movie Clips on a Saudi Sample. Sensors 2019, 19, 2218.

AMA Style

Alghowinem S, Goecke R, Wagner M, Alwabil A. Evaluating and Validating Emotion Elicitation Using English and Arabic Movie Clips on a Saudi Sample. Sensors. 2019; 19(10):2218.

Chicago/Turabian Style

Alghowinem, Sharifa; Goecke, Roland; Wagner, Michael; Alwabil, Areej. 2019. "Evaluating and Validating Emotion Elicitation Using English and Arabic Movie Clips on a Saudi Sample" Sensors 19, no. 10: 2218.

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Search more from Scilit
 
Search
Back to TopTop