E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Special Issue "Sensors for Affective Computing and Sentiment Analysis"

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Physical Sensors".

Deadline for manuscript submissions: 30 September 2019.

Special Issue Editors

Guest Editor
Prof. Dr. Antonio Fernández-Caballero

Universidad de Castilla-La Mancha, Ciudad Real, Spain
Website | E-Mail
Interests: image processing; cognitive vision; robot vision; multisensor fusion; multimodal interfaces; ambient intelligence
Guest Editor
Dr. Arturo Martínez-Rodrigo

Universidad de Castilla-La Mancha, Cuenca, Spain
Website | E-Mail
Interests: signal processing; physiological sensors; sensors networks; Internet of Things; embedded systems

Special Issue Information

Dear Colleagues,

Emotions are essential in human–human communication, cognition, learning, and rational decision-making processes. However, human–machine interfaces (HMIs) are still not able to understand human sentiments and react accordingly. With the aim of endowing HMIs with the emotional intelligence they lack, the affective computing science focuses on the development of artificial intelligence by means of the analysis of affects and emotions, such that systems and sensors could be able to recognize, interpret, process, and simulate human sentiments.

Nowadays, the evaluation of electrophysiological signals plays a key role in the advancement towards that purpose, as they are an objective representation of the emotional state of an individual. Hence, interest in physiological variables like electroencephalogram, electrocardiogram, or electrodermal activity, among many others, has notably grown in the field of affective states detection. Furthermore, emotions have also been widely identified by means of the assessment of speech characteristics and facial gestures of people under different sentimental conditions.

This Special Issue, Sensors for Affective Computing and Sentiment Analysis, is intended to be a venue for researchers that are interested in the development and/or use of physical sensors in those areas of expertise related to sentiment analysis, who want to initiate their studies or are currently working on this topic. Hence, manuscripts introducing new proposals based on physical sensors for the analysis of physiological measures, facial recognition, speech recognition, or natural language processing are welcome in this Special Issue of Sensors for Affective Computing and Sentiment Analysis.

Prof. Dr. Antonio Fernández-Caballero
Dr. Arturo Martínez-Rodrigo
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Sensors for affective computing
  • Sensors for sentiment analysis
  • Sensors for ubiquitous and pervasive computing
  • Sensors for ambient intelligence
  • Sensors for ambient assisted living
  • Sensors for physiological computing
  • Internet of things sensors
  • Sensors for natural language processing
  • Brain–-computer interfaces
  • Biofeedback and neurofeedback systems
  • Wearable systems
  • Applications and case studies

Published Papers (4 papers)

View options order results:
result details:
Displaying articles 1-4
Export citation of selected articles as:

Research

Open AccessArticle
Game-Calibrated and User-Tailored Remote Detection of Stress and Boredom in Games
Sensors 2019, 19(13), 2877; https://doi.org/10.3390/s19132877
Received: 28 May 2019 / Revised: 21 June 2019 / Accepted: 25 June 2019 / Published: 28 June 2019
PDF Full-text (3102 KB) | HTML Full-text | XML Full-text
Abstract
Emotion detection based on computer vision and remote extraction of user signals commonly rely on stimuli where users have a passive role with limited possibilities for interaction or emotional involvement, e.g., images and videos. Predictive models are also trained on a group level, [...] Read more.
Emotion detection based on computer vision and remote extraction of user signals commonly rely on stimuli where users have a passive role with limited possibilities for interaction or emotional involvement, e.g., images and videos. Predictive models are also trained on a group level, which potentially excludes or dilutes key individualities of users. We present a non-obtrusive, multifactorial, user-tailored emotion detection method based on remotely estimated psychophysiological signals. A neural network learns the emotional profile of a user during the interaction with calibration games, a novel game-based emotion elicitation material designed to induce emotions while accounting for particularities of individuals. We evaluate our method in two experiments ( n = 20 and n = 62 ) with mean classification accuracy of 61.6%, which is statistically significantly better than chance-level classification. Our approach and its evaluation present unique circumstances: our model is trained on one dataset (calibration games) and tested on another (evaluation game), while preserving the natural behavior of subjects and using remote acquisition of signals. Results of this study suggest our method is feasible and an initiative to move away from questionnaires and physical sensors into a non-obtrusive, remote-based solution for detecting emotions in a context involving more naturalistic user behavior and games. Full article
(This article belongs to the Special Issue Sensors for Affective Computing and Sentiment Analysis)
Figures

Figure 1

Open AccessArticle
Predicting Group Contribution Behaviour in a Public Goods Game from Face-to-Face Communication
Sensors 2019, 19(12), 2786; https://doi.org/10.3390/s19122786
Received: 9 May 2019 / Revised: 12 June 2019 / Accepted: 17 June 2019 / Published: 21 June 2019
PDF Full-text (1178 KB) | HTML Full-text | XML Full-text
Abstract
Experimental economic laboratories run many studies to test theoretical predictions with actual human behaviour, including public goods games. With this experiment, participants in a group have the option to invest money in a public account or to keep it. All the invested money [...] Read more.
Experimental economic laboratories run many studies to test theoretical predictions with actual human behaviour, including public goods games. With this experiment, participants in a group have the option to invest money in a public account or to keep it. All the invested money is multiplied and then evenly distributed. This structure incentivizes free riding, resulting in contributions to the public goods declining over time. Face-to-face Communication (FFC) diminishes free riding and thus positively affects contribution behaviour, but the question of how has remained mostly unknown. In this paper, we investigate two communication channels, aiming to explain what promotes cooperation and discourages free riding. Firstly, the facial expressions of the group in the 3-minute FFC videos are automatically analysed to predict the group behaviour towards the end of the game. The proposed automatic facial expressions analysis approach uses a new group activity descriptor and utilises random forest classification. Secondly, the contents of FFC are investigated by categorising strategy-relevant topics and using meta-data. The results show that it is possible to predict whether the group will fully contribute to the end of the games based on facial expression data from three minutes of FFC, but deeper understanding requires a larger dataset. Facial expression analysis and content analysis found that FFC and talking until the very end had a significant, positive effect on the contributions. Full article
(This article belongs to the Special Issue Sensors for Affective Computing and Sentiment Analysis)
Figures

Figure 1

Open AccessArticle
Evaluating and Validating Emotion Elicitation Using English and Arabic Movie Clips on a Saudi Sample
Sensors 2019, 19(10), 2218; https://doi.org/10.3390/s19102218
Received: 18 April 2019 / Revised: 5 May 2019 / Accepted: 11 May 2019 / Published: 14 May 2019
PDF Full-text (4611 KB) | HTML Full-text | XML Full-text
Abstract
With the advancement of technology in both hardware and software, estimating human affective states has become possible. Currently, movie clips are used as they are a widely-accepted method of eliciting emotions in a replicable way. However, cultural differences might influence the effectiveness of [...] Read more.
With the advancement of technology in both hardware and software, estimating human affective states has become possible. Currently, movie clips are used as they are a widely-accepted method of eliciting emotions in a replicable way. However, cultural differences might influence the effectiveness of some video clips to elicit the target emotions. In this paper, we describe several sensors and techniques to measure, validate and investigate the relationship between cultural acceptance and eliciting universal expressions of affect using movie clips. For emotion elicitation, a standardised list of English language clips, as well as an initial set of Arabic video clips are used for comparison. For validation, bio-signal devices to measure physiological and behavioural responses associated with emotional stimuli are used. Physiological and behavioural responses are measured from 29 subjects of Arabic background while watching the selected clips. For the six emotions’ classification, a multiclass SVM (six-class) classifier using the physiological and behavioural measures as input results in a higher recognition rate for elicited emotions from Arabic video clips (avg. 60%) compared to the English video clips (avg. 52%). These results might reflect that using video clips from the subjects’ culture is more likely to elicit the target emotions. Besides measuring the physiological and behavioural responses, an online survey was carried out to evaluate the effectiveness of the selected video clips in eliciting the target emotions. The online survey, having on average 220 respondents for each clip, supported the findings. Full article
(This article belongs to the Special Issue Sensors for Affective Computing and Sentiment Analysis)
Figures

Figure 1

Open AccessArticle
Using Eye Tracking to Assess Gaze Concentration in Meditation
Sensors 2019, 19(7), 1612; https://doi.org/10.3390/s19071612
Received: 13 February 2019 / Revised: 19 March 2019 / Accepted: 28 March 2019 / Published: 3 April 2019
PDF Full-text (1886 KB) | HTML Full-text | XML Full-text
Abstract
An important component of Heart Chan Meditation is gaze concentration training. Here, we determine whether eye tracking can be used to assess gaze concentration ability. Study participants (n = 306) were requested to focus their gaze on the innermost of three concentric circles [...] Read more.
An important component of Heart Chan Meditation is gaze concentration training. Here, we determine whether eye tracking can be used to assess gaze concentration ability. Study participants (n = 306) were requested to focus their gaze on the innermost of three concentric circles for 1 min while their eye movements were recorded. Results suggest that participants with high scores on gaze concentration accuracy and precision had lower systolic blood pressure and higher sleep quality, suggesting that eye tracking may be effective to assess and train gaze concentration within Heart Chan Meditation. Full article
(This article belongs to the Special Issue Sensors for Affective Computing and Sentiment Analysis)
Figures

Graphical abstract

Sensors EISSN 1424-8220 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top