sensors-logo

Journal Browser

Journal Browser

Emotion Sensing and Robotic Emotional Intelligence

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Intelligent Sensors".

Deadline for manuscript submissions: 1 August 2024 | Viewed by 2162

Special Issue Editors


E-Mail Website
Guest Editor
School of Information Technology, Deakin University, Melbourne, Australia
Interests: intelligent robotics; natural language processing; data mining and knowledge discovery; pattern recognition; computer vision; semi- and unsupervised learning; fuzzy computation; cyberphysical systems and internet of things; affective computing; software testing, verification and validation

E-Mail Website
Guest Editor
CSIRO, Research Way, Clayton, VIC 3168, Australia
Interests: emotion recognition; facial expression; manifold learning

E-Mail Website
Guest Editor
Computer Science and Information Technology, La Trobe University, Melbourne, VIC 3086, Australia
Interests: healthcare; big data; autism screening; medical AI; clinical decision support
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Nanyang Technological University, 50 Nanyang Avenue, Singapore 639798, Singapore
Interests: data science and engineering; privacy and trust enhanced computation; knowledge and cognitive computing; cloud and service computin

Special Issue Information

Dear Colleagues,

Machines are increasingly being given the intelligence to perform more complex tasks, but they remain poor at capturing human emotions. Although both sense and emotion are very important for machine automation, the techniques that have been developed for the latter lag far behind the former. This is because existing emotion recognition methods are insufficiently accurate for real-world use—a serious problem for tasks that require human–machine interaction.

As sensing and artificial intelligence technologies advance, machines are interacting increasingly naturally with humans, recognising human emotions by analysing multiple sensing modes, such as vocalisations and facial expressions. However, machines remain poor at understanding human emotions in real-world uncontrolled conditions due to the gap between laboratory and real-world wild environments. Existing emotionally intelligent robots can recognise simple emotions (contentment, joy, sadness and anger) by interpreting facial expressions and tone of voice, but often fail to intelligently recognise and respond to human emotions. The most important emotion measurement instrument, facial expression analysis, is extremely challenging due to subtle and transient movements in the foreground (people) and complex, noisy environments in the background. That is, in the real world, spontaneous human facial expressions may not be captured in an ideal way; for example, faces may be shown obliquely, at a distance and poorly lit, as opposed to a close-up frontal view with high-quality illumination.

Intelligent emotion recognition methods that are able to tackle the challenges in real-world wild scenarios involving varying head poses, illumination quality and subjects will have a wide range of applications, such as for home-care and business robots. This Special Issue seeks original technical and review papers about the latest technologies for vastly improving machine emotional intelligence in human–machine interaction, including but not limited to machine emotional intelligence for robotics, facial expression recognition in uncontrolled conditions, multimodal sensor data fusion for emotion recognition, emotional cues in speech/language/music, automatic emotional labelling for videos and user emotional preference prediction.

Dr. Guangyan Huang
Dr. Najmeh Samadiani
Dr. Lianhua Chi
Dr. Chi-Hung Chi
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • robotic emotional intelligence
  • wild facial expression recognition
  • multimodal emotional cues
  • emotion sensing
  • automatic emotional labelling
  • user emotional preference

Published Papers (1 paper)

Order results
Result details
Select all
Export citation of selected articles as:

Research

13 pages, 1367 KiB  
Article
A Token Classification-Based Attention Model for Extracting Multiple Emotion–Cause Pairs in Conversations
by Soyeop Yoo and Okran Jeong
Sensors 2023, 23(6), 2983; https://doi.org/10.3390/s23062983 - 09 Mar 2023
Viewed by 1486
Abstract
People exchange emotions through conversations with others and provide different answers depending on the reasons for their emotions. During a conversation, it is important to find not only such emotions but also their cause. Emotion–cause pair extraction (ECPE) is a task used to [...] Read more.
People exchange emotions through conversations with others and provide different answers depending on the reasons for their emotions. During a conversation, it is important to find not only such emotions but also their cause. Emotion–cause pair extraction (ECPE) is a task used to determine emotions and their causes in a single pair within a text, and various studies have been conducted to accomplish ECPE tasks. However, existing studies have limitations in that some models conduct the task in two or more steps, whereas others extract only one emotion–cause pair for a given text. We propose a novel methodology for extracting multiple emotion–cause pairs simultaneously from a given conversation with a single model. Our proposed model is a token-classification-based emotion–cause pair extraction model, which applies the BIO (beginning–inside–outside) tagging scheme to efficiently extract multiple emotion–cause pairs in conversations. The proposed model showed the best performance on the RECCON benchmark dataset in comparative experiments with existing studies and was experimentally verified to efficiently extract multiple emotion–cause pairs in conversations. Full article
(This article belongs to the Special Issue Emotion Sensing and Robotic Emotional Intelligence)
Show Figures

Figure 1

Back to TopTop