Affective Computing

A special issue of Multimodal Technologies and Interaction (ISSN 2414-4088).

Deadline for manuscript submissions: closed (30 November 2018)

Special Issue Editor


E-Mail Website
Guest Editor
Department of Business Strategy and Innovation Griffith Business School, Griffith University, Queensland 4215, Australia
Interests: mobile and pervasive systems; e-health; affective computing; multimedia analysis; interaction design
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

With the advancement of artificial intelligence, machines are now increasingly expected to have cognitive and affective abilities to understand and respond to human emotions. Affective computing requires a multi-disciplinary approach to propose novel techniques, tools and technologies to enable machines to understand human behavior and investigate the benefits on a wide-range of applications. Future systems and devices should be more fun, engaging, empathetic and natural, so that they can form positive, productive and long-lasting partnerships with the users.

The purpose of this Special Issue is to bring together state-of-the-art achievements on affective computing. We encourage authors to submit original research articles, case studies, reviews, theoretical and critical perspectives, and viewpoint articles on the following topics, but not limited to:

  • Intelligent sensing of human emotions and affective states using multimodal cues from face, body gestures and physiological signals.
  • Extraction of emotional properties from multimedia data, including text, image, sound and video (e.g. sentiment analysis of text, emotion-related visual attributes), which supports enriched captioning and complex queries.
  • New dataset, methods and approaches for big data that can support deep learning approach to automatically model and analyze emotions from human and documents.
  • Theoretical and computational models that can describe how human cognitive and decision-making processes take into account the influence of emotions.
  • Affective computing applications, such as healthcare, customer experience measurement, multimedia retrieval, entertainment, and ambient intelligence.

Prof. Dian Tjondronegoro
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Multimodal Technologies and Interaction is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Sensing and analysis of human emotions
  • Affect-based multimedia analysis and retrieval
  • Big data of emotion corpora
  • Affect-based decision-making process
  • Affective computing applications

Published Papers

There is no accepted submissions to this special issue at this moment.
Back to TopTop