sensors-logo

Journal Browser

Journal Browser

Multisensory AI for Human-Robot Interaction

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Intelligent Sensors".

Deadline for manuscript submissions: 25 August 2025 | Viewed by 1257

Special Issue Editors


E-Mail Website
Guest Editor
Robotics & Intelligent Adaptive Systems, University of Hertfordshire, Hatfield AL10 9AB, UK
Interests: human–robot interaction; grasping and dexterous manipulation; artificial perception systems/autonomous systems; pattern recognition
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Adaptive Systems Research Group, School of Computer Science, University of Hertfordshire, Hatfield AL10 9AB, UK
Interests: intersection of developmental;cognitive robotics; human-robot interaction

Special Issue Information

Dear Colleagues,

Sensors invites submissions for a Special Issue on the cutting-edge advancements in multisensory artificial intelligence (AI) with a particular focus on their applications in human–robot interaction (HRI). This Special Issue aims to explore the latest research and development in this rapidly evolving field, emphasizing the integration of diverse sensory modalities, affective computing, and neuro-inspired approaches to create more intuitive, empathetic, and effective interactions between humans and robots.

Topics of Interest:
We welcome original research articles, reviews, and perspectives on a wide range of topics, including but not limited to the following:

  • Multimodal sensor fusion for HRI: novel methods for integrating data from various sensors (e.g., vision, audio, touch, physiological signals) to enhance robot perception and understanding of human behavior, emotions, and intentions.
  • Neuro-affective computing in HRI: development and evaluation of computational models that can recognize, interpret, and respond to human affective states, including emotions, moods, and cognitive load.
  • Large language models (LLMs) for HRI: exploration of the potential of LLMs in facilitating natural language communication, understanding complex social cues, and generating contextually appropriate responses in HRI scenarios.
  • Adaptive and personalized HRI: design and implementation of intelligent systems that can adapt their behavior and communication strategies based on individual user preferences, emotional states, and learning styles.
  • Ethical considerations in neuro-affective HRI: investigation of the ethical implications of using neuro-affective data and AI in HRI, including issues related to privacy, autonomy, and potential biases.
  • Applications of multisensory AI in HRI: case studies and real-world applications of multisensory AI and neuro-affective computing in various domains, such as healthcare, education, therapy, and social robotics.
  • Empathetic robots: development of robots capable of understanding and responding appropriately to human emotions, fostering trust and rapport in HRI.
  • Socially assistive robotics: design and evaluation of robots that can provide social and emotional support to individuals with special needs, such as children with autism or the elderly.
  • Deep reinforcement learning for HRI: application of deep RL techniques to enable robots to learn complex social behaviors and adapt to dynamic environments through interaction with humans.
  • Robot behavior adaptation based on human stimuli: development of algorithms and models that allow robots to adjust their behavior in real-time based on human verbal and non-verbal cues.
  • Child–robot interaction: investigation of the unique challenges and opportunities in designing robots that can interact effectively with children, considering their developmental needs and cognitive abilities.
  • Companion robots for the elderly: exploration of the potential of robots to provide companionship, assistance, and emotional support to older adults, addressing issues of social isolation and loneliness.
  • Speech emotion for HRI communication: development of robust speech emotion recognition systems that can accurately interpret human emotions from vocal cues, enabling more natural and empathetic communication with robots.
  • Natural language processing for HRI: advancements in NLP techniques for understanding and generating human-like language in HRI, including dialogue systems, sentiment analysis, and intent recognition.

Submission Guidelines:
Manuscripts should be prepared according to the journal's guidelines and submitted through the online submission system. Please indicate in your cover letter that your submission is intended for the Special Issue on Multisensory AI for Human–Robot Interaction.

Dr. Diego R. Faria
Dr. Frank Förster
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • human-robot interaction
  • multisensory AI
  • sensors
  • socially assistive robotics
  • affective robotics
  • artificial perception for HRI
  • deep reinforcement learning for HRI
  • LLMs for HRI, adaptation for HRI

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue policies can be found here.

Published Papers (1 paper)

Order results
Result details
Select all
Export citation of selected articles as:

Research

28 pages, 9455 KiB  
Article
Advancing Emotionally Aware Child–Robot Interaction with Biophysical Data and Insight-Driven Affective Computing
by Diego Resende Faria, Amie Louise Godkin and Pedro Paulo da Silva Ayrosa
Sensors 2025, 25(4), 1161; https://doi.org/10.3390/s25041161 - 14 Feb 2025
Viewed by 925
Abstract
This paper investigates the integration of affective computing techniques using biophysical data to advance emotionally aware machines and enhance child–robot interaction (CRI). By leveraging interdisciplinary insights from neuroscience, psychology, and artificial intelligence, the study focuses on creating adaptive, emotion-aware systems capable of dynamically [...] Read more.
This paper investigates the integration of affective computing techniques using biophysical data to advance emotionally aware machines and enhance child–robot interaction (CRI). By leveraging interdisciplinary insights from neuroscience, psychology, and artificial intelligence, the study focuses on creating adaptive, emotion-aware systems capable of dynamically recognizing and responding to human emotional states. Through a real-world CRI pilot study involving the NAO robot, this research demonstrates how facial expression analysis and speech emotion recognition can be employed to detect and address negative emotions in real time, fostering positive emotional engagement. The emotion recognition system combines handcrafted and deep learning features for facial expressions, achieving an 85% classification accuracy during real-time CRI, while speech emotions are analyzed using acoustic features processed through machine learning models with an 83% accuracy rate. Offline evaluation of the combined emotion dataset using a Dynamic Bayesian Mixture Model (DBMM) achieved a 92% accuracy for facial expressions, and the multilingual speech dataset yielded 98% accuracy for speech emotions using the DBMM ensemble. Observations from psychological and technological aspects, coupled with statistical analysis, reveal the robot’s ability to transition negative emotions into neutral or positive states in most cases, contributing to emotional regulation in children. This work underscores the potential of emotion-aware robots to support therapeutic and educational interventions, particularly for pediatric populations, while setting a foundation for developing personalized and empathetic human–machine interactions. These findings demonstrate the transformative role of affective computing in bridging the gap between technological functionality and emotional intelligence across diverse domains. Full article
(This article belongs to the Special Issue Multisensory AI for Human-Robot Interaction)
Show Figures

Figure 1

Back to TopTop