Towards Socially Intelligent Robots

A special issue of Robotics (ISSN 2218-6581). This special issue belongs to the section "Humanoid and Human Robotics".

Deadline for manuscript submissions: closed (31 March 2024) | Viewed by 4489

Special Issue Editors

Electronics and Telecommunications Research Institute, Daejeon, Korea
Interests: social intelligence; software frameworks; continual learning; perceptual continuity for social robots

E-Mail Website
Guest Editor
Robotics Group, Department of Electrical and Computer Engineering, University of Auckland, Auckland, New Zealand
Interests: social robotics; human–robot interaction
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Department of Mechanical & Industrial Engineering, College of Engineering, Qatar University, Doha, Qatar
Interests: healthcare robotics; assistive and social robotics

Special Issue Information

Dear Colleagues, 

With robots becoming increasingly incorporated into the living spaces of humans, social intelligence has become a universal issue in robotics sub-fields. Robots’ capabilities (manipulation, navigation, collaboration, etc.) can no longer be considered complete without careful consideration of their human awareness and social interaction.

For this Special Issue, we welcome submissions of recent research on the provision of social intelligence to robots across various applications, including manufacturing, logistics, retail, human care, entertainment, and education. Topics of interest include the following:

  • Novel approaches to utilizing artificial intelligence models for implementing social intelligence in robots.Evaluation of the level of social intelligence in robots.
  • Effective methodologies for long-term user studies in social robotics.
  • Socially intelligent robots in the real world.
  • Software frameworks for socially intelligent robots.
  • Any topic related to socially intelligent robots.

Dr. Minsu Jang
Dr. Ho Seok Ahn
Dr. John-John Cabibihan
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Robotics is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Novel approaches to utilizing artificial intelligence models for implementing social intelligence in robots
  • Evaluation of the level of social intelligence in robots
  • Effective methodologies for long-term user studies in social robotics
  • Socially intelligent robots in the real world
  • Software frameworks for socially intelligent robots

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

30 pages, 5439 KiB  
Article
Evaluating the Performance of Mobile-Convolutional Neural Networks for Spatial and Temporal Human Action Recognition Analysis
by Stavros N. Moutsis, Konstantinos A. Tsintotas, Ioannis Kansizoglou and Antonios Gasteratos
Robotics 2023, 12(6), 167; https://doi.org/10.3390/robotics12060167 - 08 Dec 2023
Viewed by 1545
Abstract
Human action recognition is a computer vision task that identifies how a person or a group acts on a video sequence. Various methods that rely on deep-learning techniques, such as two- or three-dimensional convolutional neural networks (2D-CNNs, 3D-CNNs), recurrent neural networks (RNNs), and [...] Read more.
Human action recognition is a computer vision task that identifies how a person or a group acts on a video sequence. Various methods that rely on deep-learning techniques, such as two- or three-dimensional convolutional neural networks (2D-CNNs, 3D-CNNs), recurrent neural networks (RNNs), and vision transformers (ViT), have been proposed to address this problem over the years. Motivated by the fact that most of the used CNNs in human action recognition present high complexity, and the necessity of implementations on mobile platforms that are characterized by restricted computational resources, in this article, we conduct an extensive evaluation protocol over the performance metrics of five lightweight architectures. In particular, we examine how these mobile-oriented CNNs (viz., ShuffleNet-v2, EfficientNet-b0, MobileNet-v3, and GhostNet) execute in spatial analysis compared to a recent tiny ViT, namely EVA-02-Ti, and a higher computational model, ResNet-50. Our models, previously trained on ImageNet and BU101, are measured for their classification accuracy on HMDB51, UCF101, and six classes of the NTU dataset. The average and max scores, as well as the voting approaches, are generated through three and fifteen RGB frames of each video, while two different rates for the dropout layers were assessed during the training. Last, a temporal analysis via multiple types of RNNs that employ features extracted by the trained networks is examined. Our results reveal that EfficientNet-b0 and EVA-02-Ti surpass the other mobile-CNNs, achieving comparable or superior performance to ResNet-50. Full article
(This article belongs to the Special Issue Towards Socially Intelligent Robots)
Show Figures

Figure 1

13 pages, 4197 KiB  
Article
Heart Rate as a Predictor of Challenging Behaviours among Children with Autism from Wearable Sensors in Social Robot Interactions
by Ahmad Qadeib Alban, Ahmad Yaser Alhaddad, Abdulaziz Al-Ali, Wing-Chee So, Olcay Connor, Malek Ayesh, Uvais Ahmed Qidwai and John-John Cabibihan
Robotics 2023, 12(2), 55; https://doi.org/10.3390/robotics12020055 - 01 Apr 2023
Cited by 6 | Viewed by 2342
Abstract
Children with autism face challenges in various skills (e.g., communication and social) and they exhibit challenging behaviours. These challenging behaviours represent a challenge to their families, therapists, and caregivers, especially during therapy sessions. In this study, we have investigated several machine learning techniques [...] Read more.
Children with autism face challenges in various skills (e.g., communication and social) and they exhibit challenging behaviours. These challenging behaviours represent a challenge to their families, therapists, and caregivers, especially during therapy sessions. In this study, we have investigated several machine learning techniques and data modalities acquired using wearable sensors from children with autism during their interactions with social robots and toys in their potential to detect challenging behaviours. Each child wore a wearable device that collected data. Video annotations of the sessions were used to identify the occurrence of challenging behaviours. Extracted time features (i.e., mean, standard deviation, min, and max) in conjunction with four machine learning techniques were considered to detect challenging behaviors. The heart rate variability (HRV) changes have also been investigated in this study. The XGBoost algorithm has achieved the best performance (i.e., an accuracy of 99%). Additionally, physiological features outperformed the kinetic ones, with the heart rate being the main contributing feature in the prediction performance. One HRV parameter (i.e., RMSSD) was found to correlate with the occurrence of challenging behaviours. This work highlights the importance of developing the tools and methods to detect challenging behaviors among children with autism during aided sessions with social robots. Full article
(This article belongs to the Special Issue Towards Socially Intelligent Robots)
Show Figures

Figure 1

Back to TopTop