Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (2)

Search Parameters:
Keywords = facial animacy

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
27 pages, 8263 KiB  
Article
How the Degree of Anthropomorphism of Human-like Robots Affects Users’ Perceptual and Emotional Processing: Evidence from an EEG Study
by Jinchun Wu, Xiaoxi Du, Yixuan Liu, Wenzhe Tang and Chengqi Xue
Sensors 2024, 24(15), 4809; https://doi.org/10.3390/s24154809 - 24 Jul 2024
Cited by 5 | Viewed by 3704
Abstract
Anthropomorphized robots are increasingly integrated into human social life, playing vital roles across various fields. This study aimed to elucidate the neural dynamics underlying users’ perceptual and emotional responses to robots with varying levels of anthropomorphism. We investigated event-related potentials (ERPs) and event-related [...] Read more.
Anthropomorphized robots are increasingly integrated into human social life, playing vital roles across various fields. This study aimed to elucidate the neural dynamics underlying users’ perceptual and emotional responses to robots with varying levels of anthropomorphism. We investigated event-related potentials (ERPs) and event-related spectral perturbations (ERSPs) elicited while participants viewed, perceived, and rated the affection of robots with low (L-AR), medium (M-AR), and high (H-AR) levels of anthropomorphism. EEG data were recorded from 42 participants. Results revealed that H-AR induced a more negative N1 and increased frontal theta power, but decreased P2 in early time windows. Conversely, M-AR and L-AR elicited larger P2 compared to H-AR. In later time windows, M-AR generated greater late positive potential (LPP) and enhanced parietal-occipital theta oscillations than H-AR and L-AR. These findings suggest distinct neural processing phases: early feature detection and selective attention allocation, followed by later affective appraisal. Early detection of facial form and animacy, with P2 reflecting higher-order visual processing, appeared to correlate with anthropomorphism levels. This research advances the understanding of emotional processing in anthropomorphic robot design and provides valuable insights for robot designers and manufacturers regarding emotional and feature design, evaluation, and promotion of anthropomorphic robots. Full article
(This article belongs to the Section Biomedical Sensors)
Show Figures

Figure 1

16 pages, 3626 KiB  
Article
Effects of Robot Animacy and Emotional Expressions on Perspective-Taking Abilities: A Comparative Study across Age Groups
by Xucong Hu and Song Tong
Behav. Sci. 2023, 13(9), 728; https://doi.org/10.3390/bs13090728 - 31 Aug 2023
Cited by 5 | Viewed by 2393
Abstract
The global population is inevitably aging due to increased life expectancy and declining birth rates, leading to an amplified demand for innovative social and healthcare services. One promising avenue is the introduction of companion robots. These robots are designed to provide physical assistance [...] Read more.
The global population is inevitably aging due to increased life expectancy and declining birth rates, leading to an amplified demand for innovative social and healthcare services. One promising avenue is the introduction of companion robots. These robots are designed to provide physical assistance as well as emotional support and companionship, necessitating effective human–robot interaction (HRI). This study explores the role of cognitive empathy within HRI, focusing on the influence of robot facial animacy and emotional expressions on perspective-taking abilities—a key aspect of cognitive empathy—across different age groups. To this end, a director task involving 60 participants (30 young and 30 older adults) with varying degrees of robot facial animacy (0%, 50%, 100%) and emotional expressions (happy, neutral) was conducted. The results revealed that older adults displayed enhanced perspective-taking with higher animacy faces. Interestingly, while happiness on high-animacy faces improved perspective-taking, the same expression on low-animacy faces reduced it. These findings highlight the importance of considering facial animacy and emotional expressions in designing companion robots for older adults to optimize user engagement and acceptance. The study’s implications are pertinent to the design and development of socially effective service robots, particularly for the aging population. Full article
Show Figures

Figure 1

Back to TopTop