Intelligent Human–Robot Interaction: 4th Edition

A special issue of Biomimetics (ISSN 2313-7673). This special issue belongs to the section "Locomotion and Bioinspired Robotics".

Deadline for manuscript submissions: 30 November 2025 | Viewed by 4951

Special Issue Editors


E-Mail Website
Guest Editor
School of Information Engineering, Wuhan University of Technology, Wuhan 430070, China
Interests: intelligent remanufacturing technology; robotics and automation; human-machine collaboration; optical fiber sensing and intelligent sensing technology; mechanical equipment condition monitoring and fault diagnosis
Special Issues, Collections and Topics in MDPI journals
School of Mechanical and Electronic Engineering, Wuhan University of Technology, Wuhan 430070, China
Interests: fiber optic sensing; robot force/position hybrid control; special robot
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Human–robot interaction (HRI) is a multi-disciplinary field that encompasses artificial intelligence, robotics, human–computer interaction, machine vision, natural language understanding, and social science. With the rapid development of AI and robotics, intelligent HRI has become an increasingly attractive issue in the field of robotics.

Intelligent HRI involves numerous challenges in science and technology, particularly in human-centered aspects. These include the human expectations, attitudes towards, and perceptions of robots; the safety, acceptability, and comfort with robotic behaviors; and the closeness of robots to humans. On the other hand, it is desired that robots to understand the attention, intention, and even emotion of humans and make prompt corresponding responses with the support of AI. Achieving excellent intelligent HRI requires R&D in this multi- and cross-disciplinary field, with efforts expected in all relevant aspects, including actuation, sensing, perception, control, recognition, planning, learning, AI algorithms, intelligent IO, integrated systems, and so on.

The aim of this Special Issue is to reveal new concepts, ideas, findings, and the latest achievements in both theoretical research and technical development in intelligent HRI. We invite scientists and engineers from robotics, AI, computer science, and other relevant disciplines to present the latest results of their research and development in the field of intelligent HRI. The topics of interest include, but are not limited to, the following:

  • Intelligent sensors and systems;
  • Bio-inspired sensing and learning;
  • Multi-modal perception and recognition;
  • Social robotics;
  • Autonomous behaviors of robots;
  • AI algorithms in robotics;
  • Collaboration between humans and robots;

Advances and future challenges of HRI. 

Prof. Dr. Jun Huang
Dr. Ruiya Li
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Biomimetics is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2200 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • intelligent sensors and systems
  • bio-inspired sensing and learning
  • multi-modal perception and recognition
  • social robotics
  • autonomous behaviors of robots
  • AI algorithms in robotics
  • collaboration between humans and robots
  • advances and future challenges of HRI

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • Reprint: MDPI Books provides the opportunity to republish successful Special Issues in book format, both online and in print.

Further information on MDPI's Special Issue policies can be found here.

Related Special Issues

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

15 pages, 3926 KB  
Article
Robotic Removal and Collection of Screws in Collaborative Disassembly of End-of-Life Electric Vehicle Batteries
by Muyao Tan, Jun Huang, Xingqiang Jiang, Yilin Fang, Quan Liu and Duc Pham
Biomimetics 2025, 10(8), 553; https://doi.org/10.3390/biomimetics10080553 - 21 Aug 2025
Viewed by 283
Abstract
The recycling and remanufacturing of end-of-life (EoL) electric vehicle (EV) batteries are urgent challenges for a circular economy. Disassembly is crucial for handling EoL EV batteries due to their inherent uncertainties and instability. The human–robot collaborative disassembly of EV batteries as a semi-automated [...] Read more.
The recycling and remanufacturing of end-of-life (EoL) electric vehicle (EV) batteries are urgent challenges for a circular economy. Disassembly is crucial for handling EoL EV batteries due to their inherent uncertainties and instability. The human–robot collaborative disassembly of EV batteries as a semi-automated approach has been investigated and implemented to increase flexibility and productivity. Unscrewing is one of the primary operations in EV battery disassembly. This paper presents a new method for the robotic unfastening and collecting of screws, increasing disassembly efficiency and freeing human operators from dangerous, tedious, and repetitive work. The design inspiration for this method originated from how human operators unfasten and grasp screws when disassembling objects with an electric tool, along with the fusion of multimodal perception, such as vision and touch. A robotic disassembly system for screws is introduced, which involves a collaborative robot, an electric spindle, a screw collection device, a 3D camera, a six-axis force/torque sensor, and other components. The process of robotic unfastening and collecting screws is proposed by using position and force control. Experiments were carried out to validate the proposed method. The results demonstrate that the screws in EV batteries can be automatically identified, located, unfastened, and removed, indicating potential for the proposed method in the disassembly of EoL EV batteries. Full article
(This article belongs to the Special Issue Intelligent Human–Robot Interaction: 4th Edition)
Show Figures

Figure 1

23 pages, 3542 KB  
Article
An Intuitive and Efficient Teleoperation Human–Robot Interface Based on a Wearable Myoelectric Armband
by Long Wang, Zhangyi Chen, Songyuan Han, Yao Luo, Xiaoling Li and Yang Liu
Biomimetics 2025, 10(7), 464; https://doi.org/10.3390/biomimetics10070464 - 15 Jul 2025
Viewed by 444
Abstract
Although artificial intelligence technologies have significantly enhanced autonomous robots’ capabilities in perception, decision-making, and planning, their autonomy may still fail when faced with complex, dynamic, or unpredictable environments. Therefore, it is critical to enable users to take over robot control in real-time and [...] Read more.
Although artificial intelligence technologies have significantly enhanced autonomous robots’ capabilities in perception, decision-making, and planning, their autonomy may still fail when faced with complex, dynamic, or unpredictable environments. Therefore, it is critical to enable users to take over robot control in real-time and efficiently through teleoperation. The lightweight, wearable myoelectric armband, due to its portability and environmental robustness, provides a natural human–robot gesture interaction interface. However, current myoelectric teleoperation gesture control faces two major challenges: (1) poor intuitiveness due to visual-motor misalignment; and (2) low efficiency from discrete, single-degree-of-freedom control modes. To address these challenges, this study proposes an integrated myoelectric teleoperation interface. The interface integrates the following: (1) a novel hybrid reference frame aimed at effectively mitigating visual-motor misalignment; and (2) a finite state machine (FSM)-based control logic designed to enhance control efficiency and smoothness. Four experimental tasks were designed using different end-effectors (gripper/dexterous hand) and camera viewpoints (front/side view). Compared to benchmark methods, the proposed interface demonstrates significant advantages in task completion time, movement path efficiency, and subjective workload. This work demonstrates the potential of the proposed interface to significantly advance the practical application of wearable myoelectric sensors in human–robot interaction. Full article
(This article belongs to the Special Issue Intelligent Human–Robot Interaction: 4th Edition)
Show Figures

Figure 1

22 pages, 5819 KB  
Article
Design of Adaptive LQR Control Based on Improved Grey Wolf Optimization for Prosthetic Hand
by Khaled Ahmed, Ayman A. Aly and Mohamed O. Elhabib
Biomimetics 2025, 10(7), 423; https://doi.org/10.3390/biomimetics10070423 - 30 Jun 2025
Viewed by 433
Abstract
Assistive technologies, particularly multi-fingered robotic hands (MFRHs), are critical for enhancing the quality of life for individuals with upper-limb disabilities. However, achieving precise and stable control of such systems remains a significant challenge. This study proposes an Improved Grey Wolf Optimization (IGWO)-tuned Linear [...] Read more.
Assistive technologies, particularly multi-fingered robotic hands (MFRHs), are critical for enhancing the quality of life for individuals with upper-limb disabilities. However, achieving precise and stable control of such systems remains a significant challenge. This study proposes an Improved Grey Wolf Optimization (IGWO)-tuned Linear Quadratic Regulator (LQR) to enhance the control performance of an MFRH. The MFRH was modeled using Denavit–Hartenberg kinematics and Euler–Lagrange dynamics, with micro-DC motors selected based on computed torque requirements. The LQR controller, optimized via IGWO to systematically determine weighting matrices, was benchmarked against PID and PID-PSO controllers under diverse input scenarios. For step input, the IGWO-LQR achieved a settling time of 0.018 s with zero overshoot for Joint 1, outperforming PID (settling time: 0.0721 s; overshoot: 6.58%) and PID-PSO (settling time: 0.042 s; overshoot: 2.1%). Similar improvements were observed across all joints, with Joint 3 recording an IAE of 0.001334 for IGWO-LQR versus 0.004695 for PID. Evaluations under square-wave, sine, and sigmoid inputs further validated the controller’s robustness, with IGWO-LQR consistently delivering minimal tracking errors and rapid stabilization. These results demonstrate that the IGWO-LQR framework significantly enhances precision and dynamic response. Full article
(This article belongs to the Special Issue Intelligent Human–Robot Interaction: 4th Edition)
Show Figures

Figure 1

Review

Jump to: Research

24 pages, 1664 KB  
Review
A Comprehensive Review of Multimodal Emotion Recognition: Techniques, Challenges, and Future Directions
by You Wu, Qingwei Mi and Tianhan Gao
Biomimetics 2025, 10(7), 418; https://doi.org/10.3390/biomimetics10070418 - 27 Jun 2025
Cited by 1 | Viewed by 3496
Abstract
This paper presents a comprehensive review of multimodal emotion recognition (MER), a process that integrates multiple data modalities such as speech, visual, and text to identify human emotions. Grounded in biomimetics, the survey frames MER as a bio-inspired sensing paradigm that emulates the [...] Read more.
This paper presents a comprehensive review of multimodal emotion recognition (MER), a process that integrates multiple data modalities such as speech, visual, and text to identify human emotions. Grounded in biomimetics, the survey frames MER as a bio-inspired sensing paradigm that emulates the way humans seamlessly fuse multisensory cues to communicate affect, thereby transferring principles from living systems to engineered solutions. By leveraging various modalities, MER systems offer a richer and more robust analysis of emotional states compared to unimodal approaches. The review covers the general structure of MER systems, feature extraction techniques, and multimodal information fusion strategies, highlighting key advancements and milestones. Additionally, it addresses the research challenges and open issues in MER, including lightweight models, cross-corpus generalizability, and the incorporation of additional modalities. The paper concludes by discussing future directions aimed at improving the accuracy, explainability, and practicality of MER systems for real-world applications. Full article
(This article belongs to the Special Issue Intelligent Human–Robot Interaction: 4th Edition)
Show Figures

Figure 1

Back to TopTop