sensors-logo

Journal Browser

Journal Browser

Smart Human-Robot Interaction

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Physical Sensors".

Deadline for manuscript submissions: closed (30 November 2023) | Viewed by 10306

Special Issue Editors


E-Mail Website
Guest Editor
LInstitute of Applied Computer Science, Lodz University of Technology, 90-537 Lodz, Poland
Interests: computer science; data processing and analysis; HCI; industrial process imaging

E-Mail Website
Guest Editor
Institute of Applied Computer Science, Lodz University of Technology, 90-537 Lodz, Poland
Interests: ergonomics; human factors; knowledge management in education; HCI; UCD

E-Mail Website
Guest Editor
Institute of Applied Computer Science, Lodz University of Technology, 90-537 Lodz, Poland
Interests: human-robot interaction; tomography; computer-assisted image processing

E-Mail Website
Guest Editor
Institute of Applied Computer Science, Lodz University of Technology, 90-537 Lodz, Poland
Interests: human-computer interaction; industrial tomography; crowdsourcing; future-of-work; industry 4.0

Special Issue Information

Dear Colleagues,

Robots are necessary to substitute humans in implementing dangerous missions or in heavy, mundane, and repetitive work in the industry. Further examples include the support of specialized robots to conduct remote medical operations when the presence of specialized doctors at the surgery site is not possible. Moreover, robots develop into social companions as they start to play an integral role in everyday life for older adults, relieve them in many activities, assist in spending time, provide periodic supervision and care, and support lonely and dependent people. They can provide aid in learning processes as well as monitor the state of individuals’ health and well-being, ensuring safety. Since the presence of robots is growing pervasively yet most of those support services are quite novel to individuals and society, there are not many grounded interaction guidelines that ultimately lead to natural, socially accepted, and efficient performance at the same time.

Therefore, it is crucial to develop human–robot interaction (HRI) paradigms to help design robots for human benefits within users' natural environments, no matter the setting, for everyday and professional applications or dedicated to special-requirement applications. We need to study HRI from people's perspectives, considering cooperation with cyber–physical systems, orientation in the environment, awareness of the context of use, conditions, attitudes of people or individuals’ attitudes, behaviors, habits, and emotions.

This Special Issue welcomes a broad spectrum of research contributions, from control to engineering to social studies, including innovative designs, theories, methods, and mechanisms, as well as exploratory research in the field of HRI, the results of which will form the basis for intelligent, positive interactions between humans and robots in an effective, interactive, positive, and ethical way.

Keywords:

  • Human–robot interaction (HRI)
  • Robotic sensors and actuators
  • Social impact of human–robot interaction
  • Data processing and analysis algorithms
  • Artificial intelligence in HRI
  • Collaboration between (smart) robots and users
  • Robotics for everyday life, healthcare, professional development
  • Smart robots for human development, wellbeing and healthcare
  • (Semi-)Autonomous agents, vehicles, drones, robotic prototypes for smart HRI interaction
  • Smart HRI for professional and amateur applications

The topic covers a wide area of science and technology regarding sensor-system applications and measurement data processing focusing on the interaction between humans and robots in a variety of aspects from everyday life to specialized applications.

This Special Issue aims to support an HRI multidisciplinary community to foster problem-oriented research that integrates design, technical, and behavioral perspectives while facilitating the transfer of results into practice.

Prof. Dr. Krzysztof Grudzień
Dr. Magdalena Wróbel-Lachowska
Dr. Zbigniew Chaniecki
Prof. Dr. Andrzej Romanowski
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

15 pages, 2451 KiB  
Article
Touching a Mechanical Body: The Role of Anthropomorphic Framing in Physiological Arousal When Touching a Robot
by Konrad Maj, Paulina Grzybowicz, Wiktoria Laura Drela and Michał Olszanowski
Sensors 2023, 23(13), 5954; https://doi.org/10.3390/s23135954 - 27 Jun 2023
Viewed by 1878
Abstract
The growing prevalence of social robots in various fields necessitates a deeper understanding of touch in Human–Robot Interaction (HRI). This study investigates how human-initiated touch influences physiological responses during interactions with robots, considering factors such as anthropomorphic framing of robot body parts and [...] Read more.
The growing prevalence of social robots in various fields necessitates a deeper understanding of touch in Human–Robot Interaction (HRI). This study investigates how human-initiated touch influences physiological responses during interactions with robots, considering factors such as anthropomorphic framing of robot body parts and attributed gender. Two types of anthropomorphic framings are applied: the use of anatomical body part names and assignment of male or female gender to the robot. Higher physiological arousal was observed when touching less accessible body parts than when touching more accessible body parts in both conditions. Results also indicate that using anatomical names intensifies arousal compared to the control condition. Additionally, touching the male robot resulted in higher arousal in all participants, especially when anatomical body part names were used. This study contributes to the understanding of how anthropomorphic framing and gender impact physiological arousal in touch interactions with social robots, offering valuable insights for social robotics development. Full article
(This article belongs to the Special Issue Smart Human-Robot Interaction)
Show Figures

Figure 1

18 pages, 20901 KiB  
Article
Artistic Robotic Arm: Drawing Portraits on Physical Canvas under 80 Seconds
by Shady Nasrat, Taewoong Kang, Jinwoo Park, Joonyoung Kim and Seung-Joon Yi
Sensors 2023, 23(12), 5589; https://doi.org/10.3390/s23125589 - 14 Jun 2023
Cited by 1 | Viewed by 2082
Abstract
In recent years, the field of robotic portrait drawing has garnered considerable interest, as evidenced by the growing number of researchers focusing on either the speed or quality of the output drawing. However, the pursuit of either speed or quality alone has resulted [...] Read more.
In recent years, the field of robotic portrait drawing has garnered considerable interest, as evidenced by the growing number of researchers focusing on either the speed or quality of the output drawing. However, the pursuit of either speed or quality alone has resulted in a trade-off between the two objectives. Therefore, in this paper, we propose a new approach that combines both objectives by leveraging advanced machine learning techniques and a variable line width Chinese calligraphy pen. Our proposed system emulates the human drawing process, which entails planning the sketch and creating it on the canvas, thus providing a realistic and high-quality output. One of the main challenges in portrait drawing is preserving the facial features, such as the eyes, mouth, nose, and hair, which are crucial for capturing the essence of a person. To overcome this challenge, we employ CycleGAN, a powerful technique that retains important facial details while transferring the visualized sketch onto the canvas. Moreover, we introduce the Drawing Motion Generation and Robot Motion Control Modules to transfer the visualized sketch onto a physical canvas. These modules enable our system to create high-quality portraits within seconds, surpassing existing methods in terms of both time efficiency and detail quality. Our proposed system was evaluated through extensive real-life experiments and showcased at the RoboWorld 2022 exhibition. During the exhibition, our system drew portraits of more than 40 visitors, yielding a survey outcome with a satisfaction rate of 95%. This result indicates the effectiveness of our approach in creating high-quality portraits that are not only visually pleasing but also accurate. Full article
(This article belongs to the Special Issue Smart Human-Robot Interaction)
Show Figures

Figure 1

21 pages, 7096 KiB  
Article
Hand Gesture Interface for Robot Path Definition in Collaborative Applications: Implementation and Comparative Study
by Aleš Vysocký, Tomáš Poštulka, Jakub Chlebek, Tomáš Kot, Jan Maslowski and Stefan Grushko
Sensors 2023, 23(9), 4219; https://doi.org/10.3390/s23094219 - 23 Apr 2023
Cited by 2 | Viewed by 3496
Abstract
The article explores the possibilities of using hand gestures as a control interface for robotic systems in a collaborative workspace. The development of hand gesture control interfaces has become increasingly important in everyday life as well as professional contexts such as manufacturing processes. [...] Read more.
The article explores the possibilities of using hand gestures as a control interface for robotic systems in a collaborative workspace. The development of hand gesture control interfaces has become increasingly important in everyday life as well as professional contexts such as manufacturing processes. We present a system designed to facilitate collaboration between humans and robots in manufacturing processes that require frequent revisions of the robot path and that allows direct definition of the waypoints, which differentiates our system from the existing ones. We introduce a novel and intuitive approach to human–robot cooperation through the use of simple gestures. As part of a robotic workspace, a proposed interface was developed and implemented utilising three RGB-D sensors for monitoring the operator’s hand movements within the workspace. The system employs distributed data processing through multiple Jetson Nano units, with each unit processing data from a single camera. MediaPipe solution is utilised to localise the hand landmarks in the RGB image, enabling gesture recognition. We compare the conventional methods of defining robot trajectories with their developed gesture-based system through an experiment with 20 volunteers. The experiment involved verification of the system under realistic conditions in a real workspace closely resembling the intended industrial application. Data collected during the experiment included both objective and subjective parameters. The results indicate that the gesture-based interface enables users to define a given path objectively faster than conventional methods. We critically analyse the features and limitations of the developed system and suggest directions for future research. Overall, the experimental results indicate the usefulness of the developed system as it can speed up the definition of the robot’s path. Full article
(This article belongs to the Special Issue Smart Human-Robot Interaction)
Show Figures

Figure 1

30 pages, 44844 KiB  
Article
Analysis of the Snake Robot Kinematics with Virtual Reality Visualisation
by Anna Sibilska-Mroziewicz, Ayesha Hameed, Jakub Możaryn, Andrzej Ordys and Krzysztof Sibilski
Sensors 2023, 23(6), 3262; https://doi.org/10.3390/s23063262 - 20 Mar 2023
Cited by 1 | Viewed by 2066
Abstract
In this article, we present a novel approach to performing engineering simulation in an interactive environment. A synesthetic design approach is employed, which enables the user to gather information about the system’s behaviour more holistically, at the same time as facilitating interaction with [...] Read more.
In this article, we present a novel approach to performing engineering simulation in an interactive environment. A synesthetic design approach is employed, which enables the user to gather information about the system’s behaviour more holistically, at the same time as facilitating interaction with the simulated system. The system considered in this work is a snake robot moving on a flat surface. The dynamic simulation of the robot’s movement is realised in dedicated engineering software, whereas this software exchanges information with the 3D visualisation software and a Virtual Reality (VR) headset. Several simulation scenarios have been presented, comparing the proposed method with standard ways for visualising the robot’s motion, such as 2D plots and 3D animations on a computer screen. This illustrates how, in the engineering context, this more immersive experience, allowing the viewer to observe the simulation results and modify the simulation parameters within the VR environment, can facilitate the analysis and design of systems. Full article
(This article belongs to the Special Issue Smart Human-Robot Interaction)
Show Figures

Figure 1

Back to TopTop