Next Article in Journal
Strain Transfer Behavior of Surface-Mounted Strain Gauges on CFRP: Influence of Surface Resin-Rich Layer Thickness Under Equal-Curvature Bending
Next Article in Special Issue
Personalization and Generative Dialogue in Social Robotics for Eldercare: A User Study
Previous Article in Journal
Optimizing Data Preprocessing and Hyperparameter Tuning for Soil Organic Carbon Content Prediction Using Large Language Models: A Case Study of the Black Soil and Windblown Sandy Soil Regions in Northeast China
Previous Article in Special Issue
A Resource-Efficient Method for Real-Time Flexion–Extension Angle Estimation with an Under-Sensorized Finger Exoskeleton
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Comprehensive Narrative Review of Abrupt Movements in Human–Robot Interaction

Department of Mechanical and Aerospace Engineering, Politecnico di Torino, 10129 Turin, Italy
*
Author to whom correspondence should be addressed.
Appl. Sci. 2026, 16(7), 3350; https://doi.org/10.3390/app16073350
Submission received: 2 March 2026 / Revised: 26 March 2026 / Accepted: 27 March 2026 / Published: 30 March 2026
(This article belongs to the Special Issue Latest Advances and Prospects of Human-Robot Interaction (HRI))

Abstract

Human–robot interaction (HRI) takes place in dynamic environments where both humans and robots act as active agents, making the system inherently unpredictable. Abrupt movements can originate from either side and include human reflexes, fatigue, or unexpected reactions, as well as robot malfunctions, control errors, or task changes. These unpredictable events generate significant risks for both interaction fluency and safety, affecting not only the physical domain (e.g., collisions, excessive forces) but also cognitive aspects such as trust and predictability. Although different application areas present domain-specific challenges, a comprehensive overview of abrupt movements in HRI is still lacking, especially in the industrial scenario. This review aims to consolidate current knowledge regarding how abrupt phenomena are analyzed, prevented, and mitigated across various contexts and to offer new insights for researchers. In detail, after describing the literature search and the screening process, the review categorizes abrupt events, highlights key methodological approaches, and identifies gaps and future directions. By providing a structured synthesis of existing strategies, this work guides researchers in developing safer and more adaptive HRI frameworks capable of handling unpredictability.

1. Introduction

To survive in an ever-changing environment, humans rely on their ability to extract rules and patterns from dynamic surroundings, enabling them to anticipate and adapt to potential changes [1]. In human–robot interaction (HRI), however, the dynamics are even more complex, since the environment is shaped not only by external factors but also by the interplay of two active agents: the human and the robot [2,3]. HRI is inherently characterized by bidirectional influence, where perception, decision making, and motion generation continuously evolve through mutual adaptation between partners [4]. Unlike traditional automation scenarios, collaborative settings require robots not only to execute tasks accurately but also to interpret human behavior and adjust their actions in real time [5]. In this context, sudden and unexpected changes in action, trajectory, or intent, defined as abrupt movements, may occur. Abrupt movements can arise independently from the interaction between the operator and the robot: (i) human abrupt movements can be provoked by unexpected reactions, fatigue, or reflexive responses; (ii) robot abrupt movements can derive from control errors, malfunctions, or sudden task variations. Moreover, unexpected reactions can also be triggered reciprocally, with the action of one agent provoking an abrupt response in the other.
Such deviations from predictable patterns challenge not only the fluency of interaction but also its safety. Unlike in many other robotic domains, human–robot interaction always involves direct human presence, which makes any loss of stability or unexpected behavior potentially dangerous [6]. Therefore, safety becomes a primary requirement, including both physical (avoiding collisions, excessive forces, or unintended contacts) and cognitive (trust, predictability, and the human ability to understand and anticipate robot actions) aspects. Trust calibration and shared situational awareness have been identified as fundamental requirements for long-term acceptance of collaborative robotic systems [7,8]. In this context, abrupt or unpredictable robot movements may strongly influence operator perception of the system, potentially generating uncertainty, stress, or loss of confidence in the robot behavior. To address these challenges, adaptive mechanisms are needed to recognize abrupt events, ensuring both human safety and effective collaboration [9,10].
Unpredictability, arising from both humans and robots during their interaction, occurs differently across the most common application domains proposed by ISO/TS 15066:2016 for industrial, medical and service robotics [11]. In industrial collaboration, humans may unexpectedly reach into the robot workspace or change their task strategy, while robots may generate sudden motions due to trajectory replanning, controller instability, or sensor faults [12]. The most popular methods to ensure human safety focus on combining obstacle avoidance with polynomial-based sensory planning and a sensory system capable of mapping the robot workspace [13]. In addition to perception systems used for workspace monitoring, force sensing technologies also play an important role in collaborative robotics. Force sensors enable robots to measure interaction forces during contact with objects or the environment, allowing them to regulate gripping forces and safely perform tasks such as precision assembly while maintaining controlled interaction conditions [14]. In medical scenarios, interaction becomes even more critical, as humans can come into direct physical contact with robots [15]. In a surgical operation scenario, abrupt movements may arise from the human agent (i.e., both patient and surgeon) and from the robot, potentially compromising surgical accuracy. For this reason, robotic systems incorporate safety mechanisms to maintain surgeon engagement and prevent unexpected motions [16]. Moreover, unexpected robot behavior can derive from control inaccuracies or incorrect estimation of human motion intention [17]. Finally, abrupt movements can also occur in service and assistive robotics, which covers operational areas such as healthcare, education, leisure, smart cities, and the economy [18].
Even though the concept of abrupt movements in human–robot interaction is highly relevant for safety, the existing literature still shows a clear gap regarding research on this topic. To the best of authors’ knowledge, there are no reviews that comprehensively cover this topic in terms of summarizing the different methodologies adopted to analyze abrupt phenomena and the strategies proposed to prevent or respond to them. Accordingly, the present review is a unified overview developed not only to consolidate current knowledge but also to guide future research toward safer and more reliable interactive frameworks. The structure of the paper is as follows: (i) first, the literature search process is presented in detail, including the search string, the screening phase, and the set of selected studies; (ii) then, based on the main outcomes of this investigation, the focus is progressively narrowed, and the most relevant results are examined in depth; (iii) moreover, particular attention is given to identifying the key aspects that characterize abrupt movements in human–robot interaction, along with the different approaches used to address them; (iv) finally, the paper concludes with a summary of the main findings and guidelines to support future research in advancing the field.

2. Literature Investigation

To conduct the analysis, the concepts of human–robot interaction and abrupt movements were considered. Accordingly, the following search string was used in the Scopus electronic database on 2 March 2026, applying an additional filter to include only articles written in English:
TITLE-ABS-KEY ((human robot collaboration OR human robot interaction OR HRI OR HRC OR collaborative robot* OR cobot*) AND (abrupt movement* OR abrupt motion* OR sudden movement* OR sudden motion* OR jerky movement* OR jerky motion* OR impulsive movement* OR impulsive motion* OR quick movement* OR quick motion* OR unexpected movement* OR unexpected motion* OR rapid movement* OR rapid motion* OR unpredict* movement* OR unpredict* motion*))
A total of fifty-six results were obtained, which were then manually reviewed. A first group of studies focuses on tracking rapid human motions, which are treated as high-speed but continuous movements within predictive modeling frameworks. These works propose data-driven or hybrid approaches to forecast human or coupled human–robot movements, including LSTM-based and autoregressive neural models, as well as EMG-based decoding strategies [19,20,21]. Other contributions interpret the concept of rapid or sudden motion as the capability of the robotic control architecture to react quickly to dynamic changes or disturbances. These works investigate adaptive or robust control strategies designed to improve responsiveness under dynamic conditions, such as variable admittance control, optimization-based trajectory planning, observer-based control architectures, and dual-quaternion kinematic modeling [22,23,24,25,26]. Several works explicitly consider fast or sudden movements of tracked objects or agents, addressing perception and tracking in dynamic environments through approaches such as collaborative SLAM, real-time face and object tracking, hybrid tool tracking systems, biologically inspired visual stabilization mechanisms, and sensor metrological validation [27,28,29,30,31,32]. In other studies, descriptors such as quick, rapid, or sudden movements are mainly used to characterize the speed or expressiveness of human or robot actions within interaction scenarios. These works explore socially aware navigation and crowd interaction [33], motion design and perceptual evaluation of robot behavior [34,35], collaborative lifting and human-centered interaction [36,37], as well as gesture-based interaction, creative collaboration, rehabilitation monitoring, and virtual or haptic interfaces [38,39,40,41,42]. Although the twenty-four studies described above were retrieved due to the presence of terms included in the search string, the analyzed motions are typically modeled as high-speed yet continuous and expected behaviors rather than as genuinely unforeseen interaction events. Since this review focuses on abrupt movements understood as unexpected deviations from the ongoing interaction dynamics, these studies were not included in the final set of reviewed articles.
Accordingly, the remaining thirty-two articles were included in the subsequent analysis. For each article, the following aspects were identified: (i) year of publication, (ii) document type, (iii) robot classification: industrial robot, service robot, medical robot [43], and (iv) robot type (based on morphology). Figure 1 shows the logical flow of the analysis, while Table 1 lists all articles, one per row, ordered in descending order by year of publication.
The publication years of the selected papers are summarized in Figure 2. Although no temporal constraints were applied during the search process, the earliest relevant studies date back to 2009. Between 2009 and 2013, only one paper per year was published, except in 2012, when no publications were recorded. No studies were published between 2013 and 2017. From 2017 to 2021, the number of studies remained relatively stable, fluctuating between one and two articles per year. This was followed by a sharp increase starting in 2022, reaching a peak in 2025 with seven articles. This upward trajectory highlights the increasing scientific attention regarding abrupt movements phenomena in human–robot interaction over the past few years.
For each selected article, the type of robot was first identified according to its intended use, following the ISO 8373:2021 definitions [43]. An industrial robot is defined as a robot employed in automatic applications within an industrial environment [43]. A service robot refers to a robot intended for personal or professional use and designed to perform useful tasks for humans or equipment [43]. A medical robot is intended for use as medical electrical equipment or for a medical system [43]. In addition, considering the guidelines regarding the robot’s intended use, the morphological characteristics of the robots used in selected papers were also examined. Six different robot types were identified: manipulator, exoskeleton, mobile robot, humanoid, parallel robot, and haptic device. Manipulators are characterized as having three or more revolute joints [43], while exoskeletons are wearable devices designed to support or enhance human motor functions, such as strength and endurance [47]. Mobile robots are able to travel under their own control [43], and humanoids are designed with a body, head, and limbs to mimic human appearance and motion [43]. Parallel robots have arms forming a closed-loop structure [43], and haptic devices are robotic devices that enable information and action exchange between human and robot during human–robot interaction [43].
Across the different robot types considered in this review, human–robot cooperation is realized through different interaction modalities. Mobile robots operate in shared environments, requiring spatial coordination with humans in close proximity. Manipulators are typically used in collaborative workspaces, where humans and robots coordinate actions during task execution. Humanoid robots combine physical interaction and social cues to enable more natural interaction. Exoskeletons involve direct physical interaction, supporting or guiding human motion, while haptic devices rely on force feedback to assist user actions. These modalities illustrate how cooperation can range from proximity-based coordination to direct physical coupling, depending on the robot type.
As illustrated in Figure 3, the radar chart highlights the distribution of robot intended uses (identified by colors) and morphological types (represented by the polygon vertices) in the selected papers.
Regarding service robots, mobile robots are the most investigated type, with five related studies, while only two studies focus on manipulators and two on humanoid robots. Mobile robots operate in environments such as offices and airports, supporting humans in tasks like guiding or following people, transporting goods, and recognizing gestures or activities. Reliable people detection and interaction are essential to ensure safe and effective human–robot collaboration. One of the main challenges in this area lies in navigating safely in highly dynamic environments, where human movements can be sudden and unpredictable. This unpredictability often leads to inefficient or abrupt robot behaviors, reducing the fluidity and reliability of the interaction [33,49,52,73,74].
The types of medical robots identified in the selected papers include exoskeletons, discussed in three related studies [47,48,68], and haptic devices, discussed in two related studies [66,70]. Exoskeletons are widely used in rehabilitation to help individuals relearn walking patterns and rebuild muscle strength [48,75]. However, even when velocity and range of motion are constrained, unforeseen human–robot interactions may still occur. For instance, the device may unexpectedly deliver excessive torque or resist the user’s intended movement. These events can result in negative outcomes such as increased stress and muscular effort, which may lead to fatigue and additional strain [47]. Haptic devices, on the other hand, are frequently integrated into medical training simulators for procedures such as minimally invasive surgery. By modeling expert motion patterns and providing real-time haptic guidance, these systems help correct unpredictable movements, enhancing both safety and learning outcomes [66,70].
Exoskeletons [47,68] and haptic devices [72] are also employed as industrial robots. In detail, exoskeletons are applied to enhance worker strength and endurance, increasing overall productivity [76]. Haptic devices are employed to assist operators during precision tasks such as welding by allowing the operator to control the main trajectory and speed, while the device suppresses sudden or abrupt motions [72]. Among industrial robots, manipulators represent by far the most common category, with thirteen studies. Manipulators are typically employed in industrial settings to work collaboratively with humans, performing tasks such as handover, pick-and-place, assembly/disassembly, grasping and moving objects, and other shared operations. The core goal is to create a shared space where humans and robots can exchange information and collaborate, enhancing each other’s strengths. Robots can assist workers with tasks that require precision, strength, or speed, while humans provide supervision, decision-making, and problem-solving skills [77]. In such collaborative scenarios, several conditions may lead to abrupt movements.
Although studies were identified across different robotic domains (industrial, medical, and service robotics), the literature addressing abrupt movements in collaborative scenarios is predominantly focused on industrial manipulators. Therefore, the subsequent analysis concentrates on this category to provide a more detailed examination of the most representative studies.

3. Industrial Manipulator Robots

Following the initial literature analysis, industrial manipulators emerged as the most frequently investigated robotic systems in the context of abrupt movements in HRI. In order to analyze the selected studies in a structured way, the literature was interpreted according to several key dimensions: (i) the agent generating the abrupt motion, (ii) the cause triggering the unexpected behavior, (iii) the temporal phase in which the abrupt event is analyzed, (iv) the practical approaches and (v) the supporting technologies employed to address abrupt movements. Figure 4 illustrates this analytical framework adopted to examine abrupt motions in industrial human–robot collaboration scenarios.
In particular, thirteen articles were identified (Table 2). According to Figure 4, these studies were analyzed considering the following aspects: year of publication, aim of the work, agent performing the abrupt movement, cause of the abrupt movement, focus of the analysis, practical approach to address the abrupt movement, and supporting technology.

3.1. Publications per Year

Considering the publication year of the articles related to manipulators in industrial contexts, a clear upward trend can be observed, particularly after 2021 (Figure 5). This increase can be associated with the emergence of Industry 5.0 [78]. Introduced in 2021, Industry 5.0 promotes a human-centered, sustainable, and resilient industrial model in which workers are considered key assets, and technologies are designed to support physical and mental well-being while improving system adaptability and sustainability [79]. Within this paradigm, human–robot interaction becomes a central element of innovation, which helps explain the growing scientific attention observed in recent years. In such collaborative environments, unexpected or abrupt robot motions may directly affect both operator safety and the perceived reliability of the robotic system. Consequently, the ability to detect, anticipate, and properly manage abrupt movements becomes an important requirement for designing collaborative systems aligned with Industry 5.0 principles, where safety, trust, and human well-being are central elements of the interaction.

3.2. Agent Performing the Abrupt Movement

In the context of human–robot collaboration, abrupt movements can be performed by either the human or the robot. The analysis of these events is important for enhancing collaboration efficiency while ensuring physical and cognitive safety. The literature shows a balanced focus on this topic, with seven studies addressing robot abrupt movements and six studies examining human abrupt movements (Figure 5). The majority of the studies investigating abrupt movements performed by humans (five out of six) were conducted by the same research group, which primarily focuses on human–robot interaction and human movement analysis [54,57,60,61,63]. However, the distribution according to the agent performing the abrupt movement highlights the need to study such movements in both humans and robots to enhance overall collaboration.
As shown in Figure 5, there is a clear difference in the temporal distribution of articles focusing on robot abrupt movements and human abrupt movements. At least one article per year between 2019 and 2024 addresses robot abrupt movements, whereas studies focusing on human abrupt movements appear only from 2022 onward. This trend is consistent with the principles of Industry 5.0, which place humans at the center of production systems, thereby motivating research to increasingly investigate human-related aspects of human–robot interaction.

3.3. Cause of the Abrupt Movement

In a collaborative environment, identifying the cause of an abrupt movement is particularly important. Unexpected movements may originate either from the agent itself, from the action of the other agent, or from external factors in the surrounding environment. Figure 6 shows that, among the seven studies focusing on robot abrupt movements, five address unexpected motions caused by the robot itself, while two focus on movements triggered by the operator.
When an unexpected robot motion originates from the robot itself, it is essential to analyze possible causes such as malfunction or control errors. Abrupt movements may result from technical failures, which can lead to hazardous situations [56], or from the control strategy adopted in collision-avoidance algorithms. For example, a robot that suddenly stops to avoid an object performs an abrupt motion. Regulating this behavior results in smoother motions, making robot movements more understandable for the human counterpart [67]. Similarly, malfunctions in algorithms that directly use the coordinates of objects within the robot workspace can cause abrupt variations in kinematic distances, leading to unstable and abrupt robot movements [62]. From the operator perspective, the speed and direction of robot movement strongly influence how its motion is perceived. Comprehending this perception is crucial for effective HRI. For instance, an end effector approaching from the side at low speed can be perceived as more unpredictable than one approaching from the front [53]. Other unexpected movements include the robot rapidly moving towards the human and making contact or suddenly dropping both arms from above the participant’s head to table level [64].
When focusing on robot abrupt movement caused by humans, unexpected robot motions may emerge from the misinterpretation of human actions. In particular, the two reviewed studies propose a framework for human–robot co-carry tasks that rely on a model sensitive to human false positives [58,65]. If the operator performs shaky movements, the robot may misinterpret them as a carrying intention and initiate a co-carrying action, potentially causing discomfort or injury to the human operator.
Notably, none of the reviewed studies addresses robot abrupt motions caused by external factors. For example, industrial robots may be affected by changes in the robot operating conditions [80].
In HRI, operators typically perform repetitive and controlled gestures related to the task. However, sudden or unexpected movements may occur, deviating from the predefined pattern. Human abrupt movements may be caused by factors not related to the operator but instead induced by the robot or by external factors related to the surrounding environment. As shown in Figure 6, all six studies addressing human abrupt movements focus on motions triggered by the robot or external factors, without considering abrupt movements generated by the human himself. Recognizing these highly variable gestures is essential to ensure the safety and efficiency of human–robot collaboration [54,57,60,61,63]. Moreover, when predicting human movements, it is necessary to account for these sudden changes and to understand the operator’s intentions [59].
Human abrupt movements may also result from reactions to internal stimuli, such as sneezing or reduced vigilance, which are often highly interrelated with operator fatigue, workload, and situation awareness [81]. None of the reviewed studies explicitly address this class of human abrupt movements.

3.4. Analysis Focus

An important aspect in the evaluation of the selected articles is the focus of the analysis with respect to the occurrence of the abrupt movement. In the context of human–robot interaction, abrupt movements can be interpreted through unexpected measurable variations in motion-related quantities rather than only through qualitative descriptions. When referring to human motion, abrupt movements are typically characterized by rapid changes in kinematic variables such as velocity, acceleration, or higher-order derivatives of motion (e.g., jerk), which may appear as peaks or sudden discontinuities in the temporal evolution of motion signals acquired through sensing technologies such as inertial measurement units or motion capture systems. Among the results retrieved through the specific search string adopted in this review, the studies addressing abrupt human gestures in collaborative environments characterize such movements by analyzing features derived from acceleration signals or deviations from expected motion patterns [54,57,60,61,63]. In the case of robot motion, abrupt movements may arise from sudden variations in the robot trajectory, velocity profile, or control response. These variations can be associated with events such as trajectory replanning, collision avoidance strategies, or unexpected disturbances affecting the control system. From a kinematic and dynamic perspective, abrupt robot motions may therefore be reflected in rapid changes in velocity, acceleration, jerk, or interaction forces generated by the robot during collaborative tasks [53,58,62,67]. Therefore, as illustrated in Figure 7, an impulsive movement can be generally schematized as a temporal trend characterized by an onset phase, a peak, and a subsequent decay. Based on this temporal structure, the reviewed articles can be classified into four categories, i.e., prevention, prediction, detection, and reaction, according to the moment they address the abrupt event. The prevention corresponds to the phase preceding the onset and is addressed by five studies. During this phase, the primary objective is to avoid the occurrence of the abrupt motion. All five studies focus on robot abrupt motion and aim to intervene directly with the robot to minimize the likelihood of such events [56,58,62,65,67]. Once the movement begins, the onset marks the transition to the prediction, which extends until just before the signal reaches its peak. Only one article addresses this phase, focusing on human motion with the aim of predicting an abrupt movement shortly before it occurs [59]. The detection spans from just before the peak, partially overlapping with the prediction phase, until the end of the movement. In this case, five studies analyze human abrupt motion, with the objective of identifying the impulsive movement as it occurs [54,57,60,61,63]. This real-time identification is intended to support the implementation of safety systems. Finally, after an abrupt movement has occurred, post-event analysis can be conducted in what is defined as the reaction. The two reviewed articles in this category focus on analyzing human reactions following a robot abrupt motion [53,64]. Understanding these reactions is crucial for optimizing human perception of the robot and ultimately, improving HRI.
All these analyses are developed with respect to the timing of the abrupt movement, which represents a critical event that can compromise safety and reduce the efficiency of HRI. To optimize interaction, early intervention is essential, either by preventing robot abrupt movements or by predicting human abrupt motions. In some cases, recognizing a human abrupt movement as it occurs allows for the application of appropriate safety measures. In these situations, the capability of the system to operate in real time becomes a critical requirement. Safety mechanisms must respond with minimal latency to allow timely intervention when unexpected motions arise in shared workspaces. However, achieving high detection accuracy may require more complex computational models, such as machine learning or deep learning approaches, which can increase processing time. As a result, a trade-off may emerge between detection accuracy and real-time performance [5]. In industrial human–robot collaboration, where workers and robots operate in close proximity, ensuring a rapid system response is essential to promptly mitigate potentially hazardous situations.
Actions across these stages aim to minimize or manage abrupt movements so that they do not negatively affect the interaction. However, when a robot abrupt movement occurs, it is essential to consider the worker’s physical, mental and emotional well-being, particularly within the Industry 5.0 framework. For this reason, a deeper understanding of the mental stress experienced by human workers and their reactions to unpredictable robot movements is necessary [53]. Such understanding can help improving collaboration by optimizing robot behavior, reducing the occurrence of unexpected motions [53,64].
Overall, a clear trend emerges from the reviewed literature. Studies on robot abrupt motion primarily address the phases before and after the event (prevention and reaction), while studies on human abrupt motion mainly focus on the phases occurring during the movement, specifically prediction and detection.

3.5. Practical Approaches and Supporting Technologies

Two additional aspects analyzed in the literary review concern the practical approaches and the supporting technologies used to address abrupt motions. Regarding the implemented approaches, four different categories were identified in the reviewed articles: feature extraction, motion control, AI-based identification, and augmented reality/virtual reality (AR/VR) exploitation. The supporting technologies employed in the selected studies can be associated either with the collaborative workspace or with the human operator. For the collaborative workspace, video systems, such as web cameras or RGB-D cameras, are employed. These devices enable visualization of the workspace, accurate capture of HRI, and the acquisition of relevant data.
Conversely, when the technology is associated with the human operator, wearable devices, such as inertial sensors and physiological parameters sensors, are used for (i) detecting abrupt human movements, (ii) controlling the robot to prevent its unexpected movements, and (iii) evaluating the human responses to abrupt robot motions. When a technology is referred to as simulation, it means that no physical technology is involved and that the implementation is based on software simulations.
As shown in Figure 8, these approaches and technologies can be applied at different phases of the analysis.
For prevention, two main approaches are employed: motion control and AR. Motion control is exclusively used for preventing robot abrupt movements by improving robot control and trajectory smoothness [58,62,65,67]. Two of these studies rely on cameras placed above the collaborative workspace to capture human motion, with the acquired information sent to the local robot controller [58,65]. These two studies investigate the robot abrupt movements caused by the interaction with the human operator. Therefore, the supporting technology aims to capture human movements to improve robot motion. One study uses a video system based on RGB-D cameras to locate the position of the human operator relative to the robot [67]. This information is exploited to smoothly modulate the robot velocity by considering both the human presence and the current and intended robot trajectory. Instead of abruptly stopping or deviating from its predefined path, the system adjusts the robot speed to maintain continuous and smooth motion. As a result, robot behavior becomes more predictable and easier for the human operator to understand [67]. Finally, the other study relies on a software-based simulation to test and discuss the proposed robot control method for smoothing abrupt robot movements [62]. AR, instead, is explored in one study, with the aim of developing a gesture control human–robot interface [56]. The system relies on an AR headset as the supporting technology, which allows the operator to set the reference trajectories of the robot, preventing potentially hazardous situations due to unexpected robot movements [56].
Approaches such as AR and VR are particularly relevant when abrupt movements are performed by the robot. These approaches can be applied not only in the prevention phase, but also after the occurrence of such movements. As shown in Figure 8, one study that employs VR focuses on human reactions following an abrupt event and investigates how to preserve user trust in the robot to ensure continued collaboration [64]. The ultimate goal is to optimize the perception of robot motions in industrial HRI scenarios, simulated through VR combined with a headset device [64].
Within the reaction phase, one study employs a feature extraction approach [53]. To assess user perception after unexpected robot movements, physiological parameters sensors, such as galvanic skin response and electromyography sensors, were used in combination with self-reports. From these signals, the most relevant features were identified to analyze the operator’s reactions to the robot’s motion. The resulting information contributes to understanding how collaboration can be improved by reducing the operators’ mental stress and providing insights into the design of more effective HRI tasks.
Regarding human abrupt movements, the only study addressing the prediction stage employs a simulation-based AI approach [59]. Specifically, it uses a Gaussian mixture model to estimate the uncertainty of human motion extrapolations, which can be incorporated into robot motion planning. The online estimation of forecast uncertainties enables the robot to adjust its trajectory, increasing the distance from the human operator and decreasing the danger index. This approach is beneficial in collaborative scenarios, as it reduces the risk of collisions between humans and robots [59].
The detection of human abrupt movements, on the other hand, is conducted only through inertial sensors placed on the human body [54,57,60,61,63]. The five analyzed studies aim not only to identify abrupt movements as they occur but also to minimize the impact of the setup on the user, thereby preserving natural motion while ensuring that the necessary data are accurately captured [54,57,60,61,63]. One study adopts a feature extraction approach, analyzing wrist acceleration to better characterize the patterns of abrupt movements and thus improve their detection accuracy [63]. In the other four studies, inertial sensors are combined with AI techniques, demonstrating their effectiveness in detecting abrupt movements and enhancing safety systems for human–robot interaction in industrial settings [54,57,60,61]. Despite their promising performance, AI-based approaches may present some limitations when applied to abrupt movement detection. In particular, these methods often rely on the availability of sufficiently large and representative datasets for training. In industrial environments, collecting such datasets may be challenging due to the variability of tasks and operator behaviors. Moreover, models trained under specific experimental conditions may face difficulties in generalizing to different workspaces or interaction scenarios. Finally, the limited interpretability of some machine learning models may reduce the transparency of the decision-making process, which can be a relevant aspect when safety-critical events are involved [82].
Clear patterns emerge: the choice of approach and technology is strongly influenced by both the focus of the analysis and the agent responsible for the abrupt movement. Video systems, for example, are used exclusively during the prevention stage of robot abrupt movements, as they provide valuable information about the shared workspace. In contrast, inertial sensors are used solely for the detection of human abrupt movements, enabling real-time monitoring that does not interfere with the operator’s tasks or natural motion. During the reaction phase, physiological sensors become crucial to monitor the human response to robot abrupt motion. Motion control strategies, instead, are employed only for the prevention of robot abrupt motions, since they directly act on the robot to avoid unexpected behavior. AR/VR headsets are adopted only when the abrupt movement originates from the robot, supporting its control while also assessing how humans perceive unexpected robot actions. Finally, AI-based identification is applied exclusively in the context of abrupt human movements, as it enables the real-time identification of characteristic movement patterns.
It is worth noting that the reviewed studies exhibit different levels of technological maturity, ranging from simulation-based approaches [59,62,64] to controlled laboratory experiments [53,54,56,57,58,60,61,63,65,67]. All studies so far focus on these early-stage evaluations, with none addressing real-world validation in collaborative scenarios. This highlights a clear direction for future research to test these approaches and technologies in realistic industrial environments.

4. Conclusions

This narrative review provides a comprehensive overview of abrupt movements in human–robot interaction scenarios. A total of thirty-two papers addressing the concept of unexpected motion from different perspectives were identified through a structured literature investigation and categorized according to their intended use and the morphological type of the robot. Based on this classification, the scope was further narrowed to thirteen articles specifically focusing on abrupt movements occurring during interactions between human operators and manipulators in industrial settings. From these studies, the main aspects associated with abrupt movements were identified and analyzed, including publication year, research objective, agent performing the abrupt movement, causes of the movement, focus of the analysis, practical approaches adopted, and supporting technologies. This structured analysis allowed for the systematic characterization of the current research landscape on abrupt movements in collaborative human–robot environments.
The literature review highlights that abrupt movements, whether generated by humans or robots, play a crucial role in ensuring safety within shared workspaces. However, the analysis also reveals that existing studies tend to investigate robot abrupt motions mainly from a unilateral perspective. In particular, most research focuses on causes not directly related to the interaction with the human operator. Investigating how abrupt movements may arise from mutual dependencies between human and robot behavior represents an important research opportunity for future studies. Moreover, none of the reviewed studies explicitly considers the influence of environmental factors on robot abrupt motions. Since collaborative workspaces are dynamic environments in which multiple elements coexist, understanding how external conditions may affect robot behavior could provide valuable insights for improving the robustness of collaborative systems.
Another relevant observation concerns the different strategies adopted to address abrupt movements depending on the agent involved. In the case of robot abrupt motions, most studies focus either on preventing the unexpected movement before it occurs or on analyzing the reaction of the human operator after its occurrence. Consequently, research efforts typically follow two complementary directions. On the one hand, preventive strategies aim to reduce the likelihood of unexpected robot behavior through improved control algorithms and motion planning techniques. On the other hand, when abrupt robot movements occur, it becomes essential to analyze their impact on human perception and interaction quality. Robot abrupt motions not only represent a physical safety issue but may also influence operator perception and trust toward robotic systems. Understanding these psychological responses is therefore essential for designing collaborative systems that remain both safe and acceptable for human workers. Studies addressing human abrupt movements adopt a different perspective, typically focusing on detecting the motion shortly before or during its occurrence. Since early intervention is preferable from a safety perspective, detection approaches should operate in real time in order to enable prompt and effective responses. However, only one study attempts to predict abrupt human movements, indicating that prediction remains a largely unexplored aspect that deserves further investigation. At present, the combination of artificial intelligence techniques and inertial sensing technologies represents the most frequently adopted solution for real-time detection. Although these technologies are non-invasive and suitable for industrial environments, their potential for detecting and predicting abrupt movements could be further explored, particularly through the integration of multiple sensing modalities and analytical approaches.
The conducted analysis also highlights several technological and methodological challenges that open important directions for future research. One key challenge concerns the development of reliable detection and prediction methods capable of operating in real-time collaborative environments. Achieving an effective balance between detection accuracy and computational efficiency remains particularly critical in industrial scenarios where safety mechanisms must respond with minimal latency. In addition, the increasing use of artificial intelligence techniques introduces challenges related to data availability, model generalization across different working conditions, and the interpretability of AI-based decision processes, especially when safety-critical events are involved.
Beyond the technical challenges associated with detecting and managing abrupt movements, future research should also consider the broader ethical and regulatory implications of unexpected behavior in human–robot collaboration. Abrupt robot motions may generate safety-critical situations, potentially leading to collisions or hazardous interactions, raising questions about responsibility attribution among system designers, operators, and manufacturers. Current safety frameworks for collaborative robotics, such as ISO 10218-2:2025, define safety requirements for the integration and application of industrial robot systems, providing guidelines for safe HRI [83]. However, the increasing autonomy and adaptability of collaborative robots may require further refinement of these standards to address unpredictable behaviors and their implications for liability and trust in shared workspaces.
Finally, the growing relevance of Industry 5.0 reinforces the importance of human-centered industrial systems in which safety, transparency, and worker well-being represent fundamental design principles. In this context, future research would benefit from standardized experimental protocols specifically designed to generate and assess abrupt events in collaborative tasks, for example, through controlled perturbations, sudden robot trajectory changes, or task-induced human reactions under realistic industrial constraints. Moreover, the development of multimodal sensing frameworks combining inertial, vision-based, force, and physiological data could improve both the robustness and interpretability of abrupt-event analysis. On the methodological side, promising directions include the use of hybrid model-based and data-driven approaches, capable of combining physical consistency with adaptive learning, as well as closed-loop control architectures that directly link abrupt-event detection to real-time robot behavior adaptation. Finally, validation in ecologically valid industrial scenarios remains essential to assess the real applicability of these solutions in human-centered collaborative environments.

Author Contributions

Conceptualization, G.D.V., E.D., V.C., L.G. and S.P.; methodology, G.D.V., E.D. and V.C.; investigation, G.D.V.; writing—original draft preparation, G.D.V. and E.D.; writing—review and editing, V.C.; supervision, L.G. and S.P. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

Data sharing is not applicable.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Khouri, L.; Nelken, I. Detecting the Unexpected. Curr. Opin. Neurobiol. 2015, 35, 142–147. [Google Scholar] [CrossRef]
  2. Han, J.; Conti, D. Recent Advances in Human–Robot Interactions. Appl. Sci. 2025, 15, 6850. [Google Scholar] [CrossRef]
  3. Wang, T.; Zheng, P.; Li, S.; Wang, L. Multimodal Human–Robot Interaction for Human-Centric Smart Manufacturing: A Survey. Adv. Intell. Syst. 2024, 6, 2300359. [Google Scholar] [CrossRef]
  4. Goodrich, M.A.; Schultz, A.C. Human–Robot Interaction: A Survey. Found. Trends Hum.-Comput. Interact. 2008, 1, 203–275. [Google Scholar] [CrossRef]
  5. Lasota, P.A.; Fong, T.; Shah, J.A. A Survey of Methods for Safe Human-Robot Interaction. Found. Trends Robot. 2017, 5, 261–349. [Google Scholar] [CrossRef]
  6. Dragan, A.D.; Lee, K.C.T.; Srinivasa, S.S. Legibility and Predictability of Robot Motion. In Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction, Tokyo, Japan, 3–6 March 2013; pp. 301–308. [Google Scholar] [CrossRef]
  7. Haddadin, S.; Croft, E. Physical Human–Robot Interaction. In Springer Handbook of Robotics; Springer: Berlin/Heidelberg, Germany, 2016; pp. 1835–1874. [Google Scholar] [CrossRef]
  8. Hancock, P.A.; Billings, D.R.; Schaefer, K.E.; Chen, J.Y.C.; De Visser, E.J.; Parasuraman, R. A Meta-Analysis of Factors Affecting Trust in Human-Robot Interaction. Hum. Factors 2011, 53, 517–527. [Google Scholar] [CrossRef]
  9. Di Vincenzo, G.; Polito, M.; Digo, E.; Gastaldi, L.; Pastorelli, S. Improvement of Safety in Collaborative Robotics through Prompt Detection of Abrupt Movements. Hum. Factors Wearable Technol. 2025, 175, 41–48. [Google Scholar] [CrossRef]
  10. Digo, E.; Polito, M.; Caselli, E.; Gastaldi, L.; Pastorelli, S. A Dataset of Standard and Abrupt Industrial Gestures Recorded Through MIMUs. Robotics 2025, 14, 176. [Google Scholar] [CrossRef]
  11. ISO/TS 15066:2016; Robots and Robotic Devices—Collaborative Robots. International Organization for Standardization: Geneva, Switzerland, 2016.
  12. Villani, V.; Pini, F.; Leali, F.; Secchi, C. Survey on Human–Robot Collaboration in Industrial Settings: Safety, Intuitive Interfaces and Applications. Mechatronics 2018, 55, 248–266. [Google Scholar] [CrossRef]
  13. Reddy, A.; Bright, G.; Padayachee, J. A Review of Safety Methods for Human-Robot Collaboration and a Proposed Novel Approach. In ICINCO 2019—Proceedings of the 16th International Conference on Informatics in Control, Automation and Robotics; Springer: Berlin/Heidelberg, Germany, 2019; Volume 1, pp. 243–248. [Google Scholar] [CrossRef]
  14. Peta, K.; Wiśniewski, M.; Kotarski, M.; Ciszak, O. Comparison of Single-Arm and Dual-Arm Collaborative Robots in Precision Assembly. Appl. Sci. 2025, 15, 2976. [Google Scholar] [CrossRef]
  15. Yang, G.Z.; Cambias, J.; Cleary, K.; Daimler, E.; Drake, J.; Dupont, P.E.; Hata, N.; Kazanzides, P.; Martel, S.; Patel, R.V.; et al. Medical Robotics—Regulatory, Ethical, and Legal Considerations for Increasing Levels of Autonomy. Sci. Robot. 2017, 2, eaam8638. [Google Scholar] [CrossRef] [PubMed]
  16. Abdelaal, A.E.; Mathur, P.; Salcudean, S.E. Robotics In Vivo: A Perspective on Human-Robot Interaction in Surgical Robotics. Annu. Rev. Control Robot. Auton. Syst. 2020, 3, 221–242. [Google Scholar] [CrossRef]
  17. Liu, T.; Wang, J.; Wong, S.; Razjigaev, A.; Beier, S.; Peng, S.; Do, T.N.; Song, S.; Chu, D.; Wang, C.H.; et al. A Review on the Form and Complexity of Human–Robot Interaction in the Evolution of Autonomous Surgery. Adv. Intell. Syst. 2024, 6, 2400197. [Google Scholar] [CrossRef]
  18. Gonzalez-Aguirre, J.A.; Osorio-Oliveros, R.; Rodríguez-Hernández, K.L.; Lizárraga-Iturralde, J.; Menendez, R.M.; Ramírez-Mendoza, R.A.; Ramírez-Moreno, M.A.; Lozoya-Santos, J.d.J. Service Robots: Trends and Technology. Appl. Sci. 2021, 11, 10702. [Google Scholar] [CrossRef]
  19. Yao, J.; He, C.; Li, K.; Su, R.; Ling, K.V. Hybridizing Long Short-Term Memory Network and Inverse Kinematics for Human Manipulation Prediction in Smart Manufacturing. In Proceedings of the 2024 18th International Conference on Control, Automation, Robotics and Vision, ICARCV 2024, Dubai, United Arab Emirates, 12–15 December 2024; pp. 769–774. [Google Scholar]
  20. Atkins, J.; Lee, H. MIntNet: Rapid Motion Intention Forecasting of Coupled Human-Robot Systems With Simulation-to-Real Autoregressive Neural Networks. IEEE Robot. Autom. Lett. 2023, 8, 6363–6370. [Google Scholar] [CrossRef]
  21. Feleke, A.G.; Bi, L.; Fei, W. Emg-Based 3d Hand Motor Intention Prediction for Information Transfer from Human to Robot. Sensors 2021, 21, 1316. [Google Scholar] [CrossRef]
  22. Guo, B.; Feng, J.; Zhong, Y.; Zhu, Y.; Dian, S. Variable Admittance Control With Human Intention Estimation for Physical Human–Robot Interaction. IEEE Trans. Ind. Electron. 2025, 73, 5816–5825. [Google Scholar] [CrossRef]
  23. Zhang, S.; Zanchettin, A.M.; Villa, R.; Dai, S. Real-Time Trajectory Planning Based on Joint-Decoupled Optimization in Human-Robot Interaction. Mech. Mach. Theory 2020, 144, 103664. [Google Scholar] [CrossRef]
  24. Ji, P.; Min, F.; Zhang, F.; Ma, F. Tele-Aiming Control Design for Reconnaissance Robot Using a Strong Tracking Multi-Model Extended Super-Twisting Observer. IET Control Theory Appl. 2023, 17, 696–712. [Google Scholar] [CrossRef]
  25. Dalvi, M.; Chiddarwar, S.S.; Rahul, M.R.; Sahoo, S.R. Kinematic Modelling of UR5 Cobot Using Dual Quaternion Approach. In Machines, Mechanism and Robotics; Springer: Singapore, 2022; pp. 1077–1085. [Google Scholar] [CrossRef]
  26. Dalvi, M.; Chiddarwar, S.S.; Sahoo, S.R.; Rahul, M.R. Dual Quaternion-Based Kinematic Modelling of Serial Manipulators. In Machines, Mechanism and Robotics; Springer: Singapore, 2021; pp. 1–7. [Google Scholar] [CrossRef]
  27. Liu, X.; Wen, S.; Liu, H.; Yu, F.R. CPL-SLAM: Centralized Collaborative Multirobot Visual-Inertial SLAM Using Point-and-Line Features. IEEE Internet Things J. 2025, 12, 21866–21875. [Google Scholar] [CrossRef]
  28. Iovene, E.; Cattaneo, D.; Fu, J.; Ferrigno, G.C.; De Momi, E. Hybrid Tracking Module for Real-Time Tool Tracking for an Autonomous Exoscope. IEEE Robot. Autom. Lett. 2024, 9, 6067–6074. [Google Scholar] [CrossRef]
  29. Fransen, B.R.; Herbst, E.V.; Harrison, A.M.; Adams, W.; Trafton, J.G. Real-Time Face and Object Tracking. In Proceedings of the 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2009, St. Louis, MO, USA, 11–15 October 2009; pp. 2483–2488. [Google Scholar]
  30. Zelinsky, A.; Heinzmann, J. Human-Robot Interaction Using Facial Gesture Recognition. In Proceedings of the Robot and Human Communication—Proceedings of the IEEE International Workshop, Tsukuba, Japan, 11–14 November 1996; pp. 256–261. [Google Scholar]
  31. Avelino, J.; de Figueiredo, R.P.; Moreno, P.; Bernardino, A.J.M. On the Perceptual Advantages of Visual Suppression Mechanisms for Dynamic Robot Systems. Procedia Comput. Sci. 2016, 88, 505–511. [Google Scholar] [CrossRef]
  32. Carfagni, M.; Furferi, R.; Governi, L.; Santarelli, C.; Servi, M.; Uccheddu, F.; Volpe, Y. Metrological and Critical Characterization of the Intel D415 Stereo Depth Camera. Sensors 2019, 19, 489. [Google Scholar] [CrossRef]
  33. Kastner, L.; Fatloun, B.; Shen, Z.; Gawrisch, D.; Lambrecht, J. Human-Following and -Guiding in Crowded Environments Using Semantic Deep-Reinforcement-Learning for Mobile Service Robots. In Proceedings of the 2022 International Conference on Robotics and Automation (ICRA), Philadelphia, PA, USA, 23–27 May 2022; pp. 833–839. [Google Scholar] [CrossRef]
  34. Adachi, M.; Kakio, M.; Shiomi, M.; Miyashita, T. Virtual Robot Riding with a Person in a Real Elevator—Mixed Reality System for Robot Behavior Design. In Proceedings of the IEEE International Workshop on Robot and Human Communication, RO-MAN, Eindhoven, The Netherlands, 25–29 August 2025; pp. 508–513. [Google Scholar]
  35. Quintana, J.J.; Ferrer, M.A.; Diaz, M.; Feo, J.J.; Wolniakowski, A.; Miatluk, K. Uniform vs. Lognormal Kinematics in Robots: Perceptual Preferences for Robotic Movements. Appl. Sci. 2022, 12, 12045. [Google Scholar] [CrossRef]
  36. Schorsch, J.F.; Keemink, A.Q.L.; Stienen, A.H.A.; Van Der Helm, F.C.T.; Abbink, D.A. The Influence of Human-Robot Interaction Order during Fast Lifting Tasks for Different Levels of Weight Compensation. In Proceedings of the IEEE RAS and EMBS International Conference on Biomedical Robotics and Biomechatronics, Sao Paulo, Brazil, 12–15 August 2014; pp. 426–431. [Google Scholar]
  37. Ralph, M.B.; Moussa, M.A. On the Effect of the User’s Background on Communicating Grasping Commands. In Proceedings of the HRI 2006: Proceedings of the 2006 ACM Conference on Human-Robot Interaction, Salt Lake City, UT, USA, 2–3 March 2006; pp. 353–354. [Google Scholar]
  38. Prathyakshini; Prathwini. Hand Gesture Controlled Video Player Application. In Proceedings of the 7th International Conference on Electronics, Communication and Aerospace Technology, ICECA 2023—Proceedings, Coimbatore, India, 22–24 November 2023; pp. 67–72. [Google Scholar]
  39. Jochum, E.A.; Derks, J. Tonight We Improvise!: Real-Time Tracking for Human-Robot Improvisational Dance; ACM International Conference Proceeding Series; Association for Computing Machinery: New York, NY, USA, 2019. [Google Scholar]
  40. Theofanidis, M.; Lioulemes, A.; Makedon, F.S. A Motion and Force Analysis System for Human Upper-Limb Exercises; ACM International Conference Proceeding Series; Association for Computing Machinery: New York, NY, USA, 2016. [Google Scholar]
  41. Guda, V.K.; Mugisha, S.; Chevallereau, C.; Chablat, D. Introduction of a Cobot as Intermittent Haptic Contact Interfaces in Virtual Reality; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2023; Volume 14028, pp. 497–509. [Google Scholar] [CrossRef]
  42. He, A.; Herron, C.W.; Kalita, B.; Leonessa, A. In-House Built Robust and Adaptable System Architecture for Virtual Reality Haptic Interface. In Proceedings of the ASME International Mechanical Engineering Congress and Exposition, Proceedings (IMECE), Columbus, OH, USA, 30 October–3 November 2022; Volume 5. [Google Scholar]
  43. ISO 8373:2021; Robotics—Vocabulary. International Organization for Standardization: Geneva, Switzerland, 2021.
  44. Jiang, X.; Zhao, S.; Zhu, Y.; Li, Q.; Zhang, J. A Two-Stage Reinforcement Learning Framework for Humanoid Robot Sitting and Standing-Up. Biomimetics 2025, 10, 783. [Google Scholar] [CrossRef]
  45. Jia, T.; Long, H.; McGeady, C.; Yang, X.; Colacrai, F.; Wang, J.; Ji, L.; Li, C.; Farina, D. Physiology-Inspired EEG Transformer for Predicting Movement Transitions in Bimanual Tasks. IEEE J. Biomed. Health Inform. 2025, 24, 1–12. [Google Scholar] [CrossRef]
  46. Nguyen, H.H.; Vu, M.N.; Beck, F.; Ebmer, G.; Nguyen, A.; Kemmetmueller, W.; Kugi, A. Language-Driven Closed-Loop Grasping with Model-Predictive Trajectory Optimization. Mechatronics 2025, 109, 103335. [Google Scholar] [CrossRef]
  47. Nasr, A.; Inkol, K.; McPhee, J. Safety in Wearable Robotic Exoskeletons: Design, Control, and Testing Guidelines. J. Mech. Robot. 2025, 17, 050801. [Google Scholar] [CrossRef]
  48. Reza Mohamadi, S.; Khorashadizadeh, S. Adaptive Fuzzy Control of Time-Varying Impedance in Rehabilitation Exercises. Trans. Inst. Meas. Control 2025, 47, 854–868. [Google Scholar] [CrossRef]
  49. Kim, S.; Jang, H.; Ha, J.; Lee, D.; Ha, Y.; Song, Y. Time-Interval-Based Collision Detection for 4WIS Mobile Robots in Human-Shared Indoor Environments. Sensors 2025, 25, 890. [Google Scholar] [CrossRef] [PubMed]
  50. Li, P.; Feng, S.; Yu, H. Research on Target Localization and Adaptive Scrubbing of Intelligent Bathing Assistance System. Front. Bioeng. Biotechnol. 2025, 13, 1550875. [Google Scholar] [CrossRef]
  51. Maehigashi, A.; Kubo, K.; Nungduk, Y.; Yamada, S. Effects of Robot Bowing during Apology on Trust Repair. In Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction, Melbourne, Australia, 4–6 March 2025; pp. 1478–1482. [Google Scholar] [CrossRef]
  52. Neamah, H.A.; Mayorga Mayorga, O.A. Optimized TD3 Algorithm for Robust Autonomous Navigation in Crowded and Dynamic Human-Interaction Environments. Results Eng. 2024, 24, 102874. [Google Scholar] [CrossRef]
  53. Lu, L.; Xie, Z.; Wang, H.; Su, B.; Jung, S.; Xu, X. Factors Affecting Workers’ Mental Stress in Handover Activities During Human–Robot Collaboration. Hum. Factors 2024, 66, 2621–2635. [Google Scholar] [CrossRef] [PubMed]
  54. Digo, E.; Polito, M.; Pastorelli, S.; Gastaldi, L. Detection of Upper Limb Abrupt Gestures for Human–Machine Interaction Using Deep Learning Techniques. J. Braz. Soc. Mech. Sci. Eng. 2024, 46, 227. [Google Scholar] [CrossRef]
  55. Sun, D.; Wang, J.; Xu, Z.; Bao, J.; Lu, H. Research on Human-Robot Collaboration Method for Parallel Robots Oriented to Segment Docking. Sensors 2024, 24, 1747. [Google Scholar] [CrossRef]
  56. Hariharasudhan, A.; Tang, G.; Webb, P. The Development of an Augmented Reality Gesture Control Human-Robot Interface. In Proceedings of the 2024 12th International Conference on Control, Mechatronics and Automation, ICCMA 2024, London, UK, 11–13 November 2024; pp. 330–335. [Google Scholar] [CrossRef]
  57. Digo, E.; Caselli, E.; Polito, M.; Antonelli, M.; Gastaldi, L.; Pastorelli, S. Test–Retest Repeatability of Human Gestures in Manipulation Tasks. Appl. Sci. 2023, 13, 7808. [Google Scholar] [CrossRef]
  58. Hannum, C.; Li, R.; Wang, W. A Trust-Assist Framework for Human–Robot Co-Carry Tasks. Robotics 2023, 12, 30. [Google Scholar] [CrossRef]
  59. Renz, H.; Krämer, M.; Bertram, T. Uncertainty Estimation for Predictive Collision Avoidance in Human-Robot Collaboration. In Proceedings of the 2023 IEEE International Conference on Robotics and Biomimetics, ROBIO 2023, Koh Samui, Thailand, 4–9 December 2023. [Google Scholar] [CrossRef]
  60. Polito, M.; Digo, E.; Pastorelli, S.; Gastaldi, L. Abrupt Movements Assessment of Human Arms Based on Recurrent Neural Networks for Interaction with Machines. Mech. Mach. Sci. 2023, 147, 143–151. [Google Scholar] [CrossRef]
  61. Polito, M.; Digo, E.; Pastorelli, S.; Gastaldi, L. Deep Learning Technique to Identify Abrupt Movements in Human-Robot Collaboration. Mech. Mach. Sci. 2023, 134, 73–80. [Google Scholar] [CrossRef]
  62. Simas, H.; Di Gregorio, R. Collision Avoidance for Redundant 7-DOF Robots Using a Critically Damped Dynamic Approach. Robotics 2022, 11, 93. [Google Scholar] [CrossRef]
  63. Rosso, V.; Gastaldi, L.; Pastorelli, S. Detecting Impulsive Movements to Increase Operators’ Safety in Manufacturing. Mech. Mach. Sci. 2022, 108, 174–181. [Google Scholar] [CrossRef]
  64. Fratczak, P.; Goh, Y.M.; Kinnell, P.; Justham, L.; Soltoggio, A. Robot Apology as a Post-Accident Trust-Recovery Control Strategy in Industrial Human-Robot Interaction. Int. J. Ind. Ergon. 2021, 82, 103078. [Google Scholar] [CrossRef]
  65. Hannum, C.; Li, R.; Wang, W. Trust or Not?: A Computational Robot-Trusting-Human Model for Human-Robot Collaborative Tasks. In Proceedings of the 2020 IEEE International Conference on Big Data, Big Data 2020, Atlanta, GA, USA, 10–13 December 2020; pp. 5689–5691. [Google Scholar] [CrossRef]
  66. Zahedi, E.; Khosravian, F.; Wang, W.; Armand, M.; Dargahi, J.; Zadeh, M. Towards Skill Transfer via Learning-Based Guidance in Human-Robot Interaction: An Application to Orthopaedic Surgical Drilling Skill. J. Intell. Robot. Syst. Theory Appl. 2020, 98, 667–678. [Google Scholar] [CrossRef]
  67. Zardykhan, D.; Svarny, P.; Hoffmann, M.; Shahriari, E.; Haddadin, S. Collision Preventing Phase-Progress Control for Velocity Adaptation in Human-Robot Collaboration. In Proceedings of the IEEE-RAS International Conference on Humanoid Robots, Toronto, ON, Canada, 15–17 October 2019; pp. 266–273. [Google Scholar] [CrossRef]
  68. Esmaeili, B.; Beyramzad, J.; Seyyedrasuli, M.; Noorani, M.R.S.; Ghanbari, A. Using Fuzzy Neural Network Sliding Mode Control for Human-Exoskeleton Interaction Forces Minimization. In Proceedings of the 2018 IEEE International Conference on Mechatronics and Automation, ICMA 2018, Changchun, China, 5–8 August 2018; pp. 403–410. [Google Scholar] [CrossRef]
  69. Stark, C.; Pereira, A.; Althoff, M. Reachset Conformance Testing of Human Arms with a Biomechanical Model. In Proceedings of the 2nd IEEE International Conference on Robotic Computing, IRC 2018, Laguna Hills, CA, USA, 31 January–2 February 2018; pp. 209–216. [Google Scholar] [CrossRef]
  70. Zahedi, E.; Dargahi, J.; Kia, M.; Zadeh, M. Gesture-Based Adaptive Haptic Guidance: A Comparison of Discriminative and Generative Modeling Approaches. IEEE Robot. Autom. Lett. 2017, 2, 1015–1022. [Google Scholar] [CrossRef]
  71. Quesque, F.; Lewkowicz, D.; Delevoye-Turrell, Y.N.; Coello, Y. Effects of Social Intention on Movement Kinematics in Cooperative Actions. Front. Neurorobot. 2013, 7, 14. [Google Scholar] [CrossRef]
  72. Erden, M.S.; Marić, B. Assisting Manual Welding with Robot. Robot. Comput. Integr. Manuf. 2011, 27, 818–828. [Google Scholar] [CrossRef]
  73. Méndez-Polanco, J.A.; Muñoz-Meléndez, A.; Morales-Manzanares, E.F. Detection of Multiple People by a Mobile Robot in Dynamic Indoor Environments; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2010; Volume 6433, pp. 522–531. [Google Scholar] [CrossRef]
  74. Méndez-Polanco, J.A.; Muñoz-Meléndez, A.; Morales, E.F. People Detection by a Mobile Robot Using Stereo Vision in Dynamic Indoor Environments; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2009; Volume 5845, pp. 349–359. [Google Scholar] [CrossRef]
  75. Mubin, O.; Alnajjar, F.; Jishtu, N.; Alsinglawi, B.; Al Mahmud, A. Exoskeletons With Virtual Reality, Augmented Reality, and Gamification for Stroke Patients’ Rehabilitation: Systematic Review. JMIR Rehabil. Assist. Technol. 2019, 6, e12010. [Google Scholar] [CrossRef] [PubMed]
  76. de Looze, M.P.; Bosch, T.; Krause, F.; Stadler, K.S.; O’Sullivan, L.W. Exoskeletons for Industrial Application and Their Potential Effects on Physical Work Load. Ergonomics 2016, 59, 671–681. [Google Scholar] [CrossRef] [PubMed]
  77. Zafar, M.H.; Langås, E.F.; Sanfilippo, F. Exploring the Synergies between Collaborative Robotics, Digital Twins, Augmentation, and Industry 5.0 for Smart Manufacturing: A State-of-the-Art Review. Robot. Comput. Integr. Manuf. 2024, 89, 102769. [Google Scholar] [CrossRef]
  78. Publications Office of the European Union. Industry 5.0: Towards a Sustainable, Human-Centric and Resilient European Industry; Publications Office of the European Union: Luxembourg, 2021. [Google Scholar] [CrossRef]
  79. Xu, X.; Lu, Y.; Vogel-Heuser, B.; Wang, L. Industry 4.0 and Industry 5.0—Inception, Conception and Perception. J. Manuf. Syst. 2021, 61, 530–535. [Google Scholar] [CrossRef]
  80. Raviola, A.; Guida, R.; De Martin, A.; Pastorelli, S.; Mauro, S.; Sorli, M. Effects of Temperature and Mounting Configuration on the Dynamic Parameters Identification of Industrial Robots. Robotics 2021, 10, 83. [Google Scholar] [CrossRef]
  81. Hopko, S.; Wang, J.; Mehta, R. Human Factors Considerations and Metrics in Shared Space Human-Robot Collaboration: A Systematic Review. Front. Robot. AI 2022, 9, 799522. [Google Scholar] [CrossRef]
  82. Le, T.T.H.; Prihatno, A.T.; Oktian, Y.E.; Kang, H.; Kim, H. Exploring Local Explanation of Practical Industrial AI Applications: A Systematic Literature Review. Appl. Sci. 2023, 13, 5809. [Google Scholar] [CrossRef]
  83. ISO 10218-2:2025; Robots and Robotic Devices—Safety Requirements for Industrial Robots—Part 2: Robot Systems and Integration. International Organization for Standardization: Geneva, Switzerland, 2025.
Figure 1. Literature analysis flow chart.
Figure 1. Literature analysis flow chart.
Applsci 16 03350 g001
Figure 2. Publication year of the selected articles included in the literature analysis.
Figure 2. Publication year of the selected articles included in the literature analysis.
Applsci 16 03350 g002
Figure 3. Radar chart showing the different robot morphologies at each vertex. The three overlapping areas represent the classifications according to their intended use. Concentric rings represent reference values corresponding to the number of reviewed papers.
Figure 3. Radar chart showing the different robot morphologies at each vertex. The three overlapping areas represent the classifications according to their intended use. Concentric rings represent reference values corresponding to the number of reviewed papers.
Applsci 16 03350 g003
Figure 4. Analytical framework for examining abrupt movements in industrial human–robot collaboration. Abbreviations: AI = artificial intelligence; AR/VR = augmented reality/virtual reality; PP = physiological sensors.
Figure 4. Analytical framework for examining abrupt movements in industrial human–robot collaboration. Abbreviations: AI = artificial intelligence; AR/VR = augmented reality/virtual reality; PP = physiological sensors.
Applsci 16 03350 g004
Figure 5. Yearly distribution of the reviewed papers on industrial manipulator robots (n = 13), grouped by the agent performing the abrupt movement (robot or human).
Figure 5. Yearly distribution of the reviewed papers on industrial manipulator robots (n = 13), grouped by the agent performing the abrupt movement (robot or human).
Applsci 16 03350 g005
Figure 6. Distribution of articles based on the agent performing the abrupt movement (rows) and the cause (columns).
Figure 6. Distribution of articles based on the agent performing the abrupt movement (rows) and the cause (columns).
Applsci 16 03350 g006
Figure 7. Example of an abrupt event illustrating the four categories of papers based on the temporal relationship of the analysis to the movement. The number inside each band is related to the number of papers in that specific category.
Figure 7. Example of an abrupt event illustrating the four categories of papers based on the temporal relationship of the analysis to the movement. The number inside each band is related to the number of papers in that specific category.
Applsci 16 03350 g007
Figure 8. Number of classified articles based on the adopted practical approach (four different colors) and the employed supporting technologies (five different patterns). The articles are grouped considering the specific focus of the analysis.
Figure 8. Number of classified articles based on the adopted practical approach (four different colors) and the employed supporting technologies (five different patterns). The articles are grouped considering the specific focus of the analysis.
Applsci 16 03350 g008
Table 1. Selected articles included in the literature analysis, specifying the publication year, the document type, the robot classification according to ISO 8373:2021 [43], and the robot type.
Table 1. Selected articles included in the literature analysis, specifying the publication year, the document type, the robot classification according to ISO 8373:2021 [43], and the robot type.
StudyYearDocument TypeRobot ClassificationRobot Type
Jiang et al. [44]2025ArticleServiceHumanoid
Jia et al. [45]2025Article Industrial/Medical/ServiceNot Specified
Nguyen et al. [46]2025ArticleServiceManipulator
Nasr et al. [47]2025ArticleIndustrial/MedicalExoskeleton
Reza Mohamadi et al. [48]2025ArticleMedicalExoskeleton
Kim et al. [49]2025ArticleServiceMobile Robot
Li et al. [50]2025ArticleServiceManipulator
Maehigashi et al. [51]2025Conference PaperServiceHumanoid
Neamah et al. [52]2024ArticleServiceMobile Robot
Lu et al. [53]2024ArticleIndustrialManipulator
Digo et al. [54]2024ArticleIndustrialManipulator
Sun et al. [55]2024ArticleIndustrial Parallel Robot
Hariharasudhan et al. [56]2024Conference PaperIndustrialManipulator
Digo et al. [57]2023ArticleIndustrialManipulator
Hannum et al. [58]2023ArticleIndustrialManipulator
Renz et al. [59]2023Conference PaperIndustrialManipulator
Polito et al. [60]2023Conference PaperIndustrialManipulator
Polito et al. [61]2023Conference PaperIndustrialManipulator
Simas et al. [62]2022ArticleIndustrialManipulator
Kästner et al. [33]2022Conference PaperServiceMobile Robot
Rosso et al. [63]2022Conference PaperIndustrialManipulator
Frątczak et al. [64]2021ArticleIndustrialManipulator
Hannum et al. [65]2020Conference PaperIndustrialManipulator
Zahedi et al. [66]2020ArticleMedicalHaptic Device
Zardykhan et al. [67]2019Conference PaperIndustrialManipulator
Esmaeili et al. [68]2018Conference PaperIndustrial/MedicalExoskeleton
Stark et al. [69]2018Conference PaperIndustrialNot Specified
Zahedi et al. [70]2017ArticleMedicalHaptic Device
Quesque et al. [71]2013ArticleServiceNot Specified
Erden et al. [72]2011Conference PaperIndustrialHaptic Device
Méndez-Polanco et al. [73]2010Conference PaperServiceMobile Robot
Méndez-Polanco et al. [74]2009Conference PaperServiceMobile Robot
Table 2. Summary of the selected articles included in the subsequent analysis of manipulator robots in industry.
Table 2. Summary of the selected articles included in the subsequent analysis of manipulator robots in industry.
StudyYearAimAgentCauseAnalysis FocusApproachSupporting
Technology
[53]2024Analysis of workers’ mental stress during human–robot handover using combined
objective and subjective measures.
RobotRobotReactionFeature
Extraction
Physiological
Parameters
Sensors
[54]2024Definition of a method to identify human abrupt
movements in the workplace.
HumanExternal/
Robot
DetectionAI-Based
Identification
Inertial Sensors
[56]2024Development of an augmented reality-based gesture interface to improve HRI.RobotRobotPreventionAugmented/
Virtual Reality
Exploitation
Headset
[57]2023Evaluation of repeatability of normal and abrupt pick-and-place gestures.HumanExternal/
Robot
DetectionAI-Based
Identification
Inertial Sensors
[58]2023Development of a robot
system that adapts actions based on human trust levels.
RobotHumanPreventionMotion ControlWeb Camera
[59]2023Evaluation of a GMM-based method for estimating human motion uncertainties in
collaborative scenarios.
HumanExternal/
Robot
PredictionAI-Based
Identification
Simulation
[60]2023Detection of abrupt
movements via forearm
acceleration using
inertial sensors.
HumanExternal/
Robot
DetectionAI-Based
Identification
Inertial Sensors
[61]2023Detection of abrupt
movements via forearm
acceleration using
inertial sensors.
HumanExternal/
Robot
DetectionAI-Based
Identification
Inertial Sensors
[62]2022Reduction of sudden robot avoidance maneuvers using a digital filtering approach.RobotRobotPreventionMotion ControlSimulation
[63]2022Identification of impulsive movements through four
kinematic features.
HumanExternal/
Robot
DetectionFeature
Extraction
Inertial Sensors
[64]2021Evaluation of trust-repair strategies after unexpected
robot actions.
RobotRobotReactionAugmented/
Virtual Reality
Exploitation
Headset
[65]2020Evaluation of a computational trust model to prevent
unpredictable robot behavior.
RobotHumanPreventionMotion ControlWeb Camera
[67]2019Development of a smooth
collision-avoidance
control through robot
velocity modulation.
RobotRobotPreventionMotion ControlRGB-D Camera
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Di Vincenzo, G.; Digo, E.; Cornagliotto, V.; Gastaldi, L.; Pastorelli, S. A Comprehensive Narrative Review of Abrupt Movements in Human–Robot Interaction. Appl. Sci. 2026, 16, 3350. https://doi.org/10.3390/app16073350

AMA Style

Di Vincenzo G, Digo E, Cornagliotto V, Gastaldi L, Pastorelli S. A Comprehensive Narrative Review of Abrupt Movements in Human–Robot Interaction. Applied Sciences. 2026; 16(7):3350. https://doi.org/10.3390/app16073350

Chicago/Turabian Style

Di Vincenzo, Greta, Elisa Digo, Valerio Cornagliotto, Laura Gastaldi, and Stefano Pastorelli. 2026. "A Comprehensive Narrative Review of Abrupt Movements in Human–Robot Interaction" Applied Sciences 16, no. 7: 3350. https://doi.org/10.3390/app16073350

APA Style

Di Vincenzo, G., Digo, E., Cornagliotto, V., Gastaldi, L., & Pastorelli, S. (2026). A Comprehensive Narrative Review of Abrupt Movements in Human–Robot Interaction. Applied Sciences, 16(7), 3350. https://doi.org/10.3390/app16073350

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop