Robotic Motion Techniques for Socially Aware Navigation: A Scoping Review
Abstract
1. Introduction
1.1. Roadmap
1.2. Contributions
- 1.
- To identify the type of robot and sensor technology used in the studies (e.g., locomotion, drive system, type of sensor).
- 2.
- To present the behavioral cues processed in the studies (location of the processing, behavioral cues, biological signals).
- 3.
- To describe the type of environment and situation employed in the experiments (e.g., environment type, location, experimental scenario).
- 4.
- To describe the motion and interaction behaviors that drive the robot motion to assess socially aware navigation.
- 5.
- To describe the evaluation metrics used in the studies (e.g., quantitative and qualitative metrics, statistical analysis, safety considerations).
- 6.
- To report the participant feedback considered in the studies (e.g., sample size, gender, age, feedback).
2. Methods
2.1. Identification of Research Questions
- RQ1.
- What type of robot is used in the studies?: As addressed by Lawrence et al. [17], it is important to select appropriate robot platforms to study social norms in HRI. Therefore, the type of robot involves aspects describing the robot platform employed in order to address SAN.
- RQ2.
- What sensors are used to collect data from behavioral cues to enable robot motion in socially aware navigation?: The identification of sensors used during interaction is relevant because it enables the identification of potential emerging technologies, as well as opportunity areas.
- RQ3.
- How are behavioral cues processed to enable robot motion in socially aware navigation?: The processed behavioral cue is connected to the sensors used, but the focus is on the interpretation of the behavioral cue rather than the hardware. The processing or interpretation of these clues or signals has been included through research questions in previous reviews [1,18,19].
- RQ4.
- To what type of environment and situation is the robot exposed in the experiments conducted to assess robot motion in socially aware navigation?: It is necessary to identify the environment because it enables us to contrast how robot behavior generalizes across environments and the differences between simulation and the real world [20].
- RQ5.
- What motion and interaction behaviors drive the robot’s motion to assess socially aware navigation?: The robot motion is a key aspect to study the effect of different navigation policies [20], as well as what kind of social norms have been incorporated into the robot’s behavior [17], and what types of behaviors a robot should exhibit to ensure social norm compliance [1].
- RQ6.
- Which evaluation metrics were used to assess the robot’s social performance?: As stated before by Francis et al. [20], it is fundamental to identify the methods, metrics, and safety factors in order to obtain insights about the effects of different robot motion techniques.
- RQ7.
- Were participants involved in the experiments?: As a result of conducting experiments with volunteers, a new dimension emerges that goes beyond the evaluation of the experiments and focuses on reporting sample characteristics and post-experiment feedback.
2.2. Identification of Relevant Studies
2.3. Study Selection
2.4. Charting the Data
- RQ1.
- What type of robot is used in the studies?Despite the existence of common robotic features in SAN, there are several platforms and systems. The configuration of these platforms determines the locomotion mechanism and drive system. These aspects are key to analyzing the robot motion because they condition the navigation strategy employed and introduce limitations. In this regard, the theoretical framework to describe the locomotion and drive system used is based on the frameworks proposed by Jahanian O. and Karimi G. [22], and Francisco R. et al. [23]. Additional aspects to describe the type of robot in the context of SAN are the appearance and commercialization. The definition and classification of the appearance will follow the proposal of Baraka K. et al. [24]. Thereby, RQ1 analyzes the following parameters:
- 1.
- 2.
- Drive system: Drive configuration and restraints related to the maneuverability, controllability, and stability of the robot (e.g., differential, omnidirectional, bipedal) as presented by Francisco R. et al. [23].
- 3.
- Appearance: According to Baraka K. et al. [24], appearance is defined as the “physical presence of robots in a shared time and space with humans” and is classified as bio-inspired, artifact-shaped, or functional.
- 4.
- Commercialization: it refers to a commercial robot that is available for purchase.
- RQ2.
- What sensors are used to collect data from behavioral cues to enable robot motion in socially aware navigation?Sensors are classified according to criteria such as level of processing complexity, passive-active, and proprioceptive-exteroceptive [25]. On the other hand, there are taxonomies that group sensors with similar functionalities. For instance, a taxonomy of visual systems is proposed by Martinez-Gomez J. et al. [26]. In this approach, in order to identify the sensors employed, we define the following parameters:
- 1.
- Sensor technology: Sensors used to collect data from behavioral cues and grouped by the level of processing complexity (e.g., force-sensitive sensors, joysticks, monocular cameras).
- 2.
- Sensor specifications: Details of the features associated with the sensors (e.g., latency, range, resolution).
- RQ3.
- How are behavioral cues processed to enable robot motion in socially aware navigation?After describing the sensor technologies used to collect data from behavioral cues, it is necessary to identify the data processing that enables robot motion. Frameworks address this process of quantifying the behavioral cues through several approaches. For instance, they use psychophysiology measurements [27], social cues [5], and physical interactions [28,29]. In this regard, Q3 examines how behavioral cues are processed through the following parameters:
- 1.
- Location of the processing unit: The location of data processing, whether it is outside the robot, inside the robot, or with hybrid management.
- 2.
- Behavioral cues: According to Vinciarelli et al. [6], behavioral cues “describe a set of temporal changes in neuromuscular and physiological activity that last for short intervals of time”. In this regard, they grouped the cues into several classes. For this scoping review, physical appearance, gesture and posture, face and eye behavior, vocal behavior, and space and environment will be used as behavior cue classes.
- 3.
- Biological signals: Information related to the monitoring of participants’ biological signals.
- RQ4.
- To what type of environment and situation is the robot exposed in the experiments conducted to assess robot motion in socially aware navigation?It is necessary to analyze the contexts in which the experiments were conducted in order to replicate the results and understand how the environments influenced the results. Consequently, the experimental environment is described using three parameters: (i) the environment type, which can be real, simulated, or hybrid environment where the experiments are conducted; (ii) the experiment location [30], and (iii) the experimental scenario to understand the properties of the interaction [31].
- 1.
- Environment type: Classifies the nature of the environment of the human–robot interaction as real, simulated, or hybrid.
- 2.
- Location: According to [30], location “characterizes with more detail where the task takes place (setting) and may define some of the requirements for the robot and the task” (e.g., a kitchen, an elevator, a shopping mall).
- 3.
- Experimental scenario: Describes the activity or instructions followed by the robot and participants during the interaction.
- RQ5.
- What motion and interaction behaviors are used to assess socially aware navigation?The concept of behavior primitives can be used as a descriptor of the elemental or low-level actions that a robot can perform: this concept has been explored as expression intent behavior [32] or primitive [33]. The navigation goal seeks to provide a clear description of the target condition or requirements to finish the interaction [34]. In order to analyze the social dimension of the robot’s motion, it is crucial to identify: (i) the social conventions used by the robot, and (ii) a description of the theoretical framework that explains how the robot’s motion and behavior are modeled [35]. These parameters are defined as follows:
- 1.
- Behavior primitives: Description of the low-level actions or behaviors that a robot can perform, and is divided into physical, auditory, and visual primitives.
- 2.
- Navigation goal: Physical motion or sequence of behavior primitives that the robot must perform to fulfill the experimental scenario.
- 3.
- Social conventions: According to [5], social conventions are defined as “behaviors created and accepted by the society that help humans to understand intentions of others and facilitate the communication” (e.g., personal space management, legible navigation, understandable navigation).
- 4.
- Theoretical framework: Core theory (or theories) that govern the overall robot behavior.
- RQ6.
- Which evaluation metrics were used to assess the robot’s social performance?The evaluation of the experiments may involve the robot’s performance, instruments measuring participants’ perceptions [20], and metrics or mechanisms designed to guarantee the participant’s safety [36]. Metrics can be quantitative [37] or qualitative [38]. Additionally, statistical tests are an essential methodological instrument to evaluate the results [39,40]. Based on this, the following parameters are proposed:
- 1.
- Quantitative metrics: Metrics that evaluate objective aspects and their measurements are repeatable and verifiable.
- 2.
- Qualitative metrics: Metrics that evaluate subjective aspects and their measurements are based on observations and human perception.
- 3.
- Statistical validation: Process of applying a set of statistical tests to verify the reliability of data and validate the analytical method.
- 4.
- Safety considerations: mechanisms or considerations related to the safety of participants while interacting with the robot.
- RQ7.
- Were participants involved in the experiments?According to [41,42], a key aspect to assess the robot performance in HRI is the feedback of volunteers who participated in the experiments. Based on this, the following data were extracted for each article:
- 1.
- Sample: Number of volunteers who participated in the experiments.
- 2.
- Gender: The gender of volunteers.
- 3.
- Age: The ages of the participants.
- 4.
- Participant’s background: Information provided by the authors on participant’s features (e.g., educational level, medical condition, nationality).
- 5.
- Participant’s feedback. This involves participants’ opinions on the robot’s performance collected via questionnaires and interviews.
3. Results
3.1. RQ1 What Type of Robot Is Used in the Studies?
3.1.1. Locomotion
3.1.2. Drive System
3.1.3. Robot Appearance
3.1.4. Commercial Robots
3.2. RQ2 What Sensors Are Used to Collect Data from Behavioral Cues to Enable Robot Motion in Socially Aware Navigation?
3.2.1. Sensor Technology
3.2.2. Sensor Specifications
3.3. RQ3 How Are Behavioral Cues Processed to Enable Robot Motion in Socially Aware Navigation?
3.3.1. Location of the Processing Unit
3.3.2. Behavioral Cues
3.3.3. Biological Signals
3.4. RQ4 To What Type of Environment and Situation Is the Robot Exposed in the Experiments Conducted to Assess Robot Motion in Socially Aware Navigation?
3.4.1. Environment Type
3.4.2. Location
3.4.3. Experimental Scenario
3.5. RQ5 What Motion and Interaction Behaviors Drive the Robot’s Motion to Assess Socially Aware Navigation?
3.5.1. Behavior Primitives
3.5.2. Navigation Goal
3.5.3. Social Conventions
3.5.4. Theoretical Framework
3.6. RQ6 Which Evaluation Metrics Were Used to Assess the Robot’s Social Performance?
3.6.1. Qualitative Metrics
3.6.2. Quantitative Metrics
3.6.3. Statistical Analysis
3.6.4. Safety Considerations
3.7. RQ7 Were Participants Involved in the Experiments?
3.7.1. Sample
3.7.2. Gender
3.7.3. Age
3.7.4. Participant’s Background
3.7.5. Participant’s Feedback
3.8. Additional Results
4. Discussion and Future Directions
4.1. RQ1 What Type of Robot Is Used in the Studies?
- To explore underrepresented locomotion mechanisms (e.g., quadrupeds, cobots, drones) to evaluate how embodiment affects the interaction.
- To design multi-factorial experiments to compare how functional and bio-inspired appearances influence user perceptions.
- To establish minimum robot specification reporting standards (e.g., locomotion, drive system, speed/acceleration limits) for SAN research to improve reproducibility.
4.2. RQ2 What Sensors Are Used to Collect Data from Behavioral Cues to Enable Robot Motion in Socially Aware Navigation?
- To combine multimodal inputs (e.g., audio, tactile, haptic) using sensor fusion, so that robots could interact with people involving several behavioral cues as humans do.
- To integrate wearable technologies, such as VR headsets.
- To develop and integrate benchmarks for multimodal SAN sensing, including ground-truth reports.
- To require systematic reporting of sensor specifications (e.g., latency, frame rate, resolution) to improve methodological reproducibility.
4.3. RQ3 How Are Behavioral Cues Processed to Enable Robot Motion in Socially Aware Navigation?
- To explore behavioral cues to incorporate voice intensities, or facial features.
- To include biosignals to contrast volunteers’ answers with physiological data and double-check the participant perception.
- To develop hybrid processing architectures, balancing robot autonomy with cloud-based heavy processing.
4.4. RQ4 To What Type of Environment and Situation Is the Robot Exposed in the Experiments Conducted to Assess SAN?
- To design standardized experimental scenarios with systematically varied social density, or interaction types.
- To conduct multi-location studies to validate findings across different contexts.
- To explore simulation-to-real navigation technologies (e.g., digital twins, virtual reality, cloud-based technologies) to increase the variability of scenarios.
4.5. RQ5 What Motion and Interaction Behaviors Drive the Robot’s Motion to Assess Socially Aware Navigation?
- To establish a taxonomy of SAN behavior primitives in concordance with recent trends and with a clear map to the navigation goals involved.
- To develop multimodal strategies combining behavior primitives, such as motion, audio, and visual cues, to improve legibility.
- To implement multi-objective navigation planners, balancing operational efficiency, comfort, and safety under social constraints.
- To perform studies to measure the contribution of different primitives in human-to-human interaction.
- To increase the development of cross-cultural SAN models.
- To combine rule-based safety constraints with machine learning frameworks for more flexible and robust navigation.
- To establish safety frameworks or protocols to improve SAN in general scenarios.
4.6. RQ6 Which Evaluation Metrics Were Used to Assess the Robot’s Social Performance?
- To develop standardized instruments based on fundamental social conventions and promote the use of already standardized instruments (e.g., RoSAS, Godspeed).
- To promote robust statistical practices, including power analysis, effect sizes, and methodological pipelines.
- To improve the safety evaluation mechanism to include dangers or undesired behaviors beyond collision avoidance.
4.7. RQ7 Were Participants Involved in the Experiments?
- To improve demographic reporting standards (e.g., gender, age, cultural background) in SAN studies. Although sociodemographic analysis is not required for research design in human–robot interaction, demographic variables play a key role in analyzing the sociability in human–robot interaction and in comparing results of studies across populations to identify sociocultural differences. Therefore, the results of this review suggest that studies should report demographic variables such as age, gender, and country of residence, as well as whether the participant has previous experience with robots.
- To design questionnaires that collect the participants’ feedback during the interaction with the robot.
- To collect biosignals from the participants (e.g., electroencephalography signals, electrodermal activity, respiration) in order to analyze their emotions and reactions while they are interacting with the robot. None of the studies included in this review has used sensors to collect biological activity from the participants during the interaction.
- To promote workshops in the scientific, academic, and industrial communities in order to explore experimental designs for human–robot interaction in real scenarios and to collect feedback from participants and stakeholders, so that guidelines could be obtained for further research in terms of SAN strategies and experimental design.
4.8. Limitations
- Databases (DBs): The search was limited to a specific set of databases. Although these cover several relevant studies, it is possible that studies published in other sources may have been excluded from the analysis.
- Keywords: The keywords include “behavior” in the title and “robot”, and “interaction” in the document. These keywords were selected based on the selection of previous work presented in Table 1. Additionally, based on [4,5,21] the keywords “human”, “aware”, and “navigation” were included. Nevertheless, studies using related terms or variations such as “human-computer”, “social awareness” or “social behavior” may have been excluded.
- Access: Only studies that were open access or accessible through the author’s institution were analyzed. This may have introduced selection bias by excluding works published in non-accessible sources.
5. Conclusions and Future Work
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Mavrogiannis, C.; Baldini, F.; Wang, A.; Zhao, D.; Trautman, P.; Steinfeld, A.; Oh, J. Core Challenges of Social Robot Navigation: A Survey. J. Hum.-Robot Interact. 2023, 12, 1–39. [Google Scholar] [CrossRef]
- de Graaf, M.M.; Ben Allouch, S. Exploring influencing variables for the acceptance of social robots. Robot. Auton. Syst. 2013, 61, 1476–1486. [Google Scholar] [CrossRef]
- Schneiders, E.; Kanstrup, A.M.; Kjeldskov, J.; Skov, M.B. Domestic Robots and the Dream of Automation: Understanding Human Interaction and Intervention. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (CHI ’21), Yokohama, Japan, 8–13 May 2021. [Google Scholar] [CrossRef]
- Kruse, T.; Pandey, A.K.; Alami, R.; Kirsch, A. Human-aware robot navigation: A survey. Robot. Auton. Syst. 2013, 61, 1726–1743. [Google Scholar] [CrossRef]
- Rios-Martinez, J.; Spalanzani, A.; Laugier, C. From Proxemics Theory to Socially-Aware Navigation: A Survey. Int. J. Soc. Robot. 2015, 7, 137–153. [Google Scholar] [CrossRef]
- Vinciarelli, A.; Pantic, M.; Bourlard, H.; Pentland, A. Social signal processing: State-of-the-art and future perspectives of an emerging domain. In Proceedings of the 16th ACM International Conference on Multimedia (MM ’08), Vancouver, BC, Canada, 26–31 October 2008; pp. 1061–1070. [Google Scholar] [CrossRef]
- Klančar, G.; Zdešar, A.; Blažič, S.; Škrjanc, I. Chapter 2—Motion Modeling for Mobile Robots. In Wheeled Mobile Robotics; Klančar, G., Zdešar, A., Blažič, S., Škrjanc, I., Eds.; Butterworth-Heinemann: Oxford, UK, 2017; pp. 13–59. [Google Scholar] [CrossRef]
- Schulz, T.; Soma, R.; Holthaus, P. Movement acts in breakdown situations: How a robot’s recovery procedure affects participants’ opinions. Paladyn J. Behav. Robot. 2021, 12, 336–355. [Google Scholar] [CrossRef]
- Mahdi, H.; Akgun, S.A.; Saleh, S.; Dautenhahn, K. A survey on the design and evolution of social robots — Past, present and future. Robot. Auton. Syst. 2022, 156, 104193. [Google Scholar] [CrossRef]
- Venture, G.; Kulić, D. Robot Expressive Motions: A Survey of Generation and Evaluation Methods. J. Hum.-Robot Interact. 2019, 8, 1–17. [Google Scholar] [CrossRef]
- Pascher, M.; Gruenefeld, U.; Schneegass, S.; Gerken, J. How to Communicate Robot Motion Intent: A Scoping Review. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (CHI ’23), Hamburg, Germany, 23–28 April 2023. [Google Scholar] [CrossRef]
- Nocentini, O.; Fiorini, L.; Acerbi, G.; Sorrentino, A.; Mancioppi, G.; Cavallo, F. A Survey of Behavioral Models for Social Robots. Robotics 2019, 8, 54. [Google Scholar] [CrossRef]
- Tricco, A.C.; Lillie, E.; Zarin, W.; O’Brien, K.K.; Colquhoun, H.; Levac, D.; Moher, D.; Peters, M.D.J.; Horsley, T.; Weeks, L.; et al. PRISMA Extension for Scoping Reviews (PRISMA-ScR): Checklist and Explanation. Ann. Intern. Med. 2018, 169, 467–473. [Google Scholar] [CrossRef] [PubMed]
- Arksey, H.; O’Malley, L. Scoping studies: Towards a methodological framework. Int. J. Soc. Res. Methodol. 2005, 8, 19–32. [Google Scholar] [CrossRef]
- Guillén Ruiz, S.; Calderita, L.V.; Hidalgo-Paniagua, A.; Bandera Rubio, J.P. Measuring Smoothness as a Factor for Efficient and Socially Accepted Robot Motion. Sensors 2020, 20, 6822. [Google Scholar] [CrossRef]
- Babel, F.; Kraus, J.; Baumann, M. Findings From A Qualitative Field Study with An Autonomous Robot in Public: Exploration of User Reactions and Conflicts. Int. J. Soc. Robot. 2022, 14, 1625–1655. [Google Scholar] [CrossRef]
- Lawrence, S.; Jouaiti, M.; Hoey, J.; Nehaniv, C.L.; Dautenhahn, K. The Role of Social Norms in Human–Robot Interaction: A Systematic Review. J. Hum.-Robot Interact. 2025, 14, 1–44. [Google Scholar] [CrossRef]
- Zhou, Y. Perceived Appropriateness: A Novel View for Remediating Perceived Inappropriate Robot Navigation Behaviors. In Proceedings of the Companion of the 2023 ACM/IEEE International Conference on Human-Robot Interaction (HRI ’23), Stockholm, Sweden, 13–16 March 2023; pp. 781–783. [Google Scholar] [CrossRef]
- Chen, H.; Chan, I.Y.; Dong, Z.; Guo, Q.; Hong, J.; Twum-Ampofo, S. Biosignal measurement for human-robot collaboration in construction: A systematic review. Adv. Eng. Inform. 2025, 68, 103652. [Google Scholar] [CrossRef]
- Francis, A.; Pérez-D’Arpino, C.; Li, C.; Xia, F.; Alahi, A.; Alami, R.; Bera, A.; Biswas, A.; Biswas, J.; Chandra, R.; et al. Principles and Guidelines for Evaluating Social Robot Navigation Algorithms. J. Hum.-Robot Interact. 2025, 14, 1–65. [Google Scholar] [CrossRef]
- Charalampous, K.; Kostavelis, I.; Gasteratos, A. Recent trends in social aware robot navigation: A survey. Robot. Auton. Syst. 2017, 93, 85–104. [Google Scholar] [CrossRef]
- Jahanian, O.; Karimi, G. Locomotion Systems in Robotic Application. In Proceedings of the 2006 IEEE International Conference on Robotics and Biomimetics, Kunming, China, 17–20 December 2006; pp. 689–696. [Google Scholar] [CrossRef]
- Rubio, F.; Valero, F.; Llopis-Albert, C. A review of mobile robots: Concepts, methods, theoretical framework, and applications. Int. J. Adv. Robot. Syst. 2019, 16, 1729881419839596. [Google Scholar] [CrossRef]
- Baraka, K.; Alves-Oliveira, P.; Ribeiro, T. An Extended Framework for Characterizing Social Robots. In Human-Robot Interaction: Evaluation Methods and Their Standardization; Springer International Publishing: Cham, Switzerland, 2020; pp. 21–64. [Google Scholar] [CrossRef]
- Acharya, V.R.; Rao, V.S. Exploring Modern Sensor in Robotics: A review. In Proceedings of the 2024 Asia Pacific Conference on Innovation in Technology (APCIT), Mysore, India, 26–27 July 2024; pp. 1–5. [Google Scholar] [CrossRef]
- Martinez-Gomez, J.; Fernandez-Caballero, A.; Garcia-Varea, I.; Rodriguez, L.; Romero-Gonzalez, C. A Taxonomy of Vision Systems for Ground Mobile Robots. Int. J. Adv. Robot. Syst. 2014, 11, 111. [Google Scholar] [CrossRef]
- Bethel, C.L.; Salomon, K.; Murphy, R.R.; Burke, J.L. Survey of Psychophysiology Measurements Applied to Human-Robot Interaction. In Proceedings of the RO-MAN 2007-The 16th IEEE International Symposium on Robot and Human Interactive Communication, Jeju, Republic of Korea, 26–29 August 2007; pp. 732–737. [Google Scholar] [CrossRef]
- Hu, Y.; Abe, N.; Benallegue, M.; Yamanobe, N.; Venture, G.; Yoshida, E. Toward Active Physical Human–Robot Interaction: Quantifying the Human State During Interactions. IEEE Trans. -Hum.-Mach. Syst. 2022, 52, 367–378. [Google Scholar] [CrossRef]
- Abdulazeem, N.; Hu, Y. Human Factors Considerations for Quantifiable Human States in Physical Human–Robot Interaction: A Literature Review. Sensors 2023, 23, 7381. [Google Scholar] [CrossRef]
- Rodrigues, P.B.; Singh, R.; Oytun, M.; Adami, P.; Woods, P.J.; Becerik-Gerber, B.; Soibelman, L.; Copur-Gencturk, Y.; Lucas, G.M. A multidimensional taxonomy for human-robot interaction in construction. Autom. Constr. 2023, 150, 104845. [Google Scholar] [CrossRef]
- Onnasch, L.; Roesler, E. A Taxonomy to Structure and Analyze Human–Robot Interaction. Int. J. Soc. Robot. 2021, 13, 833–849. [Google Scholar] [CrossRef]
- Fujioka, Y.; Liu, Y.; Kanda, T. I Need to Pass Through! Understandable Robot Behavior for Passing Interaction in Narrow Environment. In Proceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction (HRI ’24), Boulder, CO, USA, 11–15 March 2024; pp. 213–221. [Google Scholar] [CrossRef]
- Scales, P.; Aubergé, V.; Aycard, O. Inducing Social Perceptions of a Mobile Robot for Human-Aware Navigation. In Proceedings of the 2024 18th International Conference on Control, Automation, Robotics and Vision (ICARCV), Dubai, United Arab Emirates, 12–15 December 2024; pp. 530–536. [Google Scholar] [CrossRef]
- Karwowski, J.; Szynkiewicz, W.; Niewiadomska-Szynkiewicz, E. Bridging Requirements, Planning, and Evaluation: A Review of Social Robot Navigation. Sensors 2024, 24, 2794. [Google Scholar] [CrossRef]
- Shi, H.; Yu, W.; Madani, K. A Review on Social Awareness Navigation for Service Robots. In Proceedings of the Social Robotics, Naples, Italy, 10–12 September 2025; Ge, S.S., Luo, Z., Wang, Y., Samani, H., Ji, R., He, H., Eds.; Springer International Publishing: Singapore, 2025; pp. 143–152. [Google Scholar]
- Oruma, S.O.; Ayele, Y.Z.; Sechi, F.; Rødsethol, H. Security Aspects of Social Robots in Public Spaces: A Systematic Mapping Study. Sensors 2023, 23, 8056. [Google Scholar] [CrossRef] [PubMed]
- Karwowski, J.; Szynkiewicz, W. Quantitative Metrics for Benchmarking Human-Aware Robot Navigation. IEEE Access 2023, 11, 79941–79953. [Google Scholar] [CrossRef]
- Krägeloh, C.U.; Bharatharaj, J.; Sasthan Kutty, S.K.; Nirmala, P.R.; Huang, L. Questionnaires to Measure Acceptability of Social Robots: A Critical Review. Robotics 2019, 8, 88. [Google Scholar] [CrossRef]
- Doğan, S.; Çolak, A. Social robots in the instruction of social skills in autism: A comprehensive descriptive analysis of single-case experimental designs. Disabil. Rehabil. Assist. Technol. 2024, 19, 325–344. [Google Scholar] [CrossRef] [PubMed]
- Stower, R.; Calvo-Barajas, N.; Castellano, G.; Kappas, A. A Meta-analysis on Children’s Trust in Social Robots. Int. J. Soc. Robot. 2021, 13, 1979–2001. [Google Scholar] [CrossRef]
- Akalin, N.; Kristoffersson, A.; Loutfi, A. The Influence of Feedback Type in Robot-Assisted Training. Multimodal Technol. Interact. 2019, 3, 67. [Google Scholar] [CrossRef]
- Mirnig, N.; Tan, Y.K.; Chang, T.W.; Chua, Y.W.; Dung, T.A.; Li, H.; Tscheligi, M. Screen feedback in human-robot interaction: How to enhance robot expressiveness. In Proceedings of the 23rd IEEE International Symposium on Robot and Human Interactive Communication, Edinburgh, UK, 25–29 August 2014; pp. 224–230. [Google Scholar] [CrossRef]
- Babel, F.; Kraus, J.; Miller, L.; Kraus, M.; Wagner, N.; Minker, W.; Baumann, M. Small Talk with a Robot? The Impact of Dialog Content, Talk Initiative, and Gaze Behavior of a Social Robot on Trust, Acceptance, and Proximity. Int. J. Soc. Robot. 2021, 13, 1485–1498. [Google Scholar] [CrossRef]
- Maroto-Gómez, M.; Malfaz, M.; Castro-González, Á.; Salichs, M.Á. Deep Reinforcement Learning for the Autonomous Adaptive Behavior of Social Robots. In Proceedings of the Social Robotics, Florence, Italy, 13 December 2022; Cavallo, F., Cabibihan, J.J., Fiorini, L., Sorrentino, A., He, H., Liu, X., Matsumoto, Y., Ge, S.S., Eds.; Springer International Publishing: Cham, Switzerland, 2022; pp. 208–217. [Google Scholar]
- Maj, K.; Grzybowicz, P.; Kopeć, J. “No, I Won’t Do That.” Assertive Behavior of Robots and its Perception by Children. Int. J. Soc. Robot. 2024, 16, 1489–1507. [Google Scholar] [CrossRef]
- Chien, S.Y.; Lin, Y.L.; Lee, P.J.; Han, S.; Lewis, M.; Sycara, K. Attention allocation for human multi-robot control: Cognitive analysis based on behavior data and hidden states. Int. J. -Hum.-Comput. Stud. 2018, 117, 30–44. [Google Scholar] [CrossRef]
- Xie, L.; Liu, C.; Li, D. Proactivity or passivity? An investigation of the effect of service robots’ proactive behaviour on customer co-creation intention. Int. J. Hosp. Manag. 2022, 106, 103271. [Google Scholar] [CrossRef]
- Adami, P.; Rodrigues, P.B.; Woods, P.J.; Becerik-Gerber, B.; Soibelman, L.; Copur-Gencturk, Y.; Lucas, G. Effectiveness of VR-based training on improving construction workers’ knowledge, skills, and safety behavior in robotic teleoperation. Adv. Eng. Inform. 2021, 50, 101431. [Google Scholar] [CrossRef]
- Shi, H.; Li, J.; Li, Z. A Distributed Strategy for Cooperative Autonomous Robots Using Pedestrian Behavior for Multi-Target Search in the Unknown Environment. Sensors 2020, 20, 1606. [Google Scholar] [CrossRef]
- Tan, Y.Z.; Tran, T.; Lin, S.; Zhao, F.; Ng, Y.S.; Ma, D.; Balan, R. How is our mobility affected as we age? Findings from a 934 users field study of older adults conducted in an urban Asian city. In Proceedings of the Behavior Transformation by IoT International Workshop, New York, NY, USA, 3–7 June 2024; Association for Computing Machinery: New York, NY, USA, 2024. [Google Scholar]
- Bobu, A.; Scobee, D.R.R.; Fisac, J.F.; Sastry, S.S.; Dragan, A.D. LESS is More: Rethinking Probabilistic Models of Human Behavior. In Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction (HRI ’20), Cambridge, UK, 23–26 March 2020; pp. 429–437. [Google Scholar] [CrossRef]
- Briggs, G.; Chita-Tegmark, M.; Krause, E.; Bridewell, W.; Bello, P.; Scheutz, M. A Novel Architectural Method for Producing Dynamic Gaze Behavior in Human-Robot Interactions. In Proceedings of the 2022 17th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Sapporo, Japan, 7–10 March 2022; pp. 383–392. [Google Scholar] [CrossRef]
- Ceha, J.; Chhibber, N.; Goh, J.; McDonald, C.; Oudeyer, P.Y.; Kulić, D.; Law, E. Expression of Curiosity in Social Robots: Design, Perception, and Effects on Behaviour. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI ’19), Glasgow Scotland, UK, 4–9 May 2019; pp. 1–12. [Google Scholar] [CrossRef]
- Chesser, M.; Chea, L.; Van Nguyen, H.; Ranasinghe, D.C. bTracked: Highly Accurate Field Deployable Real-Time Indoor Spatial Tracking for Human Behavior Observations. In Proceedings of the 15th EAI International Conference on Mobile and Ubiquitous Systems: Computing, Networking and Services (MobiQuitous ’18), New York, NY, USA, 5–7 November 2018; pp. 1–10. [Google Scholar] [CrossRef]
- Doering, M.; Kanda, T.; Ishiguro, H. Neural-network-based Memory for a Social Robot: Learning a Memory Model of Human Behavior from Data. J. Hum.-Robot Interact. 2019, 8, 1–27. [Google Scholar] [CrossRef]
- Dudzik, B.; Columbus, S.; Hrkalovic, T.M.; Balliet, D.; Hung, H. Recognizing Perceived Interdependence in Face-to-Face Negotiations through Multimodal Analysis of Nonverbal Behavior. In Proceedings of the 2021 International Conference on Multimodal Interaction (ICMI ’21), Montréal, QC, Canada, 18–22 October 2021; pp. 121–130. [Google Scholar] [CrossRef]
- Duvivier Victor Roger, Y.; Perreira da Silva, M.; Prié, Y. AI-Human Collaboration for in Situ Interactive Exploration of Behaviours From Immersive Environment. In Proceedings of the 2023 ACM International Conference on Interactive Media Experiences (IMX ’23), Nantes, France, 12–15 June 2023; pp. 427–430. [Google Scholar] [CrossRef]
- Elgarf, M.; Calvo-Barajas, N.; Paiva, A.; Castellano, G.; Peters, C. Reward Seeking or Loss Aversion? Impact of Regulatory Focus Theory on Emotional Induction in Children and Their Behavior Towards a Social Robot. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (CHI ’21), Yokohama, Japan, 8–13 May 2021. [Google Scholar] [CrossRef]
- Erel, H.; Oberlender, A.; Khatib, J.; Freund, N.; Sadeh, O.; Waksberg, J.; Carsenti, E. The Power of Opening Encounters in HRI: How Initial Robotic Behavior Shapes the Interaction that Follows. In Proceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction (HRI ’24), Boulder, CO, USA, 11–15 March 2024; pp. 203–212. [Google Scholar] [CrossRef]
- Fischer, K. Tracking Anthropomorphizing Behavior in Human-Robot Interaction. J. Hum.-Robot Interact. 2021, 11, 1–28. [Google Scholar] [CrossRef]
- Gali-Perez, O.; Sayis, B.; Pares, N. Effectiveness of a Mixed Reality system in terms of social interaction behaviors in children with and without Autism Spectrum Condition. In Proceedings of the XXI International Conference on Human Computer Interaction (Interacción ’21), Málaga, Spain, 22–24 September 2021. [Google Scholar] [CrossRef]
- Gallo, D.; Gonzalez-Jimenez, S.; Grasso, M.A.; Boulard, C.; Colombino, T. Exploring Machine-like Behaviors for Socially Acceptable Robot Navigation in Elevators. In Proceedings of the 2022 17th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Sapporo, Japan, 7–10 March 2022; pp. 130–138. [Google Scholar] [CrossRef]
- Green, N.; Works, K. Measuring Users’ Attitudinal and Behavioral Responses to Persuasive Communication Techniques in Human Robot Interaction. In Proceedings of the 2022 17th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Sapporo, Japan, 7–10 March 2022; pp. 778–782. [Google Scholar] [CrossRef]
- Guzzi, J.; Giusti, A.; Gambardella, L.M.; Di Caro, G.A. A model of artificial emotions for behavior-modulation and implicit coordination in multi-robot systems. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO ’18), Lisbon, Portugal, 15–19 July 2018; pp. 21–28. [Google Scholar] [CrossRef]
- Han, Z.; Yanco, H. Communicating Missing Causal Information to Explain a Robot’s Past Behavior. J. Hum.-Robot Interact. 2023, 12, 1–45. [Google Scholar] [CrossRef]
- Hu, Y.; Feng, L.; Mutlu, B.; Admoni, H. Exploring the Role of Social Robot Behaviors in a Creative Activity. In Proceedings of the 2021 ACM Designing Interactive Systems Conference (DIS ’21), Virtual, 28 June–2 July 2021; pp. 1380–1389. [Google Scholar] [CrossRef]
- Ishino, T.; Goto, M.; Kashihara, A. A Robot for Reconstructing Presentation Behavior in Lecture. In Proceedings of the 6th International Conference on Human-Agent Interaction (HAI ’18), Southampton, UK, 15–18 December 2018; pp. 67–75. [Google Scholar] [CrossRef]
- Jamshad, R.; Haripriyan, A.; Sonti, A.; Simkins, S.; Riek, L.D. Taking Initiative in Human-Robot Action Teams: How Proactive Robot Behaviors Affect Teamwork. In Proceedings of the Companion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction (HRI ’24), Boulder, CO, USA, 11–15 March 2024; pp. 559–562. [Google Scholar] [CrossRef]
- Jelínek, M.; Nichols, E.; Gomez, R. Developing Autonomous Robot-Mediated Behavior Coaching Sessions with Haru. In Proceedings of the Companion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction (HRI ’24), Boulder, CO, USA, 11–15 March 2024; pp. 573–577. [Google Scholar] [CrossRef]
- Kubota, A.; Peterson, E.I.C.; Rajendren, V.; Kress-Gazit, H.; Riek, L.D. JESSIE: Synthesizing Social Robot Behaviors for Personalized Neurorehabilitation and Beyond. In Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction (HRI ’20), Cambridge UK, 23–26 March 2020; pp. 121–130. [Google Scholar] [CrossRef]
- Kubota, A.; Riek, L.D. Behavior Adaptation for Robot-assisted Neurorehabilitation. In Proceedings of the Companion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction (HRI ’21 Companion), Boulder, CO, USA, 8–11 March 2021; pp. 565–567. [Google Scholar] [CrossRef]
- Leonardi, N.; Manca, M.; Paternò, F.; Santoro, C. Trigger-Action Programming for Personalising Humanoid Robot Behaviour. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI ’19), Glasgow Scotland, UK, 4–9 May 2019; pp. 1–13. [Google Scholar] [CrossRef]
- Lin, T.C.; Unni Krishnan, A.; Li, Z. Perception-Motion Coupling in Active Telepresence: Human Behavior and Teleoperation Interface Design. J. Hum.-Robot Interact. 2023, 12, 1–24. [Google Scholar] [CrossRef]
- Moorman, N.; Hedlund-Botti, E.; Schrum, M.; Natarajan, M.; Gombolay, M.C. Impacts of Robot Learning on User Attitude and Behavior. In Proceedings of the 2023 ACM/IEEE International Conference on Human-Robot Interaction (HRI ’23), Stockholm, Sweden, 13–16 March 2023; pp. 534–543. [Google Scholar] [CrossRef]
- Moro, C.; Nejat, G.; Mihailidis, A. Learning and Personalizing Socially Assistive Robot Behaviors to Aid with Activities of Daily Living. J. Hum.-Robot Interact. 2018, 7, 1–25. [Google Scholar] [CrossRef]
- Olafsson, S.; O’Leary, T.; Bickmore, T. Coerced Change-talk with Conversational Agents Promotes Confidence in Behavior Change. In Proceedings of the 13th EAI International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth’19), Trento, Italy, 20–23 May 2019; pp. 31–40. [Google Scholar] [CrossRef]
- Parreira, M.T.; Gillet, S.; Vázquez, M.; Leite, I. Design Implications for Effective Robot Gaze Behaviors in Multiparty Interactions. In Proceedings of the 2022 17th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Sapporo, Japan, 7–10 March 2022; pp. 976–980. [Google Scholar] [CrossRef]
- Parreira, M.T.; Gillet, S.; Winkle, K.; Leite, I. How Did We Miss This? A Case Study on Unintended Biases in Robot Social Behavior. In Proceedings of the Companion of the 2023 ACM/IEEE International Conference on Human-Robot Interaction (HRI ’23), Stockholm, Sweden, 13–16 March 2023; pp. 11–20. [Google Scholar] [CrossRef]
- Pöppel, J.; Kopp, S. Satisficing Models of Bayesian Theory of Mind for Explaining Behavior of Differently Uncertain Agents: Socially Interactive Agents Track. In Proceedings of the 17th International Conference on Autonomous Agents and MultiAgent Systems (AAMAS ’18), Stockholm, Sweden, 10–15 July 2018; pp. 470–478. [Google Scholar]
- Sharma, K.; Papavlasopoulou, S.; Giannakos, M.; Jaccheri, L. Kid Coding Games and Artistic Robots: Attitudes and Gaze Behavior. In Proceedings of the Conference on Creativity and Making in Education (FabLearn Europe’18), Trondheim, Norway, 18 June 2018; pp. 64–71. [Google Scholar] [CrossRef]
- Strohkorb Sebo, S.; Traeger, M.; Jung, M.; Scassellati, B. The Ripple Effects of Vulnerability: The Effects of a Robot’s Vulnerable Behavior on Trust in Human-Robot Teams. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction (HRI ’18), Chicago, IL, USA, 5–8 March 2018; pp. 178–186. [Google Scholar] [CrossRef]
- Sun, X. Conversational Interface cooperating with AI and Monitoring Technology adopting Human-in-the-Loop Interaction for Intelligent Behavioral Intervention. In Proceedings of the Companion Proceedings of the 28th International Conference on Intelligent User Interfaces (IUI ’23 Companion), Sydney, NSW, Australia, 27–31 March 2023; pp. 243–245. [Google Scholar] [CrossRef]
- Wieringa, M.S.; Müller, B.C.; Bijlstra, G.; Bosse, T. The Peg-Turning Dilemma: An Experimental Framework for Measuring Altruistic Behaviour Towards Robots. In Proceedings of the Companion of the 2023 ACM/IEEE International Conference on Human-Robot Interaction (HRI ’23), Stockholm, Sweden, 13–16 March 2023; pp. 351–354. [Google Scholar] [CrossRef]
- Wintersberger, P.; Shahu, A.; Reisinger, J.; Alizadeh, F.; Michahelles, F. Self-Balancing Bicycles: Qualitative Assessment and Gaze Behavior Evaluation. In Proceedings of the 21st International Conference on Mobile and Ubiquitous Multimedia (MUM ’22), Lisbon, Portugal, 27–30 November 2022; pp. 189–199. [Google Scholar] [CrossRef]
- Yoon, S.; Kim, S.; Park, G.; Lim, H. Evaluating How Desktop Companion Robot Behaviors Influence Work Experience and Robot Perception. In Proceedings of the Extended Abstracts of the CHI Conference on Human Factors in Computing Systems (CHI EA ’24), Honolulu, HI, USA, 11–16 May 2024. [Google Scholar] [CrossRef]
- Zhou, J.; Hu, Y. Beyond Words: Infusing Conversational Agents with Human-like Typing Behaviors. In Proceedings of the 6th ACM Conference on Conversational User Interfaces (CUI ’24), Luxembourg, 8–10 July 2024. [Google Scholar] [CrossRef]
- Söderlund, M. Service robots and artificial morality: An examination of robot behavior that violates human privacy. J. Serv. Theory Pract. 2023, 33, 52–72. [Google Scholar] [CrossRef]
- Gong, T. The effect of service robots on employees’ customer service performance and service-oriented organizational citizenship behavior. J. Serv. Theory Pract. 2025, 35, 319–347. [Google Scholar] [CrossRef]
- Gong, T. The dark side of fairness: How perceived fairness in service robot implementation leads to employee dysfunctional behavior. J. Serv. Mark. 2025, 39, 347–364. [Google Scholar] [CrossRef]
- Lai, C.C.; Yang, B.J.; Lin, C.J. Applying Reinforcement Learning for AMR’s Docking and Obstacle Avoidance Behavior Control. Appl. Sci. 2025, 15, 3773. [Google Scholar] [CrossRef]
- Borsuk, A.; Chybicki, A.; Zieliński, M. Classification of User Behavior Patterns for Indoor Navigation Problem. Sensors 2025, 25, 4673. [Google Scholar] [CrossRef]
- He, C.; Duhan, T.; Tulsyan, P.; Kim, P.; Sartoretti, G. Social behavior as a key to learning-based multi-agent pathfinding dilemmas. Artif. Intell. 2025, 348, 104397. [Google Scholar] [CrossRef]
- Merino-Fidalgo, S.; Sánchez-Girón, C.; Zalama, E.; Gómez-García-Bermejo, J.; Duque-Domingo, J. Behavior tree generation and adaptation for a social robot control with LLMs. Robot. Auton. Syst. 2025, 194, 105165. [Google Scholar] [CrossRef]
- Axelsson, M.; Churamani, N.; Çaldır, A.; Gunes, H. Participant Perceptions of a Robotic Coach Conducting Positive Psychology Exercises: A Qualitative Analysis. J. Hum.-Robot Interact. 2025, 14, 1–27. [Google Scholar] [CrossRef]
- Wang, M.; Yu, K.; Zhang, Y.; Fan, M. Challenges in Adopting Companion Robots: An Exploratory Study of Robotic Companionship Conducted with Chinese Retirees. Proc. ACM Hum.-Comput. Interact. 2025, 9, 1–27. [Google Scholar] [CrossRef]
- Fiorini, L.; D’Onofrio, G.; Sorrentino, A.; Cornacchia Loizzo, F.G.; Russo, S.; Ciccone, F.; Giuliani, F.; Sancarlo, D.; Cavallo, F. The Role of Coherent Robot Behavior and Embodiment in Emotion Perception and Recognition During Human-Robot Interaction: Experimental Study. JMIR Hum. Factors 2024, 11, e45494. [Google Scholar] [CrossRef]
- Angelopoulos, G.; Rossi, A.; Napoli, C.D.; Rossi, S. You Are In My Way: Non-verbal Social Cues for Legible Robot Navigation Behaviors. In Proceedings of the 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Kyoto, Japan, 23–27 October 2022; pp. 657–662. [Google Scholar] [CrossRef]
- Yang, F.; Yin, W.; Björkman, M.; Peters, C. Impact of Trajectory Generation Methods on Viewer Perception of Robot Approaching Group Behaviors. In Proceedings of the 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), Naples, Italy, 31 August–4 September 2020; pp. 509–516. [Google Scholar] [CrossRef]
- Chapa Sirithunge, H.P.; Bandara, H.M.R.T.; Jayasekara, A.G.B.P.; Chandima, D.P.; Abeykoon, H.M.H.S. A Study on Robot-Initiated Interaction: Toward Virtual Social Behavior. In Social Robots: Technological, Societal and Ethical Aspects of Human-Robot Interaction; Springer International Publishing: Cham, Switzerland, 2019; pp. 37–60. [Google Scholar] [CrossRef]
- Samarakoon, S.M.B.P.; Muthugala, M.A.V.J.; Jayasekara, A.G.B.P.; Elara, M.R. Adapting approaching proxemics of a service robot based on physical user behavior and user feedback. User Model.-User-Adapt. Interact. 2023, 33, 195–220. [Google Scholar] [CrossRef]
- Alcubilla Troughton, I.; Baraka, K.; Hindriks, K.; Bleeker, M. Robotic Improvisers: Rule-Based Improvisation and Emergent Behaviour in HRI. In Proceedings of the 2022 ACM/IEEE International Conference on Human-Robot Interaction (HRI ’22), Sapporo, Japan, 7–10 March 2022; IEEE Press: New York, NY, USA, 2022; pp. 561–569. [Google Scholar]
- Dautzenberg, P.; Ladwig, S.; Rosenthal-von der Pütten, A.M. Follow Me: Anthropomorphic Appearance and Communication Impact Social Perception and Joint Navigation Behavior. In Proceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction (HRI ’24), Boulder, CO, USA, 11–15 March 2024; pp. 175–183. [Google Scholar] [CrossRef]
- Gallo, D.; Bioche, P.L.; Willamowski, J.K.; Colombino, T.; Gonzalez-Jimenez, S.; Poirier, H.; Boulard, C. Investigating the Integration of Human-Like and Machine-Like Robot Behaviors in a Shared Elevator Scenario. In Proceedings of the 2023 ACM/IEEE International Conference on Human-Robot Interaction (HRI ’23), Stockholm, Sweden, 13–16 March 2023; pp. 192–201. [Google Scholar] [CrossRef]
- Jung, Y.; Jung, G.; Jeong, S.; Kim, C.; Woo, W.; Hong, H.; Lee, U. “Enjoy, but Moderately!”: Designing a Social Companion Robot for Social Engagement and Behavior Moderation in Solitary Drinking Context. Proc. ACM Hum.-Comput. Interact. 2023, 7, 1–24. [Google Scholar] [CrossRef]
- Karreman, D.E.; Ludden, G.D.S.; Evers, V. Beyond R2D2: Designing Multimodal Interaction Behavior for Robot-specific Morphology. J. Hum.-Robot Interact. 2019, 8, 1–32. [Google Scholar] [CrossRef]
- Kayukawa, S.; Ishihara, T.; Takagi, H.; Morishima, S.; Asakawa, C. Guiding Blind Pedestrians in Public Spaces by Understanding Walking Behavior of Nearby Pedestrians. In Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, Virtual, 12–17 September 2020; Volume 4. [Google Scholar] [CrossRef]
- M. Faas, S.; Kraus, J.; Schoenhals, A.; Baumann, M. Calibrating Pedestrians’ Trust in Automated Vehicles: Does an Intent Display in an External HMI Support Trust Calibration and Safe Crossing Behavior? In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (CHI ’21), Yokohama, Japan, 8–13 May 2021. [Google Scholar] [CrossRef]
- Mavrogiannis, C.; Hutchinson, A.M.; Macdonald, J.; Alves-Oliveira, P.; Knepper, R.A. Effects of distinct robot navigation strategies on human behavior in a crowded environment. In Proceedings of the 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI ’19), Cambridge, UK, 23–26 March 2020; IEEE Press: New York, NY, USA, 2020; pp. 421–430. [Google Scholar]
- Moolchandani, P.; Hayes, C.J.; Marge, M. Evaluating Robot Behavior in Response to Natural Language. In Proceedings of the Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction (HRI ’18), Chicago, IL, USA, 5–8 March 2018; pp. 197–198. [Google Scholar] [CrossRef]
- Shen, S.; Tennent, H.; Claure, H.; Jung, M. My Telepresence, My Culture? An Intercultural Investigation of Telepresence Robot Operators’ Interpersonal Distance Behaviors. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI ’18), Montreal, QC, Canada, 21–26 April 2018; pp. 1–11. [Google Scholar] [CrossRef]
- Zojaji, S.; Matviienko, A.; Leite, I.; Peters, C. Join Me Here if You Will: Investigating Embodiment and Politeness Behaviors When Joining Small Groups of Humans, Robots, and Virtual Characters. In Proceedings of the 2024 CHI Conference on Human Factors in Computing Systems (CHI ’24), Honolulu, HI, USA, 11–16 May 2024. [Google Scholar] [CrossRef]
- Weatherwax, K.; Dooley, D.; Carstensdottir, E.; Takayama, L. The Case of the Curious Robot: On the Social Viability of Curious Behavior in Non-Human Agents. In Proceedings of the 24th ACM International Conference on Intelligent Virtual Agents (IVA ’24), Glasgow, UK, 16–19 September 2024. [Google Scholar] [CrossRef]
- Bretin, R.; Cross, E.S.; Khamis, M. Co-existing With a Drone: Using Virtual Reality to Investigate the Effect of the Drone’s Height and Cover Story on Proxemic Behaviours. In Proceedings of the Extended Abstracts of the 2022 CHI Conference on Human Factors in Computing Systems (CHI EA ’22), New Orleans, LA, USA, 29 April–5 May 2022. [Google Scholar] [CrossRef]
- Lingam, S.N.; Petermeijer, S.M.; Torre, I.; Bazilinskyy, P.; Ljungblad, S.; Martens, M. Behavioral Effects of a Delivery Drone on Feelings of Uncertainty: A Virtual Reality Experiment. J. Hum.-Robot Interact. 2025, 14, 1–27. [Google Scholar] [CrossRef]
- Prakash, V.G. Behavior-Aware Robot Navigation with Deep Reinforcement Learning. In Proceedings of the 2022 6th International Conference on Computation System and Information Technology for Sustainable Solutions (CSITSS), Bangalore, India, 21–23 December 2022; pp. 1–8. [Google Scholar] [CrossRef]
- Samsani, S.S.; Muhammad, M.S. Socially Compliant Robot Navigation in Crowded Environment by Human Behavior Resemblance Using Deep Reinforcement Learning. IEEE Robot. Autom. Lett. 2021, 6, 5223–5230. [Google Scholar] [CrossRef]
- Petrak, B.; Sopper, G.; Weitz, K.; André, E. Do You Mind if I Pass Through? Studying the Appropriate Robot Behavior when Traversing two Conversing People in a Hallway Setting. In Proceedings of the 2021 30th IEEE International Conference on Robot & Human Interactive Communication (RO-MAN), Vancouver, BC, Canada, 8–12 August 2021; pp. 369–375. [Google Scholar] [CrossRef]
- Sasiadek, J. Sensor fusion. Annu. Rev. Control 2002, 26, 203–228. [Google Scholar] [CrossRef]





















| Study | Keywords |
|---|---|
| [10] | affective-expressive-emotion-style, motion -movement-robot, expression-generation-behavior-design |
| [11] | (robot* OR cobot* OR drone*) AND (intent* OR intend*) (communicat* OR visual* OR feedback*) AND (motion* OR movement* OR interaction*) |
| [12] | (cult*) AND (adapt*) AND (behavio*) AND (model* OR system*) AND (robot*) (cult*) AND (adapt*) AND (cognitive) AND (model* OR architecture*) AND (robot*) affordance* AND (behavio*) AND (adapt*) AND (robot*) affordance* AND (cognitive) AND (model* OR architecture*) AND (robot*) fac* AND expression AND cognitive AND (model* OR architecture*) AND (robot*) fac* AND expression AND (behavio*) AND (model* OR system*) AND (robot*) cognitive AND robot* AND architecture* learning AND assistive AND robot* affective AND robot* AND behavio* empathy AND social AND robot* |
| Study | Robot | Locomotion | Drive System | Appearance | Commercialized |
|---|---|---|---|---|---|
| [115] | NR | NR | Omnidirectional | NR | NR |
| [116] | NR | NR | Omnidirectional | NR | NR |
| [97] | Pepper | Wheeled | Omnidirectional | Bio-inspired | ✓ |
| [117] | Viva | NR | NR | Bio-inspired | ✗ |
| [98] | Pepper | Wheeled | Omnidirectional | Bio-inspired | ✓ |
| [99] | Mirob | Wheeled | Differential | Functional | ✗ |
| [100] | Mirob | Wheeled | Differential | Functional | ✗ |
| [101] | Pepper | Wheeled | Omnidirectional | Bio-inspired | ✓ |
| [113] | Parrot | Rotary-wing | Multi-rotor | Functional | ✓ |
| [102] | NR | Wheeled | Rocker-bogie | Functional | NR |
| [32] | Robovie | Wheeled | Omnidirectional | Bio-inspired | ✗ |
| [103] | NR | Wheeled | NR | Functional | NR |
| [104] | Jetbot | Wheeled | Differential | Functional | ✗ |
| [106] | NR | Wheeled | Omnidirectional | Artifact | ✗ |
| [105] | Frog | Wheeled | Differential | Functional | ✗ |
| [107] | NR | Wheeled | Ackermann | Artifact | ✗ |
| [108] | Beam Pro | Wheeled | Differential | Functional | ✓ |
| [109] | Jackal | Wheeled | Differential | Functional | NR |
| [110] | Beam Pro | Wheeled | Differential | Functional | ✓ |
| [111] | Pepper | Wheeled | Omnidirectional | Bio-inspired | ✓ |
| [112] | Kuri | Wheeled | Differential | Bio-inspired | ✓ |
| [114] | NR | Rotary-wing | Multi-rotor | Functional | ✗ |
| Study | Environment Type | Location | Experimental Scenario |
|---|---|---|---|
| [115] | Simulation | Virtual environment | An area without obstacles |
| [116] | Simulation | Virtual environment | Crossing scenario |
| [97] | Real-world | University campus | Hallway setting |
| [117] | Simulation | Virtual environment | Corridor setting |
| [98] | Real-world | Research laboratory | Conversational group setting |
| [99] | Real-world | Research laboratory | An area of domestic activities |
| [100] | Real-world | Research laboratory | An area of domestic activities |
| [101] | Real-world | Research laboratory | A dance floor |
| [113] | Simulation | Virtual environment | An empty room |
| [102] | Real-world | Warehouse | An obstacle course |
| [32] | Real-world | Shopping mall | A hat store with narrow aisles |
| [103] | Real-world | Work office | An office with an elevator |
| [104] | Real-world | Research laboratory | A drinking experience at home |
| [105] | Real-world | Historical palace | An art exposition |
| [106] | Real-world | Work office | A moderately crowded scene |
| [107] | Simulation | Virtual environment | Crossing scenario |
| [108] | Real-world | Research laboratory | A moderately crowded scene |
| [109] | Simulation | Virtual environment | An area of domestic activities |
| [110] | Real-world | Work office | A hallway |
| [111] | Hybrid | Research laboratory | Conversational group setting |
| [112] | Simulation | Virtual environment | A mail delivering |
| [114] | Simulation | Virtual environment | A medical package delivering |
| Study | Statistical Test | Type | Description |
|---|---|---|---|
| [116] | Mann–Whitney U, Rank-Biserial Correlation, CLES | NP | Compare safety metrics between navigation strategies |
| [97] | Wilcoxon signed-rank, | NP | Compare responses across navigation cue conditions |
| [117] | Paired t-tests | P | Compare effects of robot interaction types on user perception |
| [98] | Repeated Measures ANOVA | P | Evaluate influence of control method, perspective, and approach direction on subjective responses |
| [99] | NR | NP | Assess agreement between system judgment and human annotations |
| [100] | t-test, Wilcoxon, | B | Compare satisfaction scores between learning and non-learning systems |
| [113] | ANOVA (2 × 3), Paired t-tests | P | Evaluate effects of drone height and distance |
| [102] | ANOVA (2 × 3, 2 × 4), t-test, | P | Evaluate effects of robot design and exposure level |
| [32] | test | NP | Evaluate hypothesis that situation-aware robot would pass aisles more often than baseline |
| [103] | Pairwise t-tests | P | Test questionnaire data for normality and homogeneity, then compare subscales across conditions |
| [105] | Mann–Whitney U | NP | Compare user responses and behavioral data between human-translated and robot-optimized conditions |
| [106] | Wilcoxon signed-rank | NP | Compare task completion times |
| [107] | One-way and Mixed ANOVA, Fisher’s exact | B | Compare group scale answers and test trust development |
| [108] | One-way ANOVA, Tukey’s HSD | P | Compare robot strategies on trajectory quality and user impressions |
| [110] | 2 × 2 ANOVA | P | Compare cultural effects on interpersonal distance |
| [111] | ART ANOVA, Spearman’s | NP | Analyze how embodiment and politeness influence joining behavior |
| [112] | ANOVA | P | Test the effect of curiosity level and expectation priming |
| [114] | Two-way ANOVA | P | Test the effect of feelings of uncertainty during delivering |
| Study | Assumption Tests | Reliability and Power | Effect Sizes |
|---|---|---|---|
| [100] | NR | Cronbach’s , Power analysis | Cohen’s d |
| [99] | NR | Cohen’s | NR |
| [102] | Welch’s t-test (variance correction) | NR | NR |
| [116] | NR | NR | RBC, CLES |
| [111] | Bonferroni correction | NR | Spearman’s |
| [97] | NR | Cronbach’s , Power analysis | NR |
| [117] | NR | NR | Cohen’s d |
| [103] | Jarque-Bera | NR | NR |
| [105] | NR | Cronbach’s , Fleiss’ | NR |
| [110] | Bonferroni correction | NR | NR |
| Study | Participant’s Feedback | Description |
|---|---|---|
| [101] | Qualitative comments | Identified areas of improvement and cultural implications |
| [113] | Interview | Reflect on the robot’s behavior |
| [102] | Open-ended comments | Identified which configuration is better |
| [32] | Interview | Reflect on the robot’s behavior |
| [103] | Interview | Improve legibility of intent, communication of intent, priority, and positioning |
| [104] | Interview | Identified behavior suggestions |
| [105] | Interview | Reflect on the robot’s behavior an provide suggestions |
| [106] | Open-ended comments | Identified behavior and interface suggestions |
| [108] | Interview | Reflect on the robot’s behavior |
| [110] | Open-ended comments | Reflect on previous experiences |
| [111] | Questionnaire | Reflect on the overall experience |
| [112] | Open-ended comments | Identified behavior suggestions |
| [114] | Interview | Reflect on the robot’s behavior |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Hermosilla-Diaz, J.E.; Rechy-Ramirez, E.J.; Marin-Hernandez, A. Robotic Motion Techniques for Socially Aware Navigation: A Scoping Review. Future Internet 2025, 17, 552. https://doi.org/10.3390/fi17120552
Hermosilla-Diaz JE, Rechy-Ramirez EJ, Marin-Hernandez A. Robotic Motion Techniques for Socially Aware Navigation: A Scoping Review. Future Internet. 2025; 17(12):552. https://doi.org/10.3390/fi17120552
Chicago/Turabian StyleHermosilla-Diaz, Jesus Eduardo, Ericka Janet Rechy-Ramirez, and Antonio Marin-Hernandez. 2025. "Robotic Motion Techniques for Socially Aware Navigation: A Scoping Review" Future Internet 17, no. 12: 552. https://doi.org/10.3390/fi17120552
APA StyleHermosilla-Diaz, J. E., Rechy-Ramirez, E. J., & Marin-Hernandez, A. (2025). Robotic Motion Techniques for Socially Aware Navigation: A Scoping Review. Future Internet, 17(12), 552. https://doi.org/10.3390/fi17120552

