Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (11)

Search Parameters:
Keywords = mixed-reality tele-operation

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
18 pages, 2944 KiB  
Article
The Teleoperation of Robot Arms by Interacting with an Object’s Digital Twin in a Mixed Reality Environment
by Yan Wu, Bin Zhao and Qi Li
Appl. Sci. 2025, 15(7), 3549; https://doi.org/10.3390/app15073549 - 24 Mar 2025
Viewed by 842
Abstract
The teleoperation of robot arms can prevent users from working in hazardous environments, but current teleoperation uses a 2D display and controls the end effector of robot arms, which introduces the problem of a limited view and complex operations. In this study, a [...] Read more.
The teleoperation of robot arms can prevent users from working in hazardous environments, but current teleoperation uses a 2D display and controls the end effector of robot arms, which introduces the problem of a limited view and complex operations. In this study, a teleoperation method for robot arms is proposed, which can control the robot arm by interacting with the digital twins of objects. Based on the objects in the workspace, this method generates a virtual scene containing digital twins. Users can observe the virtual scene from any direction and move the digital twins of the objects at will to control the robot arm. This study compared the proposed method and the traditional method, which uses a 2D display and a game controller, through a pick-and-place task. The proposed method achieved 45% lower scores in NASA-TLX and 31% higher scores in SUS than traditional teleoperation methods. The results indicate that the proposed method can reduce the workload and improve the usability of teleoperation. Full article
(This article belongs to the Section Computing and Artificial Intelligence)
Show Figures

Figure 1

24 pages, 46034 KiB  
Article
Immersive Robot Teleoperation Based on User Gestures in Mixed Reality Space
by Hibiki Esaki and Kosuke Sekiyama
Sensors 2024, 24(15), 5073; https://doi.org/10.3390/s24155073 - 5 Aug 2024
Cited by 1 | Viewed by 1903
Abstract
Recently, research has been conducted on mixed reality (MR), which provides immersive visualization and interaction experiences, and on mapping human motions directly onto a robot in a mixed reality (MR) space to achieve a high level of immersion. However, even though the robot [...] Read more.
Recently, research has been conducted on mixed reality (MR), which provides immersive visualization and interaction experiences, and on mapping human motions directly onto a robot in a mixed reality (MR) space to achieve a high level of immersion. However, even though the robot is mapped onto the MR space, their surrounding environment is often not mapped sufficiently; this makes it difficult to comfortably perform tasks that require precise manipulation of the objects that are difficult to see from the human perspective. Therefore, we propose a system that allows users to operate a robot in real space by mapping the task environment around the robot on the MR space and performing operations within the MR space. Full article
Show Figures

Graphical abstract

28 pages, 3902 KiB  
Review
Integrating Virtual, Mixed, and Augmented Reality into Remote Robotic Applications: A Brief Review of Extended Reality-Enhanced Robotic Systems for Intuitive Telemanipulation and Telemanufacturing Tasks in Hazardous Conditions
by Yun-Peng Su, Xiao-Qi Chen, Cong Zhou, Lui Holder Pearson, Christopher G. Pretty and J. Geoffrey Chase
Appl. Sci. 2023, 13(22), 12129; https://doi.org/10.3390/app132212129 - 8 Nov 2023
Cited by 24 | Viewed by 8154
Abstract
There is an increasingly urgent need for humans to interactively control robotic systems to perform increasingly precise remote operations, concomitant with the rapid development of space exploration, deep-sea discovery, nuclear rehabilitation and management, and robotic-assisted medical devices. The potential high value of medical [...] Read more.
There is an increasingly urgent need for humans to interactively control robotic systems to perform increasingly precise remote operations, concomitant with the rapid development of space exploration, deep-sea discovery, nuclear rehabilitation and management, and robotic-assisted medical devices. The potential high value of medical telerobotic applications was also evident during the recent coronavirus pandemic and will grow in future. Robotic teleoperation satisfies the demands of the scenarios in which human access carries measurable risk, but human intelligence is required. An effective teleoperation system not only enables intuitive human-robot interaction (HRI) but ensures the robot can also be operated in a way that allows the operator to experience the “feel” of the robot working on the remote side, gaining a “sense of presence”. Extended reality (XR) technology integrates real-world information with computer-generated graphics and has the potential to enhance the effectiveness and performance of HRI by providing depth perception and enabling judgment and decision making while operating the robot in a dynamic environment. This review examines novel approaches to the development and evaluation of an XR-enhanced telerobotic platform for intuitive remote teleoperation applications in dangerous and difficult working conditions. It presents a strong review of XR-enhanced telerobotics for remote robotic applications; a particular focus of the review includes the use of integrated 2D/3D mixed reality with haptic interfaces to perform intuitive remote operations to remove humans from dangerous conditions. This review also covers primary studies proposing Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR) solutions where humans can better control or interact with real robotic platforms using these devices and systems to extend the user’s reality and provide a more intuitive interface. The objective of this article is to present recent, relevant, common, and accessible frameworks implemented in research articles published on XR-enhanced telerobotics for industrial applications. Finally, we present and classify the application context of the reviewed articles in two groups: mixed reality–enhanced robotic telemanipulation and mixed reality–enhanced robotic tele-welding. The review thus addresses all elements in the state of the art for these systems and ends with recommended research areas and targets. The application range of these systems and the resulting recommendations is readily extensible to other application areas, such as remote robotic surgery in telemedicine, where surgeons are scarce and need is high, and other potentially high-risk/high-need scenarios. Full article
(This article belongs to the Special Issue Extended Reality Applications in Industrial Systems)
Show Figures

Figure 1

14 pages, 7850 KiB  
Article
A Mixed-Reality-Based Unknown Space Navigation Method of a Flexible Manipulator
by Ronghui Chen, Xiaojun Zhu, Zhang Chen, Yu Tian, Lunfei Liang and Xueqian Wang
Sensors 2023, 23(8), 3840; https://doi.org/10.3390/s23083840 - 9 Apr 2023
Viewed by 2505
Abstract
A hyper-redundant flexible manipulator is characterized by high degree(s) of freedom (DoF), flexibility, and environmental adaptability. It has been used for missions in complex and unknown spaces, such as debris rescue and pipeline inspection, where the manipulator is not intelligent enough to face [...] Read more.
A hyper-redundant flexible manipulator is characterized by high degree(s) of freedom (DoF), flexibility, and environmental adaptability. It has been used for missions in complex and unknown spaces, such as debris rescue and pipeline inspection, where the manipulator is not intelligent enough to face complex situations. Therefore, human intervention is required to assist in decision-making and control. In this paper, we designed an interactive navigation method based on mixed reality (MR) of a hyper-redundant flexible manipulator in an unknown space. A novel teleoperation system frame is put forward. An MR-based interface was developed to provide a virtual model of the remote workspace and virtual interactive interface, allowing the operator to observe the real-time situation from a third perspective and issue commands to the manipulator. As for environmental modeling, a simultaneous localization and mapping (SLAM) algorithm based on an RGB-D camera is applied. Additionally, a path-finding and obstacle avoidance method based on artificial potential field (APF) is introduced to ensure that the manipulator can move automatically under the artificial command in the remote space without collision. The results of the simulations and experiments validate that the system exhibits good real-time performance, accuracy, security, and user-friendliness. Full article
Show Figures

Figure 1

18 pages, 18974 KiB  
Article
A 3D-Printed Soft Haptic Device with Built-in Force Sensing Delivering Bio-Mimicked Feedback
by Rahim Mutlu, Dilpreet Singh, Charbel Tawk and Emre Sariyildiz
Biomimetics 2023, 8(1), 127; https://doi.org/10.3390/biomimetics8010127 - 22 Mar 2023
Cited by 6 | Viewed by 5172
Abstract
Haptics plays a significant role not only in the rehabilitation of neurological disorders, such as stroke, by substituting necessary cognitive information but also in human–computer interfaces (HCIs), which are now an integral part of the recently launched metaverse. This study proposes a unique, [...] Read more.
Haptics plays a significant role not only in the rehabilitation of neurological disorders, such as stroke, by substituting necessary cognitive information but also in human–computer interfaces (HCIs), which are now an integral part of the recently launched metaverse. This study proposes a unique, soft, monolithic haptic feedback device (SoHapS) that was directly manufactured using a low-cost and open-source fused deposition modeling (FDM) 3D printer by employing a combination of soft conductive and nonconductive thermoplastic polyurethane (TPU) materials (NinjaTek, USA). SoHapS consists of a soft bellow actuator and a soft resistive force sensor, which are optimized using finite element modeling (FEM). SoHapS was characterized both mechanically and electrically to assess its performance, and a dynamic model was developed to predict its force output with given pressure inputs. We demonstrated the efficacy of SoHapS in substituting biofeedback with tactile feedback, such as gripping force, and proprioceptive feedback, such as finger flexion–extension positions, in the context of teleoperation. With its intrinsic properties, SoHapS can be integrated into rehabilitation robots and robotic prostheses, as well as augmented, virtual, and mixed reality (AR/VR/MR) systems, to induce various types of bio-mimicked feedback. Full article
(This article belongs to the Special Issue Biorobotics)
Show Figures

Figure 1

18 pages, 1554 KiB  
Article
Integrating Virtual, Mixed, and Augmented Reality to Human–Robot Interaction Applications Using Game Engines: A Brief Review of Accessible Software Tools and Frameworks
by Enrique Coronado, Shunki Itadera and Ixchel G. Ramirez-Alpizar
Appl. Sci. 2023, 13(3), 1292; https://doi.org/10.3390/app13031292 - 18 Jan 2023
Cited by 43 | Viewed by 13212
Abstract
This article identifies and summarizes software tools and frameworks proposed in the Human–Robot Interaction (HRI) literature for developing extended reality (XR) experiences using game engines. This review includes primary studies proposing Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR) solutions where [...] Read more.
This article identifies and summarizes software tools and frameworks proposed in the Human–Robot Interaction (HRI) literature for developing extended reality (XR) experiences using game engines. This review includes primary studies proposing Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR) solutions where humans can control or interact with real robotic platforms using devices that extend the user’s reality. The objective of this article is not to present an extensive list of applications and tools. Instead, we present recent, relevant, common, and accessible frameworks and software tools implemented in research articles published in high-impact robotics conferences and journals. For this, we searched papers published during a seven-years period between 2015 and 2022 in relevant databases for robotics (Science Direct, IEEE Xplore, ACM digital library, Springer Link, and Web of Science). Additionally, we present and classify the application context of the reviewed articles in four groups: social robotics, programming of industrial robots, teleoperation of industrial robots, and Human–Robot collaboration (HRC). Full article
(This article belongs to the Special Issue Advanced Human-Robot Interaction)
Show Figures

Figure 1

16 pages, 28468 KiB  
Article
A Mixed-Reality Tele-Operation Method for High-Level Control of a Legged-Manipulator Robot
by Christyan Cruz Ulloa, David Domínguez, Jaime Del Cerro and Antonio Barrientos
Sensors 2022, 22(21), 8146; https://doi.org/10.3390/s22218146 - 24 Oct 2022
Cited by 20 | Viewed by 4543
Abstract
In recent years, legged (quadruped) robots have been subject of technological study and continuous development. These robots have a leading role in applications that require high mobility skills in complex terrain, as is the case of Search and Rescue (SAR). These robots stand [...] Read more.
In recent years, legged (quadruped) robots have been subject of technological study and continuous development. These robots have a leading role in applications that require high mobility skills in complex terrain, as is the case of Search and Rescue (SAR). These robots stand out for their ability to adapt to different terrains, overcome obstacles and move within unstructured environments. Most of the implementations recently developed are focused on data collecting with sensors, such as lidar or cameras. This work seeks to integrate a 6DoF arm manipulator to the quadruped robot ARTU-R (A1 Rescue Tasks UPM Robot) by Unitree to perform manipulation tasks in SAR environments. The main contribution of this work is focused on the High-level control of the robotic set (Legged + Manipulator) using Mixed-Reality (MR). An optimization phase of the robotic set workspace has been previously developed in Matlab for the implementation, as well as a simulation phase in Gazebo to verify the dynamic functionality of the set in reconstructed environments. The first and second generation of Hololens glasses have been used and contrasted with a conventional interface to develop the MR control part of the proposed method. Manipulations of first aid equipment have been carried out to evaluate the proposed method. The main results show that the proposed method allows better control of the robotic set than conventional interfaces, improving the operator efficiency in performing robotic handling tasks and increasing confidence in decision-making. On the other hand, Hololens 2 showed a better user experience concerning graphics and latency time. Full article
(This article belongs to the Special Issue Sensors for Robotic Applications in Europe)
Show Figures

Figure 1

17 pages, 2775 KiB  
Article
Mixed-Reality-Enhanced Human–Robot Interaction with an Imitation-Based Mapping Approach for Intuitive Teleoperation of a Robotic Arm-Hand System
by Yun-Peng Su, Xiao-Qi Chen, Tony Zhou, Christopher Pretty and Geoffrey Chase
Appl. Sci. 2022, 12(9), 4740; https://doi.org/10.3390/app12094740 - 8 May 2022
Cited by 27 | Viewed by 9748
Abstract
This paper presents an integrated mapping of motion and visualization scheme based on a Mixed Reality (MR) subspace approach for the intuitive and immersive telemanipulation of robotic arm-hand systems. The effectiveness of different control-feedback methods for the teleoperation system is validated and compared. [...] Read more.
This paper presents an integrated mapping of motion and visualization scheme based on a Mixed Reality (MR) subspace approach for the intuitive and immersive telemanipulation of robotic arm-hand systems. The effectiveness of different control-feedback methods for the teleoperation system is validated and compared. The robotic arm-hand system consists of a 6 Degrees-of-Freedom (DOF) industrial manipulator and a low-cost 2-finger gripper, which can be manipulated in a natural manner by novice users physically distant from the working site. By incorporating MR technology, the user is fully immersed in a virtual operating space augmented by real-time 3D visual feedback from the robot working site. Imitation-based velocity-centric motion mapping is implemented via the MR subspace to accurately track operator hand movements for robot motion control and enables spatial velocity-based control of the robot Tool Center Point (TCP). The user control space and robot working space are overlaid through the MR subspace, and the local user and a digital twin of the remote robot share the same environment in the MR subspace. The MR-based motion and visualization mapping scheme for telerobotics is compared to conventional 2D Baseline and MR tele-control paradigms over two tabletop object manipulation experiments. A user survey of 24 participants was conducted to demonstrate the effectiveness and performance enhancements enabled by the proposed system. The MR-subspace-integrated 3D mapping of motion and visualization scheme reduced the aggregate task completion time by 48% compared to the 2D Baseline module and 29%, compared to the MR SpaceMouse module. The perceived workload decreased by 32% and 22%, compared to the 2D Baseline and MR SpaceMouse approaches. Full article
(This article belongs to the Topic Virtual Reality, Digital Twins, the Metaverse)
Show Figures

Figure 1

11 pages, 745 KiB  
Article
Does One Size Fit All? A Case Study to Discuss Findings of an Augmented Hands-Free Robot Teleoperation Concept for People with and without Motor Disabilities
by Stephanie Arévalo Arboleda, Marvin Becker and Jens Gerken
Technologies 2022, 10(1), 4; https://doi.org/10.3390/technologies10010004 - 6 Jan 2022
Cited by 4 | Viewed by 3446
Abstract
Hands-free robot teleoperation and augmented reality have the potential to create an inclusive environment for people with motor disabilities. It may allow them to teleoperate robotic arms to manipulate objects. However, the experiences evoked by the same teleoperation concept and augmented reality can [...] Read more.
Hands-free robot teleoperation and augmented reality have the potential to create an inclusive environment for people with motor disabilities. It may allow them to teleoperate robotic arms to manipulate objects. However, the experiences evoked by the same teleoperation concept and augmented reality can vary significantly for people with motor disabilities compared to those without disabilities. In this paper, we report the experiences of Miss L., a person with multiple sclerosis, when teleoperating a robotic arm in a hands-free multimodal manner using a virtual menu and visual hints presented through the Microsoft HoloLens 2. We discuss our findings and compare her experiences to those of people without disabilities using the same teleoperation concept. Additionally, we present three learning points from comparing these experiences: a re-evaluation of the metrics used to measure performance, being aware of the bias, and considering variability in abilities, which evokes different experiences. We consider these learning points can be extrapolated to carrying human–robot interaction evaluations with mixed groups of participants with and without disabilities. Full article
(This article belongs to the Collection Selected Papers from the PETRA Conference Series)
Show Figures

Figure 1

17 pages, 3878 KiB  
Article
Mixed Reality-Enhanced Intuitive Teleoperation with Hybrid Virtual Fixtures for Intelligent Robotic Welding
by Yun-Peng Su, Xiao-Qi Chen, Tony Zhou, Christopher Pretty and Geoffrey Chase
Appl. Sci. 2021, 11(23), 11280; https://doi.org/10.3390/app112311280 - 29 Nov 2021
Cited by 23 | Viewed by 4946
Abstract
This paper presents an integrated scheme based on a mixed reality (MR) and haptic feedback approach for intuitive and immersive teleoperation of robotic welding systems. By incorporating MR technology, the user is fully immersed in a virtual operating space augmented by real-time visual [...] Read more.
This paper presents an integrated scheme based on a mixed reality (MR) and haptic feedback approach for intuitive and immersive teleoperation of robotic welding systems. By incorporating MR technology, the user is fully immersed in a virtual operating space augmented by real-time visual feedback from the robot working space. The proposed robotic tele-welding system features imitative motion mapping from the user’s hand movements to the welding robot motions, and it enables the spatial velocity-based control of the robot tool center point (TCP). The proposed mixed reality virtual fixture (MRVF) integration approach implements hybrid haptic constraints to guide the operator’s hand movements following the conical guidance to effectively align the welding torch for welding and constrain the welding operation within a collision-free area. Onsite welding and tele-welding experiments identify the operational differences between professional and unskilled welders and demonstrate the effectiveness of the proposed MRVF tele-welding framework for novice welders. The MRVF-integrated visual/haptic tele-welding scheme reduced the torch alignment times by 56% and 60% compared to the MRnoVF and baseline cases, with minimized cognitive workload and optimal usability. The MRVF scheme effectively stabilized welders’ hand movements and eliminated undesirable collisions while generating smooth welds. Full article
(This article belongs to the Topic Extended Reality (XR): AR, VR, MR and Beyond)
Show Figures

Figure 1

17 pages, 29746 KiB  
Article
Design of a Hyper-Redundant Robot and Teleoperation Using Mixed Reality for Inspection Tasks
by Andrés Martín-Barrio, Juan Jesús Roldán-Gómez, Iván Rodríguez, Jaime del Cerro and Antonio Barrientos
Sensors 2020, 20(8), 2181; https://doi.org/10.3390/s20082181 - 12 Apr 2020
Cited by 26 | Viewed by 7029
Abstract
Hyper-redundant robots are highly articulated devices that present numerous technical challenges such as their design, control or remote operation. However, they offer superior kinematic skills than traditional robots for multiple applications. This work proposes an original and custom-made design for a discrete and [...] Read more.
Hyper-redundant robots are highly articulated devices that present numerous technical challenges such as their design, control or remote operation. However, they offer superior kinematic skills than traditional robots for multiple applications. This work proposes an original and custom-made design for a discrete and hyper-redundant manipulator. It is comprised of 7 sections actuated by cables and 14 degrees of freedom. It has been optimized to be very robust, accurate and capable of moving payloads with high dexterity. Furthermore, it has been efficiently controlled from the actuators to high-level strategies based on the management of its shape. However, these highly articulated systems often exhibit complex shapes that frustrate their spatial understanding. Immersive technologies emerge as a good solution to remotely and safely teleoperate the presented robot for an inspection task in a hazardous environment. Experimental results validate the proposed robot design and control strategies. As a result, it is concluded that hyper-redundant robots and immersive technologies should play an important role in the near future of automated and remote applications. Full article
(This article belongs to the Collection Robotics, Sensors and Industry 4.0)
Show Figures

Figure 1

Back to TopTop