Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (32)

Search Parameters:
Keywords = intuitive teleoperation

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
32 pages, 6323 KiB  
Article
Design, Implementation and Evaluation of an Immersive Teleoperation Interface for Human-Centered Autonomous Driving
by Irene Bouzón, Jimena Pascual, Cayetana Costales, Aser Crespo, Covadonga Cima and David Melendi
Sensors 2025, 25(15), 4679; https://doi.org/10.3390/s25154679 - 29 Jul 2025
Viewed by 315
Abstract
As autonomous driving technologies advance, the need for human-in-the-loop systems becomes increasingly critical to ensure safety, adaptability, and public confidence. This paper presents the design and evaluation of a context-aware immersive teleoperation interface that integrates real-time simulation, virtual reality, and multimodal feedback to [...] Read more.
As autonomous driving technologies advance, the need for human-in-the-loop systems becomes increasingly critical to ensure safety, adaptability, and public confidence. This paper presents the design and evaluation of a context-aware immersive teleoperation interface that integrates real-time simulation, virtual reality, and multimodal feedback to support remote interventions in emergency scenarios. Built on a modular ROS2 architecture, the system allows seamless transition between simulated and physical platforms, enabling safe and reproducible testing. The experimental results show a high task success rate and user satisfaction, highlighting the importance of intuitive controls, gesture recognition accuracy, and low-latency feedback. Our findings contribute to the understanding of human-robot interaction (HRI) in immersive teleoperation contexts and provide insights into the role of multisensory feedback and control modalities in building trust and situational awareness for remote operators. Ultimately, this approach is intended to support the broader acceptability of autonomous driving technologies by enhancing human supervision, control, and confidence. Full article
(This article belongs to the Special Issue Human-Centred Smart Manufacturing - Industry 5.0)
Show Figures

Figure 1

23 pages, 3542 KiB  
Article
An Intuitive and Efficient Teleoperation Human–Robot Interface Based on a Wearable Myoelectric Armband
by Long Wang, Zhangyi Chen, Songyuan Han, Yao Luo, Xiaoling Li and Yang Liu
Biomimetics 2025, 10(7), 464; https://doi.org/10.3390/biomimetics10070464 - 15 Jul 2025
Viewed by 307
Abstract
Although artificial intelligence technologies have significantly enhanced autonomous robots’ capabilities in perception, decision-making, and planning, their autonomy may still fail when faced with complex, dynamic, or unpredictable environments. Therefore, it is critical to enable users to take over robot control in real-time and [...] Read more.
Although artificial intelligence technologies have significantly enhanced autonomous robots’ capabilities in perception, decision-making, and planning, their autonomy may still fail when faced with complex, dynamic, or unpredictable environments. Therefore, it is critical to enable users to take over robot control in real-time and efficiently through teleoperation. The lightweight, wearable myoelectric armband, due to its portability and environmental robustness, provides a natural human–robot gesture interaction interface. However, current myoelectric teleoperation gesture control faces two major challenges: (1) poor intuitiveness due to visual-motor misalignment; and (2) low efficiency from discrete, single-degree-of-freedom control modes. To address these challenges, this study proposes an integrated myoelectric teleoperation interface. The interface integrates the following: (1) a novel hybrid reference frame aimed at effectively mitigating visual-motor misalignment; and (2) a finite state machine (FSM)-based control logic designed to enhance control efficiency and smoothness. Four experimental tasks were designed using different end-effectors (gripper/dexterous hand) and camera viewpoints (front/side view). Compared to benchmark methods, the proposed interface demonstrates significant advantages in task completion time, movement path efficiency, and subjective workload. This work demonstrates the potential of the proposed interface to significantly advance the practical application of wearable myoelectric sensors in human–robot interaction. Full article
(This article belongs to the Special Issue Intelligent Human–Robot Interaction: 4th Edition)
Show Figures

Figure 1

28 pages, 2221 KiB  
Article
Navigating the Last Mile: A Stakeholder Analysis of Delivery Robot Teleoperation
by Avishag Boker, Einat Grimberg, Felix Tener and Joel Lanir
Sustainability 2025, 17(13), 5925; https://doi.org/10.3390/su17135925 - 27 Jun 2025
Viewed by 559
Abstract
The market share of Last-Mile Delivery Robots (LMDRs) has grown rapidly over the past few years. These robots are mostly autonomous and supported remotely by human operators. As part of a broader shift toward sustainable urban logistics, LMDRs are seen as a promising [...] Read more.
The market share of Last-Mile Delivery Robots (LMDRs) has grown rapidly over the past few years. These robots are mostly autonomous and supported remotely by human operators. As part of a broader shift toward sustainable urban logistics, LMDRs are seen as a promising low-emission alternative to conventional delivery vehicles. While there is a large body of literature about the technology, little is known about the real-world experiences of operating these robots. This study investigates the operational challenges faced by remote operators (ROs) of LMDRs, aiming to enhance their efficiency and safety. Through interviews with industry professionals, we explore the scenarios requiring human intervention, the strategies employed by ROs, and the unique challenges they encounter. Our findings not only identify key intervention scenarios but also provide a thorough examination of the teleoperation ecosystem, operational workflows, and how they affect the ways the ROs manage their interactions with robots. We found that ROs’ involvement varies from monitoring to active intervention to support the robots in completing their tasks when they face connectivity issues, blocked routes, and various other interruptions on their journeys. The findings highlight the importance of intuitive user interfaces (UIs) and decision-support systems to reduce cognitive load and improve situational awareness. This research contributes to the literature by offering a detailed examination of real-world teleoperation practices and focusing on the human factors influencing LMDR scalability, sustainability, and integration into future-ready logistics systems. Full article
Show Figures

Figure 1

17 pages, 1929 KiB  
Article
Bio-Signal-Guided Robot Adaptive Stiffness Learning via Human-Teleoperated Demonstrations
by Wei Xia, Zhiwei Liao, Zongxin Lu and Ligang Yao
Biomimetics 2025, 10(6), 399; https://doi.org/10.3390/biomimetics10060399 - 13 Jun 2025
Viewed by 489
Abstract
Robot learning from human demonstration pioneers an effective mapping paradigm for endowing robots with human-like operational capabilities. This paper proposes a bio-signal-guided robot adaptive stiffness learning framework grounded in the conclusion that muscle activation of the human arm is positively correlated with the [...] Read more.
Robot learning from human demonstration pioneers an effective mapping paradigm for endowing robots with human-like operational capabilities. This paper proposes a bio-signal-guided robot adaptive stiffness learning framework grounded in the conclusion that muscle activation of the human arm is positively correlated with the endpoint stiffness. First, we propose a human-teleoperated demonstration platform enabling real-time modulation of robot end-effector stiffness by human tutors during operational tasks. Second, we develop a dual-stage probabilistic modeling architecture employing the Gaussian mixture model and Gaussian mixture regression to model the temporal–motion correlation and the motion–sEMG relationship, successively. Third, a real-world experiment was conducted to validate the effectiveness of the proposed skill transfer framework, demonstrating that the robot achieves online adaptation of Cartesian impedance characteristics in contact-rich tasks. This paper provides a simple and intuitive way to plan the Cartesian impedance parameters, transcending the classical method that requires complex human arm endpoint stiffness identification before human demonstration or compensation for the difference in human–robot operational effects after human demonstration. Full article
Show Figures

Figure 1

19 pages, 28961 KiB  
Article
Human-like Dexterous Grasping Through Reinforcement Learning and Multimodal Perception
by Wen Qi, Haoyu Fan, Cankun Zheng, Hang Su and Samer Alfayad
Biomimetics 2025, 10(3), 186; https://doi.org/10.3390/biomimetics10030186 - 18 Mar 2025
Cited by 2 | Viewed by 1313
Abstract
Dexterous robotic grasping with multifingered hands remains a critical challenge in non-visual environments, where diverse object geometries and material properties demand adaptive force modulation and tactile-aware manipulation. To address this, we propose the Reinforcement Learning-Based Multimodal Perception (RLMP) framework, which integrates human-like grasping [...] Read more.
Dexterous robotic grasping with multifingered hands remains a critical challenge in non-visual environments, where diverse object geometries and material properties demand adaptive force modulation and tactile-aware manipulation. To address this, we propose the Reinforcement Learning-Based Multimodal Perception (RLMP) framework, which integrates human-like grasping intuition through operator-worn gloves with tactile-guided reinforcement learning. The framework’s key innovation lies in its Tactile-Driven DCNN architecture—a lightweight convolutional network achieving 98.5% object recognition accuracy using spatiotemporal pressure patterns—coupled with an RL policy refinement mechanism that dynamically correlates finger kinematics with real-time tactile feedback. Experimental results demonstrate reliable grasping performance across deformable and rigid objects while maintaining force precision critical for fragile targets. By bridging human teleoperation with autonomous tactile adaptation, RLMP eliminates dependency on visual input and predefined object models, establishing a new paradigm for robotic dexterity in occlusion-rich scenarios. Full article
(This article belongs to the Special Issue Biomimetic Innovations for Human–Machine Interaction)
Show Figures

Figure 1

23 pages, 15527 KiB  
Article
Foundations for Teleoperation and Motion Planning Towards Robot-Assisted Aircraft Fuel Tank Inspection
by Adrián Ricárdez Ortigosa, Marc Bestmann, Florian Heilemann, Johannes Halbe, Lewe Christiansen, Rebecca Rodeck and Gerko Wende
Aerospace 2025, 12(2), 156; https://doi.org/10.3390/aerospace12020156 - 18 Feb 2025
Cited by 2 | Viewed by 1302
Abstract
The aviation industry relies on continuous inspections to ensure infrastructure safety, particularly in confined spaces like aircraft fuel tanks, where human inspections are labor-intensive, risky, and expose workers to hazardous exposures. Robotic systems present a promising alternative to these manual processes but face [...] Read more.
The aviation industry relies on continuous inspections to ensure infrastructure safety, particularly in confined spaces like aircraft fuel tanks, where human inspections are labor-intensive, risky, and expose workers to hazardous exposures. Robotic systems present a promising alternative to these manual processes but face significant technical and operational challenges, including technological limitations, retraining requirements, and economic constraints. Additionally, existing prototypes often lack open-source documentation, which restricts researchers and developers from replicating setups and building on existing work. This study addresses some of these challenges by proposing a modular, open-source framework for robotic inspection systems that prioritizes simplicity and scalability. The design incorporates a robotic arm and an end-effector equipped with three RGB-D cameras to enhance the inspection process. The primary contribution lies in the development of decentralized software modules that facilitate integration and future advancements, including interfaces for teleoperation and motion planning. Preliminary results indicate that the system offers an intuitive user experience, while also enabling effective 3D reconstruction for visualization. However, improvements in incremental obstacle avoidance and path planning inside the tank interior are still necessary. Nonetheless, the proposed robotic system promises to streamline development efforts, potentially reducing both time and resources for future robotic inspection systems. Full article
(This article belongs to the Section Aeronautics)
Show Figures

Figure 1

18 pages, 9899 KiB  
Article
A Robotic Teleoperation System with Integrated Augmented Reality and Digital Twin Technologies for Disassembling End-of-Life Batteries
by Feifan Zhao, Wupeng Deng and Duc Truong Pham
Batteries 2024, 10(11), 382; https://doi.org/10.3390/batteries10110382 - 30 Oct 2024
Cited by 3 | Viewed by 2834
Abstract
Disassembly is a key step in remanufacturing, especially for end-of-life (EoL) products such as electric vehicle (EV) batteries, which are challenging to dismantle due to uncertainties in their condition and potential risks of fire, fumes, explosions, and electrical shock. To address these challenges, [...] Read more.
Disassembly is a key step in remanufacturing, especially for end-of-life (EoL) products such as electric vehicle (EV) batteries, which are challenging to dismantle due to uncertainties in their condition and potential risks of fire, fumes, explosions, and electrical shock. To address these challenges, this paper presents a robotic teleoperation system that leverages augmented reality (AR) and digital twin (DT) technologies to enable a human operator to work away from the danger zone. By integrating AR and DTs, the system not only provides a real-time visual representation of the robot’s status but also enables remote control via gesture recognition. A bidirectional communication framework established within the system synchronises the virtual robot with its physical counterpart in an AR environment, which enhances the operator’s understanding of both the robot and task statuses. In the event of anomalies, the operator can interact with the virtual robot through intuitive gestures based on information displayed on the AR interface, thereby improving decision-making efficiency and operational safety. The application of this system is demonstrated through a case study involving the disassembly of a busbar from an EoL EV battery. Furthermore, the performance of the system in terms of task completion time and operator workload was evaluated and compared with that of AR-based control methods without informational cues and ‘smartpad’ controls. The findings indicate that the proposed system reduces operation time and enhances user experience, delivering its broad application potential in complex industrial settings. Full article
(This article belongs to the Section Battery Processing, Manufacturing and Recycling)
Show Figures

Figure 1

14 pages, 13034 KiB  
Article
Learning Underwater Intervention Skills Based on Dynamic Movement Primitives
by Xuejiao Yang, Yunxiu Zhang, Rongrong Li, Xinhui Zheng and Qifeng Zhang
Electronics 2024, 13(19), 3860; https://doi.org/10.3390/electronics13193860 - 29 Sep 2024
Viewed by 984
Abstract
Improving the autonomy of underwater interventions by remotely operated vehicles (ROVs) can help mitigate the impact of communication delays on operational efficiency. Currently, underwater interventions for ROVs usually rely on real-time teleoperation or preprogramming by operators, which is not only time-consuming and increases [...] Read more.
Improving the autonomy of underwater interventions by remotely operated vehicles (ROVs) can help mitigate the impact of communication delays on operational efficiency. Currently, underwater interventions for ROVs usually rely on real-time teleoperation or preprogramming by operators, which is not only time-consuming and increases the cognitive burden on operators but also requires extensive specialized programming. Instead, this paper uses the intuitive learning from demonstrations (LfD) approach that uses operator demonstrations as inputs and models the trajectory characteristics of the task through the dynamic movement primitive (DMP) approach for task reproduction as well as the generalization of knowledge to new environments. Unlike existing applications of DMP-based robot trajectory learning methods, we propose the underwater DMP (UDMP) method to address the problem that the complexity and stochasticity of underwater operational environments (e.g., current perturbations and floating operations) diminish the representativeness of the demonstrated trajectories. First, the Gaussian mixture model (GMM) and Gaussian mixture regression (GMR) are used for feature extraction of multiple demonstration trajectories to obtain typical trajectories as inputs to the DMP method. The UDMP method is more suitable for the LfD of underwater interventions than the method that directly learns the nonlinear terms of the DMP. In addition, we improve the commonly used homomorphic-based teleoperation mode to heteromorphic mode, which allows the operator to focus more on the end-operation task. Finally, the effectiveness of the developed method is verified by simulation experiments. Full article
Show Figures

Figure 1

19 pages, 6710 KiB  
Article
The Influence of the Operator’s Perception on the Energy Demand for a Hydraulic Manipulator with a Large Working Area
by Karol Cieślik, Piotr Krogul, Marian Janusz Łopatka, Mirosław Przybysz and Rafał Typiak
Appl. Sci. 2024, 14(5), 1800; https://doi.org/10.3390/app14051800 - 22 Feb 2024
Viewed by 1068
Abstract
The efficient operation of hydraulic manipulators with expansive working areas is crucial in various applications such as the construction industry, the rescue service, and the military. These machines are characterized by having more capabilities than humans, and they perform tasks that are not [...] Read more.
The efficient operation of hydraulic manipulators with expansive working areas is crucial in various applications such as the construction industry, the rescue service, and the military. These machines are characterized by having more capabilities than humans, and they perform tasks that are not repeated in the same environment. For this reason, they are most often controlled by a human in a teleoperation system. This research investigates the influence of the operator’s perception on the energy demand of such manipulators. Specifically, the research focused on assessing how the intuitive control systems, such as primary–secondary solutions, impact the energy consumption. Understanding the relation between the operator’s perception and the energy demand is essential for optimizing manipulator design and operation. Experimental research was conducted to analyze the velocity and acceleration of the manipulator’s effector, which is controlled by human operators under different movement ranges and size ratios. The obtained test results allow for the assessment of the dynamic loads, velocity, and energy consumption of the movement of a manipulator with a large working area due to the limitations resulting from the operator’s perception. Full article
(This article belongs to the Special Issue Advances in Robotic Manipulators and Their Applications)
Show Figures

Figure 1

26 pages, 8866 KiB  
Article
Research on Teleoperated Virtual Reality Human–Robot Five-Dimensional Collaboration System
by Qinglei Zhang, Qinghao Liu, Jianguo Duan and Jiyun Qin
Biomimetics 2023, 8(8), 605; https://doi.org/10.3390/biomimetics8080605 - 13 Dec 2023
Cited by 9 | Viewed by 2875
Abstract
In the realm of industrial robotics, there is a growing challenge in simplifying human–robot collaboration (HRC), particularly in complex settings. The demand for more intuitive teleoperation systems is on the rise. However, optimizing robot control interfaces and streamlining teleoperation remains a formidable task [...] Read more.
In the realm of industrial robotics, there is a growing challenge in simplifying human–robot collaboration (HRC), particularly in complex settings. The demand for more intuitive teleoperation systems is on the rise. However, optimizing robot control interfaces and streamlining teleoperation remains a formidable task due to the need for operators to possess specialized knowledge and the limitations of traditional methods regarding operational space and time constraints. This study addresses these issues by introducing a virtual reality (VR) HRC system with five-dimensional capabilities. Key advantages of our approach include: (1) real-time observation of robot work, whereby operators can seamlessly monitor the robot’s real-time work environment and motion during teleoperation; (2) leveraging VR device capabilities, whereby the strengths of VR devices are harnessed to simplify robot motion control, significantly reducing the learning time for operators; and (3) adaptability across platforms and environments: our system effortlessly adapts to various platforms and working conditions, ensuring versatility across different terminals and scenarios. This system represents a significant advancement in addressing the challenges of HRC, offering improved teleoperation, simplified control, and enhanced accessibility, particularly for operators with limited prior exposure to robot operation. It elevates the overall HRC experience in complex scenarios. Full article
(This article belongs to the Special Issue Bio-Inspired and Biomimetic Intelligence in Robotics)
Show Figures

Figure 1

28 pages, 3902 KiB  
Review
Integrating Virtual, Mixed, and Augmented Reality into Remote Robotic Applications: A Brief Review of Extended Reality-Enhanced Robotic Systems for Intuitive Telemanipulation and Telemanufacturing Tasks in Hazardous Conditions
by Yun-Peng Su, Xiao-Qi Chen, Cong Zhou, Lui Holder Pearson, Christopher G. Pretty and J. Geoffrey Chase
Appl. Sci. 2023, 13(22), 12129; https://doi.org/10.3390/app132212129 - 8 Nov 2023
Cited by 26 | Viewed by 8269
Abstract
There is an increasingly urgent need for humans to interactively control robotic systems to perform increasingly precise remote operations, concomitant with the rapid development of space exploration, deep-sea discovery, nuclear rehabilitation and management, and robotic-assisted medical devices. The potential high value of medical [...] Read more.
There is an increasingly urgent need for humans to interactively control robotic systems to perform increasingly precise remote operations, concomitant with the rapid development of space exploration, deep-sea discovery, nuclear rehabilitation and management, and robotic-assisted medical devices. The potential high value of medical telerobotic applications was also evident during the recent coronavirus pandemic and will grow in future. Robotic teleoperation satisfies the demands of the scenarios in which human access carries measurable risk, but human intelligence is required. An effective teleoperation system not only enables intuitive human-robot interaction (HRI) but ensures the robot can also be operated in a way that allows the operator to experience the “feel” of the robot working on the remote side, gaining a “sense of presence”. Extended reality (XR) technology integrates real-world information with computer-generated graphics and has the potential to enhance the effectiveness and performance of HRI by providing depth perception and enabling judgment and decision making while operating the robot in a dynamic environment. This review examines novel approaches to the development and evaluation of an XR-enhanced telerobotic platform for intuitive remote teleoperation applications in dangerous and difficult working conditions. It presents a strong review of XR-enhanced telerobotics for remote robotic applications; a particular focus of the review includes the use of integrated 2D/3D mixed reality with haptic interfaces to perform intuitive remote operations to remove humans from dangerous conditions. This review also covers primary studies proposing Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR) solutions where humans can better control or interact with real robotic platforms using these devices and systems to extend the user’s reality and provide a more intuitive interface. The objective of this article is to present recent, relevant, common, and accessible frameworks implemented in research articles published on XR-enhanced telerobotics for industrial applications. Finally, we present and classify the application context of the reviewed articles in two groups: mixed reality–enhanced robotic telemanipulation and mixed reality–enhanced robotic tele-welding. The review thus addresses all elements in the state of the art for these systems and ends with recommended research areas and targets. The application range of these systems and the resulting recommendations is readily extensible to other application areas, such as remote robotic surgery in telemedicine, where surgeons are scarce and need is high, and other potentially high-risk/high-need scenarios. Full article
(This article belongs to the Special Issue Extended Reality Applications in Industrial Systems)
Show Figures

Figure 1

19 pages, 2174 KiB  
Article
Teleoperated Surgical Robot with Adaptive Interactive Control Architecture for Tissue Identification
by Yubo Sheng, Haoyuan Cheng, Yiwei Wang, Huan Zhao and Han Ding
Bioengineering 2023, 10(10), 1157; https://doi.org/10.3390/bioengineering10101157 - 2 Oct 2023
Cited by 3 | Viewed by 3067
Abstract
The remote perception of teleoperated surgical robotics has been a critical issue for surgeons in fulfilling their remote manipulation tasks. In this article, an adaptive teleoperation control framework is proposed. It provides a physical human–robot interaction interface to enhance the ability of the [...] Read more.
The remote perception of teleoperated surgical robotics has been a critical issue for surgeons in fulfilling their remote manipulation tasks. In this article, an adaptive teleoperation control framework is proposed. It provides a physical human–robot interaction interface to enhance the ability of the operator to intuitively perceive the material properties of remote objects. The recursive least square (RLS) is adopted to estimate the required human hand stiffness that the operator can achieve to compensate for the contact force. Based on the estimated stiffness, a force feedback controller is designed to avoid the induced motion and to convey the haptic information of the slave side. The passivity of the proposed teleoperation system is ensured by the virtual energy tank. A stable contact test validated that the proposed method achieved stable contact between the slave robot and the hard environment while ensuring the transparency of the force feedback. A series of human subject experiments was conducted to empirically verify that the proposed teleoperation framework can provide a more smooth, dexterous, and intuitive user experience with a more accurate perception of the mechanical property of the interacted material on the slave side, compared to the baseline method. After the experiment, the design idea about the force feedback controller of the bilateral teleoperation is discussed. Full article
(This article belongs to the Special Issue Robotics in Medical Engineering)
Show Figures

Figure 1

19 pages, 4561 KiB  
Article
Design and Evaluation of an Intuitive Haptic Teleoperation Control System for 6-DoF Industrial Manipulators
by Ivo Dekker, Karel Kellens and Eric Demeester
Robotics 2023, 12(2), 54; https://doi.org/10.3390/robotics12020054 - 1 Apr 2023
Cited by 13 | Viewed by 4632
Abstract
Industrial robots are capable of performing automated tasks repeatedly, reliably and accurately. However, in some scenarios, human-in-the-loop control is required. In this case, having an intuitive system for moving the robot within the working environment is crucial. Additionally, the operator should be aided [...] Read more.
Industrial robots are capable of performing automated tasks repeatedly, reliably and accurately. However, in some scenarios, human-in-the-loop control is required. In this case, having an intuitive system for moving the robot within the working environment is crucial. Additionally, the operator should be aided by sensory feedback to obtain a user-friendly robot control system. Haptic feedback is one way of achieving such a system. This paper designs and assesses an intuitive teleoperation system for controlling an industrial 6-DoF robotic manipulator using a Geomagic Touch haptic interface. The system utilises both virtual environment-induced and physical sensor-induced haptic feedback to provide the user with both a higher amount of environmental awareness and additional safety while manoeuvering the robot within its working area. Different tests show that the system is capable of fully stopping the manipulator without colliding with the environment, and preventing it from entering singularity states with Cartesian end effector velocities of up to 0.25 m/s. Additionally, an operator is capable of executing low-tolerance end effector positioning tasks (∼0.5 mm) with high-frequency control of the robot (∼100 Hz). Fourteen inexperienced volunteers were asked to perform a typical object removal and writing task to gauge the intuitiveness of the system. It was found that when repeating the same test for a second time, the participants performed 22.2% faster on average. The results for the second attempt also became significantly more consistent between participants, as the inter quartile range dropped by 82.7% (from 52 s on the first attempt to 9 s on the second). Full article
(This article belongs to the Special Issue Immersive Teleoperation and AI)
Show Figures

Figure 1

24 pages, 10232 KiB  
Article
A Wearable Upper Limb Exoskeleton for Intuitive Teleoperation of Anthropomorphic Manipulators
by Liang Zhao, Tie Yang, Yang Yang and Peng Yu
Machines 2023, 11(4), 441; https://doi.org/10.3390/machines11040441 - 30 Mar 2023
Cited by 9 | Viewed by 3971
Abstract
Teleoperation technology combines the strength and accuracy of robots with the perception and cognition abilities of human experts, allowing the robots to work as an avatar of the operator in dangerous environments. The motion compatibility and intuitiveness of the human–machine interface directly affect [...] Read more.
Teleoperation technology combines the strength and accuracy of robots with the perception and cognition abilities of human experts, allowing the robots to work as an avatar of the operator in dangerous environments. The motion compatibility and intuitiveness of the human–machine interface directly affect the quality of teleoperation. However, many motion capture methods require special working environments or need bulky mechanisms. In this research, we proposed a wearable, lightweight, and passive upper limb exoskeleton, which takes intuitiveness and human-machine compatibility as a major concern. The upper limb pose estimation and teleoperation mapping control methods based on the exoskeleton are also discussed. Experimental results showed that by the help of the upper limb exoskeleton, people can achieve most areas of the normal range of motion. The proposed mapping control methods were verified on a 14-DOF anthropomorphic manipulator and showed good performance in teleoperation tasks. Full article
(This article belongs to the Section Robotics, Mechatronics and Intelligent Machines)
Show Figures

Figure 1

21 pages, 9316 KiB  
Article
Teleoperated Locomotion for Biobot between Japan and Bangladesh
by Mochammad Ariyanto, Chowdhury Mohammad Masum Refat, Xiaofeng Zheng, Kazuyoshi Hirao, Yingzhe Wang and Keisuke Morishima
Computation 2022, 10(10), 179; https://doi.org/10.3390/computation10100179 - 10 Oct 2022
Cited by 12 | Viewed by 3288
Abstract
Biobot-based insects have been investigated so far for various applications such as search and rescue operations, environmental monitoring, and discovering the environment. These applications need a strong international collaboration to complete the tasks. However, during the COVID-19 pandemic, most people could not easily [...] Read more.
Biobot-based insects have been investigated so far for various applications such as search and rescue operations, environmental monitoring, and discovering the environment. These applications need a strong international collaboration to complete the tasks. However, during the COVID-19 pandemic, most people could not easily move from one country to another because of the travel ban. In addition, controlling biobots is challenging because only experts can operate the cockroach behavior with and without stimulated response. In order to solve this issue, we proposed a user-friendly teleoperation user interface (UI) to monitor and control the biobot between Japan and Bangladesh without onsite operation by experts. This study applied Madagascar hissing cockroaches (MHC) as a biobot hybrid robot. A multithreading algorithm was implemented to run multiple parallel computations concurrently on the UI. Virtual network computing (VNC) was implemented on the teleoperation UI as remote communication for streaming real-time video from Japan to Bangladesh and sending remote commands from Bangladesh to Japan. In the experiments, a remote operator successfully steered the biobot to follow a predetermined path through a developed teleoperation UI with a time delay of 275 ms. The proposed interactive and intuitive UI enables a promising and reliable system for teleoperated biobots between two remote countries. Full article
(This article belongs to the Section Computational Engineering)
Show Figures

Figure 1

Back to TopTop