Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (69)

Search Parameters:
Keywords = tele-operation assistance

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
18 pages, 8744 KiB  
Article
A User-Centered Teleoperation GUI for Automated Vehicles: Identifying and Evaluating Information Requirements for Remote Driving and Assistance
by Maria-Magdalena Wolf, Henrik Schmidt, Michael Christl, Jana Fank and Frank Diermeyer
Multimodal Technol. Interact. 2025, 9(8), 78; https://doi.org/10.3390/mti9080078 - 31 Jul 2025
Viewed by 193
Abstract
Teleoperation emerged as a promising fallback for situations beyond the capabilities of automated vehicles. Nevertheless, teleoperation still faces challenges, such as reduced situational awareness. Since situational awareness is primarily built through the remote operator’s visual perception, the graphical user interface (GUI) design is [...] Read more.
Teleoperation emerged as a promising fallback for situations beyond the capabilities of automated vehicles. Nevertheless, teleoperation still faces challenges, such as reduced situational awareness. Since situational awareness is primarily built through the remote operator’s visual perception, the graphical user interface (GUI) design is critical. In addition to video feed, supplemental informational elements are crucial—not only for the predominantly studied remote driving, but also for emerging desk-based remote assistance concepts. This work develops a GUI for different teleoperation concepts by identifying key informational elements during the teleoperation process through expert interviews (N = 9). Following this, a static and dynamic GUI prototype was developed and evaluated in a click dummy study (N = 36). Thereby, the dynamic GUI adapts the number of displayed elements according to the teleoperation phase. Results show that both GUIs achieve good system usability scale (SUS) ratings, with the dynamic GUI significantly outperforming the static version in both usability and task completion time. However, the results might be attributable to a learning effect due to the lack of randomization. The user experience questionnaire (UEQ) score shows potential for improvement. To enhance the user experience, the GUI should be evaluated in a follow-up study that includes interaction with a real vehicle. Full article
Show Figures

Figure 1

18 pages, 16108 KiB  
Article
Development of roCaGo for Forest Observation and Forestry Support
by Yoshinori Kiga, Yuzuki Sugasawa, Takumi Sakai, Takuma Nemoto and Masami Iwase
Forests 2025, 16(7), 1067; https://doi.org/10.3390/f16071067 - 26 Jun 2025
Viewed by 289
Abstract
This study addresses the ’last-mile’ transportation challenges that arise in steep and narrow forest terrain by proposing a novel robotic palanquin system called roCaGo. It is inspired by the mechanical principles of two-wheel-steering and two-wheel-drive (2WS/2WD) bicycles. The roCaGo system integrates front- and [...] Read more.
This study addresses the ’last-mile’ transportation challenges that arise in steep and narrow forest terrain by proposing a novel robotic palanquin system called roCaGo. It is inspired by the mechanical principles of two-wheel-steering and two-wheel-drive (2WS/2WD) bicycles. The roCaGo system integrates front- and rear-wheel-drive mechanisms, as well as a central suspension structure for carrying loads. Unlike conventional forestry machinery, which requires wide, well-maintained roads or permanent rail systems, the roCaGo system enables flexible, operator-assisted transport along narrow, unprepared mountain paths. A dynamic model of the system was developed to design a stabilization control strategy, enabling roCaGo to maintain transport stability and assist the operator during navigation. Numerical simulations and preliminary physical experiments demonstrate its effectiveness in challenging forest environments. Furthermore, the applicability of roCaGo has been extended to include use as a mobile third-person viewpoint platform to support the remote operation of existing forestry equipment; specifically the LV800crawler vehicle equipped with a front-mounted mulcher. Field tests involving LiDAR sensors mounted on roCaGo were conducted to verify its ability to capture the environmental data necessary for non-line-of-sight teleoperation. The results show that roCaGo is a promising solution for improving labor efficiency and ensuring operator safety in forest logistics and remote-controlled forestry operations. Full article
Show Figures

Figure 1

27 pages, 10314 KiB  
Article
Immersive Teleoperation via Collaborative Device-Agnostic Interfaces for Smart Haptics: A Study on Operational Efficiency and Cognitive Overflow for Industrial Assistive Applications
by Fernando Hernandez-Gobertti, Ivan D. Kudyk, Raul Lozano, Giang T. Nguyen and David Gomez-Barquero
Sensors 2025, 25(13), 3993; https://doi.org/10.3390/s25133993 - 26 Jun 2025
Viewed by 482
Abstract
This study presents a novel investigation into immersive teleoperation systems using collaborative, device-agnostic interfaces for advancing smart haptics in industrial assistive applications. The research focuses on evaluating the quality of experience (QoE) of users interacting with a teleoperation system comprising a local robotic [...] Read more.
This study presents a novel investigation into immersive teleoperation systems using collaborative, device-agnostic interfaces for advancing smart haptics in industrial assistive applications. The research focuses on evaluating the quality of experience (QoE) of users interacting with a teleoperation system comprising a local robotic arm, a robot gripper, and heterogeneous remote tracking and haptic feedback devices. By employing a modular device-agnostic framework, the system supports flexible configurations, including one-user-one-equipment (1U-1E), one-user-multiple-equipment (1U-ME), and multiple-users-multiple-equipment (MU-ME) scenarios. The experimental set-up involves participants manipulating predefined objects and placing them into designated baskets by following specified 3D trajectories. Performance is measured using objective QoE metrics, including temporal efficiency (time required to complete the task) and spatial accuracy (trajectory similarity to the predefined path). In addition, subjective QoE metrics are assessed through detailed surveys, capturing user perceptions of presence, engagement, control, sensory integration, and cognitive load. To ensure flexibility and scalability, the system integrates various haptic configurations, including (1) a Touch kinaesthetic device for precision tracking and grounded haptic feedback, (2) a DualSense tactile joystick as both a tracker and mobile haptic device, (3) a bHaptics DK2 vibrotactile glove with a camera tracker, and (4) a SenseGlove Nova force-feedback glove with VIVE trackers. The modular approach enables comparative analysis of how different device configurations influence user performance and experience. The results indicate that the objective QoE metrics varied significantly across device configurations, with the Touch and SenseGlove Nova set-ups providing the highest trajectory similarity and temporal efficiency. Subjective assessments revealed a strong correlation between presence and sensory integration, with users reporting higher engagement and control in scenarios utilizing force feedback mechanisms. Cognitive load varied across the set-ups, with more complex configurations (e.g., 1U-ME) requiring longer adaptation periods. This study contributes to the field by demonstrating the feasibility of a device-agnostic teleoperation framework for immersive industrial applications. It underscores the critical interplay between objective task performance and subjective user experience, providing actionable insights into the design of next-generation teleoperation systems. Full article
(This article belongs to the Special Issue Recent Development of Flexible Tactile Sensors and Their Applications)
Show Figures

Figure 1

23 pages, 15527 KiB  
Article
Foundations for Teleoperation and Motion Planning Towards Robot-Assisted Aircraft Fuel Tank Inspection
by Adrián Ricárdez Ortigosa, Marc Bestmann, Florian Heilemann, Johannes Halbe, Lewe Christiansen, Rebecca Rodeck and Gerko Wende
Aerospace 2025, 12(2), 156; https://doi.org/10.3390/aerospace12020156 - 18 Feb 2025
Cited by 2 | Viewed by 1309
Abstract
The aviation industry relies on continuous inspections to ensure infrastructure safety, particularly in confined spaces like aircraft fuel tanks, where human inspections are labor-intensive, risky, and expose workers to hazardous exposures. Robotic systems present a promising alternative to these manual processes but face [...] Read more.
The aviation industry relies on continuous inspections to ensure infrastructure safety, particularly in confined spaces like aircraft fuel tanks, where human inspections are labor-intensive, risky, and expose workers to hazardous exposures. Robotic systems present a promising alternative to these manual processes but face significant technical and operational challenges, including technological limitations, retraining requirements, and economic constraints. Additionally, existing prototypes often lack open-source documentation, which restricts researchers and developers from replicating setups and building on existing work. This study addresses some of these challenges by proposing a modular, open-source framework for robotic inspection systems that prioritizes simplicity and scalability. The design incorporates a robotic arm and an end-effector equipped with three RGB-D cameras to enhance the inspection process. The primary contribution lies in the development of decentralized software modules that facilitate integration and future advancements, including interfaces for teleoperation and motion planning. Preliminary results indicate that the system offers an intuitive user experience, while also enabling effective 3D reconstruction for visualization. However, improvements in incremental obstacle avoidance and path planning inside the tank interior are still necessary. Nonetheless, the proposed robotic system promises to streamline development efforts, potentially reducing both time and resources for future robotic inspection systems. Full article
(This article belongs to the Section Aeronautics)
Show Figures

Figure 1

22 pages, 7758 KiB  
Article
Haptic Guidance System for Teleoperation Based on Trajectory Similarity
by Hikaru Nagano, Tomoki Nishino, Yuichi Tazaki and Yasuyoshi Yokokohji
Robotics 2025, 14(2), 15; https://doi.org/10.3390/robotics14020015 - 30 Jan 2025
Cited by 1 | Viewed by 1496
Abstract
Teleoperation technology enables remote control of machines, but often requires complex manoeuvres that pose significant challenges for operators. To mitigate these challenges, assistive systems have been developed to support teleoperation. This study presents a teleoperation guidance system that provides assistive force feedback to [...] Read more.
Teleoperation technology enables remote control of machines, but often requires complex manoeuvres that pose significant challenges for operators. To mitigate these challenges, assistive systems have been developed to support teleoperation. This study presents a teleoperation guidance system that provides assistive force feedback to help operators align more accurately with desired trajectories. Two key issues remain: (1) the lack of a flexible, real-time approach to defining desired trajectories and calculating assistive forces, and (2) uncertainty about the effects of forward motion assistance within the assistive forces. To address these issues, we propose a novel approach that captures the posture trajectory of the local control interface, statistically generates a reference trajectory, and incorporates forward motion as an adjustable parameter. In Experiment 1, which involved simulating an object transfer task, the proposed method significantly reduced the operator’s workload compared to conventional techniques, especially in dynamic target scenarios. Experiment 2, which involved more complex paths, showed that assistive forces with forward assistance significantly improved manoeuvring performance. Full article
(This article belongs to the Special Issue Robot Teleoperation Integrating with Augmented Reality)
Show Figures

Figure 1

11 pages, 3982 KiB  
Proceeding Paper
Remote Control of ADAS Features: A Teleoperation Approach to Mitigate Autonomous Driving Challenges
by İsa Karaböcek, Batıkan Kavak and Ege Özdemir
Eng. Proc. 2024, 82(1), 36; https://doi.org/10.3390/ecsa-11-20449 - 25 Nov 2024
Cited by 1 | Viewed by 1218
Abstract
This paper presents a novel approach to enhancing the safety of Advanced Driver Assistance Systems (ADAS) by integrating teleoperation for the remote control of ADAS features in a vehicle. The primary contribution of this research is the development and implementation of a teleoperation [...] Read more.
This paper presents a novel approach to enhancing the safety of Advanced Driver Assistance Systems (ADAS) by integrating teleoperation for the remote control of ADAS features in a vehicle. The primary contribution of this research is the development and implementation of a teleoperation system that allows human operators to take control of the vehicle’s ADAS features, enabling timely intervention in critical situations where autonomous functions may be insufficient. While the concept of teleoperation has been explored in the literature, with several implementations focused on the direct control of vehicles, there are relatively few examples of teleoperation systems designed specifically to utilize ADAS features. This research addresses this gap by exploring teleoperation as a supplementary mechanism that allows human intervention in critical driving situations, particularly where autonomous systems may encounter limitations. The teleoperation system was tested under two critical ADAS scenarios, cruise control and lane change assist, chosen for their importance in real-world driving conditions. These scenarios demonstrate how teleoperation can complement and enhance the performance of ADAS features. The experiments reveal the effectiveness of remote control in providing precise control, allowing for swift and accurate responses in scenarios where the autonomous system might face challenges. The novelty of this work lies in its application of teleoperation to ADAS features, offering a new perspective on how human intervention can enhance vehicle safety. The findings provide valuable insights into optimizing teleoperation for real-world driving scenarios. As a result of the experiments, it was demonstrated that integrating teleoperation with ADAS features offers a more reliable solution compared to standalone ADAS driving. Full article
Show Figures

Figure 1

26 pages, 9199 KiB  
Article
Wireless PID-Based Control for a Single-Legged Rehabilitation Exoskeleton
by Rabé Andersson, Mikael Cronhjort and José Chilo
Machines 2024, 12(11), 745; https://doi.org/10.3390/machines12110745 - 22 Oct 2024
Cited by 2 | Viewed by 1576
Abstract
The demand for remote rehabilitation is increasing, opening up convenient and effective home-based therapy for the sick and elderly. In this study, we use AnyBody simulations to analyze muscle activity and determine key parameters for designing a rehabilitation exoskeleton, as well as selecting [...] Read more.
The demand for remote rehabilitation is increasing, opening up convenient and effective home-based therapy for the sick and elderly. In this study, we use AnyBody simulations to analyze muscle activity and determine key parameters for designing a rehabilitation exoskeleton, as well as selecting the appropriate motor torque to assist patients during rehabilitation sessions. The exoskeleton was designed with a PID control mechanism for the precise management of motor positions and joint torques, and it operates in both automated and teleoperation modes. Hip and knee movements are monitored via smartphone-based IMU sensors, enabling real-time feedback. Bluetooth communication ensures seamless control during various training scenarios. Our study demonstrates that remotely controlled rehabilitation systems can be implemented effectively, offering vital support not only during global health crises such as pandemics but also in improving the accessibility of rehabilitation services in remote or underserved areas. This approach has the potential to transform the way physical therapy can be delivered, making it more accessible and adaptable to the needs of a larger patient population. Full article
(This article belongs to the Section Robotics, Mechatronics and Intelligent Machines)
Show Figures

Figure 1

18 pages, 3659 KiB  
Article
Enabling Pandemic-Resilient Healthcare: Edge-Computing-Assisted Real-Time Elderly Caring Monitoring System
by Muhammad Zubair Islam, A. S. M. Sharifuzzaman Sagar and Hyung Seok Kim
Appl. Sci. 2024, 14(18), 8486; https://doi.org/10.3390/app14188486 - 20 Sep 2024
Cited by 3 | Viewed by 1794
Abstract
Over the past few years, life expectancy has increased significantly. However, elderly individuals living independently often require assistance due to mobility issues, symptoms of dementia, or other health-related challenges. In these situations, high-quality elderly care systems for the aging population require innovative approaches [...] Read more.
Over the past few years, life expectancy has increased significantly. However, elderly individuals living independently often require assistance due to mobility issues, symptoms of dementia, or other health-related challenges. In these situations, high-quality elderly care systems for the aging population require innovative approaches to guarantee Quality of Service (QoS) and Quality of Experience (QoE). Traditional remote elderly care methods face several challenges, including high latency and poor service quality, which affect their transparency and stability. This paper proposes an Edge Computational Intelligence (ECI)-based haptic-driven ECI-TeleCaring system for the remote caring and monitoring of elderly people. It utilizes a Software-Defined Network (SDN) and Mobile Edge Computing (MEC) to reduce latency and enhance responsiveness. Dual Long Short-Term Memory (LSTM) models are deployed at the edge to enable real-time location-aware activity prediction to ensure QoS and QoE. The results from the simulation demonstrate that the proposed system is proficient in managing the transmission of data in real time without and with an activity recognition and location-aware model by communication latency under 2.5 ms (more than 60%) and from 11∼12 ms (60∼95%) for 10 to 1000 data packets, respectively. The results also show that the proposed system ensures a trade-off between the transparency and stability of the system from the QoS and QoE perspectives. Moreover, the proposed system serves as a testbed for implementing, investigating, and managing elder telecaring services for QoS/QoE provisioning. It facilitates real-time monitoring of the deployed technological parameters along with network delay and packet loss, and it oversees data exchange between the master domain (human operator) and slave domain (telerobot). Full article
(This article belongs to the Special Issue Advances in Intelligent Communication System)
Show Figures

Figure 1

19 pages, 27719 KiB  
Article
Assistive Control through a Hapto-Visual Digital Twin for a Master Device Used for Didactic Telesurgery
by Daniel Pacheco Quiñones, Daniela Maffiodo and Med Amine Laribi
Robotics 2024, 13(9), 138; https://doi.org/10.3390/robotics13090138 - 11 Sep 2024
Cited by 3 | Viewed by 1196
Abstract
This article explores the integration of a hapto-visual digital twin on a master device used for bilateral teleoperation. The device, known as a quasi-spherical parallel manipulator, is currently employed for the remote center of motion control in teleoperated mini-invasive surgery. After providing detailed [...] Read more.
This article explores the integration of a hapto-visual digital twin on a master device used for bilateral teleoperation. The device, known as a quasi-spherical parallel manipulator, is currently employed for the remote center of motion control in teleoperated mini-invasive surgery. After providing detailed insights into the device’s kinematics, including its geometric configuration, Jacobian, and reachable workspace, the paper illustrates the overall control system, encompassing both hardware and software components. The article describes how a digital twin, which implements a haptic assistive control and a visually enhanced representation of the device, was integrated into the system. The digital twin was then tested with the device: in the experiments, one “student” end-user must follow a predefined “teacher” trajectory. Preliminary results demonstrate how the overall system can pose a good starting point for didactic telesurgery operation. The control action, yet to be optimized and tested on different subjects, indeed seems to grant satisfying performance and accuracy. Full article
(This article belongs to the Special Issue Digital Twin-Based Human–Robot Collaborative Systems)
Show Figures

Figure 1

28 pages, 29843 KiB  
Article
JVC-02 Teleoperated Robot: Design, Implementation, and Validation for Assistance in Real Explosive Ordnance Disposal Missions
by Luis F. Canaza Ccari, Ronald Adrian Ali, Erick Valdeiglesias Flores, Nicolás O. Medina Chilo, Erasmo Sulla Espinoza, Yuri Silva Vidal and Lizardo Pari
Actuators 2024, 13(7), 254; https://doi.org/10.3390/act13070254 - 2 Jul 2024
Cited by 2 | Viewed by 2761
Abstract
Explosive ordnance disposal (EOD) operations are hazardous due to the volatile and sensitive nature of these devices. EOD robots have improved these tasks, but their high cost limits accessibility for security institutions that do not have sufficient funds. This article presents the design, [...] Read more.
Explosive ordnance disposal (EOD) operations are hazardous due to the volatile and sensitive nature of these devices. EOD robots have improved these tasks, but their high cost limits accessibility for security institutions that do not have sufficient funds. This article presents the design, implementation, and validation of a low-cost EOD robot named JVC-02, specifically designed for use in explosive hazardous environments to safeguard the safety of police officers of the Explosives Disposal Unit (UDEX) of Arequipa, Peru. To achieve this goal, the essential requirements for this type of robot were compiled, referencing the capabilities of Rescue Robots from RoboCup. Additionally, the Quality Function Deployment (QFD) methodology was used to identify the needs and requirements of UDEX police officers. Based on this information, a modular approach to robot design was developed, utilizing commercial off-the-shelf components to facilitate maintenance and repair. The JVC-02 was integrated with a 5-DoF manipulator and a two-finger mechanical gripper to perform dexterity tasks, along with a tracked locomotion mechanism, which enables effective movement, and a three-camera vision system to facilitate exploration tasks. Finally, field tests were conducted in real scenarios to evaluate and experimentally validate the capabilities of the JVC-02 robot, assessing its mobility, dexterity, and exploration skills. Additionally, real EOD missions were carried out in which UDEX agents intervened and controlled the robot. The results demonstrate that the JVC-02 robot possesses strong capabilities for real EOD applications, excelling in intuitive operation, low cost, and ease of maintenance. Full article
(This article belongs to the Section Actuators for Robotics)
Show Figures

Figure 1

15 pages, 35607 KiB  
Article
A Lightweight and Affordable Wearable Haptic Controller for Robot-Assisted Microsurgery
by Xiaoqing Guo, Finn McFall, Peiyang Jiang, Jindong Liu, Nathan Lepora and Dandan Zhang
Sensors 2024, 24(9), 2676; https://doi.org/10.3390/s24092676 - 23 Apr 2024
Cited by 3 | Viewed by 2585
Abstract
In robot-assisted microsurgery (RAMS), surgeons often face the challenge of operating with minimal feedback, particularly lacking in haptic feedback. However, most traditional desktop haptic devices have restricted operational areas and limited dexterity. This report describes a novel, lightweight, and low-budget wearable haptic controller [...] Read more.
In robot-assisted microsurgery (RAMS), surgeons often face the challenge of operating with minimal feedback, particularly lacking in haptic feedback. However, most traditional desktop haptic devices have restricted operational areas and limited dexterity. This report describes a novel, lightweight, and low-budget wearable haptic controller for teleoperated microsurgical robotic systems. We designed a wearable haptic interface entirely made using off-the-shelf material-PolyJet Photopolymer, fabricated using liquid and solid hybrid 3D co-printing technology. This interface was designed to resemble human soft tissues and can be wrapped around the fingertips, offering direct contact feedback to the operator. We also demonstrated that the device can be easily integrated with our motion tracking system for remote microsurgery. Two motion tracking methods, marker-based and marker-less, were compared in trajectory-tracking experiments at different depths to find the most effective motion tracking method for our RAMS system. The results indicate that within the 4 to 8 cm tracking range, the marker-based method achieved exceptional detection rates. Furthermore, the performance of three fusion algorithms was compared to establish the unscented Kalman filter as the most accurate and reliable. The effectiveness of the wearable haptic controller was evaluated through user studies focusing on the usefulness of haptic feedback. The results revealed that haptic feedback significantly enhances depth perception for operators during teleoperated RAMS. Full article
Show Figures

Figure 1

21 pages, 4986 KiB  
Article
Optimization Approach for Multisensory Feedback in Robot-Assisted Pouring Task
by Mandira S. Marambe, Bradley S. Duerstock and Juan P. Wachs
Actuators 2024, 13(4), 152; https://doi.org/10.3390/act13040152 - 18 Apr 2024
Cited by 2 | Viewed by 3237
Abstract
Individuals with disabilities and persons operating in inaccessible environments can greatly benefit from the aid of robotic manipulators in performing daily living activities and other remote tasks. Users relying on robotic manipulators to interact with their environment are restricted by the lack of [...] Read more.
Individuals with disabilities and persons operating in inaccessible environments can greatly benefit from the aid of robotic manipulators in performing daily living activities and other remote tasks. Users relying on robotic manipulators to interact with their environment are restricted by the lack of sensory information available through traditional operator interfaces. These interfaces deprive users of somatosensory feedback that would typically be available through direct contact. Multimodal sensory feedback can bridge these perceptual gaps effectively. Given a set of object properties (e.g., temperature, weight) to be conveyed and sensory modalities (e.g., visual, haptic) available, it is necessary to determine which modality should be assigned to each property for an effective interface design. The goal of this study was to develop an effective multisensory interface for robot-assisted pouring tasks, which delivers nuanced sensory feedback while permitting the high visual demand necessary for precise teleoperation. To that end, an optimization approach was employed to generate a combination of feedback properties to modality assignments that maximizes effective feedback perception and minimizes cognitive load. A set of screening experiments tested twelve possible individual assignments to form this optimal combination. The resulting perceptual accuracy, load, and user preference measures were input into a cost function. Formulating and solving as a linear assignment problem, a minimum cost combination was generated. Results from experiments evaluating efficacy in practical use cases for pouring tasks indicate that the solution was significantly more effective than no feedback and had considerable advantage over an arbitrary design. Full article
Show Figures

Figure 1

28 pages, 3902 KiB  
Review
Integrating Virtual, Mixed, and Augmented Reality into Remote Robotic Applications: A Brief Review of Extended Reality-Enhanced Robotic Systems for Intuitive Telemanipulation and Telemanufacturing Tasks in Hazardous Conditions
by Yun-Peng Su, Xiao-Qi Chen, Cong Zhou, Lui Holder Pearson, Christopher G. Pretty and J. Geoffrey Chase
Appl. Sci. 2023, 13(22), 12129; https://doi.org/10.3390/app132212129 - 8 Nov 2023
Cited by 26 | Viewed by 8286
Abstract
There is an increasingly urgent need for humans to interactively control robotic systems to perform increasingly precise remote operations, concomitant with the rapid development of space exploration, deep-sea discovery, nuclear rehabilitation and management, and robotic-assisted medical devices. The potential high value of medical [...] Read more.
There is an increasingly urgent need for humans to interactively control robotic systems to perform increasingly precise remote operations, concomitant with the rapid development of space exploration, deep-sea discovery, nuclear rehabilitation and management, and robotic-assisted medical devices. The potential high value of medical telerobotic applications was also evident during the recent coronavirus pandemic and will grow in future. Robotic teleoperation satisfies the demands of the scenarios in which human access carries measurable risk, but human intelligence is required. An effective teleoperation system not only enables intuitive human-robot interaction (HRI) but ensures the robot can also be operated in a way that allows the operator to experience the “feel” of the robot working on the remote side, gaining a “sense of presence”. Extended reality (XR) technology integrates real-world information with computer-generated graphics and has the potential to enhance the effectiveness and performance of HRI by providing depth perception and enabling judgment and decision making while operating the robot in a dynamic environment. This review examines novel approaches to the development and evaluation of an XR-enhanced telerobotic platform for intuitive remote teleoperation applications in dangerous and difficult working conditions. It presents a strong review of XR-enhanced telerobotics for remote robotic applications; a particular focus of the review includes the use of integrated 2D/3D mixed reality with haptic interfaces to perform intuitive remote operations to remove humans from dangerous conditions. This review also covers primary studies proposing Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR) solutions where humans can better control or interact with real robotic platforms using these devices and systems to extend the user’s reality and provide a more intuitive interface. The objective of this article is to present recent, relevant, common, and accessible frameworks implemented in research articles published on XR-enhanced telerobotics for industrial applications. Finally, we present and classify the application context of the reviewed articles in two groups: mixed reality–enhanced robotic telemanipulation and mixed reality–enhanced robotic tele-welding. The review thus addresses all elements in the state of the art for these systems and ends with recommended research areas and targets. The application range of these systems and the resulting recommendations is readily extensible to other application areas, such as remote robotic surgery in telemedicine, where surgeons are scarce and need is high, and other potentially high-risk/high-need scenarios. Full article
(This article belongs to the Special Issue Extended Reality Applications in Industrial Systems)
Show Figures

Figure 1

28 pages, 56756 KiB  
Review
Robotic Systems and Navigation Techniques in Orthopedics: A Historical Review
by Teng Li, Armin Badre, Farshid Alambeigi and Mahdi Tavakoli
Appl. Sci. 2023, 13(17), 9768; https://doi.org/10.3390/app13179768 - 29 Aug 2023
Cited by 31 | Viewed by 11502
Abstract
Since the da Vinci surgical system was approved by the Food and Drug Administration (FDA) in 2000, the development and deployment of various robot-assisted minimally invasive surgery (MIS) systems have been largely expedited and boomed. With the rapid advancement of robotic techniques in [...] Read more.
Since the da Vinci surgical system was approved by the Food and Drug Administration (FDA) in 2000, the development and deployment of various robot-assisted minimally invasive surgery (MIS) systems have been largely expedited and boomed. With the rapid advancement of robotic techniques in recent decades, robot-assisted systems have been widely used in various surgeries including orthopedics. These robot-related techniques are transforming the conventional ways to conduct surgical procedures. Robot-assisted orthopedic surgeries have become more and more popular due to their potential benefits of increased accuracy and precision in surgical outcomes, enhanced reproducibility, reduced technical variability, decreased pain, and faster recovery time. In this paper, robotic systems and navigation techniques in typical orthopedic surgeries are reviewed, especially for arthroplasty. From the perspective of robotics and engineering, the systems and techniques are divided into two main categories, i.e., robotic systems (RSs), and computer-aided navigation systems (CANSs). The former is further divided into autonomous RS, hands-on RS, and teleoperated RS. For the latter, three key elements in CANS are introduced, including 3D modeling, registration, and navigation. Lastly, the potential advantages and disadvantages of the RS and CANS are summarized and discussed. Future perspectives on robotics in orthopedics, as well as the challenges, are presented. Full article
(This article belongs to the Special Issue Surgical Robotics Design and Clinical Applications)
Show Figures

Figure 1

23 pages, 5036 KiB  
Article
The Translation of Mobile-Exoneuromusculoskeleton-Assisted Wrist–Hand Poststroke Telerehabilitation from Laboratory to Clinical Service
by Wanyi Qing, Ching-Yi Nam, Harvey Man-Hok Shum, Marko Ka-Leung Chan, King-Pong Yu, Serena Sin-Wah Ng, Bibo Yang and Xiaoling Hu
Bioengineering 2023, 10(8), 976; https://doi.org/10.3390/bioengineering10080976 - 18 Aug 2023
Cited by 5 | Viewed by 2066
Abstract
Rehabilitation robots are helpful in poststroke telerehabilitation; however, their feasibility and rehabilitation effectiveness in clinical settings have not been sufficiently investigated. A non-randomized controlled trial was conducted to investigate the feasibility of translating a telerehabilitation program assisted by a mobile wrist/hand exoneuromusculoskeleton (WH-ENMS) [...] Read more.
Rehabilitation robots are helpful in poststroke telerehabilitation; however, their feasibility and rehabilitation effectiveness in clinical settings have not been sufficiently investigated. A non-randomized controlled trial was conducted to investigate the feasibility of translating a telerehabilitation program assisted by a mobile wrist/hand exoneuromusculoskeleton (WH-ENMS) into routine clinical services and to compare the rehabilitative effects achieved in the hospital-service-based group (n = 12, clinic group) with the laboratory-research-based group (n = 12, lab group). Both groups showed significant improvements (p ≤ 0.05) in clinical assessments of behavioral motor functions and in muscular coordination and kinematic evaluations after the training and at the 3-month follow-up, with the lab group demonstrating better motor gains than the clinic group (p ≤ 0.05). The results indicated that the WH-ENMS-assisted tele-program was feasible and effective for upper limb rehabilitation when integrated into routine practice, and the quality of patient–operator interactions physically and remotely affected the rehabilitative outcomes. Full article
(This article belongs to the Special Issue Bioengineering for Physical Rehabilitation)
Show Figures

Figure 1

Back to TopTop