Next Article in Journal
Functional Bakery Products: Technological, Chemical and Nutritional Modification
Previous Article in Journal
Optimization of Energy Management Strategy for Series Hybrid Electric Vehicle Equipped with Dual-Mode Combustion Engine Under NVH Constraints
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Development of a Virtual Reality-Based Environment for Telerehabilitation

1
Research Center for Industrial Robots Simulation and Testing—CESTER, Faculty of Industrial Engineering, Robotics and Production Management, Technical University of Cluj-Napoca, 400114 Cluj-Napoca, Romania
2
Technical Sciences Academy of Romania, B-dul Dacia, 26, 030167 Bucharest, Romania
*
Authors to whom correspondence should be addressed.
Appl. Sci. 2024, 14(24), 12022; https://doi.org/10.3390/app142412022
Submission received: 14 November 2024 / Revised: 13 December 2024 / Accepted: 21 December 2024 / Published: 22 December 2024

Abstract

:
The paper presents an innovative virtual reality (VR)-based environment for personalized telerehabilitation programs. This environment integrates a parallel robotic structure designed for the lower limb rehabilitation of patients with neuromotor disabilities and a virtual patient. The robotic structure is controlled via a user interface (UI) that communicates with the VR environment via the TCP/IP protocol. The robotic structure can also be operated using two controllers that communicate with a VR headset via the Bluetooth protocol. Through these two controllers, the therapist demonstrates to the patient various exercises that the robotic system can perform. With the right-hand controller, the therapist guides exercises for the hip and knee, while the left-hand controller manages ankle exercises. The therapist remotely designs a rehabilitation plan for patients at home, defining exercises, interacting with the rehabilitation robot in real-time via the VR headset and the two controllers, and initiating therapy sessions. The user interface allows monitoring of patient progress through video feedback, electromyography (EMG) sensors, and session recording.

1. Introduction

Recent technological advancements have led to a significant increase in telemedicine applications. This expansion has been possible due to the rapid evolution of computer technologies and accelerated innovations in the field of telemedical devices, particularly during the COVID-19 pandemic. These developments significantly improved accessibility and efficiency in the provision of medical services, opening new horizons in the field of digital health [1]. Telecare and telerehabilitation have garnered attention from the medical community for enabling patients to connect with their doctors from the comfort of their homes. These proposed technologies not only facilitate access to healthcare but also improve patients’ quality of life by reducing the need for frequent clinic visits [2,3]. Rehabilitation exercises, in particular, prove indispensable in addressing a wide range of conditions, providing symptom relief, and enhancing patient functionality and quality of life. Through rehabilitation, patients can recover lost skills and adopt strategies to adapt to limitations caused by their conditions. Additionally, rehabilitation exercises can play a preventive role, helping to prevent the recurrence or worsening of existing conditions [4,5]. The leading cause of mobility loss in the lower and upper limbs is stroke, which occurs when blood flow to part of the brain is interrupted, either because of a blood clot (ischemic stroke) or because of bleeding in the brain (hemorrhagic stroke). In both cases, the brain cells are deprived of oxygen and essential nutrients, leading to cell death. This cell loss can impair functions controlled by the affected area of the brain. For example, if the stroke affects the part of the brain responsible for controlling limb movements, the patient may lose the ability to move one or more limbs. Recovering from a stroke can be a lengthy and difficult process that varies widely among individuals, depending on the severity of the stroke, the person’s overall health, and the rehabilitation therapy received. Physical and occupational therapy are often key components of the rehabilitation plan, helping patients regain strength, coordination, and independence [6,7,8].
There are a variety of physiotherapeutic rehabilitation systems specifically designed for people with different neuro-motor disorders. These systems also support applications in the field of telerehabilitation, thus enabling patients to benefit from treatment from home, which expands therapy access and can improve patients’ quality of life [9]. In [10], the authors present the development of a control system for a parallel robot designed for lower limb rehabilitation of stroke survivors. In [11], the authors conducted an in-depth study of various telerehabilitation systems, concluding that VR-based telerehabilitation offers an effective alternative for treating stroke patients. This option becomes especially valuable given the limitations of traditional rehabilitation methods. Bo et al. introduced two innovative methods for manipulating a Stewart–Gough platform-based walking simulator. The first method allows therapists to directly control each platform using movement data collected from wearable sensors. The second method is designed to increase patient engagement by providing limited control based on trunk tilt. Experiments have shown that both control interfaces are feasible in terms of system performance and subjective user feedback [12]. In [13], the authors developed a VR-based simulator for a robotic system designed for ankle rehabilitation (the drop-foot syndrome) of stroke patients. In [14], the authors proposed an innovative research model that combines the unified theory of technology acceptance and use with protective motivation theory to identify the factors influencing the adoption of VR in telerehabilitation. In [15], the author reviewed studies evaluating the combined effects of robotic systems and VR lower limb functions post-stroke, offering recommendations for future research in rehabilitation. In [16], the authors analyzed the benefits of a master–slave robotic rehabilitation therapy, where the patient receives real-time assistance from a therapist. In addition, they also investigated whether this approach can be effectively implemented in a telerehabilitation context. A pilot study involving 10 patients was conducted to test this method. The study involved a point-to-point rehabilitation exercise that was supported by three types of assistance: fixed assistance (with no therapist interaction), assistance provided by a physically present therapist, and assistance from a remote therapist in a simulated telerehabilitation environment. This aim was to assess the effectiveness and applicability of these assistance methods in the rehabilitation process. The patients received assistance through force fields using a robotic device for upper limb rehabilitation. The results indicated that the personalized assistance provided by the therapist better matched the patient’s needs compared to fixed assistance. In [17], the authors explored the efficacy of combining repetitive transcranial magnetic stimulation with adaptive gait training to improve lower limb function and regulatory mechanisms in subacute stroke. Miao et al. described a study in which a robot designed for lower limb rehabilitation in stroke patients can be used for both on-site and telerehabilitation [3]. In [18], the authors presented a study describing a new remote rehabilitation system that integrates an IoT-based connected robot intended for wrist and forearm rehabilitation. To facilitate the work of doctors in telerehabilitation and to find methods that can be used to treat patients better, artificial intelligence techniques are used [19]. Clemente et al. conducted a study in which the potential of human position estimation from monocular 2D videos is demonstrated, as a marker-free solution, being accessible in musculoskeletal telerehabilitation approaches [20]. In [21], the authors presented an efficient model for tracking a person for a certain range of motion using artificial intelligence technology.
This paper presents the development of a VR-based environment for personalized telerehabilitation. Compared to existing robotic rehabilitation systems, this system allows the therapist to test different rehabilitation training exercises on a virtual patient before implementing them with the actual patient. The patient is informed and can view in VR the exercises to be performed. This approach enhances the safety of the robotic rehabilitation system, especially for home-based training. Thus, the therapist can create, test, and implement a training plan via a multimodal control interface, then send it to the real robot located remotely or in the patient’s home environment. An assistant supervises the rehabilitation process through a webcam, which streams video to the robotic control interface. Section 2 describes the proposed VR environment for the safe use of the lower limb rehabilitation robot and the multimodal user interface, Section 3 presents the initial validation test results with healthy subjects, while Section 4 concludes the paper.

2. Materials and Methods

Figure 1 shows the general architecture of the proposed LegUp parallel telerehabilitation system. In collaboration with the patient, the therapist remotely develops a rehabilitation plan. These exercises are designed to help the patient improve lower limb flexibility, strength, and mobility while minimizing pain and discomfort. Before sending the parameters of the rehabilitation exercises to the LegUp experimental robotic structure, the therapist tests the rehabilitation exercises on a virtual patient using a VR simulator with a virtual model of the LegUp robot. This preliminary testing ensures the patient’s safety. Next, the therapist demonstrates the rehabilitation exercises to the patient using the VR application. To enter the immersive VR environment, the therapist wears a VR headset and uses two controllers to operate the virtual LegUp robotic structure. After this stage, the real patient is placed on the experimental LegUp robotic structure with the help of an assistant while the therapist monitors them via a video camera and an EMG sensor attached to the patient’s lower limb to track muscle condition. At this point, the therapist can send the rehabilitation exercise parameters to the LegUp robotic structure to start performing the exercises for the patient. The EMG sensor used is a model EN0240 from DFRobot (DFROBOT, Timisoara, Romania) [22], with an integrated filter that is essential for reducing noise and interference, especially power frequency (50 Hz or 60 Hz), which can affect the accuracy of EMG measurements, helping to obtain a clearer and more precise signal, better reflecting muscle activity. This filter can be used at a sample frequency of 500 Hz and 1000 Hz, with an ADC converter greater than 8 bits. For this application, 1000 Hz was chosen to preserve the original information. To retrieve and transmit data packets from the EMG sensor, an ESP32 microcontroller (Espressif Systems, Targu Jiu, Gorj, Romania) was used, which has the ability to transmit data via the Wi-Fi protocol.

2.1. The Telerehabilitation Robotic System

Following a medical protocol [23], the proposed parallel robotic structure LegUp [24] was developed for spatial lower limb rehabilitation in patients with various neuro-motor disabilities, targeting major joints: hip, knee, and ankle [25,26]. The new design targets bedridden patients through easy attachment to the bed. Additionally, the parallel robotic structure underwent comprehensive mathematical analysis and multi-criteria dimensional optimization to ensure effective operation within its workload range, free of singularities [27]. Figure 2a presents the kinematic scheme of the LegUp parallel robot. The robot is actuated using five active joints: prismatic joint q1 is used to perform the flexion and extension of the knee, prismatic joint q2 is used to perform the flexion and extension of the hip joint, while the prismatic joint q3 is used to perform the abduction and adduction of the hip. The ankle rehabilitation module is placed at the end of the Ll segment and is actuated by active joint q4, used to perform the flexion and extension of the ankle, and the prismatic joint q5, used to perform the ankle inversion/eversion. Each anatomical joint can be stimulated individually with respect to the medical protocol, or a combined motion may be created in order to simulate gait. The parallel robot also uses several passive revolute joints and prismatic joints in order to recreate the real motion of the targeted joint. Figure 2b presents the experimental model of the robot. The hardware control unit is equipped with a PLC (programmable logic controller), model X20CP3586 [28], dedicated drivers for each servo motor, model 80VD100PD.C000-01 [29], a power supply unit, and various input/output modules for sensors. The PLC features an Intel Atom 1.6 GHz processor, an additional I/O-processor, Onboard Ethernet, POWERLINK with poll response chaining, USB connectivity, CompactFlash for removable application memory, and 512 MB of DDR2 SDRAM.
The driver connects two servo motors in parallel, each with EnDat encoders (ver. 2.2), and integrates a PowerLink interface with two ports (to ensure a serial connection with the PLC and other drives). For robot actuation, two types of servomotors were selected: 3 8LVA23B1030-D000-0 motors for the linear units of the hip-knee module [30] and 2 8LVA13B1030-D000-0 motors for the rotational units of the ankle module [31].

2.2. The Main User Console

The user console consists of a set of software and hardware elements used to control the robotic system. A multimodal control interface allows the user to interact with the robotic system in both a VR environment and with the experimental model.

2.2.1. The Multimodal Control Architecture

The block diagram from Figure 3 illustrates the interconnections between the logical components of the control system, clearly depicting the information flow and the relationships among various system parts to facilitate understanding of the system’s overall functionality. The telerehabilitation robotic system is managed through two controllers that communicate with the VR headset via the Bluetooth protocol, allowing intuitive and natural user interaction with the VR environment. In addition, the VR headset communicates with the computer via the USB port, ensuring fast and efficient data transmission for real-time response to user actions, creating an immersive and interactive experience. Moreover, the C# (C Sharp) application communicates with the VR environment through the TCP/IP protocol, ensuring a stable and reliable connection between the two. This enables efficient data transmission between the application and the VR environment, which is essential for a smooth and responsive VR experience. The C# application also communicates with a user interface specifically developed for the therapist. This interface allows the therapist to create personalized exercises tailored to each patient’s needs. In addition, the interface allows the therapist to review the patient’s rehabilitation history, analyze the collected data, and make adjustments based on results. This approach facilitates a more efficient and personalized rehabilitation process, thus improving therapy outcomes. Ultimately, all these features create a more flexible and adaptable rehabilitation environment that can be adjusted to fit each patient’s unique requirements.

2.2.2. Software Application Development

The Unified Modeling Language (UML) development diagram provides a clear visualization of how physical software modules are mapped to hardware resources. This diagram, represented in Figure 4, consists of 3 nodes connected through the TCP/IP protocol. It shows that the application implemented in C#, the VR application, and the robotic structure control application all run on the Windows operating system.
The artifact enabling the execution of the C# application is the executable file UI_LegUp.exe, which uses exercise data from the Exercices.csv file and interacts with components specific to the 6 implemented classes.
As for the VR application, the executable file VR_LegUP.exe is the artifact that allows it to run, interacting with 2 components.

Software Analysis

In order to graphically visualize all the functionalities offered by the software application developed using the C# 12 programming language [32] and the Unity 2022.3 VR program [33], the UML use case diagram [34,35,36,37] shown in Figure 5 was created. This chart includes the following:
  • Eighteen use cases detailing the functionalities of the software application: twelve associated with the application implemented in the C# programming language and six related to the VR application;
  • Three actors: the human user (physical therapist), the controllers, and the control software of the robotic rehabilitation structure;
  • Eleven association relationships between actors and use cases;
  • Three dependency relationships between use cases.
A detailed description of all processes and algorithms [38,39,40] used to achieve the goal outlined by each use case is provided through activity diagrams. Figure 6 presents the UML activity diagram corresponding to the “Monitoring (graphical and tabular) of patient progress” use case, which includes 8 activities required to accomplish the goal associated with this case.
Considering the 12 C# application-specific functionalities outlined in the use case diagram, 4 classes were designed and implemented to meet the proposed specifications.
The class diagram in Figure 7 illustrates the 6 classes together with the relationships between them but also the predefined packages used from the .Net Framework [41]. The purpose of the 4 classes is as follows:
  • The ExercisesGUI class allows the therapist to interact with the application’s GUI and is derived from the Form class in the System.Windows.Forms package;
  • The HistoryGUI class enables the therapist to monitor the patient’s progress and is also derived from the Form class in the System.Windows.Forms package;
  • The RobotGUI class facilitates interaction with the robotic structure control software and is derived from the Form class in the System.Windows.Forms package;
  • The RobotConn class establishes connections between the C# application and the robotic structure control software;
  • The UnityConn class manages connections between the C# graphical interface and the VR component implemented in Unity;
  • The C#MainFile class represents the core class of the C# application, consisting of objects from the ExercisesGUI, HistoryGUI, RobotGUI, RobotConn, and UnityConn classes, according to the composition relationships in the diagram.

User Interface

The parallel robotic rehabilitation system is controlled through a user interface, which consists of two menus, namely:
Exercises menu (Figure 8), which enables the therapist to customize the types of exercises according to the patient’s needs in the lower limb rehabilitation process. Once the exercises are configured, they are sent to the real robotic system for execution.
This menu includes the following commands:
  • To initiate the connection between the user interface and the VR application using the TCP/IP protocol, the therapist must press the “Connect” button, which has a blue background. Once the connection is established, the button’s background turns red, and the text changes to “Disconnect” (Figure 8 (1)).
  • Data transfer between the user interface and the VR application begins only after the “Start” button is pressed, at which point the button’s background color turns from blue to red, and the text changes from “Start” to “Stop” (Figure 8 (1)).
  • To command the parallel robot to perform different types of rehabilitation exercises, the following commands are used:
    The parallel robotic system allows the performance of 5 types of rehabilitation exercises (HipAbduction, Dorsiflexion, Inversion, HipFlexion, KneeFlexion). For each exercise, the therapist can set parameters such as amplitude, speed, and number of repetitions. After configuring the necessary settings, the therapist selects the desired type of exercise by pressing the corresponding button (Figure 8 (2)). Once the desired exercise button is pressed, it is added to a list (Figure 8 (3)) along with the order of the exercise, the execution amplitude, the speed, and the number of repetitions. After selecting and adding the rehabilitation exercises to the list, the therapist can save them for future use or load them into the list (Figure 8 (4)).
  • To initiate the rehabilitation exercises, the therapist must press the “Start exercises” button (Figure 8 (5)). At this point, the robotic rehabilitation structure begins performing the selected rehabilitation exercises for the patient, and the elapsed time for each individual exercise is also displayed.
  • An EMG (electromyography) sensor is used during patient monitoring to track muscle activity in rehabilitation exercises. This sensor measures the electrical signals (Figure 8 (6)) generated by muscles when they are activated. Increased muscle activity can indicate improvements in muscle strength and function.
  • The therapist can demonstrate to the patient the types of exercises that the parallel robotic rehabilitation system can perform using two controllers (Figure 8 (7)).
  • To close the user interface, the “Exit” button (Figure 8 (8)) must be pressed.
Video monitoring & Robot control menu (Figure 9), where the therapist can control the experimental robot and monitor the patient while the robotic system performs various lower limb recovery exercises. The interface includes the following commands:
  • Connect button—establishes a connection between the C# application and the control computer of the experimental rehabilitation robot.
  • Homing button—initializes the servomotors on the experimental robotic structure when pressed.
  • Start button—by pressing this button, the experimental robotic system will start performing the rehabilitation exercises.
  • Emergency STOP—an emergency button that cuts the power supply to the experimental robotic system.
Session history menu (Figure 10), which enables the therapist to analyze the patient’s progress over time by accessing recorded data from each exercise session. The interface includes the following features:
  • When pressing the “Display values” button (Figure 10 (1)), the recorded data for the rehabilitation exercises (Figure 10 (2)) and the data from the EMG sensor (Figure 10 (3)) for each session are displayed.
  • To delete the session data from the file, the “Delete values” button (Figure 10 (4)) must be pressed.
To perform a detailed analysis of the patient’s overall condition after using the robotic rehabilitation system, the therapist uses the second menu of the user interface (Figure 10). This menu provides essential information for assessing the patient’s progress. It enables monitoring of the patient’s muscle activity, capturing data on muscle tone, activation level, and any variations observed during rehabilitation sessions. Additionally, the menu tracks the number of completed rehabilitation exercise sessions, which is crucial for assessing training consistency and frequency, while also promoting adherence to the rehabilitation plan. The sequence in which the exercises were performed is recorded as well, providing further insights into the patient’s patterns or preferences. This information allows the rehabilitation program to be adjusted to maximize effectiveness. The menu details the type of exercises performed, allowing the therapist to see the variety of activities and ensure that all the necessary muscle groups are adequately trained. The number of repetitions for each exercise is another key parameter, helping to assess the patient’s workload and adjust exercise intensity based on their progress. The exercise amplitude provides insight into the patient’s range of motion, indicating their flexibility and mobility. The speed of exercise execution is also monitored to ensure movements are performed at an appropriate pace, thus minimizing the risk of injury. Additionally, the exercise time is recorded to assess the overall duration of the rehabilitation sessions and to ensure that the patient is training long enough to achieve effective results. This combined data provides a complete picture of the patient’s progress and allows the therapist to tailor the rehabilitation program to meet the patient’s specific needs.

2.2.3. Using the VR Environment for the LegUp Robotic System

The Unity 3D platform was chosen for the VR environment because it has accessibility and ease of use, support for multiple platforms, and flexibility in programming. The compression and decompression algorithm used in Unity 3D is LZ4, which is efficient and fast. To create a three-dimensional VR experience in Unity 3D, various algorithms and techniques are required, including the following:
  • To handle complex graphics and rendering tasks, Unity uses the Universal Render Pipeline (URP) or High Definition Render Pipeline (HDRP), ensuring high-quality images in VR.
  • For realistic physical simulations and collision detection, essential in interactive VR environments, algorithms such as PhysX are used.
  • The A* (A-star) algorithm is frequently used for path identification in 3D spaces, facilitating the navigation of characters or objects in the VR environment.
  • The implementation of spatial audio algorithms ensures that sound sources are perceived as coming from precise locations in 3D space, thus enhancing immersion.
The use of robotic systems in physical rehabilitation has gained significant attention due to their ability to address motor deficits and alleviate the increased strain on healthcare systems [42]. The LegUp parallel robotic system is used for telerehabilitation applications, enabling therapists to remotely control the robotic system. In this setup, the patient is at home, and the therapist communicates with the patient from their office to develop personalized patient rehabilitation plans. During these planning sessions, the therapist and patient collaborate to establish a tailored plan that incorporates rehabilitation exercises using the LegUp robotic system. These sessions are designed to help the patient improve lower limb strength, flexibility, and mobility while focusing on reducing pain and discomfort. Once the rehabilitation plan is established, the therapist dons the VR headset (Figure 11a) and holds the controllers (Figure 11b) in both hands. After putting on the VR headset, the therapist enters an immersive, three-dimensional environment that facilitates deep interaction and active engagement, closely mimicking physical presence within that context. Through the two controllers, the therapist operates the robotic rehabilitation system to demonstrate to the patient the types of exercises that the robotic system can perform.
For an optimal viewing angle, the robotic structure, along with the patient on the table, is rotated using two buttons located on the controller held in the left hand (Figure 11b (4)). While using the two controllers, the user interface displays the type of exercise that is being performed at that moment. The background color, initially blue, changes to yellow to indicate the active exercise (Figure 8 (7)).
Once the therapist has presented the patient with the types of exercises that the robotic rehabilitation system can perform, they proceed to the next stage. Here, the therapist customizes the types of rehabilitation exercises to suit the patient’s specific needs using the user interface (Figure 8). Once tailored, these exercises are sent to the robotic structure to start the rehabilitation procedure for the patient’s lower limb, with the patient positioned on the robotic bed.

3. Results and Discussion

The use of VR technologies in rehabilitation has proven to have clear advantages, among which is a higher degree of patient engagement within the rehabilitation process, which further boosts efficiency [43]. This paper explored how VR can improve the rehabilitation experience for patients in remote/home settings by introducing additional safety features designed to facilitate the adoption of specific training devices. The multimodal, VR-based training interface developed here significantly improves the customization of different training exercises, tailoring them to the individual needs and rehabilitation requirements of the patient. In the virtual environment, a range of rehabilitation exercises is proposed and tested using the virtual robot and patient. Figure 12 illustrates the types of exercises that can be performed using the LegUp rehabilitation robot in the developed virtual environment:
  • HipAbduction—performed using the joystick (X axis) on the controller held in the right hand (Figure 11b (1));
  • HipFlexion—performed using the joystick (Y axis) on the controller held in the right hand (Figure 11b (1));
  • KneeFlexion—executed using two buttons located on the controller held in the right hand (Figure 11b (2)).
  • Ankle Dorsiflexion—executed using the joystick (X axis) on the controller held in the left hand (Figure 11b (3));
  • Ankle Inversion—executed using the joystick (Y axis) on the controller held in the left hand (Figure 11b (3));
After performing the initial rehabilitation training exercises in the virtual environment, the proposed training program is forwarded to the LegUp experimental model, located remotely in a clinical or home setting. Experimental tests with three healthy subjects have been performed using the current setup. Figure 10 presents each subject’s programmed rehabilitation plan. For example, subject 1 (S1) performs the first exercise, called Dorsiflexion, three times, which has a starting amplitude (Amp.S) of 30 degrees and a final amplitude (Amp.F) of 330 degrees, with a speed of 5 rpm in a time of 11:08 s. Figure 13 illustrates the user interface with video streaming for remote monitoring of the rehabilitation process. In this instance, the LegUp rehabilitation robot was placed in a clinical environment for closer monitoring, while the VR application and the user interface were installed on a remote PC. Encoder signals from the robot confirmed the accurate execution of the programmed exercises, validating the input from both the multimodal interface and the VR application. These systems are now ready for patient testing.
The LegUp robotic recovery system is currently in experimental testing at the Neurology Clinic 1, where it is being evaluated by the hospital’s ethics committee for inclusion in a battery of clinical trials with patients. We expect the trials to last 1 year, and upon completion, we will make the results available in a scientific article.
The system was tested in the laboratory with healthy subjects, and the results were very good, in the sense that no incidents were recorded, the robot avoided singular configurations, and the perceived ergonomics were adequate. However, we estimate a major difference between the healthy subjects, all young people with normal weight, and future patients, usually overweight and with a lower adaptability to the positions imposed by the use of the robot, which represents a limitation of the laboratory test performed.

4. Summary and Conclusions

This article presents the development process of a VR-based environment tailored to meet the demands of stroke patients in personalized telerehabilitation programs. The VR environment provides an effective platform for validating the various features of the parallel robotic structure and for evaluating the various interfaces. In addition, this environment enables the simulation of complex scenarios and testing of control algorithms in a safe, controlled setting, thereby reducing the risks associated with direct physical testing. The robotic system used in this telerehabilitation application is designed to rehabilitate the lower limbs of bedridden patients, providing personalized support and exercises to accelerate recovery and improve the patient’s quality of life. The telerehabilitation system also provides compassionate, long-term medical support to disabled individuals in remote areas, significantly easing the burden on patients’ families and reducing medical expenses. From the input console, the physical therapist efficiently interacts with the robotic system through two controllers and an intuitive user interface, allowing precise monitoring and adjustment of rehabilitation parameters to ensure personalized and effective treatment for each patient. The robot has some limitations: it only trains from a supine position (not standing), so it only mobilizes the limb, not the dynamic component essential for walking recovery, which will be performed at a later stage if necessary. The relatively large dimensions of the robot make it more difficult to transport and install in confined spaces. Future work will focus on initial tests with patients with various impairments to validate both the VR environment and the LegUp experimental model. A UAT (User Acceptability Test) will also be performed, involving all stakeholders within this development stage, including physical therapists, patients, and neurologists.

Author Contributions

Conceptualization, F.C. and B.G.; methodology, F.C. and P.T.; software, F.C.; validation, D.P., B.G. and C.V.; formal analysis, A.P.; investigation, C.V. and P.T.; resources, D.P.; data curation, B.G.; writing—original draft preparation, F.C.; writing—review and editing, B.G.; visualization, A.P.; supervision, D.P. and P.T.; project administration, D.P.; funding acquisition, D.P. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the project “New frontiers in adaptive modular robotics for patient-centered medical rehabilitation—ASKLEPIOS”, funded by the European Union—NextGenerationEU and the Romanian Government, under the National Recovery and Resilience Plan for Romania, contract no. 760071/23.05.2023, code CF 121/15.11.2022, with the Romanian Ministry of Research, Innovation, and Digitalization, within Component 9, investment I8.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki involving healthy subjects.

Informed Consent Statement

Written informed consent has been obtained from the healthy subjects to publish this paper.

Data Availability Statement

Data are contained within the article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Ackerman, M.J.; Filart, R.; Burgess, L.P.; Lee, I.; Poropatich, R.K. Developing next-generation telehealth tools and technologies: Patients, systems, and data perspectives. Telemed. J. e-Health 2010, 16, 93–95. [Google Scholar] [CrossRef] [PubMed]
  2. Reynolds, A.; Awan, N.; Gallagher, P. Physiotherapists’ perspective of telehealth during the COVID-19 pandemic. Int. J. Med. Inf. 2021, 156, 104613. [Google Scholar] [CrossRef] [PubMed]
  3. Miao, M.; Gao, X.; Zhu, W. A Construction Method of Lower Limb Rehabilitation Robot with Remote Control System. Appl. Sci. 2021, 11, 867. [Google Scholar] [CrossRef]
  4. Machlin, S.R.; Chevan, J.; Yu, W.W.; Zodet, M.W. Determinants of Utilization and Expenditures for Episodes of Ambulatory Physical Therapy Among Adults. Phys. Ther. 2011, 91, 1018–1029. [Google Scholar] [CrossRef]
  5. Markus, H.S. Reducing disability after stroke. Int. J. Stroke 2022, 17, 249–250. [Google Scholar] [CrossRef]
  6. Shahid, J.; Kashif, A.; Shahid, M.K. A Comprehensive Review of Physical Therapy Interventions for Stroke Rehabilitation: Impairment-Based Approaches and Functional Goals. Brain Sci. 2023, 13, 717. [Google Scholar] [CrossRef]
  7. Federico, S.; Cacciante, L.; Cieślik, B.; Turolla, A.; Agostini, M.; Kiper, P.; Picelli, A.; on behalf of the RIN_TR_Group. Telerehabilitation for Neurological Motor Impairment: A Systematic Review and Meta-Analysis on Quality of Life, Satisfaction, and Acceptance in Stroke, Multiple Sclerosis, and Parkinson’s Disease. J. Clin. Med. 2024, 13, 299. [Google Scholar] [CrossRef]
  8. Hanakawa, T.; Hotta, F.; Nakamura, T.; Shindo, K.; Ushiba, N.; Hirosawa, M.; Yamazaki, Y.; Moriyama, Y.; Takagi, S.; Mizuno, K.; et al. Macrostructural Cerebellar Neuroplasticity Correlates With Motor Recovery After Stroke. Neurorehabilit. Neural Repair 2023, 37, 775–785. [Google Scholar] [CrossRef]
  9. Maier, M.; Ballester, B.R.; Verschure, P.F.M.J. Principles of Neurorehabilitation After Stroke Based on Motor Learning and Brain Plasticity Mechanisms. Front. Syst. Neurosci. 2019, 13, 74. [Google Scholar] [CrossRef]
  10. Pisla, D.; Nadas, I.; Tucan, P.; Albert, S.; Carbone, G.; Antal, T.; Banica, A.; Gherman, B. Development of a Control System and Functional Validation of a Parallel Robot for Lower Limb Rehabilitation. Actuators 2021, 10, 277. [Google Scholar] [CrossRef]
  11. Hao, J.; Pu, Y.; Chen, Z.; Siu, K.C. Effects of virtual reality-based telerehabilitation for stroke patients: A systematic review and meta-analysis of randomized controlled trials. J. Stroke Cerebrovasc. Dis. 2023, 32, 106960. [Google Scholar] [CrossRef] [PubMed]
  12. Bo, A.P.L.; Casas, L.; Cucho-Padin, G.; Hayashibe, M.; Elias, D. Control Strategies for Gait Telerehabilitation System Based on Parallel Robotics. Appl. Sci. 2021, 11, 11095. [Google Scholar] [CrossRef]
  13. Covaciu, F.; Pisla, A.; Iordan, A.E. Development of a virtual reality simulator for an intelligent robotic system used in ankle rehabilitation. Sensors 2021, 21, 1537. [Google Scholar] [CrossRef] [PubMed]
  14. Chan, Y.K.; Tang, Y.M.; Teng, L. A comparative analysis of digital health usage intentions towards the adoption of virtual reality in telerehabilitation. Int. J. Med. Inform. 2023, 174, 105042. [Google Scholar] [CrossRef]
  15. Alashram, A.R. Effectiveness of combined robotics and virtual reality on lower limb functional ability in stroke survivors: A systematic review of randomized controlled trials. Neurol. Sci. 2024, 45, 4721–4739. [Google Scholar] [CrossRef]
  16. Catalán, J.M.; García-Pérez, J.V.; Blanco, A.; Ezquerro, S.; Garrote, A.; Costa, T.; Bertomeu-Motos, A.; Díaz, I.; García-Aracil, N. Telerehabilitation Versus Local Rehabilitation Therapies Assisted by Robotic Devices: A Pilot Study with Patients. Appl. Sci. 2021, 11, 6259. [Google Scholar] [CrossRef]
  17. Zhang, W.; Dai, L.; Fang, L.; Zhang, H.; Li, X.; Hong, Y.; Chen, S.; Zhang, Y.; Zheng, B.; Wu, J.; et al. Effectiveness of repetitive transcranial magnetic stimulation combined with intelligent Gait-Adaptability Training in improving lower limb function and brain symmetry after subacute stroke: A preliminary study. J. Stroke Cerebrovasc. Dis. 2024, 33, 107961. [Google Scholar] [CrossRef]
  18. Bouteraa, Y.; Abdallah, I.B.; Boukthir, K. A New Wrist–Forearm Rehabilitation Protocol Integrating Human Biomechanics and SVM-Based Machine Learning for Muscle Fatigue Estimation. Bioengineering 2023, 10, 219. [Google Scholar] [CrossRef]
  19. Latreche, A.; Kelaiaia, R.; Chemori, A.; Kerboua, A. A New Home-Based Upper- and Lower-Limb Telerehabilitation Platform with Experimental Validation. Arab. J. Sci. Eng. 2023, 48, 10825–10840. [Google Scholar] [CrossRef]
  20. Clemente, C.; Chambel, G.; Silva, D.C.F.; Montes, A.M.; Pinto, J.F.; Silva, H.P.d. Feasibility of 3D Body Tracking from Monocular 2D Video Feeds in Musculoskeletal Telerehabilitation. Sensors 2024, 24, 206. [Google Scholar] [CrossRef]
  21. Latreche, A.; Kelaiaia, R.; Chemori, A. AI-based Human Tracking for Remote Rehabilitation Progress Monitoring. In Proceedings of the ICAECE 2023—International Conference on Advances in Electrical and Computer Engineering, Tebessa, Algeria, 15–16 May 2023; AIJR Abstract. AIJR Publisher: Balrampur, India, 2024; pp. 7–9. [Google Scholar]
  22. Available online: https://wiki.dfrobot.com/Analog_EMG_Sensor_by_OYMotion_SKU_SEN0240 (accessed on 10 October 2024).
  23. Vaida, C.; Birlescu, I.; Pisla, A.; Ulinici, I.-M.; Tarnita, D.; Carbone, G.; Pisla, D. Systematic Design of a Parallel Robotic System for Lower Limb Rehabilitation. IEEE Access 2020, 8, 34522–34537. [Google Scholar] [CrossRef]
  24. Pisla, D.; Birlescu, I.; Vaida, C.; Tucan, P.; Gherman, B.; Machado, J. Parallel Robot for Joint Recovery of the Lower Limb in Two Planes, OSIM A 00116/20.03.2024. Available online: http://pub.osim.ro/publication-server/pdf-document?PN=RO133814%20RO%20133814&iDocId=12775&iepatch=.pdf (accessed on 12 September 2024).
  25. Vaida, C.; Birlescu, I.; Pisla, A.; Carbone, G.; Plitea, N.; Ulinici, I.; Gherman, B.; Puskas, F.; Tucan, P.; Pisla, D. RAISE-An innovative parallel robotic system for lower limb rehabilitation. In New Trends in Medical and Service Robotics; Springer: Berlin/Heidelberg, Germany, 2018; pp. 293–302. [Google Scholar]
  26. Yang, Y.L.; Guo, J.L.; Yao, Y.F.; Yin, H.S. Development of a Compliant Lower-Limb Rehabilitation Robot Using Underactuated Mechanism. Electronics 2023, 12, 3436. [Google Scholar] [CrossRef]
  27. Birlescu, I.; Tohanean, N.; Vaida, C.; Gherman, B.; Neguran, D.; Horsia, A.; Tucan, P.; Condurache, D.; Pisla, D. Modeling and analysis of a parallel robotic system for lower limb rehabilitation with predefined operational workspace. Mech. Mach. Theory 2024, 198, 105674. [Google Scholar] [CrossRef]
  28. B&R Home Page. Available online: https://www.br-automation.com/en/products/plc-systems/x20-system/x20-plc/x20cp3586/ (accessed on 28 October 2024).
  29. B&R Home Page. Available online: https://www.br-automation.com/en/products/motion-control/acoposmicro/inverter-modules/80vd100pdc000-01/ (accessed on 28 October 2024).
  30. B&R Home Page. Available online: https://www.br-automation.com/en/products/motion-control/8lva-synchronous-motors/standard-motors-available-at-short-notice/8lva23b1030d000-0/ (accessed on 28 October 2024).
  31. Available online: https://www.br-automation.com/en/products/motion-control/8lva-synchronous-motors/standard-motors-available-at-short-notice/8lva13b1030d000-0/ (accessed on 28 October 2024).
  32. Akdoğan, H. Performance Analysis of Span Data Type in C# Programming Language. Turk. J. Nat. Sci. 2024, 1, 29–36. [Google Scholar]
  33. Unity Home Page. Available online: https://unity.com (accessed on 23 October 2024).
  34. Iordan, A.E. Usage of Stacked Long Short-Term Memory for Recognition of 3D Analytic Geometry Elements. In Proceedings of the 14th International Conference on Agents and Artificial Intelligence, online, 3–5 February 2022; pp. 745–752. [Google Scholar]
  35. Nedelcu, I.G.; Ionita, A.D. Evaluating the Conformity to Types of Unified Modeling Language Diagrams with Feature-Based Neural Networks. Appl. Sci. 2024, 14, 9470. [Google Scholar] [CrossRef]
  36. Li, Q.; Zeng, F. Enhancing Software Architecture Adaptability: A Comprehensive Evaluation Method. Symmetry 2024, 16, 894. [Google Scholar] [CrossRef]
  37. Jha, P.; Sahu, M.; Isobe, T. A UML Activity Flow Graph-Based Regression Testing Approach. Appl. Sci. 2023, 13, 5379. [Google Scholar] [CrossRef]
  38. Iordan, A.E. An Optimized LSTM Neural Network for Accurate Estimation of Software Development Effort. Mathematics 2024, 12, 200. [Google Scholar] [CrossRef]
  39. Górski, T. UML Profile for Messaging Patterns in Service-Oriented Architecture, Microservices, and Internet of Things. Appl. Sci. 2022, 12, 12790. [Google Scholar] [CrossRef]
  40. Di Felice, P.; Paolone, G.; Paesani, R.; Marinelli, M. Design and Implementation of a Metadata Repository about UML Class Diagrams. A Software Tool Supporting the Automatic Feeding of the Repository. Electronics 2022, 11, 201. [Google Scholar] [CrossRef]
  41. Iordan, A.E. Optimal solution of the Guarini puzzle extension using tripartite graphs. IOP Conf. Ser.-Mater. Sci. Eng. 2019, 477, 012046. [Google Scholar] [CrossRef]
  42. Major, Z.Z.; Vaida, C.; Major, K.A.; Tucan, P.; Brusturean, E.; Gherman, B.; Birlescu, I.; Craciunaș, R.; Ulinici, I.; Simori, G.; et al. Comparative Assessment of Robotic versus Classical Physical Therapy Using Muscle Strength and Ranges of Motion Testing in Neurological Diseases. J. Pers. Med. 2021, 11, 953. [Google Scholar] [CrossRef] [PubMed]
  43. Catania, V.; Rundo, F.; Panerai, S.; Ferri, R. Virtual Reality for the Rehabilitation of Acquired Cognitive Disorders: A Narrative Review. Bioengineering 2024, 11, 35. [Google Scholar] [CrossRef] [PubMed]
Figure 1. General architecture of the system.
Figure 1. General architecture of the system.
Applsci 14 12022 g001
Figure 2. Parallel robotic system for lower limb rehabilitation: (a) kinematic scheme; (b) experimental model.
Figure 2. Parallel robotic system for lower limb rehabilitation: (a) kinematic scheme; (b) experimental model.
Applsci 14 12022 g002
Figure 3. The interconnections between components.
Figure 3. The interconnections between components.
Applsci 14 12022 g003
Figure 4. UML deployment diagram.
Figure 4. UML deployment diagram.
Applsci 14 12022 g004
Figure 5. UML use case diagram.
Figure 5. UML use case diagram.
Applsci 14 12022 g005
Figure 6. UML activity diagram.
Figure 6. UML activity diagram.
Applsci 14 12022 g006
Figure 7. UML class diagram.
Figure 7. UML class diagram.
Applsci 14 12022 g007
Figure 8. User interface: Exercises.
Figure 8. User interface: Exercises.
Applsci 14 12022 g008
Figure 9. User interface: Video monitoring & Robot control.
Figure 9. User interface: Video monitoring & Robot control.
Applsci 14 12022 g009
Figure 10. User interface: Session history.
Figure 10. User interface: Session history.
Applsci 14 12022 g010
Figure 11. Devices used in the control of the robotic system: (a) VR headset; (b) controllers.
Figure 11. Devices used in the control of the robotic system: (a) VR headset; (b) controllers.
Applsci 14 12022 g011
Figure 12. Rehabilitation exercises in the virtual environment using the LegUp robot. (a,b) Hip Abduction; (c,d) Hip Flexion; (e,f) Knee Flexion; (g,h) Ankle Dorsiflexion; (i,j) Ankle Inversion.
Figure 12. Rehabilitation exercises in the virtual environment using the LegUp robot. (a,b) Hip Abduction; (c,d) Hip Flexion; (e,f) Knee Flexion; (g,h) Ankle Dorsiflexion; (i,j) Ankle Inversion.
Applsci 14 12022 g012aApplsci 14 12022 g012b
Figure 13. Remote control through the user interface with integrated video streaming and monitoring.
Figure 13. Remote control through the user interface with integrated video streaming and monitoring.
Applsci 14 12022 g013
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Covaciu, F.; Vaida, C.; Gherman, B.; Pisla, A.; Tucan, P.; Pisla, D. Development of a Virtual Reality-Based Environment for Telerehabilitation. Appl. Sci. 2024, 14, 12022. https://doi.org/10.3390/app142412022

AMA Style

Covaciu F, Vaida C, Gherman B, Pisla A, Tucan P, Pisla D. Development of a Virtual Reality-Based Environment for Telerehabilitation. Applied Sciences. 2024; 14(24):12022. https://doi.org/10.3390/app142412022

Chicago/Turabian Style

Covaciu, Florin, Calin Vaida, Bogdan Gherman, Adrian Pisla, Paul Tucan, and Doina Pisla. 2024. "Development of a Virtual Reality-Based Environment for Telerehabilitation" Applied Sciences 14, no. 24: 12022. https://doi.org/10.3390/app142412022

APA Style

Covaciu, F., Vaida, C., Gherman, B., Pisla, A., Tucan, P., & Pisla, D. (2024). Development of a Virtual Reality-Based Environment for Telerehabilitation. Applied Sciences, 14(24), 12022. https://doi.org/10.3390/app142412022

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop