Next Article in Journal
Evidence of Antimicrobial Resistance from Maternity Units and Labor Rooms: A Water, Sanitation, and Hygiene (WASH) Study from Gujarat, India
Next Article in Special Issue
Physical Activity and Its Diurnal Fluctuations Vary by Non-Motor Symptoms in Patients with Parkinson’s Disease: An Exploratory Study
Previous Article in Journal
Association between Mineralocorticoid Receptor Antagonist and Mortality in SARS-CoV-2 Patients: A Systematic Review and Meta-Analysis
Previous Article in Special Issue
Effects of Square-Stepping Exercise on Motor and Cognitive Skills in Autism Spectrum Disorder Children and Adolescents: A Study Protocol
Order Article Reprints
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:

A Taxonomy for Augmented and Mixed Reality Applications to Support Physical Exercises in Medical Rehabilitation—A Literature Review

Institute for Innovation Research and Management, Westphalian University of Applied Sciences, 44801 Bochum, Germany
Human-Computer Interaction Group, Westphalian University of Applied Sciences, 45897 Gelsenkirchen, Germany
Computer Graphics Group, Westphalian University of Applied Sciences, 45897 Gelsenkirchen, Germany
Author to whom correspondence should be addressed.
Healthcare 2022, 10(4), 646;
Received: 26 January 2022 / Revised: 16 March 2022 / Accepted: 26 March 2022 / Published: 30 March 2022
(This article belongs to the Special Issue Physical Activity and Rehabilitation Practice)


In the past 20 years, a vast amount of research has shown that Augmented and Mixed Reality applications can support physical exercises in medical rehabilitation. In this paper, we contribute a taxonomy, providing an overview of the current state of research in this area. It is based on a comprehensive literature review conducted on the five databases Web of Science, ScienceDirect, PubMed, IEEE Xplore, and ACM up to July 2021. Out of 776 identified references, a final selection was made of 91 papers discussing the usage of visual stimuli delivered by AR/MR or similar technology to enhance the performance of physical exercises in medical rehabilitation. The taxonomy bridges the gap between a medical perspective (Patient Type, Medical Purpose) and the Interaction Design, focusing on Output Technologies and Visual Guidance. Most approaches aim to improve autonomy in the absence of a therapist and increase motivation to improve adherence. Technology is still focused on screen-based approaches, while the deeper analysis of Visual Guidance revealed 13 distinct, reoccurring abstract types of elements. Based on the analysis, implications and research opportunities are presented to guide future work.

1. Introduction

As people age, serious illnesses and injuries requiring medical rehabilitation become more common [1]. The demographic shift towards a higher mean age and life expectancy in many industrial nations means that the demand for rehabilitation increases every year. Types of rehabilitation include medical, vocational and residential/community rehabilitation. While vocational rehabilitation focuses on enabling persons to return to, or remain in employment, Residential rehabilitation services offer group living settings to disabled people to enable them to live as independently as possible; medical rehabilitation aims at minimizing consequences of disease, illness, injury, aging, and congenital factors. As we focus on the use of augmented and mixed reality in physical exercises, the area of medical rehabilitation is most relevant to our research. It consists of a variety of treatment methods. Among others, these include the fields of physiotherapy, occupational therapy, and physical therapy [2]. These three types of therapy are very close to each other. Therefore, we will refer to all types of physical exercises performed by patients within these fields by the umbrella term physical rehabilitation. The three kinds of therapy are mostly used for similar clinical pictures, such as physical damage and disabilities due to illness or accidents. The spectrum ranges from damage to the spine and joints to diseases of the central nervous system or the effects of a stroke. Expanding on the latter, according to the World Health Organization (WHO), strokes are not only often fatal but also the third leading cause of disability worldwide [3], reducing the mobility in half of the stroke survivors aged 65 and above [4]. Patients having suffered from a stroke often have long-term impairments that restrict them from performing activities of daily life such as work, leisure, or sports [5], requiring a rehabilitation process to recover from illness and regain a healthy lifestyle, specifically improving mobility, coordination, strength, and endurance. Besides strokes, many other health conditions, injuries, and disabilities require intensive rehabilitation programs which are constructed similarly and share common challenges, such as the active performance of physical exercises, tasks, or activities by the patient [6,7,8]. Therefore, in this paper, no focus will be put on a specific condition nor a type of therapy, but physical rehabilitation processes in general.
Conventionally, specialized prevention and rehabilitation facilities provide in-clinic physical therapy which is quite demanding in terms of time and cost, requiring supervision as well as individual treatment plans and modifications as per the patient’s progress [2]. Simultaneously, the number of physical therapists employed at these facilities has been reduced over the last decade [1]. Within both in- and outpatient rehabilitation facilities, a single therapist commonly supervises multiple patients simultaneously, which makes it quite difficult for them to support each patient adequately. Considering the high rehabilitation costs and the limited availability of physical therapists, patients often do not receive the close supervision needed. Further, rehabilitation programs take place not only in specialized facilities but also require repetitive exercising at home for several months, if possible, without supervision by a therapist [2]. This repetition makes the process quite lengthy and boring. Consequently, patients often lose motivation to continue the program. Additionally, the current COVID-19 (a highly infectious respiratory disease) pandemic has caused restrictions on outdoor mobility and gatherings, including therapy sessions, leading to a decline in overall activity [9], especially among older people. This further emphasizes the importance of exercises to be performed at home, requiring less or remote supervision by a therapist.
A vast amount of research has aimed to employ and adapt technology to support the rehabilitation process and overcome these problems. In particular, different implementations of Augmented and Mixed Reality (AR/MR) technologies have been applied and, as we will outline in more detail in this paper, these show that such approaches might indeed help to support the rehabilitation process. The advantage of AR/MR is that it allows merging visual digital cues with the real environment of the patient, thus letting the user focus on the physical task or exercise while still providing visual guidance in a digital form which is anchored to the real world to provide context. In their systematic literature review, Cavalcanti et al. approach the relationship between Augmented Reality and motor rehabilitation from a different angle [10]. They analyze the existing literature to provide an overview of the different types of assessment methods to evaluate the usability of such technologies. While not the primary focus, they also give an overview of affected body parts and medical conditions that were addressed in the past. However, their review does not detail the medical purpose nor interaction design of the individual approaches.
The current literature, as will be outlined in examples later, sheds light on significant aspects of visual elements that have helped patients to benefit more in the rehabilitation process utilizing this technology and can be investigated further. For example, different forms of instructions or feedback delivered by visual cues (text, image, 3D visualization) and sometimes accompanied by audio can be utilized to communicate the correct movements and thereby enable the patient to perform the exercises correctly [11]. Besides that, SleeveAR [12], an interesting novel approach, provides AR projection-based real-time performance feedback that can assist the patients to perform exercises autonomously, without direct supervision by a therapist. The work by Garcia et al. [13] uses mobile AR and game-based approaches to treat people suffering from ankle sprains. The game-based approach offers interesting visual elements that increase the patient’s motivation and adherence to continue the rehabilitation program. AR/MR systems such as [14,15] offer interesting visualizations that can prove to be more effective than traditional therapy approaches.
Due to these potentially beneficial properties of AR/MR in the context of physical rehabilitation and the vast amount of existing work in this area, we identified the need for a comprehensive literature review. From the perspective of HCI, it also seems necessary to address the in-depth examination of visual guidance that can benefit HCI researchers or practitioners in the long run. The categorization will not only help to provide a better understanding of the visual guidance elements used in the rehab applications but will also uncover interesting insights and identify research gaps. Thus, this paper aims at providing a taxonomy which:
  • Identifies different forms of Visual Guidance aimed to help patients in their rehabilitation, for example, by allowing them to perform exercises autonomously at home without any supervision by a therapist;
  • Provides an overview of the current use of the variety of AR/MR Technologies used for visual output in the field of physical rehabilitation;
  • Derives insights regarding the relations between Patient Types, Medical Purposes, Technologies, and specific Visual Guidance approaches used to support the patient in their rehabilitation process.

2. Materials and Methods

To gain a thorough understanding of the different AR/MR visual guidances that assist patients in the rehabilitation process, the relevant literature was retrieved from different databases, containing a mixture of medical and HCI literature. For the chosen research question, we decided to perform the literature review in the form of a scoping review. Unlike systematic reviews, scoping reviews aim to address broader topics and sources employing different methods and study designs, while focusing less on narrow questions nor the assessment of quality. They are most useful for providing an overview of the extent, range, and nature of currently evolving areas of research, to summarize research findings or to identify research gaps [16,17]. In our review, we aim to thoroughly explore these aspects with a focus on research on augmented and mixed reality applications to support physical exercises in medical rehabilitation. Therefore we consider a scoping review to be in our case the appropriate method for the purpose of developing a taxonomy.
The analysis process was initialized with a prescreening focusing on the abstract of a paper, followed by a full-text assessment for a relevant subset. On the resulting set of references, we applied an open-coding scheme, meaning that relevant categories and codes were not predefined but identified based on the screening results and adjusted through the thorough analysis of the reviewed studies. This way, a meaningful segmentation could be derived in relation to the research question. Three main and eight subcategories evolved during the coding process and were subsequently applied to the whole data-set. Codes within a category were not mutually exclusive. The classification of the final set of papers along the coding scheme was carried out by the first three coauthors of this review. As with the selection process, each application was reviewed by two researchers. Discrepancies were resolved in discussion with the third reviewer. A detailed description regarding the eligibility criteria and study selection process is explained below.

2.1. Eligibility Criteria

As motivated in the introduction, our main focus is on the following items:
  • The performance of physical exercises in medical rehabilitation. As mentioned above in the introduction, medical rehabilitation consists of a variety of treatment methods, including the fields of physiotherapy, occupational therapy, and physical therapy [2]. As these types of therapy are interrelated to each other, we will refer to all kinds of physical exercises performed by patients within these fields by the term physical rehabilitation.
  • The output technologies Augmented and Mixed Reality (AR/MR) as means to support the different types of physical exercises. The available vast amount of research in combination with the potential that future developments of the technology offer, we think, merits such a rather strict technological focus of this literature review. As has been the approach by other researchers (e.g., [18]), we will not distinguish between AR and MR but use the terms interchangeably or refer to them as AR/MR. As a foundation, we mostly follow the definition of Azuma, who summarized in his early survey of 1997 that AR (a) combines the real and virtual, (b) is interactive in real-time, and (c) is registered in 3D [19]. In particular, the last requirement is not always met with all types of output technology. For example, smart glasses often superimpose a digital layer without registering this in 3D. For the matter of thoroughness and because the practical definitions of AR and MR are also evolving [20], we still included such approaches in our review. It is noteworthy to mention that we did not actively exclude applications employing Virtual Reality (VR) as long as our eligibility criteria were met. However, the number of VR applications found may not be representative, since we did not explicitly search for the term Virtual Reality during our paper selection process.
  • The actual use of visual stimuli or visual guidance to support patients during the rehabilitation task. Here, we did not limit our review to graphical stimuli but included representations with text as well.

2.2. Literature Sources and Search Strategies

We based our review on the following five databases: Web of Science, ScienceDirect, PubMed, IEEE Xplore, and ACM. This diverse focus is necessary considering the multidisciplinary nature of the research question, connecting information technology, especially the area of human–computer interaction, with the field of medical rehabilitation. Web of Science and ScienceDirect offers such a broad, multidisciplinary overview of the scientific spectrum. PubMed is the leading database within the medical field, including research on physical therapy from both the patients’ and therapists’ viewpoints. IEEE Xplore and ACM, on the other hand provide a stronger technological perspective.
We chose search terms in accordance with the research question and iteratively expanded them based on the keywords of relevant references found. To be shortlisted in the database search, the following two requirements had to be met:
  • The research must take place in the context of physical rehabilitation, i.e., physical exercises in medical rehabilitation. Therefore, at least one of the following search terms had to be included in the title, keywords, or abstract: “Physical Therapy”, “Physiotherapy”, “Physical exercise”, “Physical rehabilitation”, “Occupational Therapy”, “Stroke rehabilitation”.
  • Furthermore, the research must use some kind of AR-/MR-based visual guidance or stimuli to support the patient in their rehabilitation. Consequently, one of these search terms must be found in the title, keywords, or abstract in addition to the first requirement: “Mixed Reality”, “Extended Reality”, “Augmented Reality”, “Visual Cues”, “Visualization”, “Visualisation”, “Video-based”.
Early database searches in the field of medical rehabilitation emphasized the importance of research aimed at the rehabilitation of stroke patients, resulting in an explicit mention of this condition in the keywords. In addition, the search did not require AR/MR terms but also allowed papers to be examined which used some kind of visual guidance.
We included original articles, conference papers, and demonstrations in the search, as well as literature reviews, as these may reference interesting papers otherwise not found in the search. In order to illustrate both the progression and the state of the art of research in the highlighted area, no restrictions were placed on publication year. Due to the terms chosen, the focus of the search was on English publications but encountered references in other languages were translated and considered as well. The last search took place on the 27 July 2021.

2.3. Study Selection

A total of 776 references were found. An overview of the selection process is depicted in Figure 1 and summarized below. First, 217 duplicates were removed. The abstracts of the remaining 559 papers were each screened independently by two different researchers. Discrepancies were resolved by discussion with a third researcher, who would make the final call. Another 395 references were excluded in this step in accordance with the eligibility criteria due to the following reasons:
  • The reference addresses physical exercises in medical rehabilitation, but without the use of AR/MR-based or comparable visual guidance or support for the patient (104).
  • The reference addresses medical rehabilitation but not physical exercise (38).
  • The reference addresses the use of visual support in medical education or training, aimed at the medical professional (29).
  • The reference uses no distinct visualization, but instead either considers the feasibility of conventional rehabilitation delivered via video calls or the effect of commercially available mobile games on general activity (17).
  • The reference explicitly considers psychotherapy or psychiatry without physical exercise (15).
  • The reference addresses other medical fields, such as medical imaging or surgery (125).
  • The reference is off-topic in other ways, for example, including other data visualization in Machine Learning, Artificial Intelligence, Veterinary Sciences, applications not employing AR/MR or VR, and game reviews (67).
This resulted in 164 references which were assessed in more detail, i.e., via a full-text analysis, employing the same dual control principle mentioned before. Another 80 references were excluded in this step for the reasons stated below:
  • The references utilize no or an insufficient level of visualization (14).
  • The references mention a system using visualization but do not explain it in enough detail, which would help other researchers and practitioners to build upon their results (13).
  • The references utilize a system or application which was already (and better) explained in another paper included in the data-set. This took place many times as research groups would commonly deploy a single application for a number of distinct papers, such as a preliminary report, a feasibility study, a paper explaining the technical development, and a clinical long-term study (33).
  • The references are literature reviews showing no distinct visualization, but refer to other potentially interesting papers (9).
  • The references are off-topic in other ways (5).
  • In addition, one single reference could not be procured (1).
This assessment left 84 studies to be included in the review. Despite our focus on AR/MR technology, we did not exclude papers employing Virtual Reality, as long as they fit our search terms, explored the use of visualization to support patients within the field of physical rehabilitation, and sufficiently explained their visualization. Additionally to the initial search, we screened the references cited by the set of 164 in-depth assessed papers and identified 7 further references which we included as well. This led to a final set of 91 references which were examined in the following. The publication years of the considered papers range from 2002 to current releases.

3. Results

3.1. Taxonomy Construction

Among the 91 references, some include two or more diverse approaches or applications, such as for means of comparison, for distinct tasks of different body parts, or to allow the user to choose out of a set. In cases where there was a clear distinction between applications, all were considered individually in our review. However, very similar applications, such as slightly modified pick-and-place-tasks, within a single reference were treated as one. As a result, we identified 114 distinct applications, which form the basis of our taxonomy. An overview of the 114 applications, structured by reference and the coding scheme of the taxonomy, is provided in the Appendix A. Overall, we were able to identify three main categories and eight sub-categories, which form the basic hierarchical structure of the taxonomy (see Figure 2). Other items within sub-categories were chosen in case of the larger number of matching applications and distinct features that could further be identified.
Firstly, we identified as the main category, that the 114 applications focused on different Patient Types, which could further be separated in different Medical Conditions as well as Affected Body Parts. While these may not be mutually exclusive from a medical point of view, not all references provided the necessary medical details and rigor, which made us opt for such a rather practical separation.
The most prevalent Medical Conditions considered in the references were of Neurological nature (78%, 89 applications), predominantly Stroke (73%, 83 applications) with only few mentions of other conditions such as Parkinson (3%, 3 applications). Further, interventions following Injury or Surgery (9%, 10 applications) were considered as well. Finally, a set of Other (25%, 28 applications) with either unspecific or diverse conditions was identified. For example, we found a single study examining neck pain [21] as well as an article focusing on the technical motion-capturing aspect of a proposed full-body physical therapy system, without further specifying any medical condition [22].
Regarding the Affected Body Parts, 81 (71%) of the applications especially aimed at Upper Limb, out of which 52 (46%) were focused on the Hands, Fingers, or Wrist, 20 (18%) on the Shoulder, and 21 (18%) on the Elbow. A total of 33 (29%) applications considered the Lower Limb, while only 4 (4%) of these further emphasized either the Knee or Ankle. Another three (3%) applications regarded Other parts of the body, namely the neck once [21], and movement of the body in general twice [23,24].
Secondly, we identified the main category Medical Purpose, covering the reasoning behind the different applications from a medical perspective. We further subdivided this category into Bodily Functions and Added Value.
The Bodily Function describes the primary, functional purpose of the physical rehabilitation. Four types of functions were identified. In total, 96 (84%) of the applications were aimed at restoring or improving Motor Functions of certain joints or limbs. A further 26 (23%) aimed to improve Gait or Balance, while 24 (21%) aimed at building Strength. Only 13 (11%) applications explicitly addressed the enhancement of Cognitive Skills.
The Added Value captures the benefits that researchers aim to achieve through the means of technology used in the applications, and thereby go beyond what conventional physical therapy provides. Most of the applications aim to provide several such benefits. The most common added value is an increase in Autonomy, allowing patients to perform the exercise without close supervision by a therapist, which applies to 91 (80%) of the applications. An additional goal is to overcome the repetitive and boring nature of many therapeutic exercises by increasing the motivation of patients. Here, two aspects play a role. First, 87 (76%) applications aim to improve patients’ Adherence, meaning their likeliness to continue with their rehabilitation program over a prolonged period, while 47 (41%) aim at boosting their Effort during a session. Motivational factor is considered by 75 (66%) of applications, which assist the patient during an exercise by providing real-time Performance Feedback, whereas 46 (40%) allow for an Analysis of the performance after the session. Among all, a noteworthy added value, which a technological solution may provide, is an increase in patients’ Accuracy to perform the exercises, which is considered by 39 (34%) of the identified applications.
The third and the last main category, Interaction Design, revolves around the technical solutions as implemented in the different applications. It, therefore, comprises the used Output Technology, the Application Type, the concrete kind of Visual Guidance employed to aid the patient, as well as the Input Technology required for the AR/MR technology to work.
As for the Output Technology, 72 (63%) of the applications could be considered screen-based, i.e., making use of a type of 2D display, mobile phone, or projector image to superimpose digital content on video-based real-world imagery. Only 8 (7%) of these applied a handheld Mobile Phone-based AR/MR approach, similar to Pokémon Go and the like. Another 30 (26%) of the applications used a Head-Mounted Display (HMD), out of which 13 (11%) used a VR HMD and 17 (15%) an AR/MR HMD. Another 14 (12%) of the applications delivered the application via a Spatial-AR/MR approach, which uses a projector to display digital content onto the patient and his surroundings, thereby registering the virtual content in 2D space [25].
The Application Type can be either Game-based, signifying playful and fun solutions employing gamification elements, as 55 (48%) of the applications were, or Task-based, as were 53 (46%). Compared to their game-based counterpart, Task-based applications emphasize a more serious approach, focusing on the execution of simple, repetitive exercises rather than motivational aspects. As another item in this category, we extracted Mirror Therapy as a highly specific type found in 12 (11%) of the applications. In mirror therapy, the brain is tricked into thinking that an affected limb, for example, an amputated arm, is able to move without pain. This is achieved through putting a mirror in a specific spatial configuration next to the healthy arm. The brain now connects the mirrored image with the affected limb, so that when the patient tries to move both arms (and only the healthy fully responds), it assumes that also the affected limb is moving correctly, helping to train the relevant function and mitigate pain [26].
The main emphasis of our analysis lays in the identification of the Visual Guidance elements. In total, 13 reoccurring abstract concepts were found here, which we classified into three groups: Guided Interaction, Demonstrated Interaction, and Feedback. The concepts are explained in the following.
The group Guided Interaction includes such visual support elements, aiding the patient by demanding immediate interplay. We identified the following elements:
  • Target: A total of 64 (56%) of the applications use some form of Target, which we define as a spatial destination to be reached by the patient. For example, in Garcia and Navarro [13], the user controls a ball to hit bricks that act as a target with a paddle controlled by the foot movement (see Figure 3a). Similarly, another game in Bouteraa et al. [27] shows randomly appearing targets that patients need to grasp with their hand and the help of an exoskeleton.
  • Path: Related to targets are Paths, which not only show the destination but also a trajectory to be followed. Compared to targets, paths provide information during the entire execution of the exercise. In total, 21 (18%) of the applications use this type of visualization. In SleeveAR [12], a path is shown to direct the arm movement in order to execute the exercise (see Figure 3b). [email protected] [28] visualizes a wedge that includes a movement arc for proper arm guidance (see Figure 3f).
  • Direction: Overall, 15 (13%) applications do not show an entire path but indicate a Direction to be followed instead, usually visualized by an arrow. In the ARDance game [29], the user has to move the AR marker towards the left or right diagonal directions to complete dance moves. LightGuide [30] uses a simple 3D arrow projected onto the user’s hand to visualize the direction of movement. It also uses the Hue cue, a visual hint that uses spatial coloring to indicate direction (see Figure 3d). Tang et al. [28] also visualize such an arrow in addition to their path (see Figure 3f).
  • Object: The most common element is Objects, found in 69 (61%) of the analyzed applications. Objects, unlike targets, can be interacted with in some way. For example, in the Ocean Catch Game by Park et al. [31], the user trains grasping movements by catching virtual fish. Similarly, Alamri et al. created a Shelf Exercise [32], which reenacts the motion of placement by showing a virtual cup that can be placed on different spots in a shelf by the user.
  • Racket/Cursor: A Racket or Cursor is an object directly controlled by the patient which allows them to manipulate other objects; 21 (18%) of the analyzed solutions utilize such a racket. For example, with the help of a marker, the bar is used to hit the ball in a game of Pong as shown by Dinevan et al. [33]. FruitNinja as applied by Seyedebrahimi et al. [34] shows a circle to depict the current position of the hand to control fruit targets appearing on the screen.
  • Obstacle:Obstacles describes hindrances to be avoided or bypassed. They are employed in 11 (10%) of the applications. To facilitate practicing gait and balance exercises, the obstacles can be projected directly onto the treadmill [35] (see Figure 3c) or shown in the form of virtual objects such as blocks or tree trunks [36] to perform certain leg movements.
Visual Guidance can also be provided by Demonstrated Interaction of the correct execution of the exercise. Demonstrations can be delivered in the ways mentioned below.
  • Recording: The demonstration may be delivered by a prerecorded Video, Animation, or Picture, as it was the case in 23 (20%) of the analyzed applications. For instance, UNICARE Home+ by Yeo et al. [38] shows a guide video on the side of the screen to depict exercise movement. Another example from Khan et al. [39] uses the “follow the leader” approach by showing an animated virtual avatar to be followed by the user.
  • Virtual Coach: Unlike with an unchanging recording, a Virtual Coach provides interaction with the user. Such a coach was used in seven (6%) of the analyzed solutions. The holographic representation of a virtual coach as depicted in Mostajeran et al. [37] allows users to look at the coach from different angles, see instructions from the same perspective and fosters interaction (see Figure 3e).
  • Overlay: An Overlay displays a visual instruction directly perceived on the user’s body. In total, 15 (13%) applications employed such an Overlay. In a Kinect-based system as shown by Pachoulakis et al. [40], the user’s body is overlaid with skeleton joints displaying the trainer’s movement, allowing him to follow the exercise.
The last group of visual guidance contains elements providing Feedback. These elements guide the patient indirectly by adding another layer of information related to the task or exercise. Often, these elements inform the patient about the quality of their performance.
  • Text: In total, 23 (20%) applications deliver Text-Based Instructions or Feedback. ARKanoidAR [11] uses text-based instructions to guide the user in performing the exercise correctly (see Figure 3g). Additionally, a virtual piano game [41] motivates patients by showing text-based feedback such as “Well Done” based on the performance of the user.
  • Score/Graph: A total of 60 (53%) of the analyzed solutions provide the patients with some kind of informative Graph, Score or Progress Bar related to their performance. When the task is performed correctly, the score is increased, as depicted in FruitNinja [34] or ARKanoidAR [11] games. Such game elements increase the user’s motivation to keep performing the exercise.
  • Color-code: Information can also be transported using a Color-Code which is used by 23 (20%) applications. To illustrate if the user has reached the target appropriately, interACTION [42] uses a color-coded mechanism to inform the patient. The Target is originally displayed in red. Once the user approaches the target, it changes its color to yellow and once the target is reached, it turns green. Another approach used by [15] is to compare the user’s pose and the desired pose with a color code. Once the user’s pose matches the desired pose, the circular glyphs are highlighted in green.
  • Self-Evaluation: Altogether, 26 (23%) applications grant the patient some kind of improved Self-Evaluation delivered by either another camera angle or mirror to visualize the real affected body part better as shown in [email protected] [28] (see Figure 3f), or by displaying an avatar or skeleton based on motion tracking data to grant information on the patient’s own movement. One such example is depicted in [26] for gait symmetry, where the user’s whole body movement is shown via an avatar with different views for a better understanding of gait deviations.
Lastly, the Input Technology utilized to allow interaction between the patient and system is considered as part of the Interaction Design category. Most applications require some sort of motion capture with the most common method being Optical Motion Tracking, employed by 48 (43%) applications. Examples include the work of [40,43] that is based on Kinect motion tracking, and games for hand rehabilitation that employ Leap motion tracking [44]. Moreover, quite popular is the use of Marker-based Motion Tracking, which is utilized by 38 (34%) solutions. Few prototypes such as [email protected] [28] and gait symmetry application by [26] used Vicon motion tracking cameras for precise tracking. Vicon tracking system requires wearing explicit markers on different body parts. Another 19 (17%) depend on nonoptical Sensor-based Motion Tracking such as Xsens used by [36]. Nine (8%) applications used some form of Haptic Device to simulate tactile sensation in AR environment when users touch virtual objects as used by [45] and 29 (26%) employed an Additional Device, such as a treadmill used by [46] for gait rehabilitation.

3.2. Time-Based Analysis

We identified 91 references published between 2002 and the first half of 2021 related to the field of AR/MR visual guidance in physical rehabilitation. Only 8 of these were published prior to 2010 followed by a significant increase in research interest, peaking in 20 publications in the year 2020 alone. The detailed distribution per year is illustrated in Figure 4. The most commonly used keywords provided by this set of references are displayed in Figure 5. The data was harmonized in this step to group different representations, such as Mixed-Reality and Mixed Reality, and plural forms of the words. A significant bias towards the use of the term Augmented Reality across the entire time period is apparent, outnumbering Mixed Reality 10-fold. Virtual Reality was mentioned in 18 references despite not being included in the search terms. In contrast, seemingly general terms such as Visual Cues, Visualization, and Visual Feedback find much less use.

3.3. Patient Type and Bodily Functions

As became apparent during the open coding process, a large majority (73%) of the identified references explicitly focus on rehabilitation following Stroke. We did not expect such a prevalence as our search encompassed five other, broader search terms as well. In contrast, rehabilitation following Injuries or Surgery is only explicitly the focus of 10 (9%) of the references. Similarly yet less pronounced is the accentuated focus of the Upper Limbs compared to the Lower Limb, with 81 (71%) compared to 33 (29%) references. Distinct differences become apparent between research in these areas. These findings coincide with those of Cavalanti et al. who, in their literature review of studies employing AR in rehabilitation, found a strong prevalence of Upper Limb research and a severe yet less pronounced focus on Stroke [10].
  • First, studies aimed at Upper Limb are predominately focusing on individual parts of the limb, sometimes multiple, as seen in Figure 6. Among the 81 references covering Upper Limb, 51 (63%) target either the Hand, Fingers or Wrist, 21 (26%) the Elbow and 20 (25%) the Shoulder, leaving only 19 (23%) references without a specific focus. In contrast, the 33 references regarding Lower Limb are overwhelmingly nonspecific (79%) with only 7 applications (21%) focusing on either the Knee, Ankle or both.
  • Further, there is a difference in the targeted Bodily Functions depending on the considered Body Part. 79 (98%) out of the 81 Upper Body applications target the restoration or improvement of specific Motor Functions, while 19 (23%) aim to increase the patient’s muscle Strength. In comparison, with 24 (73%) the largest fraction of the 33 Lower Limb research addresses the patient’s Gait or Balance, especially in order to prevent future falls. Improvement of Specific Motor Function, however, is only targeted in 19 (58%) references here. Muscle Strength is also considered less often, finding explicit mentioning in five (15%) of the references.
  • Lastly, the research focused on both Lower Limb and Gait has increased immensely in recent years, as illustrated in Figure 7. Prior to 2019, we identified a total of 13 references targeting Lower Limb, representing only 18% of the research papers published in this time frame. Further, only eight of these (62%) aim at Gait or Balance. Compared to that, 20 out of 43 applications published since 2019 target Lower Limb (47%). Additionally, the share of Lower Limb research focusing Gait or Balance rose to 80%.

3.4. Medical Purpose and Application Types

Differences become apparent between the variants of Application Type and their relation with Medical Purpose:
  • Regarding the Bodily Functions, 11 out of the 13 (85%) applications aiming to provide Cognitive training do so by employing Game elements.
  • The 55 Game-based applications are especially often used to increase the patients’ motivation, with 51 (93%) of them targeting increased Adherence. For the 53 Task-based applications, Adherence is a much less prominent goal, which is only addressed by 31 (58%) applications. In total, 30 (55%) of the Game-based applications further aim to increase the patients’ Effort during the session, compared to only 16 (30%) within the set of Task-based approaches.
  • Instead, 28 (53%) Task-based applications focus on increased Accuracy, compared to only 11 (20%) within the set of Game-based applications.
Figure 8 shows the usage of Visual Guidance elements by Application Type and targeted Added Value. The main insights are summarized below.
  • In general, Task-based applications are more balanced in their use of visual elements, while Game-based are more focused on certain elements, namely Objects (91%), Scores or Graphs (80%), and Targets (65%).
  • The most common visual elements implemented in Game-based applications are from the Guided Interaction group. Elements from this group are used in 53 Game-based applications (96%) and make up 59% of all elements used here. In comparison, they are used in 38 Task-based applications (72%) and account for 46% of elements used.
  • In contrast, only nine Game-based applications (16%) utilize one form of Demonstrated Interaction each, accounting for 4% of elements. Within Task-based applications, 24 (47%) employ at least one Demonstrated Interaction element, accounting for 19% of all elements.

4. Discussion

In this section, we will take a closer look into the Interaction Design approaches and in particular discuss different approaches and implications within the categories Visual Guidance and Output Technology.

4.1. Implications for Visual Guidance

Garcia and Navarro were able to show that the MobileReh [13] app enhanced overall user interaction for ankle sprain rehabilitation by providing several Guided Interaction elements, such as Targets, Objects, Paths, and Racket/Cursor. The game required no controllers or sensors to operate. The main challenge the authors reported was to keep the game engaging, which they tried to tackle by providing higher difficulty levels. Alamri et al. found that virtual Objects, such as a cup in their Shelf Exercise approach [32], tend to be motivating for users and help them to keep the exercises engaging. The simple activities of daily life such as pick-and-place movement can be made more interesting if real-world objects could be overlaid with virtual objects. The seamless integration of virtual objects in the real environment can be achieved by advanced AR/MR technology, such as HMD-AR/MR or Spatial-AR/MR and the authors conclude that these can enhance the user experience along with increasing patients’ participation in the rehabilitation program. For applications aiming to improve gait adaptability, Fonteyn et al. and subsequently Timmermans et al. were able to show that Guided Interaction involving the element of Obstacles, i.e., avoiding/dodging hindrances, can indeed improve gait performance [35,47].
Seyedebrahimi et al. conducted a study to compare different implementations of the FruitNinja game [34]. They compared what they call an AR and VR mode. Given the description in their work, however, we regard these two variants as referring to the Spatial-AR and Screen-based category, respectively. The authors observed that the Spatial-AR mode provided better hand-eye coordination which plays a significant role in completing tasks. The Objects were projected directly on the tabletop that made it easier for users to interact. Compared to this, in the Screen-based mode, where the output was displayed on a monitor, visual and proprioceptive feedback were uncoupled because the gaze was directed on the monitor and not their hand, resulting in poor motor performance.
Guided Interaction vs. Demonstration and Virtual Coach: Several researchers compared their Guided Interaction approaches to Video Demonstration. For example, Sousa et al. found that the movement Path as shown in SleeveAR [12] showed better results when performing the exercise as compared to a Video Demonstration alone. The Path not only guided the user on how to move but drawing the trajectories over the original exercises helped understanding the areas of improvement. In a similar manner, Tang et al. performed an experiment to investigate if the wedge visualization (including a movement arc and a Directional Arrow), as shown in [email protected] [28], helped users to perform the exercise better than Video Demonstration. It was shown that the movement Path guided users better to perform the steps correctly for hand exercise. However, for elbow exercise, the results were not any better than Video Demonstration due to a technical limitation. Overall, the corrective guides helped people to realign themselves when needed. In their work, Sodhi et al. explored on-body projected hints (Overlay). In their approach LightGuide [30], this helped users to perform the movement more accurately compared to a simple Video Demonstration. It was reported that Video Demonstration is unable to adequately show next steps. In comparison, visual hints such as a 3D Arrow helped users to exercise at their own tempo and guided them to perform the movement in a particular direction. 3D Pathlet on the other hand served as a feedforward component to guide the user in advance what is ahead in the Path. The question is still open if such visual hints on a Screen will allow users to perform the movement as accurately as when they are projected on to the body.
While these comparisons with Demonstrated Interaction mostly chose Video Recordings, other researchers have explored Virtual Coaches as a digital representation or surrogate for a real therapist. Mostejarean et al. [48] conducted a study to find out if Virtual Coaches in the form of holographic representations have potential to help older adults in Balance training. Results suggest that a Virtual Coach was well perceived by the users and the holographic representation of the Virtual Coach made it easier to look at it from different perspectives-something that is difficult to achieve with Screen-based systems. The Virtual Coach was able to interact with users with the help of audio and speech feedback. Users reported that the visual representation of the avatar should be more human-like to make it feel real and alive. The authors conclude that such a personalized coaching system has the potential to be used in home-based therapy setting. Another form of Virtual Coach was used by Han et al. [49]. They Overlaid the Virtual Coach’s guided bone structure onto the user’s body to perform Tai-Chi maneuvers recorded by a motion capture system. The participants reported that they were able to perform tasks well using this immersive technique and better compared to simple instructions by pictures or words.
Feedback: Another approach to address gait performance was investigated, for example, by Liu et al, focusing on avatar-based feedback [26]. They found that such an approach provides many advantages as it helps to Self-Evaluate the movements of the user. They showed that the availability of multiple views of an avatar allows different perceptions, depending on the current users’ need which makes it quite versatile in nature. Additionally, it also provided unique concurrent information on walking performance and was able to show improvements compared to communicating such information directly by clinicians or therapists. Bell et al. evaluated the real-time visual Feedback in their interACTION app, which was well perceived by users in a study conducted to check the feasibility of the system [42]. It was reported that the visual Feedback helped users to perform the exercises correctly. The real-time qualitative and quantitative Feedback was displayed using a 2D animation and a numerical counter that depicts the joint angle. Moreover, a Color-Coded mechanism helped them to identify if they are within or outside the reach of the Target that helped them to perform the range of motion exercise as accurately as possible.
Investigating different forms and modalities of Feedback, Cavalcanti et al. were able to identify multiple aspects which might help to design future rehabilitation applications [11]. For example, they were able to show that audio feedback was well perceived by users because it did not interfere with other visual information. However, the efficacy of audio feedback depends on the environment and may not work in clinical practice, e.g., due to disruption caused by external sound. Text Feedback was perceived better than Image Feedback. However, it was reported that it is critical that the display duration of the text is not too short to be able to understand the message clearly. For Image-based Feedback, the authors used a gif illustration that failed to convey the exact beginning and ending of a movement. Scores, on the other hand, helped to increase Adherence, i.e., to continue playing the game. Overall, Cavalcanti et al. reported that all forms of Feedback helped in some way to make correct movements and have the potential to motivate users in a physical rehabilitation setting.
Overall, the current state of research makes it difficult to draw generalizing conclusions. Many factors influence the choice and suitability of certain Visual Guidance elements. What worked well for one application might not work at all for another. Our main insights can be summarized as follows:
  • Even for non-game-like applications, the use of AR/MR still provides a certain fascination and helps to keep users engaged. Still, if this is just a rather short-dated novelty effect or can be something that is sustained over time, remains to be seen.
  • There is some evidence that there is a closer relation between specific types of Visual Guidance elements and certain Medical Purposes, such as Gait & Balance being addressed through Obstacle elements. However, again, the current body of work does not provide systematic studies of such aspects but base this conclusion on the overall evaluation of an application.
  • There is growing evidence that simple video-based tele-rehabilitation approaches can not compete with AR/MR approaches, which should provide an interesting opportunity for future business models, as currently tele-rehabilitation and tele-fitness courses dominate the home market.

4.2. Implications for Output Technology

The choice of Output Technology is seemingly unrelated to the publication date, as seen in Figure 9. A relatively low-tech Screen remains the predominant choice of display across the entire period under review. In total, Screen-based approaches account for 56% of output devices, while Head-Mounted Displays (HMD) are only used in 26% of the applications. The immense technological advances over the last 10 years for HMDs and in particular AR/MR-HMDs, including lighter hardware, more user-friendly interfaces, and increased immersive potential due to greatly improved specs, have only very recently seemed to emerge in the field of physical rehabilitation. From 2019 on, the number of such applications has risen and reached some levels of consistency in the amount of use - surprisingly not to the detriment of Screen-based approaches, which remained the most-used technology even in 2019 and 2020.
A certain level of influence on the Output Technology can be found regarding the Affected Body Part as well as Bodily Function as shown in Figure 10. However, upon closer inspection of the relevant data, we could not find a clear pattern that would explain, for example, the heavier reliance on Screen-based approaches for upper body parts compared to Lower Body parts. In addition, we also could not find any influence of the potential Added Value on the choice of Output Technology.
Reasoning behind Technology Choices: To gain a better understanding of the reasoning behind technology choices for Output Technology, we conducted an additional in-depth qualitative analysis among the most recent references from our data-set, published in 2019 and after. The analysis showed that only a fraction of the considered references provides any justification for the selection of the Output Technology. These justifications then differ again between the type of visualization technology chosen, AR/MR or VR, and the type of hardware selected, e.g., HMD or Screen-based.
The choice of VR: The use of VR, for example, is being discussed controversially. As a major disadvantage of VR, it is argued that patients do not receive feedback from the real world, which can lead to great insecurity and even health risks, especially for stroke patients who often suffer from balance problems [39,50,51]. In other papers, the separation of the patient from his real environment is considered to be a major advantage, because it avoids the patient being too distracted by the real environment [52,53] or allows the patient to be immersed in a completely virtual world, which can be particularly designed to support the exercises [43].
The choice of AR/MR: There are also differing opinions on whether and where AR/MR is suitable for use, some of which may be caused by unclear definitions of AR and/or MR. For example, Theunissen et al. claim that a property of MR is the filtering out of real visual information, which can then lead to safety risks due to the shielding form the real world, similar to VR [50]. However, most consider AR/MR to have the great advantage of allowing patients to be immersed in a virtual environment while still being able to perceive the real world [54]. It is noteworthy that no disadvantage of AR was explicitly mentioned in the selected papers.
The choice of Screens: The main reasons given for selecting either Mobile Phones or other Screens were that these are inexpensive and commercially available. In addition, patients are usually familiar with their operation [15,50]. Reference was also made to other work that has shown that tele-rehabilitation has proven itself effective in the past [42] and especially that Mobile Phones and other Screens are commonly found in households. One disadvantage of screens mentioned was the more difficult hand-eye coordination and split attention, since the visual attention cannot be on both, the own body and the screen simultaneously [34].
The choice of HMDs: HMDs primarily offer an advantage in representing the depth of motion in 3-dimensional space [38,48]. They are also significantly less expensive than having to set up a complete multisensory room [53]. A major disadvantage of HMDs is the impairment of head movement or an entire range of motion due to the inherent weight of the HMD and patients not being accustomed to carrying anything on their heads during movements or exercises [50].
A general implication to be taken from this analysis is that the choice of output technology depends on specific characteristics of the patient’s Condition and age, e.g., limitations in Gait functionality, Balance, and prior experience using technology [37,51]. Further, the scientific aim of the research plays an important role in the selection of technology. For example, some papers identify research gaps and target them specifically [43], which may have increased the focus on emerging AR/MR technology in recent years.

4.3. Research Opportunities

Based on our analysis, we identified a number of trends and open issues in AR/MR research within the field of Physical Rehabilitation, which we summarize below and which we think can help to guide future research:
  • The analyzed literature reveals a strong focus on rehabilitation following Neurological conditions, especially Stroke. This may be due to research grants rather being provided to researchers who try to tackle such universal problems. Still, it would help if more research would then try to generalize the findings to other Medical Conditions. In particular, research dealing with AR/MR-based interventions following Injury or physical Disabilities is severely lacking.
  • Research on Upper Limb is both more prevalent in general and more diverse, routinely focusing on individual joints and muscles, which at least can be partly explained through the importance of the Hands (and related body parts) for everyday activities and independence. Contrary, a majority of Lower Limb research targets Gait or Balance. The research focused on the use of AR/MR technology in rehabilitation and exercises of individual joints and muscles of the Lower Limb remains scarce. Additionally, research aimed at muscle Strength is severely underrepresented as well, in particular for Lower Limb.
  • Regarding the utilized Output Technology, Screens still account for almost half of all display types used, even in most recent years. In contrast, there is ample research opportunity for more advanced AR/MR technologies such as HMDs and Spatial-AR through projection. HMDs might have an advantage for home-based exercise, but Spatial-AR could be installed in facilities and provide much of the same experience without the drawbacks of a (currently) heavy-weight head-worn device.
  • The Visual Guidance elements Virtual Coach and on-body Overlays are relatively seldom types of demonstration, although the existing research shows promising results (e.g., [48,49]). While both Virtual Coaches and on-body Overlays are quite complex to realize and require a tracking of the environment or body, we think there is an opportunity here, in particular in combination with modern AR/MR-HMDs and their integrated inside-out tracking functionality.
  • Furthermore, there is a lack of research explicitly comparing the effectiveness and advantages of selected Visual Guidance elements. While a few references did compare elements [14,28,30], they only do so aimed at very specific use cases. Therefore, this literature review cannot provide more guidance towards the usefulness and appropriateness of certain elements. Still, our review can serve as an inspirational starting point for practitioners and other researchers alike.

4.4. Limitations and Weaknesses

There are some limitations and potential weaknesses of this literature review that could be addressed in future research.
  • Firstly, in this scoping literature review, information was gathered from different studies and sources without assessing or weighing the quality, accuracy, and validity of the papers. Instead, the focus was on collecting evidence to provide an overview of the research topic and assist researchers. Therefore, no assertions or suggestions can be made about the feasibility of the employed visualisations, nor is possible to derive a fair general comparison between them.
  • Furthermore, the review was exploratory, employing an open-coding process. Therefore, the chosen categories, codes, and assignments could be considered arbitrary. In order to mitigate this factor, all steps leading to and including the final coding process were performed by at least two independent researchers and discussed with a third reviewer before a final call was made.
  • The selection of search keywords is, as with every literature review, a topic for debate. We deliberately did not impose a predefined set of keywords on the search process, but rather applied an iterative process, adding relevant keywords along the way.
  • Lastly, within this review, we identified distinct applications. Occasionally, this led to an inclusion of several such applications presented within a single reference or by the same research group. Although distinct, these routinely share many similarities, such as using the same technology, thereby distorting results. Therefore, the results do not accurately describe the amount and quality of research in each category. In order to limit this bias, we did not include papers describing an identical system, but only the one best representing it.

4.5. Strengths

Besides the above-mentioned limitations, our approach offers a number of strengths summarized in this section.
  • The review address the broad topic of the use of Augmented and Mixed Reality applications in several areas of medical rehabilitation and for multiple conditions and affected body parts. Further, it takes into consideration a broad spectrum of sources, employing different methods and study designs. It is therefore able to provide an overview of the extent, range and nature of currently evolving areas of research as well as summarize research findings.
  • By doing this, it also aids to identify trends, needs and research gaps to aid in future research.
  • The review further provides a first taxonomy to classify the research area. The advantage of the exploratory approach employing an open-coding method lies in its bottom-up construction of categories and items based on actual research and relevant items within the sources.
  • Additionally, several in-depth qualitative comparisons of selected recent studies were conducted, to provide a better understanding of the reasoning behind technology choices for Output Technology, this paper aims to further guide future research and experiment set-ups.

5. Conclusions

We analyzed 114 applications presented in 91 papers within the field of AR/MR research regarding physical exercises in medical rehabilitation. Employing an open coding process, we derived a taxonomy that distinguishes between different Patient Types, the Medical Purpose as well as the Interaction Design of proposed applications. In particular, the latter explores the relationship between Output Technologies, Application Types, Input Technologies and Visual Guidance elements. We identified 13 distinct, reoccurring abstract types of elements providing Visual Guidance, which we classified further into the three subgroups Guided Interaction, Demonstrated Interaction and Feedback. A deeper analysis of the selected data-set revealed the positive relationship between elements and the Medical Purpose as well as some insights into the effectiveness of certain Visual Guidance elements. We also presented qualitative evidence for researchers’ reasons for selecting certain Output Technologies. In summary, the taxonomy provides an overview of relevant attributes of AR/MR applications for physical rehabilitation. It bridges the gap between a medical/physiological perspective (Patient Type, Medical Purpose) and the Technology and Interaction Design. We are confident that it can serve as a reference for new research and practitioners alike, who aim to develop new approaches or get an understanding of existing work in a specific subarea. In particular, the set of Visual Guidance elements can serve as an inspiration for future Interaction Designs. The taxonomy also revealed the predominant motivation to develop technological solutions for physical rehabilitation process; the two biggest problems, based on what has been addressed in research, seem to be the lack of autonomy (or reliance on the presence of a therapists) as well as the lack of motivation (which often might be co-related with the absence of a therapist). Most researchers do not want to replace the therapist but either see the need for additional physical exercise or think that additional technology would help therapists to focus more on individual patients. The lack of motivation is mainly tackled by Game-like Application designs or gamification elements, such as Performance Feedback. Although our focus was on AR/MR applications, which are nowadays often naturally associated with HMDs, such as a Microsoft HoloLens device, most applications still use screens in a variety of form factors to provide visual guidance. Even though some researchers acknowledge the difficulties of split-attention and the less immersive experience of such Screen-based approaches, the wide availability, low cost, and technical limitations of HMD-based approaches (i.e., limited field of view) may help to explain this situation.
Given that, we pointed out a variety of research opportunities. In particular, we want to highlight the following three areas we see as critical for future research: (1) applying and studying the efficacy of advanced AR/MR technologies to overcome the limitations of Screen-based approaches; (2) atudies comparing and understanding the efficacy of different forms of Visual Guidance and Application Types; (3) the tackling of currently underrepresented Body Parts or Bodily Functions.
Finally, we want to stress the importance of multidisciplinary research in this area. Only a combination of experts from the medical field as well as HCI (design, technology, user research) is able to address the taxonomy as a whole and make sensible technology and medical application choices.

Author Contributions

Conceptualization, all authors; methodology, B.B., A.J. and A.R.; software, B.B.; validation, B.B., A.J. and A.R.; formal analysis, B.B., A.J. and A.R.; investigation, B.B.; data curation, B.B.; writing—original draft preparation, B.B., A.J. and A.R.; writing—review and editing, J.G. and G.L.; visualization, B.B.; supervision, J.G. and G.L.; project administration, B.B.; funding acquisition, J.G. and G.L. All authors have read and agreed to the published version of the manuscript.


This research was funded by the Ministerium für Wirtschaft, Innovation, Digitalisierung und Energie des Landes Nordrhein-Westfalen. We acknowledge support by the Open Access Publication Fund of the Westfälische Hochschule, University of Applied Science.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The datasets used in this study are available on request from the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.


The following abbreviations are used in this manuscript:
ACMAssociation for Computing Machinery
ARAugmented Reality
HCIHuman Computer Interaction
HMDHead-Mounted Display
IEEEInstitute of Electrical and Electronics Engineers
MRMixed Reality
VRVirtual Reality
WHOWorld Health Organization

Appendix A

Table A1. Overview of the selected references—author names A–H, part 1/2, Abbr. in Table A7. Distinct applications within a reference are marked with lowercase letters.
Table A1. Overview of the selected references—author names A–H, part 1/2, Abbr. in Table A7. Distinct applications within a reference are marked with lowercase letters.
1st AuthorRef.Med. ConditionAff. Body PartBodily FunctionAdded Value
Achanccaray[52]NE, NSUL, FHMFAC, AH, EF
Agopyan[55]NE, NSLL, KNMF, GBAC, EF
Alamri[32] aNE, NSUL, FHMFAH, AU, PA
Alamri[32] bNE, NSUL, FHMFAH, AU, PA, PF
Alexandre[56]NE, NS, COUL, FHMF, STAC, AH, AU, PA, PF
Aung[57] aNE, NSULMF, STAH, AU, EF, PF
Aung[57] bNE, NSUL, FH, SH, ELMF, STAH, AU, EF, PF
Bakker[23]NE, NSBOCGAH, AU, EF, PF
Bouteraa[27]NE, NSUL, FHMFAH, PA, PF
Broeren[60]NE, NSULMFAH, AU, PF
Broeren[61]NE, NSUL, FHMF, CGAH, AU, EF
Burke[62] aNE, NSULMF, STAH, AU, EF, PF
Burke[62] bNE, NSULMF, ST, CGAH, AU, PF
Camporesi[63]COULMFAH, AU, PA, PF
Cavalcanti[11]NEUL, SH, ELMFAC, AH, AU, EF, PA, PF
Colomer[14] aNE, NSUL, FHMFAH, AU, EF, PF
Colomer[14] bNE, NSUL, FH, SH, ELMFAH, AU, EF, PF
Corrêa[64]COUL, FH, ELMFAH, EF
Desai[68] aNE, NSLLGBAH, AU, EF, PA, PF
Desai[68] bNE, NSUL, ELMFAH, AU, EF, PA, PF
Desai[68] cNE, NSUL, ELMF, CGAH, AU, EF, PA, PF
Dinevan[33] aNE, NSUL, FHMFAH, AU, PF
Dinevan[33] bNE, NSUL, SHMF, STAH, AU, PF
Fonteyn[47]NELLMF, GBAU, PA, PF
Garcia[13]ISLL, ANMFAH, AU, EF, PF
Gauthier[70]NE, NSUL, FH, SH, ELMF, ST, CGAH, AU, EF
Hacioglu[73]COUL, FHMFAH
Mohd Hashim[53]NE, NSLLMF, CGAH, AU
Hoermann[75]NE, NSUL, FHMF, CGAH, AU, EF
Hoermann[76]NE, NSUL, FHMFAH, EF
Table A2. Overview of the selected references—author names A–H, part 2/2, Abbr. in Table A7. Distinct applications within a reference are marked with lowercase letters.
Table A2. Overview of the selected references—author names A–H, part 2/2, Abbr. in Table A7. Distinct applications within a reference are marked with lowercase letters.
1st AuthorRef.Output Tech.Application TypeVisual GuidanceInput Tech.
Achanccaray[52]HVRTB, MIOB, READ
Agopyan[55]SCTBSEMBT, AD
Alamri[32] aHVRTBTA, OBMBT
Alamri[32] bHVRTBOB, PTMBT
Alexandre[56]SCGATA, OB, RC, GSSBT, HD
Bouteraa[27]SCGATA, OB, SE, GSOT, AD
Broeren[60]SCGATA, OB, GSHD
Broeren[61]SCGATA, OB, RC, OS, GSHD
Burke[62] aSCGATA, OB, RC, GSMBT
Burke[62] bSCTBTA, OB, GS, CCMBT
Camporesi[63]SCTBVT, RE, SE, GSOT, MBT
Cavalcanti[11]SCGATA, OB, RC, RE, GS, TEOT
Colomer[14] aSARGATA, OB, GS, TEOT
Colomer[14] bSARGAOB, RC, GSOT
Desai[68] aSCGAOB, SE, GSOT
Desai[68] bSCGAOB, SE, GSOT
Desai[68] cSCGATA, OB, SE, GS, TEOT
Dinevan[33] aSCGAOB, RC, GSMBT
Dinevan[33] bSCGAOB, RC, GS, TEMBT, AD
Gauthier[70]SCGATA, OB, OS, SE, GSOT
Hacioglu[73]SCGAOB, PTOT
Mohd Hashim[53]HVRGAOB, GS, TEOT
Hoermann[75]SCGA, MIOB, GS, CCOT, AD
Table A3. Overview of the selected references—author names I–Sr, part 1/2, Abbr. in Table A7. Distinct applications within a reference are marked with lowercase letters.
Table A3. Overview of the selected references—author names I–Sr, part 1/2, Abbr. in Table A7. Distinct applications within a reference are marked with lowercase letters.
1st AuthorRef.Med. ConditionAff. Body PartBodily FunctionAdded Value
Jung[79]NE, NSLL, ANMF, GB, ST
Keskin[44] aNE, NSUL, FHMFAH, AU, EF, PA
Keskin[44] bNE, NSUL, FH, ELMFAH, AU, EF, PA, PF
Kloster[21]ISBO AH, AU, PF
Koroleva[82] aNE, NSLLMF, GBAC, PA, PF
Koroleva[82] bNE, NSUL, FHMF, STAC, PA, PF
Kowatsch[24]COUL, LL, BOSTAH, AU, EF, PF
Loureiro[87]NE, NSUL, FH, SH, ELMFAC, AH
Manuli[89]NE, NSLLMF, GB, ST, CGAH, EF, PF
Mostajeran[48]CO GBAU, PA, PF
Mostajeran[37]CO MF, GB, CGAC, AH, AU, PA, PF
Mouraux[91]NE, ISULMFAH, AU
Nanapragasam[93]NE, NSLLGB
Pachoulakis[40]NE, NPUL, FH, SH, EL, LLMFAC, AH, AU, EF, PA, PF
Paredes[54]COLLGB, CGAH, AU, EF, PA, PF
Phongamwong[46]NE, NSLLMF, GBAC, PA, PF
Ramirez[94]NE, NSLLSTAH, AU, EF
Regenbrecht[95]NE, NSUL, FHMFAH
Saraee[96]NE, NS, IS, COULMF, STAH, EF, PA
Seyedebrahimi[34]NE, NS, NPUL, FHMF, CGEF
Sodhi[30] aCOUL, FHMFAC, AU
Sodhi[30] bCOUL, FHMFAC, AU
Sodhi[30] cCOUL, FHMFAC, AU
Sodhi[30] dCOUL, FHMFAC, AU, PF
Song[97] aNE, NSULMFAH, AU, EF, PF
Song[97] bNE, NSULMF, CGAH, AU, PF
Table A4. Overview of the selected references—author names I–Sr, part 2/2, Abbr. in Table A7. Distinct applications within a reference are marked with lowercase letters.
Table A4. Overview of the selected references—author names I–Sr, part 2/2, Abbr. in Table A7. Distinct applications within a reference are marked with lowercase letters.
1st AuthorRef.Output Tech.Application TypeVisual GuidanceInput Tech.
Keskin[44] aSCTBTA, OBOT
Keskin[44] bSCGAOB, RC, GSOT
Koroleva[82] aSCTBPT, OS, GSOT
Koroleva[82] bSCGAOB, PT, GS, CCOT
Kotwasch[24]HAR, SCTBVT, TESBT
Loureiro[87]SCTBTA, OB, PTHD
Manuli[89]SCTBTA, OSAD
Mostajeran[37]HARGATA, OB, PT, VTOT, AD
Mouraux[91]SCGA, MITA, OB, SE, CCOT
Nanapragasam[93]SC TA, SEMBT, AD
Pachoulakis[40]SCTBPT, DI, RE, OV, GS, TEOT
Phongamwong[46]SCTBSE, GSMBT, AD
Saraee[96]SCTBTA, OB, RC, PTHD
Seyedebrahimi[34]SAR, SCGATA, RC, GSMBT
Sodhi[30] aSARTBDI, OVOT
Sodhi[30] bSARTBDI, OVOT
Sodhi[30] cSARTBTA, PT, OV, CCOT
Sodhi[30] dSARTBDI, OV, CCOT
Table A5. Overview of the selected references—author names Sy–Z, part 1/2, Abbr. in Table A7. Distinct applications within a reference are marked with lowercase letters.
Table A5. Overview of the selected references—author names Sy–Z, part 1/2, Abbr. in Table A7. Distinct applications within a reference are marked with lowercase letters.
1st AuthorRef.Med. ConditionAff. Body PartBodily FunctionAdded Value
Syed Ali Fathima[99]COUL, FH, SH, ELMFAH, AU, PA, PF
Theunissen[50] aNE, NSLLMF, GBAC, AH, AU, EF, PA, PF
Theunissen[50] bNE, NSLLMF, GBAC, AH, AU, EF, PA, PF
Theunissen[50] cNE, NSLLMF, GBAC, AH, AU, EF, PA, PF
Thikey[100]NE, NSLL, AN, KNMF, GBAC, AH, PA, PF
Timmermans[35] aNE, NSLLGBAH, AU, PF
Timmermans[35] bNE, NSLLGBAH, AU, PF
Timmermans[35] cNE, NSLLGBAH, AU, PF
Trojan[101] aNE, NS, COUL, FHMFAH, AU, PA, PF
Trojan[101] bNE, NS, COUL, FHMFAH, AU, PA, PF
Trojan[101] cNE, NS, COUL, FHMFAH, AU, PA, PF
Trojan[101] dNE, NS, COUL, FHMFAH, AU, PA, PF
Velloso[22]CO STAC, AU, EF, PA, PF
Vidrios-Serrano[45]NE, NSULMFAH, AU, PA
Voinea[102]NE, NSUL, FHMFAH, AU
Table A6. Overview of the selected references—author names Sy–Z, part 2/2, Abbr. in Table A7. Distinct applications within a reference are marked with lowercase letters.
Table A6. Overview of the selected references—author names Sy–Z, part 2/2, Abbr. in Table A7. Distinct applications within a reference are marked with lowercase letters.
1st AuthorRef.Output Tech.Application TypeVisual GuidanceInput Tech.
Syed Ali Fathima[99]SCGATA, OB, GSOT
Theunissen[50] aSCGAGSMBT, SBT, AD
Theunissen[50] bMARGATA, OB, GSSBT
Theunissen[50] cSCTB SBT
Thikey[100]SCTBTA, SE, GS, CCMBT
Timmermans[35] aSARTBOSAD
Timmermans[35] bSARTBTAAD
Timmermans[35] cSARGAOBAD
Trojan[101] aHARMITA, RE, CCOT
Trojan[101] bHARMITA, REOT
Trojan[101] cHARGA, MITA, OB, REOT
Trojan[101] dHARMIOB, REOT
Velloso[22]SCTBDI, RE, OV, SE, GS, TE, CCOT, SBT
Vidrios-Serrano[45]HARTBTA, OB, RC, CCMBT, HD
Voinea[102]HVRTB, MIVT, RE, SE
Table A7. Abbreviations for Table A1, Table A2, Table A3, Table A4, Table A5, Table A6.
Table A7. Abbreviations for Table A1, Table A2, Table A3, Table A4, Table A5, Table A6.
Medical ConditionNE = NeurologicalNS = StrokeNP = Parkinson
IS = Injury/SurgeryCO = Other Condition
Affected Body partUL = Upper LimbFH = Hand/Fingers/WristsSH = Shoulder
EL = ElbowLL = Lower LimbAN = Ankle
KN = KneeBO = Other
Bodily FunctionMF = Motor functionGB = Gait/BalanceST = Strength
CG = Cognitive function
Added ValueAH = AdherencePA = Post-session AnalysisAC = Accuracy
AU = AutonomyPF = Performance FeedbackEF = Effort
Output TechnologyHVR = VR HMDHAR = AR/MR HMDSAR = Spatial AR/MR
MAR = Mobile AR/MRSC = Other Screen-based
Application TypeGA = Game-basedTB = Task-basedMI = Mirror Therapy
Visual GuidanceTA = TargetOB = ObjectRC = Racket/Cursor
PT = PathOS = ObstacleDI = Direction
VT = Virtual CoachRE = RecordingOV = Overlay
SE = Self-EvaluationGS = Score/GraphTE = Text
CC = Color-code
Input TechnologyOT = Optical TrackingMBT = Marker-based TrackingAD = Additional Device
HD = Haptic DeviceSBT = Sensor-based Tracking


  1. Gimigliano, F.; Negrini, S. The World Health Organization “Rehabilitation 2030: A call for action”. Eur. J. Phys. Rehabil. Med. 2017, 53, 155–168. [Google Scholar] [CrossRef] [PubMed]
  2. Lenz, G. Rehabilitation Delivery. In Encyclopedia of Public Health; Springer: Dordrecht, The Netherlands, 2008; pp. 1243–1246. [Google Scholar] [CrossRef]
  3. WHO. Global Health Estimates: Life Expectancy and Leading Causes of Death and Disability; WHO: Genava, Switzerland, 2021. [Google Scholar]
  4. Virani, S.S.; Alonso, A.; Benjamin, E.J.; Bittencourt, M.S.; Callaway, C.W.; Carson, A.P.; Chamberlain, A.M.; Chang, A.R.; Cheng, S.; Delling, F.N.; et al. Heart disease and stroke statistics—2020 update: A report from the American Heart Association. Circulation 2020, 141, 139–596. [Google Scholar] [CrossRef] [PubMed]
  5. McKenna, C.; Chen, P.; Barrett, A.M. Stroke: Impact on Life and Daily Function. In Changes in the Brain: Impact on Daily Life; Springer: Berlin, Germany, 2017; pp. 87–115. [Google Scholar] [CrossRef]
  6. Sjölund, B.H. Physical Medicine and Rehabilitation. In Encyclopedia of Pain; Springer: Berlin/Heidelberg, Germany, 2013; Volume 4, pp. 2912–2916. [Google Scholar] [CrossRef]
  7. Kirch, W. Physical Therapy. In Encyclopedia of Public Health; Springer: Dordrecht, The Netherlands, 2008; pp. 1108–1109. [Google Scholar]
  8. Bulat, P. Occupational Therapy. In Encyclopedia of Public Health; Springer: Dordrecht, The Netherlands, 2008. [Google Scholar] [CrossRef]
  9. Langer, A.; Gassner, L.; Flotz, A.; Hasenauer, S.; Gruber, J.; Wizany, L.; Pokan, R.; Maetzler, W.; Zach, H. How COVID-19 will boost remote exercise-based treatment in Parkinson’s disease: A narrative review. NPJ Parkinson’s Dis. 2021, 7, 1–9. [Google Scholar] [CrossRef] [PubMed]
  10. Cavalcanti, V.C.; Santana, M.I.D.; Gama, A.E.D.; Correia, W.F. Usability assessments for augmented reality motor rehabilitation solutions: A systematic review. Int. J. Comput. Games Technol. 2018, 2018, 5387896. [Google Scholar] [CrossRef]
  11. Cavalcanti, V.C.; Ferreira, M.I.d.S.; Teichrieb, V.; Barioni, R.R.; Correia, W.F.M.; Da Gama, A.E.F. Usability and effects of text, image and audio feedback on exercise correction during augmented reality based motor rehabilitation. Comput. Graph. 2019, 85, 100–110. [Google Scholar] [CrossRef]
  12. Sousa, M.; Vieira, J.; Medeiros, D.; Arsénio, A.; Jorge, J. SleeveAR: Augmented reality for rehabilitation using realtime feedback. In Proceedings of the International Conference on Intelligent User Interfaces, Sonoma, CA, USA, 7–10 March 2016; pp. 175–185. [Google Scholar] [CrossRef]
  13. Garcia, J.A.; Navarro, K.F. The mobile RehApp™: An AR-based mobile game for ankle sprain rehabilitation. In Proceedings of the SeGAH 2014—IEEE 3rd International Conference on Serious Games and Applications for Health, Rio de Janeiro, Brazil, 14–16 May 2014. [Google Scholar] [CrossRef]
  14. Colomer, C.; Llorens, R.; Noé, E.; Alcañiz, M. Effect of a mixed reality-based intervention on arm, hand, and finger function on chronic stroke. J. NeuroEng. Rehabil. 2016, 13, 45. [Google Scholar] [CrossRef][Green Version]
  15. Yu, X.; Angerbauer, K.; Mohr, P.; Kalkofen, D.; Sedlmair, M. Perspective Matters: Design Implications for Motion Guidance in Mixed Reality. In Proceedings of the 2020 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2020, Virtual, 9–13 November 2020; pp. 577–587. [Google Scholar] [CrossRef]
  16. von Elm, E.; Schreiber, G.; Haupt, C.C. Methodische Anleitung für Scoping Reviews (JBI-Methodologie). Z. Evid. Fortbild. Qual. Gesundhwes. 2019, 143, 1–7. [Google Scholar] [CrossRef]
  17. Arksey, H.; O’Malley, L. Scoping studies: Towards a methodological framework. Int. J. Soc. Res. Methodol. Theory Pract. 2005, 8, 19–32. [Google Scholar] [CrossRef][Green Version]
  18. Merino, L.; Schwarzl, M.; Kraus, M.; Sedlmair, M.; Schmalstieg, D.; Weiskopf, D. Evaluating Mixed and Augmented Reality: A Systematic Literature Review (2009–2019). arXiv 2020, arXiv:2010.05988. [Google Scholar]
  19. Azuma, R.T. A Survey of Augmented Reality. Presence Teleoperators Virtual Environ. 1997, 6, 355–385. [Google Scholar] [CrossRef]
  20. Speicher, M.; Hall, B.D.; Nebeling, M. What is Mixed Reality? In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; Volume 15. [Google Scholar] [CrossRef]
  21. Kloster, M.; Babic, A. Mobile VR-Application for Neck Exercises. In Studies in Health Technology and Informatics; IOS Press: Amsterdam, The Netherlands, 2019; Volume 262, pp. 206–209. [Google Scholar] [CrossRef]
  22. Velloso, E.; Bulling, A.; Gellersen, H. MotionMA. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Montréal, QC, Canada, 22–27 April 2006; ACM: New York, NY, USA, 2013; pp. 1309–1318. [Google Scholar] [CrossRef]
  23. Bakker, M.; Boonstra, N.; Nijboer, T.; Holstege, M.; Achterberg, W.; Chavannes, N. The design choices for the development of an Augmented Reality game for people with visuospatial neglect. Clin. eHealth 2020, 3, 82–88. [Google Scholar] [CrossRef]
  24. Kowatsch, T.; Lohse, K.M.; Erb, V.; Schittenhelm, L.; Galliker, H.; Lehner, R.; Huang, E.M. Hybrid ubiquitous coaching with a novel combination of mobile and holographic conversational agents targeting adherence to home exercises: Four design and evaluation studies. J. Med. Internet Res. 2021, 23, e23612. [Google Scholar] [CrossRef] [PubMed]
  25. Raskar, R.; Welch, G.; Fuchs, H. Spatially augmented reality. In Augmented Reality: Placing Artificial Objects in Real Scenes; CRC Press: Boca Raton, FL, USA, 1999; pp. 64–71. [Google Scholar]
  26. Liu, L.Y.; Sangani, S.; Patterson, K.K.; Fung, J.; Lamontagne, A. Real-Time Avatar-Based Feedback to Enhance the Symmetry of Spatiotemporal Parameters after Stroke: Instantaneous Effects of Different Avatar Views. IEEE Trans. Neural Syst. Rehabil. Eng. 2020, 28, 878–887. [Google Scholar] [CrossRef] [PubMed]
  27. Bouteraa, Y.; Abdallah, I.B.; Elmogy, A.M. Training of Hand Rehabilitation Using Low Cost Exoskeleton and Vision-Based Game Interface. J. Intell. Robot. Syst. Theory Appl. 2019, 96, 31–47. [Google Scholar] [CrossRef]
  28. Tang, R.; Yang, X.D.; Bateman, S.; Jorge, J.; Tang, A. [email protected]: Exploring visual guidance and feedback techniques for physiotherapy exercises. In Proceedings of the Conference on Human Factors in Computing Systems, Seoul, Korea, 18–23 April 2015; pp. 4123–4132. [Google Scholar] [CrossRef]
  29. Toh, A.; Jiang, L.; Keong Lua, E. Augmented Reality Gaming for [email protected]; Technical report; ACM: New York, NY, USA, 2011. [Google Scholar]
  30. Sodhi, R.; Benko, H.; Wilson, A. LightGuide. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Austin, TX, USA, 5–10 May 2012; pp. 179–188. [Google Scholar] [CrossRef]
  31. Park, W.; Kim, J.; Kim, M. Efficacy of virtual reality therapy in ideomotor apraxia rehabilitation. Medicine 2021, 100, e26657. [Google Scholar] [CrossRef]
  32. Alamri, A.; Kim, H.N.; El Saddik, A. A decision model of stroke patient rehabilitation with augmented reality-based games. In Proceedings of the IEEE 2010 International Conference on Autonomous and Intelligent Systems, Povoa de Varzim, Portugal, 21–23 June 2010. [Google Scholar] [CrossRef]
  33. Dinevan, A.; Aung, Y.M.; Al-Jumaily, A. Human computer interactive system for fast recovery based stroke rehabilitation. In Proceedings of the 2011 11th International Conference on Hybrid Intelligent Systems (HIS), Malacca, Malaysia, 5–8 December 2011; pp. 647–652. [Google Scholar] [CrossRef]
  34. Seyedebrahimi, A.; Khosrowabadi, R.; Hondori, H.M. Brain Mechanism in the Human-Computer Interaction Modes Leading to Different Motor Performance. In Proceedings of the 2019 27th Iranian Conference on Electrical Engineering (ICEE), Yazd, Iran, 30 April–2 May 2019; pp. 1802–1806. [Google Scholar] [CrossRef]
  35. Timmermans, C.; Roerdink, M.; van Ooijen, M.W.; Meskers, C.G.; Janssen, T.W.; Beek, P.J. Walking adaptability therapy after stroke: Study protocol for a randomized controlled trial. Trials 2016, 17, 425. [Google Scholar] [CrossRef][Green Version]
  36. Held, J.P.O.; Yu, K.; Pyles, C.; Veerbeek, J.M.; Bork, F.; Heining, S.M.; Navab, N.; Luft, A.R. Augmented reality-based rehabilitation of gait impairments: Case report. JMIR mHealth uHealth 2020, 8, e17804. [Google Scholar] [CrossRef]
  37. Mostajeran, F.; Steinicke, F.; Ariza Nunez, O.J.; Gatsios, D.; Fotiadis, D. Augmented Reality for Older Adults: Exploring Acceptability of Virtual Coaches for Home-based Balance Training in an Aging Population. In Proceedings of the Conference on Human Factors in Computing Systems, Online, 23 April 2020. [Google Scholar] [CrossRef]
  38. Yeo, S.M.; Lim, J.Y.; Do, J.G.; Lim, J.Y.; In Lee, J.; Hwang, J.H. Effectiveness of interactive augmented reality-based telerehabilitation in patients with adhesive capsulitis: Protocol for a multi-center randomized controlled trial. BMC Musculoskelet. Disord. 2021, 22, 386. [Google Scholar] [CrossRef]
  39. Khan, O.; Ahmed, I.; Cottingham, J.; Rahhal, M.; Arvanitis, T.N.; Elliott, M.T. Timing and correction of stepping movements with a virtual reality avatar. PLoS ONE 2020, 15, e0229641. [Google Scholar] [CrossRef][Green Version]
  40. Pachoulakis, I.; Xilourgos, N.; Papadopoulos, N.; Analyti, A. A Kinect-Based Physiotherapy and Assessment Platform for Parkinson’s Disease Patients. J. Med. Eng. 2016, 2016, 9413642. [Google Scholar] [CrossRef]
  41. Shen, Y.; Gu, P.W.; Ong, S.K.; Nee, A.Y. A novel approach in rehabilitation of hand-eye coordination and finger dexterity. Virtual Real. 2012, 16, 161–171. [Google Scholar] [CrossRef]
  42. Bell, K.M.; Onyeukwu, C.; McClincy, M.P.; Allen, M.; Bechard, L.; Mukherjee, A.; Hartman, R.A.; Smith, C.; Lynch, A.D.; Irrgang, J.J. Verification of a portable motion tracking system for remote management of physical rehabilitation of the knee. Sensors 2019, 19, 1021. [Google Scholar] [CrossRef] [PubMed][Green Version]
  43. Chen, S.; Hu, B.; Gao, Y.; Liao, Z.; Li, J.; Hao, A. Lower Limb Balance Rehabilitation of Post-stroke Patients Using an Evaluating and Training Combined Augmented Reality System. In Proceedings of the 2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), Recife, Brazil, 9–13 November 2020; pp. 217–218. [Google Scholar] [CrossRef]
  44. Keskin, Y.; Gürcan Atçi, A.; Ürkmez, B.; Akgül, Y.S.; Özaras, N.; Aydin, T. Efficacy of a video-based physical therapy and rehabilitation system in patients with post-stroke hemiplegia: A randomized, controlled, pilot study. Turk Geriatr Dergisi 2020, 23, 118–128. [Google Scholar] [CrossRef]
  45. Vidrios-Serrano, C.; Bonilla, I.; Vigueras-Gomez, F.; Mendoza, M. Development of a haptic interface for motor rehabilitation therapy using augmented reality. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Milan, Italy, 25–29 August 2015; pp. 1156–1159. [Google Scholar] [CrossRef]
  46. Phongamwong, C.; Rowe, P.; Chase, K.; Kerr, A.; Millar, L. Treadmill training augmented with real-time visualisation feedback and function electrical stimulation for gait rehabilitation after stroke: A feasibility study. BMC Biomed. Eng. 2019, 1, 20. [Google Scholar] [CrossRef][Green Version]
  47. Fonteyn, E.M.; Heeren, A.; Engels, J.J.C.; Boer, J.J.; van de Warrenburg, B.P.; Weerdesteyn, V. Gait adaptability training improves obstacle avoidance and dynamic stability in patients with cerebellar degeneration. Gait Posture 2014, 40, 247–251. [Google Scholar] [CrossRef]
  48. Mostajeran, F.; Katzakis, N.; Ariza, O.; Freiwald, J.P.; Steinicke, F. Welcoming a holographic virtual coach for balance training at home: Two focus groups with older adults. In Proceedings of the 26th IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 1465–1470. [Google Scholar] [CrossRef][Green Version]
  49. Han, P.H.; Chen, K.W.; Hsieh, C.H.; Huang, Y.J.; Hung, Y.P. AR-Arm: Augmented visualization for guiding arm movement in the first-person perspective. In Proceedings of the ACM International Conference Proceeding Series. Association for Computing Machinery, Geneva, Switzerland, 25–27 February 2016. [Google Scholar] [CrossRef]
  50. Theunissen, T.; Ensink, C.; Bakker, R.; Keijsers, N. Movin(g) Reality: Rehabilitation after a CVA with Augmented Reality. In Proceedings of the European Conference on Pattern Languages of Programs, Virtual. 1–4 July 2020. [Google Scholar] [CrossRef]
  51. Chang, W.C.; Ko, L.W.; Yu, K.H.; Ho, Y.C.; Chen, C.H.; Jong, Y.J.; Huang, Y.P. EEG analysis of mixed-reality music rehabilitation system for post-stroke lower limb therapy. J. Soc. Inf. Disp. 2019, 27, 372–380. [Google Scholar] [CrossRef]
  52. Achanccaray, D.; Izumi, S.I.; Hayashibe, M. Visual-Electrotactile Stimulation Feedback to Improve Immersive Brain-Computer Interface Based on Hand Motor Imagery. Comput. Intell. Neurosc. 2021, 2021, 8832686. [Google Scholar] [CrossRef]
  53. Mohd Hashim, S.H.; Ismail, M.; Manaf, H.; Hanapiah, F.A. Development of dual cognitive task virtual reality game addressing stroke rehabilitation. In Proceedings of the 2019 3rd International Conference on Virtual and Augmented Reality, Perth, WN, Australia, 23–25 February 2019; pp. 21–25. [Google Scholar] [CrossRef]
  54. Paredes, T.V.; Postolache, O.; Monge, J.; Girao, P.S. Gait rehabilitation system based on mixed reality. In Proceedings of the 2021 Telecoms Conference (ConfTELE), Leiria, Portugal, 11–12 February 2021. [Google Scholar] [CrossRef]
  55. Agopyan, H.; Griffet, J.; Poirier, T.; Bredin, J. Modification of knee flexion during walking with use of a real-time personalized avatar. Heliyon 2019, 5, e02797. [Google Scholar] [CrossRef]
  56. Alexandre, R.; Postolache, O.; Girao, P.S. Physical Rehabilitation based on Smart Wearable and Virtual Reality Serious Game. In Proceedings of the 2019 IEEE International Instrumentation and Measurement Technology Conference (I2MTC), Auckland, New Zealand, 20–23 May 2019; pp. 1–6. [Google Scholar] [CrossRef]
  57. Aung, Y.M.; Al-Jumaily, A. AR based upper limb rehabilitation system. In Proceedings of the 2012 4th IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics (BioRob), Rome, Italy, 24–27 June 2012; pp. 213–218. [Google Scholar] [CrossRef]
  58. Aung, Y.M.; Al-Jumaily, A.; Anam, K. A novel upper limb rehabilitation system with self-driven virtual arm illusion. Annu. Int. Conf. IEEE Eng. Med. Biol. Soc. 2014, 2014, 3614–3617. [Google Scholar] [CrossRef]
  59. Baran, M.; Lehrer, N.; Siwiak, D.; Chen, Y.; Duff, M.; Ingalls, T.; Rikakis, T. Design of a home-based adaptive mixed reality rehabilitation system for stroke survivors. Annu. Int. Conf. IEEE Eng. Med. Biol. Soc. 2011, 2011, 7602–7605. [Google Scholar] [CrossRef]
  60. Broeren, J.; Rydmark, M.; Björkdahl, A.; Sunnerhagen, K.S. Assessment and training in a 3-dimensional virtual environment with haptics: A report on 5 cases of motor rehabilitation in the chronic stage after stroke. Neurorehabil. Neural Repair 2007, 21, 180–189. [Google Scholar] [CrossRef] [PubMed]
  61. Broeren, J.; Claesson, L.; Goude, D.; Rydmark, M.; Sunnerhagen, K.S. Virtual rehabilitation in an activity centre for community-dwelling persons with stroke: The possibilities of 3-dimensional computer games. Cerebrovasc. Dis. 2008, 26, 289–296. [Google Scholar] [CrossRef] [PubMed]
  62. Burke, J.W.; McNeill, M.D.; Charles, D.K.; Morrow, P.J.; Crosbie, J.H.; McDonough, S.M. Augmented reality games for upper-limb stroke rehabilitation. In Proceedings of the 2010 Second International Conference on Games and Virtual Worlds for Serious Applications, Braga, Portugal, 25–26 March 2010; pp. 75–78. [Google Scholar] [CrossRef]
  63. Camporesi, C.; Kallmann, M.; Han, J.J. VR solutions for improving physical therapy. In Proceedings of the 2013 IEEE Virtual Reality (VR), Lake Buena Vista, FL, USA, 18–20 March 2013; pp. 77–78. [Google Scholar] [CrossRef][Green Version]
  64. Corrêa, D.; Ficheman, I.K.; De, R.; Lopes, D.; Klein, A.N.; Nakazune, S.J. Augmented Reality in Occupacional Therapy. In Proceedings of the 2013 8th Iberian Conference on Information Systems and Technologies (CISTI), Lisboa, Portugal, 19–22 June 2013. [Google Scholar]
  65. Da Gama, A.E.F.; Chaves, T.M.; Figueiredo, L.S.; Baltar, A.; Meng, M.; Navab, N.; Teichrieb, V.; Fallavollita, P. MirrARbilitation: A clinically-related gesture recognition interactive tool for an AR rehabilitation system. Comput. Methods Programs Biomed. 2016, 135, 105–114. [Google Scholar] [CrossRef] [PubMed]
  66. Dancu, A. Motor learning in a mixed reality environment. In Proceedings of the 7th Nordic Conference on Human-Computer Interaction Making Sense Through Design—NordiCHI ’12, Copenhagen, Denmark, 14–17 October 2012; ACM Press: New York, NY, USA, 2012; p. 811. [Google Scholar] [CrossRef]
  67. David, L.; Bouyer, G.; Otmane, S. Towards an upper limb self-rehabilitation assistance system after stroke. In Proceedings of the Virtual Reality International Conference—Laval Virtual 2017, Laval, France, 22–24 March 2017; ACM: New York, NY, USA, 2017; pp. 1–4. [Google Scholar] [CrossRef]
  68. Desai, K.; Bahirat, K.; Ramalingam, S.; Prabhakaran, B.; Annaswamy, T.; Makris, U.E. Augmented reality-based exergames for rehabilitation. In Proceedings of the 7th International Conference on Multimedia Systems, MMSys 2016, Klagenfurt, Austria, 10–13 May 2016; Association for Computing Machinery, Inc.: New York, NY, USA, 2016; pp. 232–241. [Google Scholar] [CrossRef]
  69. Enam, N.; Veerubhotla, A.; Ehrenberg, N.; Kirshblum, S.; Nolan, K.J.; Pilkar, R. Augmented-reality guided treadmill training as a modality to improve functional mobility post-stroke: A proof-of-concept case series. Top. Stroke Rehabil. 2021, 28, 624–630. [Google Scholar] [CrossRef] [PubMed]
  70. Gauthier, L.V.; Kane, C.; Borstad, A.; Strahl, N.; Uswatte, G.; Taub, E.; Morris, D.; Hall, A.; Arakelian, M.; Mark, V. Video Game Rehabilitation for Outpatient Stroke (VIGoROUS): Protocol for a multi-center comparative effectiveness trial of in-home gamified constraint-induced movement therapy for rehabilitation of chronic upper extremity hemiparesis. BMC Neurol. 2017, 17, 109. [Google Scholar] [CrossRef]
  71. Grimm, F.; Naros, G.; Gharabaghi, A. Compensation or restoration: Closed-loop feedback of movement quality for assisted reach-to-grasp exercises with a multi-joint arm exoskeleton. Front. Neurosci. 2016, 10, 280. [Google Scholar] [CrossRef][Green Version]
  72. Guo, G.; Segal, J.; Zhang, H.; Xu, W. ARMove: A smartphone augmented reality exergaming system for upper and lower extremities stroke rehabilitation: Demo abstract. In SenSys 2019—Proceedings of the 17th Conference on Embedded Networked Sensor Systems; Association for Computing Machinery, Inc.: New York, NY, USA, 2019; pp. 384–385. [Google Scholar] [CrossRef]
  73. Hacioglu, A.; Ozdemir, O.F.; Sahin, A.K.; Akgul, Y.S. Augmented reality based wrist rehabilitation system. In Proceedings of the 2016 24th Signal Processing and Communication Application Conference (SIU), Zonguldak, Turkey, 16–19 May 2016; pp. 1869–1872. [Google Scholar] [CrossRef]
  74. Halic, T.; Kockara, S.; Demirel, D.; Willey, M.; Eichelberger, K. MoMiReS: Mobile mixed reality system for physical & occupational therapies for hand and wrist ailments. In Proceedings of the 2014 IEEE Innovations in Technology Conference, Warwick, RI, USA, 16 May 2014. [Google Scholar] [CrossRef]
  75. Hoermann, S.; Santos, L.F.D.; Morkisch, N.; Jettkowski, K.; Sillis, M.; Cutfield, N.J.; Schmidt, H.; Hale, L.; Kruger, J.; Regenbrecht, H.; et al. Computerized mirror therapy with augmented reflection technology for stroke rehabilitation: A feasibility study in a rehabilitation center. In Proceedings of the 2015 International Conference on Virtual Rehabilitation (ICVR), Valencia, Spain, 9–12 June 2015; pp. 199–206. [Google Scholar] [CrossRef]
  76. Hoermann, S.; Ferreira dos Santos, L.; Morkisch, N.; Jettkowski, K.; Sillis, M.; Devan, H.; Kanagasabai, P.S.; Schmidt, H.; Krüger, J.; Dohle, C.; et al. Computerised mirror therapy with Augmented Reflection Technology for early stroke rehabilitation: Clinical feasibility and integration as an adjunct therapy. Disabil. Rehabil. 2017, 39, 1503–1514. [Google Scholar] [CrossRef]
  77. Ines, D.L.; Abdelkader, G.; Hocine, N. Mixed reality serious games for post-stroke rehabilitation. In Proceedings of the 2011 5th International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth) and Workshops, Dublin, Ireland, 23–26 May 2011; pp. 530–537. [Google Scholar] [CrossRef][Green Version]
  78. Jin, Y.; Monge, J.; Postolache, O.; Niu, W. Augmented Reality with Application in Physical Rehabilitation. In Proceedings of the 2019 International Conference on Sensing and Instrumentation in IoT Era (ISSI), Lisbon, Portugal, 29–30 August 2019; pp. 1–6. [Google Scholar] [CrossRef]
  79. Jung, G.U.; Moon, T.H.; Park, G.W.; Lee, J.Y.; Lee, B.H. Use of Augmented Reality-Based Training with EMG-Triggered Functional Electric Stimulation in Stroke Rehabilitation. J. Phys. Therapy Sci. 2013, 25, 147–151. [Google Scholar] [CrossRef][Green Version]
  80. King, M.; Hale, L.; Pekkari, A.; Persson, M. An affordable, computerized, table-based exercise system for stroke survivors. Disabil. Rehabil. Assist. Technol. 2010, 5, 288–293. [Google Scholar] [CrossRef]
  81. Klein, A.; De Assis, G.A. A markeless augmented reality tracking for enhancing the user interaction during virtual rehabilitation. In Proceedings of the 2013 XV Symposium on Virtual and Augmented Reality, Cuiaba, Brazil, 28–31 May 2013; pp. 117–124. [Google Scholar] [CrossRef]
  82. Koroleva, E.S.; Tolmachev, I.V.; Alifirova, V.M.; Boiko, A.S.; Levchuk, L.A.; Loonen, A.J.M.; Ivanova, S.A. Serum BDNF’s Role as a Biomarker for Motor Training in the Context of AR-Based Rehabilitation after Ischemic Stroke. Brain Sci. 2020, 10, 623. [Google Scholar] [CrossRef]
  83. LaPiana, N.; Duong, A.; Lee, A.; Alschitz, L.; Silva, R.M.; Early, J.; Bunnell, A.; Mourad, P. Acceptability of a mobile phone–based augmented reality game for rehabilitation of patients with upper limb deficits from stroke: Case study. JMIR Rehabil. Assist. Technol. 2020, 7, e17822. [Google Scholar] [CrossRef] [PubMed]
  84. Lee, A.; Hellmers, N.; Vo, M.; Wang, F.; Popa, P.; Barkan, S.; Patel, D.; Campbell, C.; Henchcliffe, C.; Sarva, H. Can google glass™ technology improve freezing of gait in parkinsonism? A pilot study. Disabil. Rehabil. Assist. Technol. 2020, 1–11. [Google Scholar] [CrossRef] [PubMed]
  85. Liu, J.; Mei, J.; Zhang, X.; Lu, X.; Huang, J. Augmented reality-based training system for hand rehabilitation. Multimed. Tools Appl. 2017, 76, 14847–14867. [Google Scholar] [CrossRef]
  86. Lledó, L.D.; Díez, J.A.; Bertomeu-Motos, A.; Ezquerro, S.; Badesa, F.J.; Sabater-Navarro, J.M.; García-Aracil, N. A comparative analysis of 2D and 3D tasks for virtual reality therapies based on robotic-assisted neurorehabilitation for post-stroke patients. Front. Aging Neurosci. 2016, 8, 205. [Google Scholar] [CrossRef] [PubMed][Green Version]
  87. Loureiro, R.; Amirabdollahian, F.; Topping, M.; Driessen, B.; Harwin, W. Upper Limb Robot Mediated Stroke Therapy-GENTLE/s Approach. Auton. Rob. 2003, 15, 35–51. [Google Scholar] [CrossRef][Green Version]
  88. Luo, X.; Kline, T.; Fischer, H.C.; Stubblefield, K.A.; Kenyon, R.V.; Kamper, D.G. Integration of augmented reality and assistive devices for post-stroke hand opening rehabilitation. Conf. Proc. IEEE Eng. Med. Biol. Soc. 2005, 2005, 6855–6858. [Google Scholar] [CrossRef]
  89. Manuli, A.; Maggio, M.G.; Latella, D.; Cannavò, A.; Balletta, T.; De Luca, R.; Naro, A.; Calabrò, R.S. Can robotic gait rehabilitation plus Virtual Reality affect cognitive and behavioural outcomes in patients with chronic stroke? A randomized controlled trial involving three different protocols. J. Stroke Cerebrovasc. Dis. 2020, 29, 104994. [Google Scholar] [CrossRef]
  90. Monge, J.; Postolache, O. Augmented Reality and Smart Sensors for Physical Rehabilitation. In Proceedings of the 2018 International Conference and Exposition on Electrical And Power Engineering (EPE), Iasi, Romania, 18–19 October 2018; pp. 1010–1014. [Google Scholar] [CrossRef]
  91. Mouraux, D.; Brassinne, E.; Sobczak, S.; Nonclercq, A.; Warzée, N.; Sizer, P.S.; Tuna, T.; Penelle, B. 3D augmented reality mirror visual feedback therapy applied to the treatment of persistent, unilateral upper extremity neuropathic pain: A preliminary study. J. Man. Manip. Therapy 2017, 25, 137–143. [Google Scholar] [CrossRef]
  92. Muñoz, G.F.; Mollineda, R.A.; Casero, J.G.; Pla, F. A rgbd-based interactive system for gaming-driven rehabilitation of upper limbs. Sensors 2019, 19, 3478. [Google Scholar] [CrossRef][Green Version]
  93. Nanapragasam, A.; Pelah, A.; Cameron, J.; Lasenby, J. Visualizations for locomotor learning with real time feedback in VR. In Proceedings of the 6th Symposium on Applied Perception in Graphics and Visualization—APGV ’09, Chania, Crete, Greece, 30 September–2 October 2009; ACM Press: New York, NY, USA, 2009; p. 139. [Google Scholar] [CrossRef]
  94. Ramírez, E.R.; Petrie, R.; Chan, K.; Signal, N. A Tangible Interface and Augmented Reality Game for Facilitating Sit-to-Stand Exercises for Stroke Rehabilitation. In Proceedings of the ACM International Conference Proceeding Series—IOT ’18, Santa Barbara, CA, USA, 15–18 October 2018; Association for Computing Machinery: New York, NY, USA, 2018. [Google Scholar] [CrossRef]
  95. Regenbrecht, H.; Collins, J.; Hoermann, S. A leap-supported, hybrid AR interface approach. In Proceedings of the 25th Australian Computer-Human Interaction Conference on Augmentation, Application, Innovation, Collaboration—OzCHI ’13, Adelaide, Australia, 25–29 November 2013; ACM Press: New York, NY, USA, 2013; pp. 281–284. [Google Scholar] [CrossRef]
  96. Saraee, E.; Betke, M. Dynamic Adjustment of Physical Exercises Based on Performance Using the Proficio Robotic Arm. In Proceedings of the ACM International Conference Proceeding Series—PETRA’16, Corfu Island, Greece, 29 June–1 July 2016; Association for Computing Machinery: New York, NY, USA, 2016. [Google Scholar] [CrossRef]
  97. Song, X.; Ding, L.; Zhao, J.; Jia, J.; Shull, P. Cellphone Augmented Reality Game-based Rehabilitation for Improving Motor Function and Mental State after Stroke. In Proceedings of the 2019 IEEE 16th International Conference on Wearable and Implantable Body Sensor Networks (BSN), Chicago, IL, USA, 19–22 May 2019; pp. 1–4. [Google Scholar] [CrossRef]
  98. Sror, L.; Vered, M.; Treger, I.; Levy-Tzedek, S.; Levin, M.F.; Berman, S. A virtual reality-based training system for error-augmented treatment in patients with stroke. In Proceedings of the 2019 International Conference on Virtual Rehabilitation (ICVR), Tel Aviv, Israel, 21–24 July 2019; pp. 1–2. [Google Scholar] [CrossRef]
  99. Syed Ali Fathima, S.J.; Shankar, S. AR using NUI based physical therapy rehabilitation framework with mobile decision support system:A global solution for remote assistance. J. Glob. Inf. Manag. 2018, 26, 36–51. [Google Scholar] [CrossRef]
  100. Thikey, H.; Grealy, M.; van Wijck, F.; Barber, M.; Rowe, P. Augmented visual feedback of movement performance to enhance walking recovery after stroke: Study protocol for a pilot randomised controlled trial. Trials 2012, 13, 163. [Google Scholar] [CrossRef] [PubMed][Green Version]
  101. Trojan, J.; Diers, M.; Fuchs, X.; Bach, F.; Bekrater-Bodmann, R.; Foell, J.; Kamping, S.; Rance, M.; Maaß, H.; Flor, H. An augmented reality home-training system based on the mirror training and imagery approach. Behav. Res. Methods 2014, 46, 634–640. [Google Scholar] [CrossRef] [PubMed][Green Version]
  102. Voinea, A.; Moldoveanu, A.; Moldoveanu, F. 3D visualization in IT systems used for post stroke recovery: Rehabilitation based on virtual reality. In Proceedings of the 2015 20th International Conference on Control Systems and Computer Science, Bucharest, Romania, 27–29 May 2015; pp. 856–862. [Google Scholar] [CrossRef]
  103. Wei, X.; Chen, Y.; Jia, X.; Chen, Y.; Xie, L. Muscle Activation Visualization System Using Adaptive Assessment and Forces-EMG Mapping. IEEE Access 2021, 9, 46374–46385. [Google Scholar] [CrossRef]
  104. Xue, Y.; Zhao, L.; Xue, M.; Fu, J. Gesture Interaction and Augmented Reality based Hand Rehabilitation Supplementary System. In Proceedings of the 2018 IEEE 3rd Advanced Information Technology, Electronic and Automation Control Conference (IAEAC), Chongqing, China, 12–14 October 2018; pp. 2572–2576. [Google Scholar] [CrossRef]
  105. Yang, U.; Kim, G.J. Implementation and Evaluation of “Just Follow Me”: An Immersive, VR-Based, Motion-Training System. Presence Teleoperators Virtual Environ. 2002, 11, 304–323. [Google Scholar] [CrossRef]
  106. Yeh, S.C.; Rizzo, A.; McLaughlin, M.; Parsons, T. VR enhanced upper extremity motor training for post-stroke rehabilitation: Task design, clinical experiment and visualization on performance and progress. Stud. Health Technol. Inf. 2007, 125, 506–511. [Google Scholar]
Figure 1. Flow chart of the study selection process, own representation based on [16].
Figure 1. Flow chart of the study selection process, own representation based on [16].
Healthcare 10 00646 g001
Figure 2. Overview of the Taxonomy.
Figure 2. Overview of the Taxonomy.
Healthcare 10 00646 g002
Figure 3. Illustrative examples of visual guidance within the selected references: (a) game-based ankle exercise using a racket to direct an object into a target [13]; (b) task-based elbow and shoulder exercise guiding the patient’s cursor along with a color-coded path [12]; (c) task-based gait training on treadmill displaying an obstacle to be avoided by the patient [35]; (d) arrow overlay projected onto the patient’s hand guiding him in the right direction [30]; (e) virtual coach demonstrates the correct performance of an exercise [37]; (f) alternative angle displayed on the screen in front of the patient allows him to evaluate his movements [28]; (g) game screen extended with text-based instructions, score, and animation demonstrating the correct performance of the exercise [11].
Figure 3. Illustrative examples of visual guidance within the selected references: (a) game-based ankle exercise using a racket to direct an object into a target [13]; (b) task-based elbow and shoulder exercise guiding the patient’s cursor along with a color-coded path [12]; (c) task-based gait training on treadmill displaying an obstacle to be avoided by the patient [35]; (d) arrow overlay projected onto the patient’s hand guiding him in the right direction [30]; (e) virtual coach demonstrates the correct performance of an exercise [37]; (f) alternative angle displayed on the screen in front of the patient allows him to evaluate his movements [28]; (g) game screen extended with text-based instructions, score, and animation demonstrating the correct performance of the exercise [11].
Healthcare 10 00646 g003
Figure 4. Distribution of the selected references by publication year.
Figure 4. Distribution of the selected references by publication year.
Healthcare 10 00646 g004
Figure 5. Usage of visualization-related keywords in the selected references.
Figure 5. Usage of visualization-related keywords in the selected references.
Healthcare 10 00646 g005
Figure 6. Connections between selected elements of the reviewed literature.
Figure 6. Connections between selected elements of the reviewed literature.
Healthcare 10 00646 g006
Figure 7. Share of research aimed at Lower Limb.
Figure 7. Share of research aimed at Lower Limb.
Healthcare 10 00646 g007
Figure 8. Usage of Visual Guidance elements by Application Type and targeted Advantage. Numbers on the bars represent the percentage of applications utilizing a Visual Guidance element. The bar’s width represents the relative frequency within the respective Added Value.
Figure 8. Usage of Visual Guidance elements by Application Type and targeted Advantage. Numbers on the bars represent the percentage of applications utilizing a Visual Guidance element. The bar’s width represents the relative frequency within the respective Added Value.
Healthcare 10 00646 g008
Figure 9. Used Output Technologies over the years.
Figure 9. Used Output Technologies over the years.
Healthcare 10 00646 g009
Figure 10. Differences in used Output Technology in relation to Affected Body Part and Bodily Function. Numbers represent percentages.
Figure 10. Differences in used Output Technology in relation to Affected Body Part and Bodily Function. Numbers represent percentages.
Healthcare 10 00646 g010
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Butz, B.; Jussen, A.; Rafi, A.; Lux, G.; Gerken, J. A Taxonomy for Augmented and Mixed Reality Applications to Support Physical Exercises in Medical Rehabilitation—A Literature Review. Healthcare 2022, 10, 646.

AMA Style

Butz B, Jussen A, Rafi A, Lux G, Gerken J. A Taxonomy for Augmented and Mixed Reality Applications to Support Physical Exercises in Medical Rehabilitation—A Literature Review. Healthcare. 2022; 10(4):646.

Chicago/Turabian Style

Butz, Benjamin, Alexander Jussen, Asma Rafi, Gregor Lux, and Jens Gerken. 2022. "A Taxonomy for Augmented and Mixed Reality Applications to Support Physical Exercises in Medical Rehabilitation—A Literature Review" Healthcare 10, no. 4: 646.

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop