A Novel Augmented Reality Mobile-Based Application for Biomechanical Measurement

Human posture and range of motion (ROM) measurements are important health indicators for identifying abnormalities from various disorders (e.g., scoliosis, musculoskeletal disorders, pain syndromes). A viable real-time mobile application for measuring body posture and ROM is currently lacking. To address this need, a novel Android smartphone augmented-reality-based application was developed and evaluated to enable real-time AprilTag2 marker measurement at the point of patient contact (Biomechanical Augmented Reality-Marker, BAR-M). Mobile app performance was evaluated on a body opponent bag (BOB) and 15 healthy participants by comparing smartphone app and Vicon motion analysis output (pelvis, shoulder, arm, torso angles). A Samsung Galaxy smartphone recorded live video, calculated AprilTag orientations and angle of “a line connecting the center of two tags”, and displayed outcomes in real time. For the BOB test, the absolute difference between Vicon and smartphone angles were 0.09◦ ± 0.05◦ for hip, 0.09◦ ± 0.06◦ for shoulder, and 0.69◦ for arm abduction. For the participant test, the absolute mean angle differences were 1.70◦ ± 0.23◦ for hip, 1.34◦ ± 0.27◦ for shoulder, and 11.18◦ ± 3.68◦ for arm abduction. Overall, the app obtained valid and reliable angle measurements for postural and ROM assessments using the smartphone’s front camera. Arm abduction results were affected by clothing movement that caused Vicon markers to move differently from AprilTag markers. Thus, with appropriate measurement methods, this real-time smartphone app is a viable tool to facilitate immediate clinical decision making based on human posture and ROM assessments.


Introduction
Posture is the body's position or bearing for a special purpose, and can be related to health [1]. When considering movement, joint range of motion (ROM) is often used in clinical assessments. Currently, several methods can be used to quantify human posture and ROM measurements in clinical practice. A conventional approach is to make measurements manually, with a goniometer, tape, etc. However, visual observation can be affected by human error [2], which can lead to improper clinical decisions and thereby long-term issues for the patient (e.g., improper pelvis alignment leading to back problems for prosthesis users).
To mitigate human error, advanced technologies can be used to assist with accurate quantification of human body postural measurement. Technology-based methods include inertial measurement unit (IMU) sensors, marker-based systems, and markerless motion analysis systems.
Marker-based systems capture the movement of markers placed on the person. Systems use active, magnetic, or passive markers. Passive marker systems typically use retro-reflective balls that are illuminated using infrared (IR) lights mounted on the cameras such as Vicon (Oxford, England). While Vicon systems are accurate, using them in clinical practice is limited due to motion lab space requirements, prohibitive system cost, and BioMed 2022, 2 256 the time required for patient setup and data processing [3,4]. Despite the high accuracy, passive markers present challenges in postprocessing 3D data for markers close to each other [5], often requiring manual editing to relabel markers [6,7]. Reflective surfaces in the background may also be mistaken for body markers, requiring manual background masking or relabeling [8].
Active marker systems use light-emitting diodes (LED) as markers, with each marker having a predefined frequency to assist in marker differentiation [9]. However, individuals need to carry several cables and other components that may affect their movements. Active markers systems include Codamotion (Rothley, England), Optotrak (Northern Digital, Inc., Waterloo, Ontario, Canada), Qualysis (Göteborg, Sweden), and Selcon (Selspot Systems, Ltd., Southfield, Michigan) [10]. Active markers can eliminate errors due to marker misidentification and therefore reduce marker sorting time during postprocessing [11].
In magnetic systems, low-frequency quasi-static magnetic fields can be used to determine position and orientation of a sensor relative to a source [12]. While these 3D measurement systems are portable, they are more cumbersome to employ due to the system's power, size, and weight [12] and are susceptible to metal within the capture volume.
Markerless systems could be used in movement analysis to reduce participant preparation time and encourage natural movement, since tracking markers are not required. Posture can be qualitatively and quantitatively assessed by interpreting photographs. For example, recent developments in pose estimation algorithms, particularly in the field of artificial intelligence, have improved posture prediction from videos by converting video frames into body component coordinates. Numerous algorithms for estimating poses have been presented [13][14][15]. However, based on human posture estimate performance, deep learning algorithms were shown to be the most powerful [16]. Pose estimation algorithms have potential for measuring human movement; however, accurate human pose quantification requires a substantial amount of data for training and specific landmarks not included in the pose estimation model cannot be identified. Therefore, a marker-based approach could be more beneficial in many instances.
Another example of markerless systems is obtaining body angle measurements by manually post processing each frame recorded on the video. Angle measurements on stored video have been implemented as mobile phone apps or web-based applications [17,18] by selecting three points on the video frame (i.e., online goniometer). While angle measurement is accurate based on the selected points, points on the video might not be the exact anatomical positions. Additional drawbacks include the time required for capturing video, processing markers, and reporting results. However, this approach allows clinicians to step through the stored video frames, which can aid observational movement analysis.
Current motion analysis lab methods can have high purchase and installation costs, time-consuming data collection and analysis, and require a permanent, large footprint. Therefore, there is a need for a reliable, affordable, and easily accessible system to quantify posture measurement, body segment positioning, and symmetry at the point of patient contact. As a result, recent studies have developed mobile phone applications to provide an affordable solution for human angle measurement.
Biomechanics Augmented Reality (BAR) is an example of an AR-based smartphone app for real-time human angle measurement and result reporting [19]. BAR measures angles using the phone's orientation to gravity via the smartphone sensors and displays results over live video in real time. The app can also show a grid overlay as a visualization aid. Real-time angle measurements on a smartphone facilitate immediate clinical decision making; however, BAR cannot measure ROM and specific body landmarks.
OpenPose-based markerless keypoint identification has also been implemented on a smartphone to measure ankle, knee, and hip flexion/extension angles by analyzing images taken from the recorded video using the smart phone's camera. The OpenPose 2D real-time multi-person keypoint detection technology matches linked parts of the human body with people in the video using Convolutional Neural Networks (CNNs) [20]. However, the smartphone application has not been validated using a gold standard system such as Vicon. OpenPose was shown to provide better results than BlazePose (smartphone enabled model for Google ML Kit) for typical movements used in clinical analysis [21]. While these smartphone tools could be used anywhere, analyzing human anatomical points beyond the available model is not feasible.
Considering the drawbacks of these methods, we propose a prototype, novel ARbased smartphone application using fiducial markers (AprilTag2) [22,23] to enable mobile human posture, body symmetry, and angle measurement in real-time with inexpensive and accurate computational processes. Fiducial marker systems are defined objects in an optical imaging device's field of view and can be dynamically detected in software. These markers are best applied when a relative pose between a source and object is needed; for example, robotics, AR, or human-computer interaction. Ideally, a clinician will be able to hold markers on a patient's anatomical locations and see the outcome measurement immediately on the mobile device.
In this research, we developed and evaluated the Biomechanics Augmented Reality Marker (BAR-M) smartphone app to fill the gap between cost, accessibility, and system accuracy for functionality in clinical situations, specifically human posture, body symmetry, and ROM measurement. After successful evaluation of this smartphone app in clinical situations, this AR-based digital measurement tool could be used by clinicians to obtain real-time angle measurements to assist in clinical decision making at the point of patient contact.

BAR-M System Development
The BAR-M system consists of three main components to be used in clinical practice including: 1-AprilTag fiducial marker; 2-Mobile application; and 3-BAR-M adapter.

AprilTag
AprilTag2 was used as BAR-M application markers. AprilTag2 improved on the original AprilTag by providing greater reliability than other fiducial systems, with better detection performance, fewer false positives, and shorter computational time [22,23]. The lexicode-based generation process reduced false positive rates without hindering location accuracy [22,24]. AprilTag works robustly in lens distortion, occlusion, and warping [25], but with increased sensitivity to edge occlusions, limiting its effectiveness for cases where occlusion occurs [26]. Upon occlusion of the internal portion of the tags, AprilTag markers performed at satisfactory levels, yielding a detection rate of 50-100% depending on the tag ID [26]. AprilTag showed strong resistance to lateral and normal rotations since markers were detected and recognized at 0 • , 10 • , 20 • , 30 • , 45 • , 55 • , and 65 • relative to both rotation directions (clockwise and counterclockwise) [26]. Another study assessed AprilTag2 performance on a smartphone [27]. The results supported robust tag detection on smartphones and other computation-constrained devices, substantially enhancing their application for real-time tag tracking. Markers were identified and recognized at 0 • , 5 • , 10 • , 15 • , 20 • , and 25 • relative to both clockwise and counterclockwise rotations. Moreover, these markers demonstrated resistance to camera yaw and tilt rotations. Thus, even though the tags are not completely parallel and the mobile device screen is not vertical to gravity, AprilTag2 reliability on smartphones was supported [27].

BAR-M Application Design
The novel BAR-M Android app tracks AprilTag2 orientation and 2D position in real time. The core AprilTag2 image processing library (C language) was not modified and Java language was used for the AprilTag2 library with the Java Native Interface (JNI).
A background thread receives frames captured by the smart phone camera ( Figure 1). To use the AprilTag2 library, this thread parses frame data and delivers the data to the JNI. Through the JNI, the AprilTag library returns (x,y) coordinates for marker corners and tag centers. This is done using regional slope equations and a scale factor (a ratio between marker real-world dimensions and marker coordinates in pixels) [28].
time. The core AprilTag2 image processing library (C language) was not modified and Java language was used for the AprilTag2 library with the Java Native Interface (JNI).
A background thread receives frames captured by the smart phone camera ( Figure  1). To use the AprilTag2 library, this thread parses frame data and delivers the data to the JNI. Through the JNI, the AprilTag library returns (x,y) coordinates for marker corners and tag centers. This is done using regional slope equations and a scale factor (a ratio between marker real-world dimensions and marker coordinates in pixels) [28]. Overlayed visuals are rendered by OpenGL in the correct perspective relative to the smartphone and marker along with marker coordinates transformed into a global affine coordinate frame ( Figure 1).
In addition to displaying a box around each marker, the graphical overlay draws a line connecting marker centers. The app stores frame timings, marker corner coordinates, marker center coordinates, and marker side lengths in a .csv file (units in pixels). The user can choose between two live AR views: individual marker angle to horizontal (determined by camera sensor gravity vector) or angle between a line connecting the centers of two markers and horizontal ( Figure 2). Overlayed visuals are rendered by OpenGL in the correct perspective relative to the smartphone and marker along with marker coordinates transformed into a global affine coordinate frame ( Figure 1).
In addition to displaying a box around each marker, the graphical overlay draws a line connecting marker centers. The app stores frame timings, marker corner coordinates, marker center coordinates, and marker side lengths in a .csv file (units in pixels). The user can choose between two live AR views: individual marker angle to horizontal (determined by camera sensor gravity vector) or angle between a line connecting the centers of two markers and horizontal ( Figure 2).

BAR-M Adapter Design
AprilTag2 markers were mounted on custom 3D-printed adapters to enable positioning at anatomical locations (

BAR-M Adapter Design
AprilTag2 markers were mounted on custom 3D-printed adapters to enable positioning at anatomical locations (Figures 3 and 4):

Methods
This study was designed to assess BAR-M app accuracy compared with a Vicon motion capture system (Vicon MX40; 4 megapixels; Vicon Motion Systems Ltd., Oxford, UK). Vicon was used as a gold standard because of its high accuracy and precision. Distances between markers have been used to evaluate both precision (0.015 mm) and accuracy (0.15 mm) of motion caption systems [7,29].
To evaluate BAR-M functionality for clinical situations, pelvis obliquity, shoulder position, and arm-abduction measurements were obtained for a body opponent bag (BOB) and human.

Body Opponent Bag
A BOB mannequin was the static human surrogate for initial evaluation, since anatomical-based measurements could be made without human movement and tissue variability. Reflective markers were attached to the BOB pelvis (superior iliac spines), shoulder (acromioclavicular joint), arm, and torso (zyphoid process) to enable 3D marker tracking with a 10-camera Vicon motion analysis system ( Figure 4). Angles between the reflective markers were calculated as a gold standard comparator.   For BAR-M, a Samsung S6 smartphone was set up on a tripod such that the screen was 1m in front and parallel to the BOB frontal plane. The application tracked AprilTag2 coordinates, calculated the angle formed by the tag centers, and displayed the angle onscreen in real time (i.e., angle between a line connecting two tags and the phone orienta-

Methods
This study was designed to assess BAR-M app accuracy compared with a Vicon motion capture system (Vicon MX40; 4 megapixels; Vicon Motion Systems Ltd., Oxford, UK). Vicon was used as a gold standard because of its high accuracy and precision. Distances between markers have been used to evaluate both precision (0.015 mm) and accuracy (0.15 mm) of motion caption systems [7,29].
To evaluate BAR-M functionality for clinical situations, pelvis obliquity, shoulder position, and arm-abduction measurements were obtained for a body opponent bag (BOB) and human.

Body Opponent Bag
A BOB mannequin was the static human surrogate for initial evaluation, since anatomicalbased measurements could be made without human movement and tissue variability. Reflective markers were attached to the BOB pelvis (superior iliac spines), shoulder (acromioclavicular joint), arm, and torso (zyphoid process) to enable 3D marker tracking with a 10-camera Vicon motion analysis system ( Figure 4). Angles between the reflective markers were calculated as a gold standard comparator.
For BAR-M, a Samsung S6 smartphone was set up on a tripod such that the screen was 1m in front and parallel to the BOB frontal plane. The application tracked AprilTag2 coordinates, calculated the angle formed by the tag centers, and displayed the angle onscreen in real time (i.e., angle between a line connecting two tags and the phone orientation to gravity, reported as an angle from the horizontal), and stored tag coordinates, angles, and time for further analysis.
Vicon and app data were collected simultaneously for all trials. The app and Vicon system were synchronized by auditory start cue. Vicon marker data were collected at 100 Hz. The BAR-M data collection rate was approximately 19 Hz, varying from 18 to 21 Hz dependent on settings. One second of steady-state data (i.e., lowest standard deviation) was averaged for each comparative measure.
AprilTag markers were co-located beside reflective markers. Two post adapters were used for pelvic obliquity measurements (Figure 4). Flat mounts were held in a diamond position (i.e., corner vertical) for shoulder angle measurement. Velcro adapter markers were secured to the upper arm and torso for arm abduction angle measurement (i.e., angle between torso and arm).
All measurements were made by the same person, who stood behind the BOB and held adapters next to the reflective markers. Since BOB arm abduction was constant with the Velcro attachment, 1 trial was recorded. Pelvis and shoulder angles were measured 10 times since BAR-M markers were hand held. Hand holding markers replicates a clinical use scenario where the clinician holds the markers on the patient's body and then reads the measurement from the screen.

Human Testing
A convenience sample of 15 healthy adults were recruited (14 male, 1 female). All measurements were made by one evaluator. All participants provided informed consent and signed a consent form (uOttawa Research Ethics Board approved). Exclusion criteria were balance problems that affect safe standing and cognitive problems that made following instructions difficult.
An eight-marker set was affixed to each participant: acromia, anterior superior iliac crests, superior iliac crests, torso, and arms. The participant was positioned 1 m in front of the phone, such that the body frontal plane was parallel to the phone screen.
For pelvis obliquity, the participant stood still with their right leg on a 2 mm thick plate, simulating hip misalignment, with arms at their sides. The evaluator stood behind the participant and held post adapters on the left and right posterior superior iliac crests for at least 3 s (Figure 4). In addition to the same saved data as the BOB test, the person making the measurement recorded the most consistent angle from the app's real-time display (i.e., angle that a clinician would select in practice). This procedure was repeated 10 times. Angles from the anterior superior iliac crests' reflective markers were used as comparators.
For shoulder angle, the participant stood with arms at their sides. The evaluator stood behind the participant and held two marker adaptors in a diamond orientation on top of the reflective markers on the acromioclavicular joints for at least 3 s (Figure 4). Saved data and evaluator recorded angle were logged. This procedure was repeated 10 times.
For arm abduction angle, the participant stood with arms at their sides and facing the camera. Velcro straps secured AprilTags to the participant. The evaluator positioned the torso AprilTag vertically at the center of the participant's front. Two reflective markers were positioned on the chest above and below the adaptor. The upper arm adaptor was aligned to the arm's long axis and two reflective markers were positioned beside the adaptor, along this axis. The participant abducted their arm to their comfortable range and held the position for 3 s (Figure 4). Angles of each AprilTag orientation with respect to the horizontal were displayed on the app screen and saved on the phone.
The absolute mean difference between the value read from the app AR display and data stored on the phone was 0.19 • ± 0.09 • for pelvis measurement (range 0.10 • to 0.29 • , p < 0.001) and 0.12 • ± 0.09 • for shoulder angle (range 0.03 • to 0.23 • , p = 0.83). Detailed participant test results are given in Table 1.

Arm Abduction
The difference between app and Vicon arm abduction measurements was not satisfactory. To investigate this further, arm abduction angle was divided into: (a) angle between torso and horizontal line, (b) angle between arm and horizontal line, and (c) angle between arm and torso. Figure 5 shows mean differences for all participants and Figure 6 shows mean differences for participant 4, highlighting differences between participants and between mean and standard deviation. Torso standard deviations (SD) were consistent but SD differed between trials for angle measurements between the arm and horizontal. Figure 7 demonstrates these differences since the line created from the reflective markers (green line) had a different angle than a line passing through the AprilTag's center (orange line). The difference between lines was smaller for the torso reflective markers (yellow line) and a line passing through the AprilTag marker (red line).

Discussion
The smartphone biomechanics augmented reality app was successfully developed, implemented, and evaluated. The app tracked and displayed accurate angles, based on evaluation with a Body Opponent Bag where anthropometric measures were not affected by clothing or participant movement. Therefore, these viable angle measurements can be used for point-of-contact assessments. Evaluations with humans also demonstrated viability for posture, body symmetry, and ROM assessment, but results were not as good as the BOB analyses, indicating that measurement method improvements could be made to improve human measurement accuracy. The ability to work in real time with the AR display enables clinicians to hand hold markers for efficient measurements while interacting with patients.
Various sources of marker-based measurement error have been reported for human movement analysis, and BAR-M was also adversely affected by these errors. These include skin movement over bone, clothing movement, and difficulty locating anatomical landmarks in areas with excessive tissue [30][31][32]. For this study, participants wore regular clothing since this better reflected an actual-use scenario. One of the limitations of the BAR-M application is the inability to automatically correct for errors caused by factors such as clothing movement while quantifying human posture and ROM. Autocorrection could be one of the future research directions to reduce systematic errors caused by external factors.

Pelvis and Shoulder Measurement
For shoulder and pelvis measurements, most participants had differences between the Vicon and BAR-M of less than 2°. However, two people had differences of more than 3°. The greater differences could be due to difficulties positioning both marker sets (AprilTag and reflective markers) on the anatomical landmarks. For shoulder measurements, not holding the AprilTag flat mount in an exact vertical diamond shape on top of the reflective markers would cause an error, although the vertical diamond position enhances stability when holding a marker and aids positioning. For pelvis measurement, the greater differences could be due to the Vicon markers being taped onto clothing, since using the post adapter to locate anatomical landmarks could move clothing and thereby affect reflective marker locations. Vicon measurement standard deviation was greater than the mobile app, especially for people with more fat on their pelvic area since body mass could move and affect Vicon marker locations when the evaluator located the landmarks with the post adapter. Therefore, this research highlights the practical issues for using marker-based analysis at the point of patient contact. Since the AprilTag mount is

Discussion
The smartphone biomechanics augmented reality app was successfully developed, implemented, and evaluated. The app tracked and displayed accurate angles, based on evaluation with a Body Opponent Bag where anthropometric measures were not affected by clothing or participant movement. Therefore, these viable angle measurements can be used for point-of-contact assessments. Evaluations with humans also demonstrated viability for posture, body symmetry, and ROM assessment, but results were not as good as the BOB analyses, indicating that measurement method improvements could be made to improve human measurement accuracy. The ability to work in real time with the AR display enables clinicians to hand hold markers for efficient measurements while interacting with patients.
Various sources of marker-based measurement error have been reported for human movement analysis, and BAR-M was also adversely affected by these errors. These include skin movement over bone, clothing movement, and difficulty locating anatomical landmarks in areas with excessive tissue [30][31][32]. For this study, participants wore regular clothing since this better reflected an actual-use scenario. One of the limitations of the BAR-M application is the inability to automatically correct for errors caused by factors such as clothing movement while quantifying human posture and ROM. Autocorrection could be one of the future research directions to reduce systematic errors caused by external factors.

Pelvis and Shoulder Measurement
For shoulder and pelvis measurements, most participants had differences between the Vicon and BAR-M of less than 2 • . However, two people had differences of more than 3 • . The greater differences could be due to difficulties positioning both marker sets (AprilTag and reflective markers) on the anatomical landmarks. For shoulder measurements, not holding the AprilTag flat mount in an exact vertical diamond shape on top of the reflective markers would cause an error, although the vertical diamond position enhances stability when holding a marker and aids positioning. For pelvis measurement, the greater differences could be due to the Vicon markers being taped onto clothing, since using the post adapter to locate anatomical landmarks could move clothing and thereby affect reflective marker locations. Vicon measurement standard deviation was greater than the mobile app, especially for people with more fat on their pelvic area since body mass could move and affect Vicon marker locations when the evaluator located the landmarks with the post adapter. Therefore, this research highlights the practical issues for using marker-based analysis at the point of patient contact. Since the AprilTag mount is held at the appropriate location, regardless of clothing, this approach may be more appropriate than other data-collection-based approaches where markers experience clothing-movement error (i.e., when making real-time measurements during a clinical encounter).

Arm Abduction
Arm abduction differences between Vicon and app measurements were greater than the shoulder and pelvis measurements. In some cases, arm abduction differences were more than 10 • , with the maximum being 28.46 • . As discussed in the previous section, this error involved both the arm and chest, hence error summation contributed to the differences (i.e., pelvis and shoulder). All errors were systematic. Chest errors could be resolved by an appropriate Velcro strap and locating tags on the participant's back rather than chest (i.e., flatter surface). Moreover, securing the mount with tape on the skin could help since strap movement due to breathing or torso movement could be reduced. The torso location would typically be used as a reference for limb angle measurement, so this marker would benefit from being secure on the person rather than hand held.
For the arm, errors occurred due to changes in alignment between the reflective markers and AprilTags, since reflective markers moved when a participant moved their arm, predominately due to clothing movement. Therefore, the Vicon marker-based measurement had large errors and was the main contributor to differences between the smartphone output and comparator data. Clothing movement errors are well known for marker-based systems [33]. In situations where participants are not comfortable removing their clothes (i.e., cultural reason or measurement in public), measurement errors caused by clothing are likely for adhesive-based approaches [34]. An appropriate strap and AprilTag marker approach could minimize clothing-based errors, as well as holding the marker on the person.
In the literature, marker-based systems and bi-planar videoradiography were compared, and reported joint center position errors of up to 30 mm and joint rotation errors of up to 14 • , with averages of 9 to 19 mm and 2.2 to 5.5 • , respectively [35,36]. All motion capture systems yield the largest errors in rotation along the longitudinal axis when measuring devices such as markers placed on the skin are closer to the axis of rotation, with hip internal-external rotational errors as high as 21.8 • [37,38].
Reading the most consistent angle from the mobile screen in real time provided accurate results, supporting the AR approach for pose measurement. The average difference between reading the on-screen value and calculating the average angle from saved data was less than a 0.2 • . Therefore, reading instant measurements from the mobile screen is appropriate.
Fiducial markers were essential to avoid marker loss in environments with complex backgrounds and to avoid confusion between markers, since each fiducial marker is unique and more easily separated from the background. Another benefit of AprilTag markers is that marker corner coordinates are provided, which can be used to calculate a linear scale factor and enable distance measurements.
With a smartphone, a trade-off exists between image resolution and AprilTag sample rate, with sample rate decreasing for higher resolutions. The current configuration enabled pose measurement with the entire body in the field of view. Newer mobile devices with faster processors will provide faster sample rates and/or higher video resolutions, which would then enable the camera to be located further away or smaller AprilTags to be used. The BAR-M app can set the appropriate resolution for a functional frame rate across phones with differing processing capacities.

Conclusions
The "Biomechanics Augmented Reality-Marker" (BAR-M) application was developed and evaluated for functional application. The BAR-M application detected, tracked, and calculated anatomical angles between fiducial landmarks, in real time. Angles were displayed on-screen in real time and detailed data were stored on the mobile device. This AR smartphone app was viable for range of motion, body symmetry, and postural measurements required by clinicians and researchers, both from reading the real-time angle and postprocessing saved data. The smartphone app approach could help clinicians assess human body posture and range of motion at the point of patient contact. A successful app opens the door for many new research opportunities for 3D and dynamic postural measurement. Additionally, AprilTag3 was recently introduced [39], providing better tag detection and reduced processing algorithm time, so Z direction (depth) can be computed live without a delay [39]. Addition of the depth component could improve error correction for hand-held markers or for patient movement out of the camera plane.
Measurement methodologies could be improved to compensate for clothing and other factors that can introduce measurement error. This includes developing straps that can be quickly applied to the body for a consistent fiducial marker location and expanding the current set of adapters to make human measurement efficient for the clinician and the person being measured (e.g., curved, square, foldable adapters).
Author Contributions: All authors were fully involved in the study and preparation of the manuscript and the material within has not been and will not be submitted for publication elsewhere. All authors have read and agreed to the published version of the manuscript.