Next Article in Journal
Research on Intelligent Monitoring of Boring Bar Vibration State Based on Shuffle-BiLSTM
Previous Article in Journal
Wavelet-Based Output-Only Damage Detection of Composite Structures
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:

A Neural Network Approach for Inertial Measurement Unit-Based Estimation of Three-Dimensional Spinal Curvature

Department of Computer Science, The University of Hong Kong, Pokfulam, Hong Kong, China
Laboratory for Artificial Intelligence in Design, Hong Kong Science Park, New Territories, Hong Kong, China
School of Fashion and Textiles, The Hong Kong Polytechnic University, Hung Hom, Hong Kong, China
Authors to whom correspondence should be addressed.
Sensors 2023, 23(13), 6122;
Submission received: 23 May 2023 / Revised: 27 June 2023 / Accepted: 30 June 2023 / Published: 3 July 2023
(This article belongs to the Section Biomedical Sensors)


The spine is an important part of the human body. Thus, its curvature and shape are closely monitored, and treatment is required if abnormalities are detected. However, the current method of spinal examination mostly relies on two-dimensional static imaging, which does not provide real-time information on dynamic spinal behaviour. Therefore, this study explored an easier and more efficient method based on machine learning and sensors to determine the curvature of the spine. Fifteen participants were recruited and performed tests to generate data for training a neural network. This estimated the spinal curvature from the readings of three inertial measurement units and had an average absolute error of 0.261161 cm.

1. Introduction

The spine, or vertebral column, is an important bony structure that connects the head to the pelvis [1]. Abnormal spinal curvature may cause lower back pain [2] and neck pain [3,4,5,6] and, in severe cases, obvious body imbalance and even cardiorespiratory complications [7]. However, abnormal curvature is considered a condition that can be diagnosed and then treated. Methods of diagnosing abnormal curvature include the forward bend test and radiography [8,9,10].
School screening was conducted in the United States as far back as the 1960s to detect scoliosis, which is a kind of spinal deformity that is estimated to affect 2–3 percent of the general population [11]. Initially, postural tests were conducted and poor posture was considered a health hazard. Following this, in 1965, the English physician William Adams devised the Adam’s forward bend test (AFBT) [12], which owing to its effectiveness and convenience remains a popular and useful method of screening scoliosis [13]. The AFBT has recently been modified, with the newer technique extending the evaluation of rotational flexibility and classification of the type of spinal curvature [14]. Nevertheless, the AFBT is not the only practical examination method. A few years after the AFBT was devised, Moire topography was developed to analyse the asymmetry of the trunk of the body to screen for structural scoliosis [15]. Later, Drerup and Hierholzer [16] developed a three-dimensional (3D) surface screening technique called raster stereography to automatically define anatomical landmarks on the surface of the back and therefore measure spinal deformity [16]. However, although this technique is accurate for early detection, it cannot be used to monitor the progression of spinal curvature [17].
In fact, none of the above-mentioned examination methods are as precise as radiographic examination. Thus, plain radiography imagery is now considered the gold-standard technique for measuring spinal curvature and detecting spinal deformities [18]. This approach involves passing ionising radiation (X-rays) through the body to obtain a two-dimensional (2D) image and a 3D computed tomography scan [19]. Owing to the risk of radiation exposure, good alternative technologies that reduce radiation exposure have been devised. A successfully applied alternative technology is magnetic resonance imaging (MRI), which allows 3D reconstruction of the spine. However, MRI technology has a limited project volume and requires professional operators. Additionally, a disadvantage of these three technologies is their high financial cost, which may place a heavy economic burden on patients and their families [20,21]. Another successfully applied technology is the DIERS Formetric 4D scanner, which creates a 3D image rather than a radiograph of the spine [22]. However, this type of scanner is not universally used, owing to the requirements of operating the equipment and software and for the back of the patient to be bare. Additionally, with the development of computer vision technology, artificial intelligence technology has been used to analyse images of the naked back for determining the spinal curvature [23]. The technologies described above are used for static evaluation of the spine. However, dynamic assessment should be considered because it is more related to patients’ health-related quality of life [24]. To this end, motion capture was verified as an accurate and reliable method of measuring dynamic spinal alignment [18].
Currently, in the healthcare industry, new applications based on the use of sensors are being developed to help people to understand their body, e.g., sensors that monitor the heart rate [25,26,27] or blood sugar concentration [28], and thermometers that measure body temperature [29] or muscle activity [30]. In particular, it has been found that inertial measurement units (IMUs) are effective for sensing [31,32,33]. In terms of the clinical use of IMUs for examining the spine, [34] validated the reliability of IMU sensors that were used on the lumbar part of the spine. They also stated the need for further research that evaluates the performance of IMU sensors for examining healthy populations and other parts of the spine that they did not study.
In addition to sensors, data-driven approaches, such as machine learning, are necessary for training a model that relates data obtained from IMU sensors to the degree of spinal curvature [33]. Recently, neural network (NN) techniques and data from IMUs have been combined in human motion analyses. The motion information of multiple joints during walking was accurately predicted using only a single IMU sensor and an artificial NN model, which demonstrated the feasibility of applying this combination in the analysis of biomechanical dynamics [35]. Similarly, a convolutional NN method was successfully adopted to estimate golf swing phases using data collected from an IMU sensor placed on different body parts [36]. However, it does not seem possible to estimate the degree of spinal curvature using only one IMU sensor, as spinal curvature varies from person to person and usually has two apexes. Hence, more sensors are needed for such applications.
Accordingly, the purpose of this paper is to describe a new method to estimate and monitor spinal curvature in real time using three IMU sensors and machine learning technology. Hardware with three IMUs was developed and an NN model was built to estimate the curvature from the hardware readings. The estimates were compared with the results of a motion capture system, which were also the ground truth for the machine learning model, to evaluate the accuracy and reliability of this new method.

2. Materials and Methods

The NN was designed to estimate the spinal curvature from the orientations of the IMUs in three different locations. The ground-truth spinal curvature was obtained using a Vicon motion capture system and a custom-developed sensor strip, via the following five steps: (1) sensor strip development, (2) data collection, (3) data processing, (4) neural network training, and (5) cross-validation.
Spinal curvature is defined as a line representing the curve of the spine, which can be interpolated from 10 points along the line. Each point is represented by a 3D vector with a position in the x-, y-, and z-dimensions. The points are ordered from bottom to top, with the last point representing the location of the vertebra prominens (C7).

2.1. Sensor Attachment

A custom-made vest and strip were created for mounting the three IMUs along the spine. The vest was tightly fitted to each participant’s body with elastic belts that prevented the dislocation of the vest during body movement. The strip was a 7 cm wide and 50 cm long piece of elastic fabric that had electric components sewn onto its surface.
The sensor strip used in the study consisted of five components: three IMUs, a microcontroller unit, and a power-delivery module. Two of the IMUs were MPU-9250 modules from InvenSense, while the third was an LSM6DS3 module from STMicroelectronics on an Arduino Nano 33 IoT board. Both types of IMU module had an integrated 3D digital accelerometer and 3D digital gyroscope. The MPU-9250 is a small multi-chip module that combines a 3-axis gyroscope, a 3-axis accelerometer, and a 3-axis magnetometer, and includes a Digital Motion Processor for complete 9-axis MotionFusion output, making it ideal for motion-tracking applications. The device also features 16-bit ADCs, programmable digital filters, a precision clock, an embedded temperature sensor, and programmable interrupts, and supports both I2C and SPI serial interfaces. The sensitivity scale factors of gyroscope, accelerometer, and magnetometer are 16.4 LSB/(°/s), 2048 LSB/g and 0.6 µT/LSB, respectively. The IMUs were positioned at three different distances from the top of the fabric strap, and the printed circuit boards were aligned with the centre of the actual IMU integrated circuit to ensure accurate measurements along the centre of the strap with aligned IMU orientations. A schematic of the sensor strip is available in Figure 1.

2.2. Data collection

2.2.1. Participants

Fifteen healthy participants without physical or mobility impairment were recruited for the study. The details of the research work were explained to all of the participants, and they gave written informed consent prior to the start of the experiment. The research was given ethical approval by the Hong Kong Polytechnic University (reference number HSEARS20171214002). The demographic information of the participants is summarised in Table 1.

2.2.2. Instrumentation

The spinal curvature of the participants was measured using a Vicon motion capture system. The data were collected with software developed for the study; i.e., Vicon’s Datastream SDK. Ten 14 mm reflective markers were attached to the sensor strip 5 cm apart from each sensor, starting from the top of the sensor strip. The participants were instructed to wear a tight vest with Velcro on the back. The sensor strip was then attached to the back of the vest, with the top of the strip aligned to C7 of the spine. The participants were then asked to stand at the centre of the measurement chamber. A system check was performed before the data collection began.

2.2.3. Motion Capture Experiment

Two motion sets were required for all of the participants. The first motion set had 10 steps, which covered eight directions of movement that a human typically performs. The first step was flexion, which meant that the participants bent forwards to 90 degrees. The participants were then asked to extend backwards as far as possible. The third step was left lateral flexion, during which the participants bent to the left side without raising their pelvises or feet. In the fourth step, the participants bent to the right side with the same requirements. In the fifth step, the participants were asked to bend to the left side and forwards at the same time. The sixth step was bending to the right and forwards at the same time. The seventh step was a combination of bending to the left and extending backwards as far as possible, and the eighth step was a combination of bending to the right and extending backwards. The ninth and tenth steps were rotation to both sides to slightly twist the spine. The participants were instructed to return to a normal standing posture between steps and to ensure each movement was slow and steady. The complete motion set is shown in Figure 2.
The second motion set covered the spinal curve between the forward bending motions in the first motion set and comprised nine steps. The participants first bent forwards at 90 degrees and then extended to the right side. This posture was the starting position for the motion set. Next, the participants were asked to slowly move to the left side without raising or lowering their bodies, which was the first sweep. The third step was raising their heads approximately 22.5 degrees from horizontal, which was equal to one-fourth of the vertical standing posture. In the fourth step, the participants swept to the right side without raising or lowering their bodies. In the fifth and sixth steps, they raised their heads again to approximately 30 degrees and slowly moved to the left, respectively. In the seventh step, they raised their heads once again to approximately 45 degrees, and in the eighth step they swept to the right side. Finally, the participants slowly returned to the vertical standing posture. Figure 3 shows the movements for the second motion set. The first and second motion sets were performed four times and twice, respectively, to ensure the reliability of the collected data.

2.3. Data Processing

The raw data collected were processed before being used to train the NN. The data collected from the motion capture system and the sensor strip were processed separately and then combined to construct the dataset.

2.3.1. Motion Capture Data Processing

The Vicon motion capture system captured the locations of the markers in a stream of frames. One or more markers were not visible in some of the frames, so these incomplete frames were eliminated. The recognised markers in a frame were not ordered, so they were sorted by selecting the closest marker to the previous marker, with the marker that was closest to the ground taken as the first marker.
The reported locations of the markers were relative to the centre of the measurement chamber. The origin was moved to the first marker by subtracting the distance from the first marker from the distances of the sorted markers.
The locations were expressed by 3D vectors with forward, rightward, and upward axes in units of millimetres. The vectors were scaled to −1 and 1 by dividing the forward and rightward axes by 50 mm and the upward axis by 500 mm. The data were not restricted to −1 and 1, such that it was possible to have values that exceeded −1 and 1.

2.3.2. Sensor Strip Data Processing

The sensor strip provided data from the accelerometer, also in a stream of frames. However, the sensor strip and the motion-capture system worked asynchronously at the hardware level. Hence, the frames reported by the sensor strip were not synchronised with those reported by the motion-capture system. The frames were therefore synchronised by linear interpolation.
Each motion capture frame was checked for a timestamp t in the IMU data frames IMU (t). All of the IMU data frames that had the exact timestamp of any one of the motion-capture data frames were used without interpolation. Otherwise, the frames immediately before IMU (t0) and after the IMU (t1) timestamp t were used to interpolate the value of IMU (t), where t0 and t1 are the timestamps of the frames, respectively. The interpolation equation is written as
I M U   t = I M U t ,   i f   I M U t   e x i s t s , I M U t 0 + I M U t 1 I M U t 0 t 1 t 0 ,   o t h e r w i s e .

2.4. Neural Network

An NN was trained to estimate the spinal curvature from an input of three accelerometer readings of an IMU in 3D vectors. The estimation was expressed as nine points along the curvature in 3D vectors, originating from the first point fixed at (0, 0, 0). See Figure 4 for the input and output of the NN’s training and inference.
The network had nine inputs, which represent the x-, y-, and z-axes of the accelerometers. There were 27 nodes at the output, which were the x-, y-, and z-locations of the nine points along the spinal curvature. The dense layers in between had an hourglass shape of 27, 18, 9, and 18 nodes, respectively, and all were fully connected.
The collected data were divided into training, validation, and testing datasets in proportions of 50%, 25%, and 25%, respectively. All of the data frames collected in the same session were placed into the same dataset, so that data from the same session were not separated into training and testing datasets. The network was trained using a rectified linear unit as the activation function and the mean squared error as the loss function. RMSprop was used as the optimiser, with Rho = 0.9 and Momentum = 0. See Table 2 for all of the parameters. The training converged quickly and stopped at Epoch = 250.
The trained NN was cross validated to detect overfitting or memorising. The data of one male and one female participant were randomly excluded from the collected data and formed a cross-validation dataset. Hence, the NN had not seen the data used in cross-validation. See Table 3 for the participant profiles. The performance of the trained NN was evaluated using the cross-validation dataset, with this being undertaken only once, after training. This procedure evaluated the performance of the NN using unseen data.

3. Results

There were 476,977 frames collected in 15 data sessions. The frames were cleaned and processed before being used in network training. The number of frames used in NN training, testing, validation and cross validation were 250900, 83633, 83633 and 58810, respectively. The training, validation, and testing errors were 0.0251, 0.0244, and 0.0366, respectively.
The collected dataset is available on (accessed on 22 May 2023) for public access.
The prediction of the trained NN was compared with the ground-truth data. The average component-wise error of the estimated marker positions was −0.261161 cm ± 2.510505 cm. For more information, Table A1 in Appendix A provided the error in each component.
The trained neural network was used on two scoliosis patients to evaluate the applicability of the neural network in subjects with spinal deformities. Three IMUs were attached to the spine of the patient using adhesive tape, at 11.5 cm, 32 cm, and 44 cm from C7, respectively. The patients were instructed to sit upright when the readings of IMUs were recorded. The spinal curvature was then obtained using the neural network on the recorded IMU readings. See Figure 5 for the estimated spinal curvatures, and Table A2 in Appendix B for the detailed estimated data. The Cobb angle was measured as the angle between the most and least tilted point along the estimated spinal curvature. See Table 4 for the Cobb angle measured from the patient’s X-ray image and the estimated spinal curve.

4. Discussion

This paper proposes a neural network approach to monitor spinal curvatures in real time that successfully predicts the degree of the spinal curvature based on the readings of three IMUs with an average absolute error of 0.261161 cm. The results of our study show that this technology can provide accurate and reliable measurements of spinal curvatures, and offers a promising solution for the real-time monitoring of spinal curvatures.
This method only requires a sensor strip and an electronic device, such as a laptop or smartphone. Therefore, its convenience and cost-effectiveness are unrivalled. To be specific, measuring the spinal curvature can be undertaken in a patient’s home or clinic, thus reducing the need for frequent hospital visits and reducing healthcare costs overall.
Another key advantage of this method is its ability to provide real-time monitoring of spinal curvatures. This feature allows for continuous monitoring of spinal curvature changes, which can increase the effectiveness of the treatment. Patients can track their treatment progress and see the effects, which can increase their engagement and motivation to continue with the treatment. Medical professionals such as clinicians will find that this method offers more accurate data to optimize the treatment plan in a timely manner. Additionally, the method enables real-time continuous applications that rely on the spinal curvature data, which cannot be realized with traditional methods, such as x-ray imaging. For example, the method is well-suited for conducting biofeedback training, including postural training, where monitoring of the posture is necessary to provide immediate feedback on whether correction is required. Therefore, further applications will be subsequently developed.
One limitation of this study is that the sensor strip is designed for laboratory use, and may not be as convenient to use outside the laboratory setting. Therefore, improvements to the design of the sensor strip are necessary to enable its application in other settings, such as clinics. For instance, it is necessary to cover the electric components when the sensor strip is being used by doctors or patients. Furthermore, outliers are observed in the mid-range of the estimated spinal curvature values. The accuracy and performance can be perhaps further improved by increasing the number of participants and balancing the ratio of male and female participants. This, together with the recruitment of participants with a larger age and body-size ranges, should be the focus of future work.
Another limitation is that an accumulation of error along the markers is observed, the error of a previous marker is being carried to the subsequent markers. The errors in x- and z-axes are also significantly larger than that in the y-axis. It is believed that this is due to the lack of information on world orientation that is perpendicular to the gravity from the accelerometer readings. However, it is assumed that the accuracy in estimating the curvature is more important than restoring the correct world orientation. As the error in terms of marker positions for the same curvature facing south or east can be very large, the curvature remains the same. For applications that also require an absolute world orientation, further research is needed on including relevant sensors, for example, a compass.

5. Conclusions

This study demonstrates the potential of machine learning technology and IMU-based systems for the real-time monitoring of spinal curvatures. This method offers several advantages, including dynamic assessment, convenience, cost-effectiveness, and reduced radiation exposure. However, further research is needed to validate these findings and optimize the design.

Author Contributions

Conceptualization, T.H.A.M. and J.Y.; methodology, J.Y. and T.W.C.; software, T.H.A.M.; validation, T.H.A.M. and R.L.; formal analysis, R.L.; investigation, T.H.A.M.; resources, J.Y. and T.W.C.; data curation, T.H.A.M.; writing—original draft preparation, T.H.A.M. and R.L.; writing—review and editing, R.L.; visualization, T.H.A.M.; supervision, J.Y.; project administration, T.W.C. and J.Y.; funding acquisition, J.Y. All authors have read and agreed to the published version of the manuscript.


This research was funded by the Laboratory for Artificial Intelligence in Design (Project Code: Rp1-4) under the InnoHK Research Cluster, Hong Kong Special Administrative Region Government on 1 August 2020. The principal investigator is Dr. Joanne Yip. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Institutional Review Board Statement

The experimental procedure was approved by the Human Subjects Ethics Subcommittee of The Hong Kong Polytechnic University, reference number HSEARS20171214002.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study. Written informed consent has been obtained from the patient(s) to publish this paper.

Data Availability Statement

The collected dataset is available on (accessed on 22 May 2023) for public access.


We express our appreciation to the participants for their time and effort in the data collection process.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. Errors (cm) in each component of estimated marker positions.
Table A1. Errors (cm) in each component of estimated marker positions.
MarkerError in x-Axis (cm)Error in y-Axis (cm)Error in z-Axis (cm)
Mean ± SD
Marker 2−0.039246 ± 0.6546290.009011 ± 0.4079790.009011 ± 0.407979
Marker 3−0.082154 ± 1.4584810.101323 ± 0.535569−0.018093 ± 1.402238
Marker 4−0.144836 ± 1.7896420.046761 ± 0.779287−0.166073 ± 1.779907
Marker 5−0.291647 ± 2.4140060.099789 ± 0.898368−0.245607 ± 2.405230
Marker 6−0.469228 ± 3.0858310.176705 ± 1.103621−0.440173 ± 3.037856
Marker 7−0.632811 ± 3.8052840.213977 ± 1.293863−0.593567 ± 3.683613
Marker 8−0.821146 ± 4.5758980.298431 ± 1.531894−0.721798 ± 4.285308
Marker 9−0.995856 ± 5.3498100.353473 ± 1.820733−0.879281 ± 4.881974
Marker 10−1.187598 ± 6.2224600.343759 ± 2.205439−1.053195 ± 5.482679

Appendix B

Table A2. Detailed data of estimated spinal curvature.
Table A2. Detailed data of estimated spinal curvature.
ItemSubject #1Subject #2
IMU 1—Acceleration X−0.045170.04333
IMU 1—Acceleration Y0.617550.89661
IMU 1—Acceleration Z−0.67151−0.44287
IMU 2—Acceleration X−0.057130.00903
IMU 2—Acceleration Y0.989380.99658
IMU 2—Acceleration Z−0.006840.03027
IMU 3—Acceleration X0.01709−0.02991
IMU 3—Acceleration Y0.973750.98401
IMU 3—Acceleration Z0.04199−0.17786
Marker 2—X−2.4248275756835938−2.441648244857788
Marker 2—Y27.02641677856445334.43146133422852
Marker 2—Z−16.348039627075195−8.949712753295898
Marker 3—X2.985224246978761.0537102222442627
Marker 3—Y81.9786834716796988.9962158203125
Marker 3—Z−26.029380798339844−15.354696273803713
Marker 4—X5.7697167396545412.9824023246765137
Marker 4—Y110.39649963378906124.36925506591795
Marker 4—Z−23.595386505126957−6.963764190673828
Marker 5—X8.408657073974615.85613489151001
Marker 5—Y160.68186950683594177.68435668945312
Marker 5—Z−10.858737945556646.670775413513184
Marker 6—X8.9164495468139656.876500129699707
Marker 6—Y210.945068359375230.6820373535156
Marker 6—Z−8.87688350677490212.186483383178713
Marker 7—X8.910546302795417.606303215026855
Marker 7—Y258.7609558105469281.3783569335937
Marker 7—Z−10.99170303344726612.565881729125977
Marker 8—X11.80871200561523411.296859741210938
Marker 8—Y310.0574035644531334.2227478027344
Marker 8—Z−14.9241695404052729.177355766296388
Marker 9—X13.44681835174560413.42614459991455
Marker 9—Y358.4640197753906384.51568603515625
Marker 9—Z−24.184339523315431.3085908889770508
Marker 10—X12.88639259338378713.651782035827637
Marker 10—Y403.5655212402344431.4049072265625
Marker 10—Z−47.075775146484375−20.73443984985352
Cobb Angle10.95718374235174513.13111123849918


  1. Frost, B.A.; Camarero-Espinosa, S.; Foster, E.J. Materials for the Spine: Anatomy, Problems, and Solutions. Materials 2019, 12, 253. [Google Scholar] [CrossRef] [Green Version]
  2. Kostuik, J.P.; Bentivoglio, J. The Incidence of Low-Back Pain in Adult Scoliosis. Spine 1981, 6, 268–273. [Google Scholar] [CrossRef]
  3. Coenen, P.; Smith, A.; Paananen, M.; Osullivan, P.; Beales, D.; Straker, L. Trajectories of Low Back Pain from Adolescence to Young Adulthood. Arthritis Care Res. 2017, 69, 403–412. [Google Scholar] [CrossRef]
  4. Alshami, A.M. Prevalence of Spinal Disorders and Their Relationships with Age and Gender. Saudi Med. J. 2015, 36, 725–730. [Google Scholar] [CrossRef]
  5. Borzı`, F.; Szychlinska, M.; Rosa, M.D.; Musumeci, G. A Short Overview of the Effects of Kinesio Taping for Postural Spine Curvature Disorders. J. Funct. Morphol. Kinesiol. 2018, 3, 59. [Google Scholar] [CrossRef] [Green Version]
  6. Raciborski, F.; Gasik, R.; Klak, A. Disorders of the spine. A major health and social problem. Reumatol./Rheumatol. 2016, 4, 196–200. [Google Scholar] [CrossRef]
  7. Teixeira da Silva, L.E.C.; de Barros, A.G.C.; de Azevedo, G.B.L. Management of severe and rigid idiopathic scoliosis. Eur. J. Orthop. Surg. Traumatol. Orthop. Traumatol. 2015, 25 (Suppl. S1), S7–S12. [Google Scholar] [CrossRef] [Green Version]
  8. Kwok, G.; Yip, J.; Yick, K.L.; Cheung, M.C.; Tse, C.Y.; Ng, S.P.; Luximon, A. Postural Screening for Adolescent Idiopathic Scoliosis with Infrared Thermography. Sci. Rep. 2017, 7, 14431. [Google Scholar] [CrossRef] [Green Version]
  9. Akazawa, T.; Torii, Y.; Ueno, J.; Saito, A.; Niki, H. Mobile Application for Scoliosis Screening Using a Standard 2D Digital Camera. Cureus 2021, 13, e13944. [Google Scholar] [CrossRef]
  10. Oakley, P.A.; Ehsani, N.N.; Harrison, D.E. The Scoliosis Quandary: Are Radiation Exposures from Repeated X-Rays Harmful? Dose-Response 2019, 17, 155932581985281. [Google Scholar] [CrossRef]
  11. Tambe, A.D.; Panikkar, S.J.; Millner, P.A.; Tsirikos, A.I. Current concepts in the surgical management of adolescent idiopathic scoliosis. Bone Jt. J. 2018, 100-B, 415–424. [Google Scholar] [CrossRef]
  12. Fairbank, J. Historical Perspective. Spine 2004, 29, 1953–1955. [Google Scholar] [CrossRef]
  13. Plaszewski, M.; Nowobilski, R.; Kowalski, P.; Cieslinski, M. Screening for scoliosis. Int. J. Rehabil. Res. 2012, 35, 13–19. [Google Scholar] [CrossRef]
  14. Senkoylu, A.; Ilhan, M.N.; Altun, N.; Samartzis, D.; Luk, K.D.K. A simple method for assessing rotational flexibility in adolescent idiopathic scoliosis: Modified Adam’s forward bending test. Spine Deform. 2020, 9, 333–339. [Google Scholar] [CrossRef]
  15. Willner, S. Moiré Topography for the Diagnosis and Documentation of Scoliosis. Acta Orthop. Scand. 1979, 50, 295–302. [Google Scholar] [CrossRef]
  16. Drerup, B.; Hierholzer, E. Automatic localization of anatomical landmarks on the back surface and construction of a body-fixed coordinate system. J. Biomech. 1987, 20, 961–970. [Google Scholar] [CrossRef]
  17. Bassani, T.; Stucovitz, E.; Galbusera, F.; Brayda-Bruno, M. Is Rasterstereography a Valid Noninvasive Method for the Screening of Juvenile and Adolescent Idiopathic Scoliosis? Eur. Spine J. 2019, 28, 526–535. [Google Scholar] [CrossRef]
  18. Severijns, P.; Overbergh, T.; Thauvoye, A.; Baudewijns, J.; Monari, D.; Moke, L.; Desloovere, K.; Scheys, L. A subject-specific method to measure dynamic spinal alignment in adult spinal deformity. Spine J. 2020, 20, 934–946. [Google Scholar] [CrossRef]
  19. Michaud, F.; Lugr’ıs, U.; Cuadrado, J. Determination of the 3D Human Spine Posture from Wearable Inertial Sensors and a Multibody Model of the Spine. Sensors 2022, 22, 4796. [Google Scholar] [CrossRef]
  20. Janssen, M.; Nabih, A.; Moussa, W.; Kawchuk, G.N.; Carey, J.P. Evaluation of diagnosis techniques used for spinal injury related back pain. Pain Res. Treat. 2011, 2011, 478798. [Google Scholar] [CrossRef]
  21. Faria, R.; McKenna, C.; Wade, R.; Yang, H.; Woolacott, N.; Sculpher, M. The EOS 2D/3D X-ray imaging system: A cost-effectiveness analysis quantifying the health benefits from reduced radiation exposure. Eur. J. Radiol. 2013, 82, e342–e349. [Google Scholar] [CrossRef]
  22. Kumar, V.; Cole, A.; Breakwell, L.; Michael, A.L.R. Comparison of the DIERS Formetric 4D Scanner and Plain Radiographs in Terms of Accuracy in Idiopathic Scoliosis Patients. Glob. Spine J. 2016, 6 (Suppl. 1), s-0036. [Google Scholar] [CrossRef] [Green Version]
  23. Yang, J.; Zhang, K.; Fan, H.; Huang, Z.; Xiang, Y.; Yang, J.; He, L.; Zhang, L.; Yang, Y.; Li, R.; et al. Development and validation of deep learning algorithms for scoliosis screening using back images. Commun. Biol. 2019, 2, 390. [Google Scholar] [CrossRef] [Green Version]
  24. Moke, L.; Severijns, P.; Schelfaut, S.; Van de Loock, K.; Hermans, L.; Molenaers, G.; Jonkers, I.; Scheys, L. Performance on Balance Evaluation Systems Test (BESTest) Impacts Health-Related Quality of Life in Adult Spinal Deformity Patients. Spine 2018, 43, 637–646. [Google Scholar] [CrossRef]
  25. Brage, S.; Brage, N.; Franks, P.W.; Ekelund, U.; Wareham, N.J. Reliability and validity of the combined heart rate and movement sensor Actiheart. Eur. J. Clin. Nutr. 2005, 59, 561–570. [Google Scholar] [CrossRef] [Green Version]
  26. Irawan, Y.; Fernando, Y.; Wahyuni, R. Detecting Heart Rate Using Pulse Sensor As Alternative Knowing Heart Condition. J. Appl. Eng. Technol. Sci. 2019, 1, 30–42. [Google Scholar] [CrossRef]
  27. Maity, K.; Mondal, A.; Saha, M.C. Cellulose Nanocrystal-Based All-3D-Printed Pyro-Piezoelectric Nanogenerator for Hybrid Energy Harvesting and Self-Powered Cardiorespiratory Monitoring toward the Human–Machine Interface. ACS Appl. Mater. Interfaces 2023, 15, 13956–13970. [Google Scholar] [CrossRef]
  28. Aly, A.H.; Zaky, Z.A.; Shalaby, A.S.; Ahmed, A.M.; Vigneswaran, D. Theoretical study of hybrid multifunctional one-dimensional photonic crystal as a flexible blood sugar sensor. Phys. Scr. 2020, 95, 035510. [Google Scholar] [CrossRef]
  29. Wijaya, N.H.; Oktavihandani, Z.; Kunal, K.; Thelmy, E.; Nguyen, P.T. The Design of Tympani Thermometer Using Passive Infrared Sensor. J. Robot. Control 2020, 1, 27–30. [Google Scholar] [CrossRef]
  30. Szumilas, M.; Władziński, M.; Wildner, K. A Coupled Piezoelectric Sensor for MMG-Based Human-Machine Interfaces. Sensors 2021, 21, 8380. [Google Scholar] [CrossRef]
  31. Connolly, J.; Condell, J.; Oflynn, B.; Sanchez, J.T.; Gardiner, P. IMU Sensor-based Electronic Goniometric Glove (iSEG-Glove) for clinical finger movement analysis. IEEE Sens. J. 2017, 18, 1273–1281. [Google Scholar] [CrossRef]
  32. Kim, M.; Cho, J.; Lee, S.; Jung, Y. IMU Sensor-Based Hand Gesture Recognition for Human-Machine Interfaces. Sensors 2019, 19, 3827. [Google Scholar] [CrossRef] [Green Version]
  33. Brabandere, A.D.; Emmerzaal, J.; Timmermans, A.; Jonkers, I.; Vanwanseele, B.; Davis, J. A Machine Learning Approach to Estimate Hip and Knee Joint Loading Using a Mobile Phone-Embedded IMU. Front. Bioeng. Biotechnol. 2020, 8, 320. [Google Scholar] [CrossRef] [Green Version]
  34. Beange KHe Chan ADc Beaudette, S.M.; Graham, R.B. Concurrent validity of a wearable IMU for objective assessments of functional movement quality and control of the lumbar spine. J. Biomech. 2019, 97, 109356. [Google Scholar] [CrossRef]
  35. Lim, K.; Park, S. Prediction of Lower Limb Kinetics and Kinematics during Walking by a Single IMU on the Lower Back Using Machine Learning. Sensors 2019, 20, 130. [Google Scholar] [CrossRef] [Green Version]
  36. Kim, M.; Park, S. Golf Swing Segmentation from a Single IMU Using Machine Learning. Sensors 2020, 20, 4466. [Google Scholar] [CrossRef]
Figure 1. Schematic of the sensor strip with three IMUs, one battery, one microcontroller unit and eight retro-reflective markers.
Figure 1. Schematic of the sensor strip with three IMUs, one battery, one microcontroller unit and eight retro-reflective markers.
Sensors 23 06122 g001
Figure 2. Data collection from motion sets. First set of motions, covering eight directions of common human movements: (a) flexion and extension, (b) lateral flexion, (c) combined flexion and lateral flexion, (d) combined extension and lateral flexion, and (e) rotation.
Figure 2. Data collection from motion sets. First set of motions, covering eight directions of common human movements: (a) flexion and extension, (b) lateral flexion, (c) combined flexion and lateral flexion, (d) combined extension and lateral flexion, and (e) rotation.
Sensors 23 06122 g002
Figure 3. Second set of motions, covering the spinal curvature between that involved in the forward bending motions performed in Motion Set 1: (a) starting position, (b) first sweep, (c) first raising, (d) second sweep, (e) second raising, (f) third sweep, (g) third raising, (h) final sweep, and (i) standing posture.
Figure 3. Second set of motions, covering the spinal curvature between that involved in the forward bending motions performed in Motion Set 1: (a) starting position, (b) first sweep, (c) first raising, (d) second sweep, (e) second raising, (f) third sweep, (g) third raising, (h) final sweep, and (i) standing posture.
Sensors 23 06122 g003
Figure 4. The framework of the training scheme: (a) the inputs and output of neural network’s training; (b) the input and output of neural network’s inference.
Figure 4. The framework of the training scheme: (a) the inputs and output of neural network’s training; (b) the input and output of neural network’s inference.
Sensors 23 06122 g004
Figure 5. Estimated spinal curvature of scoliosis patients by the neural network: (a) Subject #1, (b) Subject #2. The estimated marker positions are indicated by green dots, and the spinal curvature is drawn as a black line. The orientations of each marker are indicated by red (right, x-axis), green (up, y-axis), and blue (forward, z-axis), respectively.
Figure 5. Estimated spinal curvature of scoliosis patients by the neural network: (a) Subject #1, (b) Subject #2. The estimated marker positions are indicated by green dots, and the spinal curvature is drawn as a black line. The orientations of each marker are indicated by red (right, x-axis), green (up, y-axis), and blue (forward, z-axis), respectively.
Sensors 23 06122 g005
Table 1. Demographic information of the participants.
Table 1. Demographic information of the participants.
Number of participants15
Number of male/female participants 4/11
Height (cm) (Mean ± SD)163.53 ± 7.74
Length of spine (cm) (C7 to S2) (Mean ± SD)50.33 ± 3.96
Age (Mean ± SD)25.40 ± 3.85
Table 2. Parameters of RMSProp optimiser.
Table 2. Parameters of RMSProp optimiser.
Learning RateRhoMomentumEpochsBatch Size
Table 3. Profile of participants chosen for neural network evaluation.
Table 3. Profile of participants chosen for neural network evaluation.
No. of ParticipantsGenderAge (Years Old)Height (cm)Spine Length (cm)
Table 4. Cobb angles of patients measured from X-ray images and estimated spinal curves.
Table 4. Cobb angles of patients measured from X-ray images and estimated spinal curves.
Subject CodeMeasured Cobb Angle
X-Ray Image (°)Estimated Spinal Curve (°)
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Mak, T.H.A.; Liang, R.; Chim, T.W.; Yip, J. A Neural Network Approach for Inertial Measurement Unit-Based Estimation of Three-Dimensional Spinal Curvature. Sensors 2023, 23, 6122.

AMA Style

Mak THA, Liang R, Chim TW, Yip J. A Neural Network Approach for Inertial Measurement Unit-Based Estimation of Three-Dimensional Spinal Curvature. Sensors. 2023; 23(13):6122.

Chicago/Turabian Style

Mak, T. H. Alex, Ruixin Liang, T. W. Chim, and Joanne Yip. 2023. "A Neural Network Approach for Inertial Measurement Unit-Based Estimation of Three-Dimensional Spinal Curvature" Sensors 23, no. 13: 6122.

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop