1. Introduction
With recent advances in medical and health sciences and the accessibility of information about healthy living, the awareness of the importance of well-being at work is rising. However, the stresses and commitments associated with modern living can hinder efforts towards achieving better health and well-being on a daily basis. In particular, the increase in desk-bound work and the use of computers and hand-held devices such as smartphones and tablets has exacerbated the problems of sedentary lifestyles and poor sitting posture [
1]. A systematic review based on accelerometry measurement of 11 large-scale population studies found that adults spend approximately 10 h a day or approximately 65–80% of the day performing sedentary behaviors [
2]. Common sedentary behaviors that most people engage in daily occur while working, commuting, and many leisure activities that require prolonged sitting. Slouching in particular has been termed “the new smoking” [
3]. Slouching while sitting is a state where the person’s posture is imbalanced forward or to the sides in addition to any combination of rounded shoulders, forward head posture, and angled neck or lumbar. A vast body of research has shown that poor sitting posture and prolonged sitting lead to a wide range of physical and mental health issues such as lower back pain, neck pain, headaches, respiratory and cardiovascular issues, digestive issues, and an overall higher risk of disease and death [
4,
5,
6,
7]. It also contributes to multiple mental health issues such as poor mood, fatigue, low productivity, and depression [
8,
9]. Furthermore, the lockdown measures introduced by many governments in response to the COVID-19 pandemic led to a surge in working from home (WFH), which had a serious impact on the sedentary and sitting habits of remote workers. While many workplaces are well-equipped with ergonomic chairs and desks, many home settings are far from ideal for prolonged sitting and working from home. Moretti et al. (2020) reported that this lack of ergonomic office furniture in working from home settings may be linked with poorer posture and the onset of musculoskeletal disorders (MSDs) [
10].
Addressing the problems of poor sitting posture and prolonged sitting more rigorously is needed to alleviate the impact of their associated health risks on individuals and their economic footprint from lost productivity and national health spending. In addition, to encourage the adoption of an upright posture, occupational health awareness programs often include incentives for workers to stand up, take small and frequent breaks, and perform regular stretching. Frequent postural transitions and regular stretching are important aspects of good posture awareness. Recent studies have shown that incorporating stretching exercises in the training programs of office workers is effective in preventing MSDs in the long term and reducing pain and discomfort [
11,
12]. Stretching at the workplace has further been found to increase flexibility, prevent injuries due to muscle strain, and improve personal perception of attractiveness and confidence [
13]. Therefore, to ensure the adoption of good sitting posture habits, it is important to actively correct poor sitting posture, reduce the amount of time spent sitting down and the amount of time spent in various types of slouching positions in particular, and integrate frequent breaks of activity and proper stretching. Evidently, there are quite a number of factors to actively keep track of, especially when performing other tasks that require focus, which is why we propose a solution using the LifeChair, an Internet of Things (IoT) cushion for real-time posture and activeness tracking.
Various systems have been previously proposed to monitor sitting posture in order to encourage adopting an upright posture with both passive approaches (ergonomics, materials, and fabrics) and active approaches (IoT and sensors). Passive solutions include ergonomic chairs, cushions, elastic bands, and foot rests. Active solutions track sitting posture and include smart cushions, wearable point trackers, and smartphone applications. Recent advances in Artificial Intelligence (AI) and ubiquitous sensing have highlighted the practicality and effectiveness of collecting and mining human-health-related data in real-time for the assessment and improvement of human health and well-being [
14]. Better sensing technologies and the large quantities of data they generate have facilitated the application of machine learning to detect and monitor various problems related to well-being such as poor sitting posture. Real-time sitting posture recognition and prolonged sitting monitoring are challenging tasks that require accurate tracking of sitting posture and seated behavior. Sitting is a dynamic task that comes with a wide range of inter-individual variability in body characteristics and differences in working environments, sitting habits, and various other user-specific parameters, which current active posture tracking solutions have yet to address. Moreover, the lack of a standard source of sitting posture and seated behavior data hinders progress in research to achieve active and accurate sitting posture monitoring. Accurate posture tracking leads to effective feedback for active posture correction and good sedentary habits’ promotion. The empowerment of human well-being through posture tracking and correction has important benefits in many domains including the workplace, personal fitness, driver assistance, and entertainment.
In this work, we propose an active posture training solution based on a combination of machine learning and full-back pressure sensing using an IoT cushion called LifeChair for both sitting posture and seated stretch recognition. Our main contributions are as follows:
We designed an experimental setup for collecting real-world sitting posture and seated stretch pose data from a diverse participant group using a novel pressure sensing IoT cushion.
We built sitting posture and seated stretch databases that comprise real-time user back pressure sensor data using an active posture labeling method based on a biomechanics posture model and on user body characteristics’ data (BMI).
We applied and compared the performance of several machine learning classifiers in a sitting posture recognition task and achieved an accuracy of 98.82% in detecting 15 different sitting postures, using an easily deployable machine learning algorithm, outperforming previous efforts in human sitting posture recognition. We were able to correctly classify many more postures than in previous works that targeted on average between five and seven sitting postures.
We applied and compared the performance of several machine learning classifiers in the seated stretching recognition tasks and achieved an accuracy of 97.94% in detecting six common chair-bound stretches, which are physiotherapist recommended and have not been investigated in related works. While previous works focused on sitting posture recognition alone, we extend our method to include specific chair-bound stretches.
In the context of AI-powered device personalization, we show that user body mass index (BMI) is an important parameter to consider in sitting posture recognition and propose a novel strategy for a user-based optimization of the LifeChair system.
We also demonstrate the portability and adaptability of our machine-learning-based posture classification in five different environments and discuss deployment strategies for handling new environments. This has not been investigated by previous works that focus on a single use case of their proposed systems. We demonstrate the impact of local sensor ablations on the performance of the machine learning models in sitting posture recognition.
We propose, to the best of our knowledge, the first posture data-driven stretch pose recommendation system for personalized well-being guidance.
The rest of this paper is organized as follows:
Section 2 reviews the related works in sitting posture monitoring and machine-learning-based sitting posture recognition;
Section 3 details our proposed framework for intelligent posture training using machine learning for sitting posture and stretch pose recognition based on a pressure sensing IoT cushion;
Section 4 presents the results and discussion of our machine-learning-based sitting posture and stretch pose recognition in the context of the aforementioned contributions;
Section 5 concludes this paper with a summary and our future work.
2. Related Works
Posture monitoring and correction have received special attention in the last few years, and many different types of posture monitoring devices have been proposed. Broadly, two main types of posture monitoring devices can be found in the scientific literature and in the industry: passive posture training devices and active posture training devices. Passive posture training devices rely on ergonomics and materials to act as add-ons to chairs that target a specific body part of the user to promote a healthy posture form. These include ergonomic chairs, cushions, elastic bands, and foot rests, for example, MTG’s style, Better Back, and Backpod [
15,
16,
17], the Embody Chair by Herman Miller, and ReGeneration by Knoll [
18,
19]. However, these types of passive posture training solutions have significant shortcomings due to their lack of sensing capabilities and rigidity. They also do not guarantee that users adopt a good posture, as they may still slouch while using them or sit for too long, unaware of their poor posture.
Active solutions aim to address these issues by tracking sitting posture with active components such as sensors and software. They include smart cushions, wearables, point trackers, cameras, robots, and smartphone applications. However, the active solutions available today share many shortcomings with the passive solutions and have limited sensing capabilities, inaccurate posture detection mechanisms, and ineffective feedback schemes. Other cushion-type solutions such as Darma, Cushionware, or e-Cushion have been proposed to track sitting posture using a bottom-rest pressure sensing interface [
20,
21,
22]. These solutions disregard many important aspects of posture training and focus on a basic indication of the balance of pressure as the users sit on them. They can also be uncomfortable and unreliable as they dislocate easily from their optimal position as the user sits on them throughout the day. Wearable posture solutions target body tilt and orientation using accelerometers or gyroscopes. Examples include Upright Go, Waiston, or Lumo Lift [
23,
24,
25]. However, these solutions do not account for full-posture tracking and neglect critical postures such as the forward head posture and rounded shoulders. They are also invasive and intrusive to the user, often requiring direct skin contact or appearance changes. Moreover, these active systems face challenges related to data quality, which hinders their scalable deployment and integration with modern approaches for posture modeling and detection such as machine learning and edge computing. Previous studies in human activity recognition (HAR) have explored the application of machine learning in human sitting posture recognition. Various types of data have been considered for applying machine learning techniques to detect a user’s sitting posture, including camera recording data, depth sensor data, accelerometer and gyroscope data, strain sensor data, and pressure sensing data. Computer vision for human pose detection is a well-established sub-domain of HAR, and many studies have proposed using depth image processing to capture a visual snapshot of the user’s front and machine learning for classifying poses [
26,
27,
28]. However, vision-based methods for posture recognition are limited by field of view constraints, interference and occlusion, sensitivity to lighting conditions, and motion artifacts, in addition to many issues relating to privacy invasion and user trust, which hinder widespread deployment. Wireless methods such as radio frequency identification (RFID) have also been used to detect passive sitting postures, but remain as proof-of-concept solutions prone to inaccuracies in real-world scenarios, in addition to raising important privacy challenges [
29].
In broader HAR studies, Anguita et al. (2012) used support vector machines (SVM) for activity recognition, with accelerometers from a mobile phone, and achieved an accuracy of 89% [
30]. Wu et al. (2012) found that k-nearest neighbors (k-NN) achieved the best accuracies in detecting different activities such as sitting, walking and jogging, among others, based on iPod touch sensor data [
31]. However, these studies target human poses that are significantly divergent from each other such as standing, sitting, stooping, kneeling, and lying down and do not address finer pose sub-classes within each pose such as different sitting postures. Cerqueira et al. (2020) used inertial data from IMUs mounted on a garment to detect six main static upper-body postures using various machine learning models including quadratic SVM, kNN, and linear discriminant analysis (LDA) [
32]. The postures targeted in this work do not include sitting postures and only represent broad tilts in the upper body or arm position. Earlier studies achieved fair accuracies in detecting sitting postures ranging from 78% to 88% using principle component analysis (PCA), hidden Markov models, and naive Bayes (NB) [
33,
34,
35].
Table 1 summarizes the key recent works in sitting posture recognition using user sensing and machine learning. These works share many limitations such as proof-of-concept solutions not designed for real-world use and sensing interfaces that do not account for full-posture tracking. Furthermore, the machine learning models they apply for posture recognition are not suitable for real-time deployment due to their computational intensiveness. Furthermore, the datasets they use for training the machine learning classifiers are limited in size and user group diversity. They also target a limited range of sitting postures that do not reflect the dynamicity of seated behavior.
Roh et al. (2018) used a low-cost load cell system made of four load cells mounted on the bottom rest of a chair [
36]. They explored using SVM, LDA, quadratic discriminant analysis (QDA), NB, and random forest. They achieved an accuracy of 97.20% with SVM with an RBF kernel on a weight sensor dataset obtained from guided experiments.
Zemp et al. (2016) trained several machine learning classifiers on data obtained from 20 pressure sensors mounted on a chair and on the arm rests in addition to accelerometers, gyroscopes, and magnetometers attached to the rear of the backrest [
37]. They trained several machine learning classifiers including SVMs, multinomial regression (MNR), NN, and random forest on manually labeled sensor data obtained from guided experiments to detect seven sitting postures. Their best-performing algorithm was random forest boosted by bagging and ensemble techniques, achieving an accuracy of 90.9%.
A study by Ma et al. (2017) used 12 textile pressure sensors mounted on the bottom rest and backrest of a wheelchair and implemented J48 trees, SVM, multilayer perceptron (MLP), NB, and k-NN to classify five wheelchair-specific sitting postures [
38]. They achieved an accuracy of 99.51% using J48 decision trees. However, they used more sensors than in our study and detected only five wheelchair-specific postures.
Hu et al. (2020) used six flex sensors mounted on a regular chair and artificial neural networks (ANNs) implemented on a field programmable gate array (FPGA) to detect seven basic sitting postures with an accuracy of 97.78% with a floating-point evaluation and 97.43% accuracy with the 9 bit fixed-point implementation [
39].
Luna-Perejón et al. (2021) used force-sensitive resistors (FSRs) mounted on the bottom rest of a chair and ANN to classify seven sitting postures with an accuracy of 81% [
40].
Jeong et al. (2021) combined FSRs and distance sensors embedded in an office chair to detect 11 sitting postures using k-NN and achieved an accuracy of 59%, 82%, and 92% using the pressure sensors only, distance sensors only, and mixed sensor systems, respectively [
41].
Farhani et al. (2022) used FSRs attached to the seat pan of a Formid dynamic chair and RF, SVM, and GDTs to classify seven basic sitting postures with an accuracy of around 90% [
42].
Stretch pose detection has received little attention in the literature especially using methods such as machine learning. Li et al. (2021) applied a badge-reel stretch sensor to detect spinal bending or stretching. However, this sensing interface is invasive and uses a rigid and simple displacement change computation for spine stretching detection [
43].
Previous studies have pointed to a potential effect of user BMI on recognition performance when using pressure sensors to detect sitting posture [
38,
44]. However, none of these studies fully investigated the importance of BMI in posture recognition, and they showed conflicting results regarding its impact, as detailed in
Section 4.2. We investigated the impact of taking BMI into consideration in posture recognition and discuss its importance in the LifeChair in
Section 4.2.
Notably, our proposed system for posture monitoring outperforms the works discussed above and is based on a sensing interface that is not fixed—it is non-invasive, portable, and lightweight and can be fit to various chairs for active posture recognition. Our machine learning models are built around data obtained from a biomechanics-based model that covers all areas of the user’s back, including the shoulders, lumbar regions, center of the back, and bottom of the neck. This accounts for critical points of posture monitoring such as vertical and horizontal pressure symmetry, shoulder contact, lumbar contact, in addition to neck and head position, which have been neglected in related works to varying degrees. Furthermore, the type of data collected by our system is also directly relevant to many aspects of user well-being from posture to behavior and suitable for the design of posture data-driven tools for improving posture habits such as personalized stretch recommendation systems.