Next Article in Journal
Young’s Modulus-Independent Determination of Fibre Parameters for Rayleigh-Based Optical Frequency Domain Reflectometry from Cryogenic Temperatures up to 353 K
Next Article in Special Issue
Activity-Based Prospective Memory in ADHD during Motor Sleep Inertia
Previous Article in Journal
Pipelined Key Switching Accelerator Architecture for CKKS-Based Fully Homomorphic Encryption
Previous Article in Special Issue
Healthy Ageing: A Decision-Support Algorithm for the Patient-Specific Assignment of ICT Devices and Services
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Validation of an Algorithm for Measurement of Sedentary Behaviour in Community-Dwelling Older Adults

1
School of Population Health, Faculty of Medical and Health Sciences, University of Auckland, Auckland 1023, New Zealand
2
Translational and Clinical Research Institute, Faculty of Medical Sciences, Newcastle University, Newcastle upon Tyne NE2 4HH, UK
3
Janssen Research & Development, High Wycombe HP12 4EG, UK
4
School of Clinical Sciences, Auckland University of Technology, Auckland 1010, New Zealand
5
National Institute for Health and Care Research (NIHR), Newcastle Biomedical Research Centre (BRC), Newcastle University, The Newcastle upon Tyne Hospitals NHS Foundation Trust, Newcastle upon Tyne NE2 4HH, UK
*
Author to whom correspondence should be addressed.
Sensors 2023, 23(10), 4605; https://doi.org/10.3390/s23104605
Submission received: 9 April 2023 / Revised: 28 April 2023 / Accepted: 8 May 2023 / Published: 9 May 2023
(This article belongs to the Special Issue Wearable Sensors and Mobile Apps in Human Health Monitoring)

Abstract

:
Accurate measurement of sedentary behaviour in older adults is informative and relevant. Yet, activities such as sitting are not accurately distinguished from non-sedentary activities (e.g., upright activities), especially in real-world conditions. This study examines the accuracy of a novel algorithm to identify sitting, lying, and upright activities in community-dwelling older people in real-world conditions. Eighteen older adults wore a single triaxial accelerometer with an onboard triaxial gyroscope on their lower back and performed a range of scripted and non-scripted activities in their homes/retirement villages whilst being videoed. A novel algorithm was developed to identify sitting, lying, and upright activities. The algorithm’s sensitivity, specificity, positive predictive value, and negative predictive value for identifying scripted sitting activities ranged from 76.9% to 94.8%. For scripted lying activities: 70.4% to 95.7%. For scripted upright activities: 75.9% to 93.1%. For non-scripted sitting activities: 92.3% to 99.5%. No non-scripted lying activities were captured. For non-scripted upright activities: 94.3% to 99.5%. The algorithm could, at worst, overestimate or underestimate sedentary behaviour bouts by ±40 s, which is within a 5% error for sedentary behaviour bouts. These results indicate good to excellent agreement for the novel algorithm, providing a valid measure of sedentary behaviour in community-dwelling older adults.

1. Introduction

Sedentary behaviour (SB) is defined as ”any waking activity characterised by an energy expenditure ≤ 1.5 metabolic equivalents (METs), whilst in a sitting, reclining or lying posture” [1]. SB is distinct from physical inactivity and is differentially associated with health risks. High levels of SB are unfavourably related to cognitive function, depression, functional status, and disability in older adults [2,3,4]. For example, increased sitting duration, especially if associated with more screen time (e.g., watching television or use of mobile phones but not computer or internet use time [2]), are detrimental to sleep health [5,6] and social connectedness [7] which could increase the risk of disability, loneliness, and depression in adults [8]. Traditional methods of quantifying SB (e.g., questionnaires and diaries) have the potential for inaccuracy and inherent bias (e.g., recall bias) [9]; thus, the use of wearable devices (wearables) to objectively quantify SB is a welcome advancement in the field.
There has been a dramatic increase in studies that employ wearables to investigate SB, including some large-scale longitudinal population studies [10,11]. Wearable devices allow objective yet continuous and unobtrusive tracking of movement and posture and provide refined and accurate data on sedentary activities [12]. However, only a limited number of studies have investigated the validity of algorithms related to SB using accelerometry in the older populations who are the most sedentary amongst the age groups [13,14]. Out of 15 of these reported studies, only 7 investigated the accuracy of SB in real-world environments [14]. Studies employing machine learning techniques fared better than those relying on other techniques in detecting real-world SB, but more rigorous field-based research is still warranted [14,15].
Furthermore, algorithms designed to identify SB are limited in scope and, overall, perform inconsistently [16]. For example, discerning sitting from standing is problematic [17,18]. Studies that employ multiple sensor configurations report better accuracy [19,20,21], but such configurations increase the wearability burden especially if used for longer periods of time. Others that employ a single wearable device usually use the thigh as the preferred site (e.g., [22,23]) given that the wrist is preferred for monitoring physical activities [24]). However, a single wearable limits the ability to accurately distinguish sedentary activities such as sitting from lying (e.g., afternoon napping, which is common in older adults) both of which are important in the case of older adults [17,25]. In addition, algorithms based on machine learning and artificial intelligence techniques that solely rely on fixed cut-off points (primarily based on activity counts/step counts or METs) to classify SB are usually difficult to generalise in a population that the algorithm was not trained for, [26] limiting widespread utility for these algorithms.
Accurate and reliable measurement of sedentary behaviour in older adults is informative and relevant and will allow us to plan appropriate intervention strategies. A single open-source, accelerometer-based wearable—the Axivity monitor—attached to the lower back has been recently validated to detect a comprehensive battery of real-world gait characteristics in older adults [27,28]. Whether the same configuration can be used to detect SB remains to be investigated. In this study, we developed an algorithm that uses specific characteristics of the participants to detect SB.
Thus, the main objective of this study was to validate the performance of a customised algorithm based on a single wearable device placed on the L5 position of the lower back (to increase usability and acceptability) to identify SB and to discriminate its domains—sitting versus lying versus upright, in community-dwelling older people aged 75 years and above, in real-world conditions.

2. Materials and Methods

2.1. Participants

This study was embedded in the Ageing Well Through Eating, Sleeping, Socialising and Mobile (AWESSoM) study [10]. Older adults participating in AWESSoM were invited to take part, alongside participants who met the following inclusion criteria but did not enrol in AWESSoM. Inclusion criteria were (1) age of 75 years or over; (2) able to ambulate a minimum of 15 m independently, with or without walking aids; (3) able to stand, with or without walking aids, for a minimum of 60 s. Exclusion criteria were (1) any significant medical, orthopaedic, or neurological conditions that would contraindicate normal activity; (2) allergy to surgical adhesive tape. All subjects gave their informed consent for inclusion before they participated in this study. This study was conducted in accordance with the Declaration of Helsinki, and the protocol was approved by the New Zealand Ministry of Health and Disability Ethics Committee (2021 AM 9955).

2.2. Experimental Protocol

The Axivity monitor (AX6) is a wearable device incorporating a triaxial accelerometer and gyroscope, with a sampling frequency of 100 Hz, accelerometer range: ±8 g, and gyroscope range: 2000 degrees per second (dps). It is firmly established as a robust single-wearable device, used extensively to measure continuous real-world activity across the age range (e.g., younger adults [29], older adults [27]). For this study, we secured the AX6 onto the lower back at the fifth lumbar vertebrae of each participant, using a hydrogel adhesive, covered with a surgical-grade adhesive dressing (OPSITE Flexifix™ or Hypafix™, Smith+Nephew Ltd., Watford, UK). A handheld tablet (Galaxy Tab A, SM-P555, Samsung, sampling frequency: 30 FPS (frames per seconds), resolution: 1280 × 720) was used for videoing all movements of the participants. The AX6 and the handheld tablet were time synchronised using the network time (https://nist.time.gov/) via a laptop connected to the internet. This was performed with the respective USB cables that came with the devices connected to the laptop.

2.3. Procedure

Both scripted and non-scripted sedentary activities (sitting and lying bouts) were defined based on prior research [30,31,32]. Participants undertook these activities in their own homes or retirement villages (Supplementary Figure S1). Both tasks were video recorded by a research assistant using a handheld tablet, and was restricted to the trunk and lower limbs, and all recognisable features (e.g., facial) were avoided.

2.3.1. Scripted Activities

To indicate the start of the scripted activity and for synchronisation purposes, the AX6 was tapped by the research assistant three times at approximately one-second intervals. Participants then completed the following activities sequentially: (a) from a standing position, sit on a lounge/sofa chair for (approximately) one minute; (b) stand up and walk at their comfortable pace (with or without walking aids) to their dining area and sit on their dining chair for one minute; (c) stand up and walk to their bedroom and lie on their back on their bed for one minute; (d) sit up on the edge of their bed for approximately three seconds, then stand up (with or without support) beside their bed for one minute; (e) walk to their dining area; (f) when they are about to reach their dining area, they are instructed to return back to their bedroom; (g) when they are about to reach their bedroom, they are instructed to return to their lounge area; (h) sit down on their lounge chair for one minute; (i) to stand up and stand still for one minute. The AX6 was then tapped by the research assistant three times at approximately one-second intervals. This completes the scripted activity. Participants were requested to rest before executing the non-scripted activities.

2.3.2. Non-Scripted Activities

Participants were then instructed to continue their activities as normal for a duration of up to eight minutes. They were requested to avoid sitting or lying for too long during this period. To indicate the end of the non-scripted activity, they were instructed to return to their lounge area and sit on their lounge chair for approximately 60 s and thereafter stand up. The AX6 was then tapped by the research assistant three times at approximately one-second intervals, for synchronisation purposes. This completes the non-scripted activity.

2.3.3. Data Management

Data from the wearable device were downloaded to a computer using the OmGui software (Version 1.0.0.43, Open Movement, Newcastle, UK). Selected data based on the start and the end timing of the scripted and unscripted activities, respectively, were exported as raw comma-separated values (CSV) files, with the timestamps option as ”Fractional days (MATLAB)”. The data were then resampled at 100 Hz and linearly interpolated (piecewise cubic hermite interpolating polynomial [pchip]) in MATLAB (R2022a) to address the issue of real-time clock drift within the AX6. The video and the resampled data were frame synchronised using the ELAN software (Version 6.2, Nijmegen: Max Planck Institute for Psycholinguistics, The Language Archive) by identifying the exact start frame of the first tap on the AX6 (see Section 2.3.1 and Section 2.3.2). Participants’ activities (see Table 1) were coded based on the video recordings by the observer (KAJ). From the coded information, the duration of sitting, lying, and upright activities were calculated based on the start and end frame of each activity.

2.3.4. Algorithm Implementation

The algorithm used in this study is described in Algorithm 1 (see Figure 1 for flow). The key phases of the algorithm are data preparation, classification, and detection. The raw triaxial accelerometry data were first removed of their offset (mean accelerations) and thereafter passed through a 2nd order low-pass Butterworth two-pass filter with a cut-off frequency of 17 Hz [33]. A moving window of 0.1 s (i.e., 10 data samples) was then used [34] to identify upright activities based on the likelihood of whether the participant was a more “upright” (likely to spend more time in upright activities) or less “upright” (likely to spend more time in sitting and lying activities) candidate. Mediolateral and anteroposterior tilt thresholds were estimated based on upright versus non-upright postures. The vertical tilt threshold was estimated from earlier studies on gait and sit-to-stand movements in older adults [35,36,37,38]. The start and end of each potential upright bout were then identified and stored. Thereafter, appropriate thresholds were applied to the filtered anterior–posterior tilt angles to confirm the upright bouts. The next part of the algorithm identified the activities between any two consecutive upright bouts. For this, we assumed that the filtered mean anterior–posterior tilt angles of any lying activities should be at least 2.5 times lower than those of the preceding upright bout. If this was not true, then we checked whether the former was less than the filtered mean anterior–posterior tilt angles of the preceding upright bout. If true, then the current non-upright bout is likely to be a sitting bout, else it is likely an upright bout.
Algorithm 1 Pseudocode for sitting, lying, and upright bouts
Data: acc = [ax, ay, az] 1
axfiltered ← butterworth (ax, order = 2, cutoff = 17 Hz)
ayfiltered ← butterworth (ay, order = 2, cutoff = 17 Hz)
azfiltered ← butterworth (az, order = 2, cutoff = 17 Hz)
for (every 0.1 s)
std_axfiltered ← stdev(axfiltered)
std_ayfiltered ← stdev(ayfiltered)
std_azfiltered ← stdev(azfiltered)
tilt_angle_VT ←  a r c c o s ( a x a x 2 + a y 2 + a z 2 ) ·( 180 π )
tilt_angle_ML ←  a r c c o s ( a y a x 2 + a y 2 + a z 2 ) ·( 180 π )
tilt_angle_AP ←  a r c c o s ( a z a x 2 + a y 2 + a z 2 ) ·( 180 π )
end
std_sum ← std_axfiltered + std_ayfiltered + std_azfiltered
std_sumfiltered ← butterworth (std_sum, order = 2, cutoff = 1 Hz)
tilt_angle_VTfiltered ← butterworth (tilt_angle_VT, order = 2, cutoff = 0.25 Hz)
tilt_angle_MLfiltered ← butterworth (tilt_angle_ML, order = 2, cutoff = 0.25 Hz)
tilt_angle_APfiltered ← butterworth (tilt_angle_AP, order = 2, cutoff = 0.25 Hz)
for (every 0.1 s)
    create empty array to store upright movement
end
if ceiling(mean(tilt_angle_VTfiltered) ≥ 150 2
    if ceiling(mean(tilt_angle_MLfiltered) ≥ 90 2
       if ceiling(mean(tilt_angle_APfiltered) ≥ 90 2
         for (every 0.1 s)
           if std_sumfiltered ≥ mean(std_sumfiltered)
              assign 1 to the array 3
            end
         end
       end
    end
else if tilt_angle_VTfiltered ≥ 140 2 and tilt_angle_APfiltered ≥ 75 2
     for (every 0.1 s)
       assign 1 to the array 3
    end
end
find start_frame and end_frame of potential upright bouts and store in an array
Result: MoveArray [start_frame, end_frame] 4
for every two consecutive potential upright bouts
    if mean(tilt_angle_APfiltered) of current potential upright bout < 40 5
       label current potential upright bout as “Lying”
    else if mean(tilt_angle_APfiltered) of current potential upright bout < 80 5
       label current potential upright bout as “Sitting”
    else
       label current potential upright bout as “Upright”
    end
    if mean(tilt_angle_APfiltered) of current non-upright bout < mean(tilt_angle_APfiltered)/2.5 of preceding upright bout
       label current non-upright bout as “Lying”
    else if mean(tilt_angle_APfiltered) of current non-upright bout < mean(tilt_angle_APfiltered) of preceding upright bout
       label current non-upright bout as “Sitting”
    else
       label current non-upright bout as “Upright”
    end
end
Result: Data_Label = [array of labelled bouts]
In Algorithm 1, 1ax—vertical axis, ay—mediolateral axis, az—anterior–posterior axis; 2—these thresholds were estimated based on earlier studies [35,36,37,38]. 3—“1” indicates “upright”. 4—Array with start and end frame number of “upright” movement. Note that the end_frame of the current frame to the start of the next start_frame was considered as “non-upright” bout. 5—These thresholds were estimated based on the whole dataset.

2.3.5. Data Analysis

To determine inter-rater reliability, ten video recordings were randomly selected and the start and end frame of sitting, lying, and upright activities (see Table 1 for definitions) of both scripted and non-scripted activities were independently annotated using the ELAN software by two investigators (SL and KAJ). The results were presented as intra-class correlation (two-way random, absolute agreement) [ICC(2,1)]. Linear relationships of the duration of activities between the algorithm and the observer were also investigated using ICC(2,1) to establish levels of agreement. Criterion validity between the analysed accelerometer data and the corresponding video observations (considered the “gold standard”) of time resolution of 0.01 s (based on the 100 Hz sampling frequency of the AX6) was assessed based on the sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV). These measures are described as follows:
Sensitivity = T P T P + F N ,
Specificity = T N T N + F P ,
Positive predictive value (PPV) = T P T P + F P ,
Negative predictive value (NPV) = T N T N + F N ,
True positives, true negatives, false positives, and false negatives are described in Figure 2 below. Sensitivity describes how well the algorithm correctly identifies each observed category of activities (i.e., sitting, lying, and upright). Specificity describes how well the algorithm correctly identifies the absence of each observed category of activities (i.e., not sitting, not lying, and not upright). PPV describes the probability that when the algorithm identifies an activity that was present, it is actually correct. NPV describes the probability that when the algorithm identifies an activity that was absent, it is actually correct.
Bland–Altman plots were used to investigate the limits of agreement between the total duration of each activity [39]. The absolute percentage error (APE) and the absolute error (AE) of each activity were calculated as the difference between the accelerometer and video observation duration divided by the video observation duration. Statistical and graphical analysis were performed in R Studio (Version 3.6.1).
A E = 1 n t = 1 n R t A t ,
A P E = 100 % n t = 1 n R t A t R t ,
where R is the duration of each individual activity based on the reference (video observation), A is the duration of each individual activity based on the algorithm, and n is the number of bouts.

3. Results

3.1. Participants

A total of twenty older adults participated in this study. Of these, 19 were also part of the AWESSoM study. All participants completed both scripted and non-scripted activities. Data for two participants could not be processed because of synchronisation and scripted-task errors. The average ±SD age for the remaining 18 participants was 81.1 ± 6.2 years, and more than 60% were females (Table 2). A total of 293.27 min of sedentary behaviour (scripted—89.59 min, non-scripted—203.68 min) was analysed. No lying activities were captured during the non-scripted activities (Table 3).

3.2. Inter-Rater Reliability

Inter-rater reliability, ICC(2,1), for both investigators (SL, KAJ) was calculated based on nine videos because one of the videos had synchronisation issues. The ICC(2,1) for sitting, lying, and upright activities was 0.999, 0.985, and 0.999, respectively. The intra-class correlation between the algorithm and the observer was good to excellent for all activities. The ICC(2,1) [scripted, non-scripted] for sitting activities was 0.888 and 0.981, for upright activities it was 0.946 and 0.997, and for scripted lying activities it was 0.858 [40] (Table 4).

3.3. Criterion Validity

Table 5 shows the sensitivity, specificity, PPV, and NPV for both scripted and non-scripted activities. Sensitivity, accuracy, and PPV for scripted activities were lower compared to non-scripted activities. Sensitivity was lowest (70.39%) for scripted lying activity and highest (96.94%) for non-scripted upright activity. Specificity was high (≥90%) for both scripted as well as non-scripted activities. The algorithm was able to correctly identify sitting, lying, and upright activities with a probability of ≥83% for scripted and ≥95% for non-scripted (based on PPV). It was able to identify non-sitting and non-upright non-scripted activities with probabilities above 90% (based on NPV). The algorithm showed a lower probability for scripted activities, especially for scripted upright activities (NPV—75.89%).

3.4. Limits of Agreement

Bland–Altman plots are shown in Figure 3. The absolute mean difference (bias) between the algorithm and the video annotation was less than 10 s for all activities (range: 0.75 s to 9.61 s). The algorithm overestimated scripted lying activities by 9.61 s. It underestimated sitting by more than 3 s in both scripted and non-scripted conditions. The limits of agreement were greatest for (individual) non-scripted sitting activities: −30.49 s to 23.27 s, and the lowest were for scripted upright activities: −12.73 s to 11.12 s. The absolute percentage errors were relatively low (<16.1%) for sitting and upright activities but not so for lying activities (22.4%) (Table 6).

4. Discussion

The main goal of this study was to validate the real-world performance of a novel customised algorithm for identifying SB (sitting, lying, and upright activities) in community-dwelling adults aged 75 years and over. The PPV (>80%) and NPV (>75%) indicated good agreement between the algorithm and video observations for all activities, although the algorithm generally fared better in non-scripted activities than scripted activities. The limits of agreement (Bland–Altman plots) suggested that the algorithm could, at worst, overestimate or underestimate sitting, lying, or upright activities by ±40 s, which is within a 5% error for the average duration of bouts for sitting [41], lying (daytime napping) [42], and upright [43] in generally healthy community-dwelling older adults.
The PPV for all three activities surpassed that reported by Dijkstra et al. [44] (Table 7) by at least 10%. Although the current algorithm performed better in detecting sitting and upright activities compared to those reported by Taylor et al., it compared unfavourably for lying activities [17]. This could be due to the difference in age. Taylor et al. investigated an older age group (88.1 ± 5.0 years) which included long-term care participants, and the duration and number of occurrences of lying activities were greater than in the current study [17]. Other notable differences between the present study and that of Taylor et al. were that their real-world tasks included lying as a prescribed activity. In addition, the sensor used in their study (DynaPort MoveMonitor, McRoberts, The Hague, the Netherlands) differed from ours. However, similar to our results, Taylor et al. also reported that their algorithm performed better in non-scripted activities than in scripted activities. This could be due to the lower number of transitions and longer duration of activities within the non-scripted activities compared to the scripted activities [17].
The current algorithm emphasises the posture of the trunk rather than the intensity of the movement (i.e., raw accelerometry data) and looks for SB between two identified upright activities. The algorithm uses accelerometry data from the whole dataset to estimate the mean tilt angles (vertical, anterior–posterior, medial–lateral), and based on these angles and their respective (fixed) thresholds, classifies the participant as “more likely to spend more time upright” or “less likely to spend more time upright”. If the participant is “more likely to spend more time upright”, it then uses the standard deviation of the triaxial acceleration (signal vector magnitude) [29] to classify the upright activity. Otherwise, it uses fixed thresholds to classify upright activities based on vertical and anterior–posterior tilt angles alone. Because the algorithm for the “more likely to spend more time upright” scenario does not incorporate postural information, it is more sensitive to gait but less so for upright standing activities, which is at times misclassified. The other issue with the current algorithm is that it overestimates lying durations. The algorithm includes postural transitions (i.e., stand to sit, sit to lie, lie to sit, and sit to stand) within the duration of lying activities, which thus inflates the actual durations of lying activities. All thresholds for this study were tuned to improve the detection of sitting activities rather than lying activities because we anticipate more bouts of sitting activities for this cohort of older adults. This could be a plausible reason why the current algorithm failed to perform well for detecting lying bouts when compared to earlier studies [17,18].
The configuration purposefully adopted in this study used only the lower back with a single wearable to minimise the wearability burden on its users. Although the main objective of this study was to quantify SB, information of PA that includes gait and turning is important for understanding change in functional decline in older adults. The lower back and the hip are recommended for gait-related activities as these locations are closest to the centre of gravity of the participants [45]. Even with this limitation, the current algorithm generally fared better than previously published algorithms with similar configurations. The key improvement in the current algorithm is the semi-adaptative approach to understanding the user. It tries to classify the user based on the amount of time they spent in upright activities and non-upright activities. This step helps the algorithm to use an appropriate threshold—standard deviation of accelerations versus tilt angles—to better identify upright bouts. Furthermore, the algorithm also uses the participant’s own postural information in estimating the thresholds for tilt angles rather than fixed thresholds to classify sitting and lying bouts.

Study Limitations

We wish to acknowledge that our use of reference (video observation method), although considered as “gold standard”, is still prone to subjectivity. More costly but better alternatives, such as optical-based systems, are available. The current algorithm relied mainly on tilt angles and accelerometry data to classify the activities. It also used fixed thresholds (in addition to customised thresholds) to differentiate activities. Newer research-grade wearables have an in-built triaxial gyroscope that provides additional (trunk) rotational information of the participant. They may classify sedentary and non-sedentary activities better. Accurately identifying key events and postural transitions, such as the initiation of a gait and the start and end of a sit-to-stand transition, may inform us of the precise timing of when an activity ends and when a new activity begins. Furthermore, the duration of postural transitions, although they could be considerably much smaller compared to the duration of sitting or lying, were not identified as separate activities in this study. These factors have limited the ability and performance of the algorithm. Some newer algorithms incorporate machine learning and artificial intelligence to improve their accuracy [20,21], albeit they might lack the necessary generalisability to be adopted for a wider population. These algorithms could be used to estimate customised thresholds and use additional signal-related features to not only identify SB and PA, but also accurately classify the key postural transitions.

5. Conclusions

This study investigated the ability of a semi-personalised algorithm to identify SB and discriminate sitting, lying, and upright activities. This was conducted in real-world conditions with minimal experimental setup and constraints. The importance of measuring SB in addition to PA in older adults is well recognised, but accurate and reliable measurements in daily life are challenging. The current algorithm provides a valid measure for identifying SB in community-dwelling older adults in real-world conditions and this could provide researchers in this field with better and clearer understanding on how SB plays an important role in healthy ageing. However, the algorithm’s accuracy, especially for lying activities, could be improved if postural transitions were separately classified.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/s23104605/s1, Supplementary Figure S1: Flow diagram—scripted and non-scripted activities.

Author Contributions

Conceptualisation, K.A.J., N.K., S.D.D., S.L. and R.T.; methodology, K.A.J.; software, J.S., K.A.J. and R.Z.U.R.; validation, J.S., K.A.J., R.Z.U.R. and S.L.; data analysis, K.A.J.; investigation, K.A.J., J.S., R.Z.U.R., S.D.D. and S.L.; resources, N.K. and R.T.; data curation, N.K. and R.T.; writing—original draft preparation, K.A.J., N.K., S.D.D., S.L. and R.T.; writing—review and editing, J.S., K.A.J., N.K., R.Z.U.R., S.D.D., S.L. and R.T.; visualisation, K.A.J.; supervision, N.K., S.D.D., S.L. and R.T.; project administration, N.K., S.D.D., S.L. and R.T.; funding acquisition, N.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the Ageing Well National Science Challenge, New Zealand. NK was supported as the Joyce Cook Chair in Ageing Well funded by the Joyce Cook Family. KAJ was financially supported by HOPE Foundation for Research on Ageing Scholarship. SDD was supported by the Mobilise-D project that has received funding from the Innovative Medicines Initiative 2 Joint Undertaking (JU) under grant agreement No. 820820. This JU receives support from the European Union’s Horizon 2020 research and innovation programme and the European Federation of Pharmaceutical Industries and Associations (EFPIA). SDD was also supported by the Innovative Medicines Initiative 2 Joint Undertaking (IMI2 JU) project IDEA-FAST—Grant Agreement 853981. SDD was supported by the National Institute for Health Research (NIHR) Newcastle Biomedical Research Centre (BRC) based at Newcastle upon Tyne Hospital NHS Foundation Trust and Newcastle University, and by the NIHR/Wellcome Trust Clinical Research Facility (CRF) infrastructure based at Newcastle Upon Tyne Hospital NHS Foundation Trust, Newcastle University and the Cumbria, Northumberland and Tyne and Wear (CNTW) NHS Foundation Trust. All opinions are those of the authors and not the funders.

Institutional Review Board Statement

Ethics approval was granted by the New Zealand Ministry of Health and Disability Ethics Committee (2021 AM 9955).

Informed Consent Statement

Informed written consent was obtained from all participants involved in this study.

Data Availability Statement

Data will be available upon reasonable request by contacting the principal investigator of the study, Professor Ngaire Kerse.

Acknowledgments

We want to acknowledge and thank the contributions of Karen Campbell, Christin Fuchs, Kirsten van Dommelen, Jo Parsons, Lynne Taylor, and Jochen Klenk towards this project. We would also like to acknowledge and thank all the participants who contributed their valuable time to this work.

Conflicts of Interest

SDD reports consultancy activity with Hoffman La Roche Ltd. outside of this study.

References

  1. Tremblay, M.S.; Aubert, S.; Barnes, J.D.; Saunders, T.J.; Carson, V.; Latimer-Cheung, A.E.; Chastin, S.F.; Altenburg, T.M.; Chinapaw, M.J. Sedentary behavior research network (SBRN)–terminology consensus project process and outcome. Int. J. Behav. Nutr. Phys. Act. 2017, 14, 75. [Google Scholar] [CrossRef] [PubMed]
  2. Saunders, T.J.; McIsaac, T.; Douillette, K.; Gaulton, N.; Hunter, S.; Rhodes, R.E.; Prince, S.A.; Carson, V.; Chaput, J.-P.; Chastin, S. Sedentary behaviour and health in adults: An overview of systematic reviews. Appl. Physiol. Nutr. Metab. 2020, 45, S197–S217. [Google Scholar] [CrossRef] [PubMed]
  3. Mitsutake, S.; Shibata, A.; Ishii, K.; Amagasa, S.; Kikuchi, H.; Fukushima, N.; Inoue, S.; Oka, K. Clustering of domain-specific sedentary behaviors and their association with physical function among community-dwelling older adults. J. Phys. Act. Health 2020, 17, 709–714. [Google Scholar] [CrossRef] [PubMed]
  4. Falck, R.S.; Davis, J.C.; Liu-Ambrose, T. What is the association between sedentary behaviour and cognitive function? A systematic review. Br. J. Sport. Med. 2017, 51, 800–811. [Google Scholar] [CrossRef]
  5. Lakerveld, J.; Mackenbach, J.D.; Horvath, E.; Rutters, F.; Compernolle, S.; Bardos, H.; De Bourdeaudhuij, I.; Charreire, H.; Rutter, H.; Oppert, J.M.; et al. The relation between sleep duration and sedentary behaviours in European adults. Obes. Rev. 2016, 17 (Suppl. S1), 62–67. [Google Scholar] [CrossRef]
  6. Madden, K.M.; Ashe, M.C.; Lockhart, C.; Chase, J.M. Sedentary behavior and sleep efficiency in active community-dwelling older adults. Sleep Sci. 2014, 7, 82–88. [Google Scholar] [CrossRef]
  7. Van Holle, V.; Van Cauwenberg, J.; De Bourdeaudhuij, I.; Deforche, B.; Van de Weghe, N.; Van Dyck, D. Interactions between neighborhood social environment and walkability to explain Belgian older adults’ physical activity and sedentary time. Int. J. Environ. Res. Public Health 2016, 13, 569. [Google Scholar] [CrossRef]
  8. Vancampfort, D.; Hallgren, M.; Schuch, F.; Stubbs, B.; Smith, L.; Rosenbaum, S.; Firth, J.; Van Damme, T.; Koyanagi, A. Sedentary behavior and depression among community-dwelling adults aged ≥50 years: Results from the irish longitudinal study on ageing. J. Affect. Disord. 2020, 262, 389–396. [Google Scholar] [CrossRef]
  9. Dowd, K.P.; Szeklicki, R.; Minetto, M.A.; Murphy, M.H.; Polito, A.; Ghigo, E.; van der Ploeg, H.; Ekelund, U.; Maciaszek, J.; Stemplewski, R. A systematic literature review of reviews on techniques for physical activity measurement in adults: A DEDIPAC study. Int. J. Behav. Nutr. Phys. Act. 2018, 15, 15. [Google Scholar] [CrossRef]
  10. Lord, S.; Teh, R.; Gibson, R.; Smith, M.; Wrapson, W.; Thomson, M.; Rolleston, A.; Neville, S.; McBain, L.; Del Din, S. Optimising function and well-being in older adults: Protocol for an integrated research programme in Aotearoa/New Zealand. BMC Geriatr. 2022, 22, 215. [Google Scholar] [CrossRef]
  11. Chastin, S.; McGregor, D.; Palarea-Albaladejo, J.; Diaz, K.M.; Hagströmer, M.; Hallal, P.C.; van Hees, V.T.; Hooker, S.; Howard, V.J.; Lee, I.-M. Joint association between accelerometry-measured daily combination of time spent in physical activity, sedentary behaviour and sleep and all-cause mortality: A pooled analysis of six prospective cohorts using compositional analysis. Br. J. Sport. Med. 2021, 55, 1277–1285. [Google Scholar] [CrossRef] [PubMed]
  12. Shephard, R.J. Limits to the measurement of habitual physical activity by questionnaires. Br. J. Sport. Med. 2003, 37, 197–206. [Google Scholar] [CrossRef] [PubMed]
  13. Harvey, J.A.; Chastin, S.F.; Skelton, D.A. How sedentary are older people? A systematic review of the amount of sedentary behavior. J. Aging Phys. Act. 2015, 23, 471–487. [Google Scholar] [CrossRef] [PubMed]
  14. Heesch, K.C.; Hill, R.L.; Aguilar-Farias, N.; Van Uffelen, J.G.; Pavey, T. Validity of objective methods for measuring sedentary behaviour in older adults: A systematic review. Int. J. Behav. Nutr. Phys. Act. 2018, 15, 119. [Google Scholar] [CrossRef]
  15. Sasaki, J.E.; Hickey, A.; Staudenmayer, J.; John, D.; Kent, J.A.; Freedson, P.S. Performance of activity classification algorithms in free-living older adults. Med. Sci. Sport. Exerc. 2016, 48, 941. [Google Scholar] [CrossRef]
  16. Shei, R.-J.; Holder, I.G.; Oumsang, A.S.; Paris, B.A.; Paris, H.L. Wearable activity trackers–advanced technology or advanced marketing? Eur. J. Appl. Physiol. 2022, 122, 1975–1990. [Google Scholar] [CrossRef]
  17. Taylor, L.M.; Klenk, J.; Maney, A.J.; Kerse, N.; MacDonald, B.M.; Maddison, R. Validation of a body-worn accelerometer to measure activity patterns in octogenarians. Arch. Phys. Med. Rehabil. 2014, 95, 930–934. [Google Scholar] [CrossRef]
  18. Dijkstra, B.; Kamsma, Y.P.; Zijlstra, W. Detection of gait and postures using a miniaturized triaxial accelerometer-based system: Accuracy in patients with mild to moderate Parkinson’s disease. Arch. Phys. Med. Rehabil. 2010, 91, 1272–1277. [Google Scholar] [CrossRef]
  19. Jung, S.; Michaud, M.; Oudre, L.; Dorveaux, E.; Gorintin, L.; Vayatis, N.; Ricard, D. The use of inertial measurement units for the study of free living environment activity assessment: A literature review. Sensors 2020, 20, 5625. [Google Scholar] [CrossRef]
  20. Awais, M.; Chiari, L.; Ihlen, E.A.; Helbostad, J.L.; Palmerini, L. Classical machine learning versus deep learning for the older adults free-living activity classification. Sensors 2021, 21, 4669. [Google Scholar] [CrossRef]
  21. Ustad, A.; Logacjov, A.; Trollebø, S.Ø.; Thingstad, P.; Vereijken, B.; Bach, K.; Maroni, N.S. Validation of an Activity Type Recognition Model Classifying Daily Physical Behavior in Older Adults: The HAR70+ Model. Sensors 2023, 23, 2368. [Google Scholar] [CrossRef] [PubMed]
  22. Wullems, J.A.; Verschueren, S.M.; Degens, H.; Morse, C.I.; Onambele, G.L. Performance of thigh-mounted triaxial accelerometer algorithms in objective quantification of sedentary behaviour and physical activity in older adults. PLoS ONE 2017, 12, e0188215. [Google Scholar] [CrossRef] [PubMed]
  23. Giurgiu, M.; Bussmann, J.B.; Hill, H.; Anedda, B.; Kronenwett, M.; Koch, E.D.; Ebner-Priemer, U.W.; Reichert, M. Validating accelerometers for the assessment of body position and sedentary behavior. J. Meas. Phys. Behav. 2020, 3, 253–263. [Google Scholar] [CrossRef]
  24. Soltani, A.; Paraschiv-Ionescu, A.; Dejnabadi, H.; Marques-Vidal, P.; Aminian, K. Real-world gait bout detection using a wrist sensor: An unsupervised real-life validation. IEEE Access 2020, 8, 102883–102896. [Google Scholar] [CrossRef]
  25. Chigateri, N.G.; Kerse, N.; Wheeler, L.; MacDonald, B.; Klenk, J. Validation of an accelerometer for measurement of activity in frail older people. Gait Posture 2018, 66, 114–117. [Google Scholar] [CrossRef]
  26. Farrahi, V.; Niemelä, M.; Kangas, M.; Korpelainen, R.; Jämsä, T. Calibration and validation of accelerometer-based activity monitors: A systematic review of machine-learning approaches. Gait Posture 2019, 68, 285–299. [Google Scholar] [CrossRef]
  27. Del Din, S.; Galna, B.; Godfrey, A.; Bekkers, E.M.; Pelosin, E.; Nieuwhof, F.; Mirelman, A.; Hausdorff, J.M.; Rochester, L. Analysis of free-living gait in older adults with and without Parkinson’s disease and with and without a history of falls: Identifying generic and disease-specific characteristics. J. Gerontol. Ser. A 2019, 74, 500–506. [Google Scholar] [CrossRef]
  28. Del Din, S.; Godfrey, A.; Rochester, L. Validation of an accelerometer to quantify a comprehensive battery of gait characteristics in healthy older adults and Parkinson’s disease: Toward clinical and at home use. IEEE J. Biomed. Health Inform. 2015, 20, 838–847. [Google Scholar] [CrossRef]
  29. Hickey, A.; Del Din, S.; Rochester, L.; Godfrey, A. Detecting free-living steps and walking bouts: Validating an algorithm for macro gait analysis. Physiol. Meas. 2016, 38, N1. [Google Scholar] [CrossRef]
  30. Geraedts, H.A.; Zijlstra, W.; Van Keeken, H.G.; Zhang, W.; Stevens, M. Validation and User Evaluation of a Sensor-Based Method for Detecting Mobility-Related Activities in Older Adults. PLoS ONE 2015, 10, e0137668. [Google Scholar] [CrossRef]
  31. Bourke, A.K.; Ihlen, E.A.F.; Bergquist, R.; Wik, P.B.; Vereijken, B.; Helbostad, J.L. A physical activity reference data-set recorded from older adults using body-worn inertial sensors and video technology—The ADAPT study data-set. Sensors 2017, 17, 559. [Google Scholar] [CrossRef] [PubMed]
  32. Bourke, A.K.; Ihlen, E.A.F.; Helbostad, J.L. Development of a gold-standard method for the identification of sedentary, light and moderate physical activities in older adults: Definitions for video annotation. J. Sci. Med. Sport 2019, 22, 557–561. [Google Scholar] [CrossRef] [PubMed]
  33. Godfrey, A.; Bourke, A.K.; Olaighin, G.M.; van de Ven, P.; Nelson, J. Activity classification using a single chest mounted tri-axial accelerometer. Med. Eng. Phys. 2011, 33, 1127–1135. [Google Scholar] [CrossRef] [PubMed]
  34. Lyons, G.; Culhane, K.; Hilton, D.; Grace, P.; Lyons, D. A description of an accelerometer-based mobility monitoring technique. Med. Eng. Phys. 2005, 27, 497–504. [Google Scholar] [CrossRef] [PubMed]
  35. Kinoshita, S.; Kiyama, R.; Yoshimoto, Y. Effect of handrail height on sit-to-stand movement. PLoS ONE 2015, 10, e0133747. [Google Scholar] [CrossRef]
  36. Ishigaki, N.; Kimura, T.; Usui, Y.; Aoki, K.; Narita, N.; Shimizu, M.; Hara, K.; Ogihara, N.; Nakamura, K.; Kato, H. Analysis of pelvic movement in the elderly during walking using a posture monitoring system equipped with a triaxial accelerometer and a gyroscope. J. Biomech. 2011, 44, 1788–1792. [Google Scholar] [CrossRef]
  37. Atrsaei, A.; Dadashi, F.; Hansen, C.; Warmerdam, E.; Mariani, B.; Maetzler, W.; Aminian, K. Postural transitions detection and characterization in healthy and patient populations using a single waist sensor. J. NeuroEngineering Rehabil. 2020, 17, 70. [Google Scholar] [CrossRef]
  38. Pham, M.H.; Warmerdam, E.; Elshehabi, M.; Schlenstedt, C.; Bergeest, L.-M.; Heller, M.; Haertner, L.; Ferreira, J.J.; Berg, D.; Schmidt, G. Validation of a lower back “wearable”-based sit-to-stand and stand-to-sit algorithm for patients with Parkinson’s disease and older adults in a home-like environment. Front. Neurol. 2018, 9, 652. [Google Scholar] [CrossRef]
  39. Bland, J.M.; Altman, D. Statistical methods for assessing agreement between two methods of clinical measurement. Lancet 1986, 327, 307–310. [Google Scholar] [CrossRef]
  40. Fleiss, J.L. Design and Analysis of Clinical Experiments; John Wiley & Sons: Hoboken, NJ, USA, 2011. [Google Scholar]
  41. Rosenberg, D.; Walker, R.; Greenwood-Hickman, M.A.; Bellettiere, J.; Xiang, Y.; Richmire, K.; Higgins, M.; Wing, D.; Larson, E.B.; Crane, P.K. Device-assessed physical activity and sedentary behavior in a community-based cohort of older adults. BMC Public Health 2020, 20, 1256. [Google Scholar] [CrossRef]
  42. Zhang, Z.; Xiao, X.; Ma, W.; Li, J. Napping in older adults: A review of current literature. Curr. Sleep Med. Rep. 2020, 6, 129–135. [Google Scholar] [CrossRef] [PubMed]
  43. Grant, P.M.; Granat, M.H.; Thow, M.K.; Maclaren, W.M. Analyzing free-living physical activity of older adults in different environments using body-worn activity monitors. J. Aging Phys. Act. 2010, 18, 171–184. [Google Scholar] [CrossRef] [PubMed]
  44. Dijkstra, B.; Kamsma, Y.; Zijlstra, W. Detection of gait and postures using a miniaturised triaxial accelerometer-based system: Accuracy in community-dwelling older adults. Age Ageing 2010, 39, 259–262. [Google Scholar] [CrossRef] [PubMed]
  45. Trost, S.G.; Mciver, K.L.; Pate, R.R. Conducting accelerometer-based activity assessments in field-based research. Med. Sci. Sport. Exerc. 2005, 37, S531–S543. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Flowchart showing key phases of the algorithm.
Figure 1. Flowchart showing key phases of the algorithm.
Sensors 23 04605 g001
Figure 2. Example of sitting identification. Each square represents a window of 0.01 s. The reference here refers to the video observations.
Figure 2. Example of sitting identification. Each square represents a window of 0.01 s. The reference here refers to the video observations.
Sensors 23 04605 g002
Figure 3. Bland–Altman plots of the duration of sitting, lying, and upright activities. Each point in the graph represents an individual’s activities. (a) Duration of scripted sitting bouts; (b) duration of non-scripted sitting bouts; (c) duration of scripted lying bouts; (d) duration of scripted upright bouts; (e) duration of non-scripted upright bouts.
Figure 3. Bland–Altman plots of the duration of sitting, lying, and upright activities. Each point in the graph represents an individual’s activities. (a) Duration of scripted sitting bouts; (b) duration of non-scripted sitting bouts; (c) duration of scripted lying bouts; (d) duration of scripted upright bouts; (e) duration of non-scripted upright bouts.
Sensors 23 04605 g003aSensors 23 04605 g003b
Table 1. Definitions of SB and upright activities.
Table 1. Definitions of SB and upright activities.
Event/ActivityDefinitions
Sitting
-
when the participant’s buttock is fully in contact with the seat of the chair/bed/stool (i.e., not on the ground). (adapted from [32])
Lying
-
when the participant’s trunk and thigh are in a relatively horizontal posture with the back and stomach or side touching a horizontal underground. (adapted from [30,31,32])
Upright
-
any activity undertaken with only the feet (and the use of assisted devices, if applicable) touching the ground. This would include standing without upper body movement, standing with upper body movement, walking, running, shuffling, stair climbing, stair descending. (adapted from [30,31,32])
Table 2. Characteristics of participants [Mean ± SD].
Table 2. Characteristics of participants [Mean ± SD].
Scripted (n = 18)Non-Scripted (n = 17)
Age (yrs.)81.1 ± 6.280.5 ± 5.9
Female12 (66.7%)11 (64.7%)
Weight (kg)71.2 ± 13.172.2 ± 12.7
Height (cm)163.1 ± 9.4163.8 ± 9.2
BMI26.6 ± 3.326.7 ± 3.3
Table 3. Total duration, average duration [Mean ± SD].
Table 3. Total duration, average duration [Mean ± SD].
Total Duration (in secs)Average Duration 1 (in secs)
ActivityScriptedNon-ScriptedOverallScriptedNon-ScriptedOverall
Sitting1654.84111.25766.061.3 ± 16.089.4 ± 72.579.0 ± 59.7
Lying879.7NA 2879.758.6 ± 23.4NA 258.6 ± 23.4
Upright2840.98109.610,950.440.0 ± 35.6180.2 ± 217.194.4 ± 153.3
1 Based on the number of activities. 2 No non-scripted lying activities were captured.
Table 4. ICC(2,1) between the duration of activity [Mean ± SD] of video reference and algorithm.
Table 4. ICC(2,1) between the duration of activity [Mean ± SD] of video reference and algorithm.
Video Reference (in secs)Algorithm (in secs)ICC(2,1)
ActivityScriptedNon-ScriptedOverallScriptedNon-ScriptedOverallScriptedNon-ScriptedOverall
Sitting61.3 ± 16.089.4 ± 72.580.0 ± 59.757.4 ± 18.185.8 ± 70.475.3 ± 58.30.8880.9810.923
Lying58.6 ± 23.4NA58.6 ± 23.468.3 ± 24.4NA68.3 ± 24.40.858NA0.858
Upright37.4 ± 33.7180.2 ± 217.1102.4 ± 163.939.4 ± 32.3183.0 ± 219.0104.7 ± 165.10.9460.9970.997
Table 5. Sensitivity, specificity, positive predictive value [PPV], and negative predictive value [NPV]. Scripted activities (n = 18) and non-scripted activities (n = 17).
Table 5. Sensitivity, specificity, positive predictive value [PPV], and negative predictive value [NPV]. Scripted activities (n = 18) and non-scripted activities (n = 17).
Sensitivity (%)Specificity (%)PPV (%)NPV (%)
ScriptedNon-ScriptedScriptedNon-ScriptedScriptedNon-ScriptedScriptedNon-Scripted
Sitting76.9092.2594.8499.48 89.1798.9988.1495.83
Lying70.39NA95.71NA83.10NA91.51NA
Upright77.3096.9492.54 98.9993.0799.4875.8994.25
Table 6. Absolute percentage error (APE) and absolute error (AE) [Mean ± SD].
Table 6. Absolute percentage error (APE) and absolute error (AE) [Mean ± SD].
ActivityScriptedNon-ScriptedScriptedNon-Scripted
Sitting10.8 ± 11.85.0 ± 7.65.9 ± 5.84.9 ± 13.3
Lying22.4 ± 11.9NA12.1 ± 5.6NA
Upright16.1 ± 16.910.0 ± 13.54.5 ± 4.26.4 ± 14.8
Table 7. Comparison of PPV (%) of current algorithm (free-living data only).
Table 7. Comparison of PPV (%) of current algorithm (free-living data only).
ActivityCurrentDijkstra et al. [44] 1Taylor et al. [17] 2
Sitting89.276.885.2
Lying83.164.698.0
Upright 193.180.256.1
1 Dijkstra et al. and Taylor et al. reported standing bouts. 2 Taylor et al. reported overall agreement, which was the total duration that the video observation and the accelerometer corresponded for each activity divided by the total duration each activity was observed on video.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Abdul Jabbar, K.; Sarvestan, J.; Zia Ur Rehman, R.; Lord, S.; Kerse, N.; Teh, R.; Del Din, S. Validation of an Algorithm for Measurement of Sedentary Behaviour in Community-Dwelling Older Adults. Sensors 2023, 23, 4605. https://doi.org/10.3390/s23104605

AMA Style

Abdul Jabbar K, Sarvestan J, Zia Ur Rehman R, Lord S, Kerse N, Teh R, Del Din S. Validation of an Algorithm for Measurement of Sedentary Behaviour in Community-Dwelling Older Adults. Sensors. 2023; 23(10):4605. https://doi.org/10.3390/s23104605

Chicago/Turabian Style

Abdul Jabbar, Khalid, Javad Sarvestan, Rana Zia Ur Rehman, Sue Lord, Ngaire Kerse, Ruth Teh, and Silvia Del Din. 2023. "Validation of an Algorithm for Measurement of Sedentary Behaviour in Community-Dwelling Older Adults" Sensors 23, no. 10: 4605. https://doi.org/10.3390/s23104605

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop