With the advent of the digital age, time on screens (screen time) on multiple digital devices (computer, mobile phone, tablet, television screens, etc.) due to varied motivations (work, leisure, entertainment, gaming, etc.) has become an integral part of population behaviour [1
]. Screen time research has been widely incorporated across multiple disciplines, including public health, epidemiology and psychology, with emerging evidence suggesting that screen time is associated with health issues such as poor sleep [1
] overweight/obesity [2
], physical inactivity [4
] and poor cognitive development [5
]. However, these findings are not always consistent [1
], and it is important to separate the construct of screen time across different behaviours (i.e., entertainment, video gaming, etc.) [6
] and digital devices.
As screen time is predominantly assessed using self-report questionnaires that primarily focus on assessing television [7
], video gaming, texting [9
] and computer time [10
], without taking into account the variation of these behaviours across different digital devices, there is a significant gap in evidence related to screen time accumulated over mobile devices, such as smartphones. Smartphones have become the truly ubiquitous devices, with some estimates projecting that there will be 6 billion smartphones in circulation by the year 2020 [15
]. Thus, it is critical to understand not only the health impact of smartphone-based screen time accumulation, but also how smartphone-based screen time, i.e., smartphone usage, influences behaviours such as active living and dietary intake [16
Traditional self-report and cross-sectional research designs have previously been used to quantify smartphone screen time [14
]. Self-reporting of smartphone screen time is prone to significant measurement errors, with evidence indicating that self-reported findings are inconsistent [13
]. Utilising ecological momentary assessments, Kobayashi and Boase [21
] (2012) showed that smartphone users over report frequencies of communication usage on their devices. Similarly, Moreno et al. [22
] (2012) found low correlation between ecological momentary assessment of Internet use and self-reported Internet use, to conclude that self-reported smartphone-based screen time via traditional validated surveys results in overestimation.
Thus, evidence indicates that existing self-report measures, especially for smartphone-based screen time, may result in significant bias due to the consistent dependence on this ubiquitous device. A novel approach is needed to collect real-time data with multiple time-stamped assessments over time to minimise recall bias and errors in over- and/or underestimation of smartphone use. Smartphones are equipped with built-in sensors that can objectively capture smartphone screen time (i.e., screen-state), and although one smartphone addiction study showed that self-report time is significantly lower than objectively measured smartphone use, to our knowledge [23
], a replicable methodology to derive objective smartphone screen-state does not exist.
As part of the SMART Platform, a citizen science and mobile health initiative [24
], this study aimed to develop an accurate, reliable, and replicable methodology to derive prospective objective screen-state usage from all types of citizen-owned smartphones functioning on both Android and iOS platforms. Moreover, this study also aimed to compare prospectively obtained objective screen-state with retrospectively obtained validated self-reported measures of screen time that were adapted to capture different user behaviours (Internet surfing, texting, etc.) on smartphones.
2. Materials and Methods
2.1. Study Design
Data were obtained from the adult cohort of the SMART Platform (Regina, SK, Canada) and detailed description of methods, including recruitment and data collection strategy has been described in the methodology of the SMART Platform [25
]. In brief, SMART Adults [24
] is a prospective cohort study designed to obtain longitudinal data from a convenience sample of adults (≥18 years). The study was designed to capture data across different seasons (winter, spring, summer, and autumn) during 8 consecutive days in each cycle. All subjective (including screen time) and objective data (including screen-state) related to physical activity, sedentary behaviour, perception of environment, individual motivation, health outcomes, and eudaimonic well-being are obtained through citizen-owned smartphones during each cycle (Figure 1
). The data that were used for this study were collected as part of pilots conducted between April 1 and May 31, 2017 and January 4 and March 31, 2018.
2.2. Study Recruitment and Participants
Participants in the SMART Platform are termed as “citizen scientists” as they can engage with the researchers at all stages of the research process. Citizen scientists were recruited online through social media or in person from the universities of Regina and Saskatchewan, and community centres located in different neighbourhoods in Regina and Saskatoon to capture a representative sample (Figure 2
). Citizen scientists were guided to download Ethica (Ethica Data Services Inc., Waterloo, ON, Canada), an epidemiological smartphone application (app), specifically adapted for the SMART Platform, which captures data through both Android and iOS platforms.
All citizen scientists provided informed consent through the app (Figure 3
) and confirmed their age (≥18 years) before being recruited. Ethics approval was obtained from the universities of Regina and Saskatchewan through a synchronised review protocol (REB # 2017-29).
2.3. Data Collection Tools
Objective time-stamped smartphone screen-state was captured through the screen-state sensor that recorded every ON and OFF screen notification, i.e., all smartphone screen time accumulated by participants over 8 consecutive days. Subjective screen time data were obtained via a modified sedentary behavior questionnaire [26
] (University of California, San Diego version) that not only captured various motivations for screen time accumulation (gaming, texting, Internet surfing, etc.), but also varied devices (desktops, laptops, tablets) across which screen time can be accumulated:
On a typical WEEKDAY (from when you wake up until you go to bed), how much time do you spend watching TELEVISION?
On a typical WEEKDAY, how much time do you spend doing INTERNET SURFING (watching videos, reading news, etc.) or GENERAL WORK (office work, emails, paying bills, etc.) on a DESKTOP/LAPTOP/TABLET?
On a typical WEEKDAY, how much time do you spend doing INTERNET SURFING (watching videos, reading news, etc.) or GENERAL WORK (office work, emails, paying bills, etc.) on a SMARTPHONE?
On a typical WEEKDAY, how much time do you spend playing games on a DESKTOP/LAPTOP or TELEVISION SCREEN?
On a typical WEEKDAY, how much time do you spend playing games on a SMARTPHONE or a HANDHELD VIDEO GAME CONSOLE?
On a typical WEEKDAY, how much time do you spend TEXTING?
On a typical WEEKDAY, how much time do you spend SITTING and READING a paper-based BOOK/MAGAZINE?
On a typical WEEKDAY, how much time do you spend SITTING and READING an ELECTRONIC BOOK/MAGAZINE on a DESKTOP/LAPTOP/TABLET?
On a typical WEEKDAY, how much time do you spend SITTING and READING an ELECTRONIC BOOK/MAGAZINE on a SMARTPHONE?
On a typical WEEKDAY, how much time do you spend SITTING and LISTENING to MUSIC?
On a typical WEEKDAY, how much time do you spend SITTING and TALKING on the PHONE?
On a typical WEEKDAY, how much time do you spend SITTING and PLAYING a MUSICAL INSTRUMENT?
On a typical WEEKDAY, how much time do you spend SITTING and doing ARTWORK/ CRAFT
On a typical WEEKDAY, how much time do you spend DRIVING/RIDING in a CAR/BUS/ TRAIN, or any other mode of MOTORISED TRANSPORTATION?
Screen time accumulated over smartphones was derived using the questions relevant to smartphone screen time accumulation.
The following were the inclusion criteria:
A description of the data derivation process to arrive at the final sample has been shown in Figure 4
2.4. Methodology to Derive Objective Screen-State
A series of data processing techniques were applied to derive objective screen-state. After applying the inclusion criterion of deriving data only from those participants who provided data on at least 2 weekdays and 1 weekend day, screen-state was defined as “valid” when the smartphones went on to the “ON-state” (i.e., when the smartphone screens were activated). To take into account numerous real-life scenarios such as auto-notifications that result in activation of smartphone screens without participants actually viewing the screens, a series of “notification” cut-points for ON-state were developed, including 20, 15, 10, and 5 seconds.
Similarly, to account for invalid data due to smartphone errors that result in screens being in the ON-state longer than expected, “continuous usage” cut-points were developed with 6 thresholds: no-threshold, and 6, 5, 4, 3, 2 and 1 hours. These cut-points allowed the segregation of data by the longest duration of continuous usage of smartphones. Finally, another cut-point to account for potential invalid data due to long smartphone use was developed with two thresholds: usage of smartphones for less than or equal to 10 hours/day and more than 10 hours/day.
2.5. Statistical Analyses
Continuous estimates were reported as means with standard deviations (SD). After deriving objective screen-state data by implementing the above methodology, average screen-state time (in minutes) per weekdays and weekends for all users was calculated across a range of notification and continuous usage cut-points. Wilcoxon signed rank tests were conducted to compare the mean values of subjective screen time with mean values of objective screen-state across different cut points on both weekdays and weekend days. Correlation coefficients were calculated (Spearman’s rank or Pearson’s where applicable) to understand the correlation between subjective screen time and objective screen-state across different cut-points on both weekdays and weekend days. The Statistical Package for the Social Sciences for Windows v.24 software (IBM SPSS, Inc., Chicago, IL, USA) was used to conduct all analyses. The significance level was set to p < 0.05 (2-tailed).
Out of 538 adult citizen scientists recruited in two different seasons (spring 2017 and winter 2019), 38 participants dropped out of the study. After applying the inclusion criterion of data being derived from only those participants who provided objective screen-state data on at least 2 weekdays and 1 weekend days and completed the adapted sedentary behavior questionnaire [26
], the sample size for this study resulted in 54 adult citizen scientists (>18 years; males = 35.1% (n
= 19); females = 64.9% (n
, Table 2
, Table 3
and Table 4
show results of the Wilcoxon signed rank tests comparing weekday and weekend mean duration of subjectively reported screen time and objectively derived screen-state (minutes per day) stratified by a series of “notification” and “continuous usage” cut-points. Table 1
and Table 2
show that on weekdays, although insignificant, irrespective of length of smartphone usage per day (>10 or ≤ 10 hours), overall participants underreported the screen time spent when compared to sensor-determined screen-state.
Similarly, on weekend days (Table 3
and Table 4
), the over reporting trend for subjective screen time continued. When stratified by a series of “notification” and “continuous usage” cut-points, irrespective of length of smartphone usage per day (>10 or ≤10 hours), overall participants underreported the screen time spent when compared to sensor-determined screen-state, with significant differences being observed within the no-threshold category for continuous usage.
In terms of subjective screen time measurement, participants underreported usage more on the weekends in comparison with the weekdays, resulting in significant differences between objective screen-state and subjective screen time on the weekends. It must be noted that all significant results were observed within the no-threshold continuous usage cut-point category, irrespective of the range of the notification cut-points.
Correlation analyses results are shown in Figure 5
. In general, the correlation was either weak or not statistically significant between screen-state and screen time. For example, on weekdays, low statistically significant correlation was only found in “continuous usage” cut-points from 1 to 4 hours, and the overall trend was that when the “continuous usage” cut-points increased, the correlation coefficients decreased.
The purpose of this study was to develop an accurate, reliable, and replicable methodology to derive objective screen-state usage from all types of citizen-owned smartphones and, within the same sample of participants, compare objective screen-state with validated self-reported measures of screen time that were adapted to capture different user behaviours (Internet surfing, texting, etc.) on smartphones.
After conducting a series of sensitivity analyses by testing “notification” and “continuous usage” cut-points to derive real-time screen-state, the findings showed that there are some statistically significant differences between objectively derived screen-state and subjectively reported screen time. The main results indicate that participants generally underreport their screen time compared to objective measures, both during weekdays and weekend days. These results were corroborated by correlation analyses, which showed low correlation between objective and subjective measures.
As all significant results were observed within the no-threshold continuous usage cut-point category, irrespective of the range of the notification cut-points, it shows that the difference between objective screen-state and subjective screen time exists even after taking into consideration real-life scenarios of participants receiving notifications on their smartphones throughout the day, with or without their knowledge. Moreover, these results also indicate that participants who are continuously using their smartphones for very long periods are significantly contributing to the difference between objective screen-state and subjective screen time.
Objective derivation of screen-state data from apps is not immune to measurement errors [27
], which can occur when apps are actively turned on and off, the screen switches on without the users’ knowledge (i.e., automated notifications) or if multiple users use the same device. One of the inclusion criteria in this study was that participants had not switched off their apps during the data collection period. More importantly, a series of sensitivity analyses were conducted, which accounted for both automated notification and continuous usage of smartphones.
To our knowledge, this is the first study to develop a replicable methodology by examining optimal cut-points/thresholds to derive objective smartphone screen-state and compare it with a validated adapted survey that captured comprehensive screen time behaviours across multiple digital devices. Even with a small sample, and after taking into account multiple behaviours on smartphones (Internet surfing, texting, etc.), the results indicate underreporting via subjective measures, which challenges meagre evidence on smartphone screen time reporting via subjective measures.
This methodological study is part of the SMART Platform [24
], which combines citizen science and community-based participatory research to engage with participants through their smartphones to understand various behaviours, including screen time accumulation across multiple digital devices. Thus, apart from the replicable potential of this methodology to derive objective screen-state from smartphones, the larger approach of combining citizen science and community-based participatory research could help to engage large populations across jurisdictions to understand patterns of smartphone-based screen time accumulation.
Another approach this study offers is linking objective and subjective measures of smartphone screen time to capture not only the accurate measurement of screen-state, but also essential context of behaviours (Internet surfing, texting, etc.). This approach can be used for both accurate surveillance of behaviours such as active living and dietary patterns, and as well as develop interventions using mobile technology [16
]. Beyond the traditional linkage of objective and subjective measures, smartphone-based data collection also provides the opportunity to link big data obtained via sensors (such as accelerometers, pedometers, global positioning systems, Wi-Fi, and battery), with qualitative perception of citizen scientists through smartphone audio to accurately understand smartphone-based screen time behaviours.
More importantly, as the there is no indication that digital mobile device usage will decline in the future, and as these devices are being increasingly leveraged for mobile health applications [28
], it is critical to understand both the positive and negative impacts of this technology on human health. This study is part of a mobile health platform that is utilising smartphones for both surveillance and interventions [25
], and this novel methodology can be replicated in similar mobile health platforms to not only assess the link between smartphone usage and health outcomes but also inform interventions to minimize negative effects of smartphone usage, especially among those participants who are part of mobile health interventions.
Limitations and Strengths
The sample size of the study was limited but still possessed enough power to depict statistically significant results, and we expect the findings to be reiterated when the methodology is replicated with a larger sample. With a larger sample, we expect to observe significant results across several cut-point categories, which would allow the modeling of health outcomes across different types of smartphone users, i.e., individuals who use smartphones rarely, intermittently, or constantly. This study is one of the first to measure smartphone screen time exposure prospectively [31
], and this approach can help to track screen time in real-time, and longitudinally. Moreover, by adapting a validated recall survey, this study also serves as an example of modifying existing self-report measures to reflect technological advancements.