Next Article in Journal
Altered Spontaneous Brain Activity in Cirrhotic Patients with Minimal Hepatic Encephalopathy: A Meta-Analysis of Resting-State Functional Imaging
Previous Article in Journal
Maternal LPS Exposure Enhances the 5-HT Level in the Prefrontal Cortex of Autism-like Young Offspring
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Review on Smartphone Keystroke Dynamics as a Digital Biomarker for Understanding Neurocognitive Functioning

1
Department of Psychiatry, University of Illinois at Chicago, Chicago, IL 60612, USA
2
Department of Biomedical Engineering, University of Illinois at Chicago, Chicago, IL 60607, USA
3
Department of Computer Science, University of Illinois at Chicago, Chicago, IL 60607, USA
*
Author to whom correspondence should be addressed.
Brain Sci. 2023, 13(6), 959; https://doi.org/10.3390/brainsci13060959
Submission received: 4 May 2023 / Revised: 7 June 2023 / Accepted: 14 June 2023 / Published: 16 June 2023
(This article belongs to the Section Computational Neuroscience and Neuroinformatics)

Abstract

:
Can digital technologies provide a passive unobtrusive means to observe and study cognition outside of the laboratory? Previously, cognitive assessments and monitoring were conducted in a laboratory or clinical setting, allowing for a cross-sectional glimpse of cognitive states. In the last decade, researchers have been utilizing technological advances and devices to explore ways of assessing cognition in the real world. We propose that the virtual keyboard of smartphones, an increasingly ubiquitous digital device, can provide the ideal conduit for passive data collection to study cognition. Passive data collection occurs without the active engagement of a participant and allows for near-continuous, objective data collection. Most importantly, this data collection can occur in the real world, capturing authentic datapoints. This method of data collection and its analyses provide a more comprehensive and potentially more suitable insight into cognitive states, as intra-individual cognitive fluctuations over time have shown to be an early manifestation of cognitive decline. We review different ways passive data, centered around keystroke dynamics, collected from smartphones, have been used to assess and evaluate cognition. We also discuss gaps in the literature where future directions of utilizing passive data can continue to provide inferences into cognition and elaborate on the importance of digital data privacy and consent.

1. Introduction

In 2021, 97% of Americans owned a phone, with 85% of them owning a smartphone [1]. In the developing world, 45% of people have smartphones, with the number growing daily [1]. Until recently, cognitive testing has been conducted within a laboratory or clinical setting, but with the advent of technological advances, smartphones and other wearable technologies have provided new tools for remote cognitive testing in the real world. As more smartphones are used and become truly ubiquitous devices worldwide, the research potential for longitudinal active and passive data collection increases proportionally. Active data collection is when participants are prompted to perform a task, whereas passive data are collected unobtrusively with participants being unaware of the data collection; definitions and examples for both active and passive data are given in Table 1. Ecological momentary assessments (EMAs) are an example of active data collection and have gained considerable traction within the last decade with the administration of EMAs via smartphones. Participants receive notifications on their smartphones at specified times during the day to complete surveys and other tasks. EMAs offer researchers a method to assess participants (e.g., their thoughts and feelings, motor/cognitive/mood assessments) in real time and in their natural environment, which decreases the probability of recall bias [2]. However, active participation needed for EMAs may gradually yield less data over time as participants eventually stop using or have low participation rates for application-based activities and interventions [3]. To augment research capabilities of smartphones, researchers have turned to passive data collection, which increases the amount of information acquired while decreasing burdens on participants. Passive data collection is when data from smartphone sensors (e.g., GPS, accelerometer, keystroke dynamics) are acquired from participants unobtrusively, yet consensually, and thus goes unnoticed by participants. Current smartphone sensors can include precision dual-frequency global positioning system (GPS), digital compass, iBeacon microlocation, barometer, high dynamic range gyro, high g accelerometer, proximity sensors, dual ambient light sensors, and temperature sensors [4]. Figure 1 depicts the types of passive data that can be collected via a smartphone and its sensors. Using smartphone applications that can passively register activity in the background during usage but not record the content itself provides researchers with unparalleled access to data while still allowing for privacy. Additionally, it allows for objective data collection, as some self-reported measures have been shown to be less accurate when compared to passively collected data [5]. Passive data, because of its unobtrusive, longitudinal, objective and near-continuous collection, can provide researchers with insights into cognition and cognitive fluctuations outside of a laboratory setting and can reveal potential biomarkers for neuropsychiatric disorders. Moreover, keystroke dynamics and other passive data may provide better insights into cognition as cognitive tests in a laboratory setting provide limited insight—“snapshots” per se of cognition—compared to a more complete picture outside of a laboratory setting. For example, some cognitive tests focus on speed and reaction time, which may not realistically reflect how different cognitive processes relate to or modulate one another in real life. During these types of tests, participants are placed in a controlled environment devoid of usual day-to-day distractions, while at the same time, are cognizant of being observed and have the additional stress of needing to perform well [6,7]. In addition, patients who participate in clinical research may have required periodic testing to monitor disease course, or they may wish to participate in research but are hindered by the number of visits. Using smartphones to monitor disease progression and conduct research would decrease this burden. Passive data collection via smartphones provides a way to circumvent this barrier to long-term participation and makes research more accessible to a greater number of participants.
Within the last decade, researchers have used passive data collection via smartphones to investigate cognition. Preliminary results have shown that passive data collection can possibly be used in lieu of laboratory-based neuropsychology assessments [8]. Currently, bedside clinical screening tools for cognitive assessment may include the mini mental state examination (MMSE) [9], the abbreviated mental test [10], the mental status questionnaire [11], the short portable mental status questionnaire [12], and the Montreal cognitive assessment [13]. These rapid assessments are meant to be quick, cost-effective evaluations of cognition, but can be limited in their specificity. These clinical screening tools would then lead to additional in-depth neuropsychological assessments which require in-person assessments and yield only a cross-sectional view of cognition at the time of assessment. Smartphones would allow for not only accessible, longitudinal remote monitoring and assessments of intra-individual cognitive fluctuations, but also passive unobtrusive data collection, where participants are unaware that objective data are being collected.
One of the primary ways users actively interact with their smartphone (instead of merely passively browsing) is through keypresses and related keyboard dynamics (simply referred to as keystroke dynamics hereafter) that are passively collected via a modern smartphone’s virtual keyboard. Keystroke dynamics refer to keypress-related metadata (e.g., general category of the keypress, corresponding timing information of key down press and release time, incidences of autocorrect, etc.) on the smartphone keyboard but not the actual text. Intuitively, typing on a smartphone keyboard utilizes multiple cognitive domains. Articulating thoughts by typing on a smartphone keyboard requires awareness of both psychomotor and visuospatial processes [14]. Given the necessary cognitive and motor processes that must be engaged to type efficiently on a smartphone keyboard, it is plausible that cognitive deficits or dysfunction could be detectable via keystroke dynamics. In addition, fine, individualized motor movements can be sampled by triggering the accelerometer and/or gyroscope, thus opening up possibilities of detecting any subtle motor anomalies before any clinically diagnosable symptoms arise [15] and may provide important digital biomarkers to serve as advanced warnings of brain dysfunction. Moreover, quantitatively characterizing cognitive processes is particularly important given how their dysfunction is the basis of a plethora of disorders. With smartphones being increasingly used in data collection and research, we sought to summarize current research using keystroke dynamics to elucidate processes within the cognitive domain, as defined by the Research Domain Criteria (RDoC), as well as discuss future directions and the ethicality of using passive data collected from smartphones and other wearable technologies.

1.1. Defining Cognition and Its Domains

There are a multitude of ways to consider, study, and understand cognition. Relevant to this review, the National Institute of Mental Health created a research framework to collectively understand neuropsychiatric disorders: the Research Domain Criteria (RDoC) [16]. RDoC introduces one way of understanding cognition, where cognition consists of six domains: attention, working memory, declarative memory, language, cognitive control, and perception [17]. Attention refers to “a range of processes that regulate access to capacity-limited systems, such as awareness, higher perceptual processes, and motor action” [16], and perception is defined as the intake of sensory information which can then guide action [16]. Articulating thoughts by typing on a smartphone keyboard requires awareness of both psychomotor and visuospatial processes [14]. Neurodegenerative disorders have been shown to have both non-spatial and spatial deficits of visual attention and perception [18,19], and neuropsychiatric disorders have been shown to have some perception deficits [20,21] and attentional deficits in general [22]. Additionally, language is equally important to type coherently on a smartphone keyboard. Language is defined as “a system of shared symbolic representations of the world, the self and abstract concepts that supports thought and communication” [16]. The ability to communicate and to express desires and thoughts is vital for both physical and mental well-being. Given the complex networks within the brain that facilitate communication, language is an important marker of cognitive functioning, with semantic dysfunction manifesting earlier in certain neurodegenerative disorders than other symptoms [23,24,25]. Cognitive control is also important to cognition as it regulates the operation of cognitive and emotional systems to accomplish goals [16]. Neurodegenerative disorders can be distinguished by a progressive decline in gross and fine motor control and cognitive control, with evidence indicating disturbances in cognitive processes associating with motor deficits [26], which may be evident via changes in keystroke dynamics on smartphones. Lastly, declarative memory refers to the obtainment, consolidation, and retrieval of facts and events [16], while working memory is defined as “the active maintenance and flexible updating of goal/task relevant information (items, goals, strategies, etc.) in a form that has limited capacity and resists interference” [16]. Working memory is required when multitasking to keep certain tasks in mind (e.g., switching between smartphone applications for multiple tasks). Deficits of both types of memory have been shown in neuropsychiatric and neurodegenerative disorders [27,28]. Executing a series of keypresses efficiently and without errors may utilize all cognitive domains. For example, someone navigating public transit on the way to work while simultaneously trying to type long complex sentences with difficult-to-spell words would likely employ all six cognitive domains to accomplish their task. As motor and cognitive deficits increase in certain disorders, smartphone measurements can potentially monitor symptoms and provide information regarding the state of cognitive domains.

1.2. Intraindividual Variability

Intraindividual variability (IIV) refers to fluctuations in cognitive performance for tasks repeated over time. IIV consists of two categories: (1) inconsistency, the variability of cognitive performance in a single task over a short period of time, and (2) dispersion, the variability of cognitive performance across different tasks over time [29]. Reaction time, finger tapping, and memory capacity can be measured from these assessments and compared longitudinally. Previous research has shown possible links between IIV and neurological dysfunction, with IIV being a potential earlier marker for the initial cognitive changes associated with the onset of neurodegenerative diseases, such as Alzheimer’s disease [30,31], multiple sclerosis [32,33,34,35,36], as well as mild cognitive impairment [37,38,39]. IIV is also more sensitive during prodromal stages of neurodegenerative disorders and is a strong predictor of progressive cognitive decline [40,41,42]. Previously, smartphones have been used to measure cognition and IIV in participants with neurodegenerative disorders using EMAs [43,44,45], with some participants being asked up to six times a day to complete assessments and with varying adherence rates. By using keystroke dynamics to measure IIV and infer cognitive states, researchers would lessen any burdens which active data collection would place on participants and be able to obtain data from all participants, given the nature of passive data collection.

2. Methods

2.1. Search Strategy, Eligibility Criteria, and Selection Process

We conducted a number of searches to identify studies that used keystroke dynamics and other passive data types to assess neurocognitive functioning following the Preferred Reporting Items for System Reviews and Meta-Analyses guided (PRISMA) guidelines, with a flow diagram depicted in Figure 2. We initialized the search using these terms: “passive data” AND “cognition” AND/OR “smartphone” AND/OR “keystroke dynamics” AND/OR “keyboard dynamics” AND/OR “accelerometer” on PubMed and Embase with no filters. Google Scholar was additionally utilized. Searches were conducted until April 2023. Only studies written in English were included in this review. Studies were only included if they fulfilled the following criteria: (1) collected and analyzed passive data, specifically keystroke dynamics; (2) used smartphones to collect data; and (3) assessed cognitive functioning. Studies were then excluded if they: (1) did not assess cognition as defined by RDoC; (2) evaluated cognition but did not employ keystroke dynamics; and (3) used non-smartphones to collect keyboard dynamics. We analyzed all search results systematically by title, abstract, and keywords initially for relevancy and eligibility, followed by a full-text evaluation.

2.2. Data Extraction

The search yielded ten studies that met the inclusion criteria, which are summarized in Table 2. The following data were extracted from these studies into Microsoft Excel and Word tables: types of active and passive data collected, digital technologies used, overall findings, and types of analyses conducted.

3. Keystroke Dynamics and Affected Cognitive Domains in Neurodegenerative Disorders

3.1. Alzheimer’s Disease and Mild Cognitive Impairment

Alzheimer’s disease (AD) is a progressive neurodegenerative disease characterized by the gradual loss of motor function and cognitive facilities [56]. Studies have shown that language and speech can manifest as part of the early signs of mild cognitive impairment (MCI) and other prodromal stages of AD while correlating with declines in episodic and semantic memory [23,24]. These studies have also indicated that there may be a preclinical AD stage where cognitive, behavioral, sensory, and motor changes can possibly precede clinical manifestations of AD by years [24]. Researchers have examined how language characteristics change in participants with AD and found that AD can already influence temporal characteristics of spontaneous speech (i.e., increased hesitations) and in reading-out-loud and spoken tasks (i.e., verbal fluency difficulties) in early stages of AD [57]. These speech characteristic changes may translate through to keystroke dynamics as well. Researchers using passive data can measure the frequency of text messages, the duration for text messages to be typed, and other keystroke dynamics to infer these changes. In one study, symptomatic participants with MCI or AD received less text messages and sent less text messages than healthy controls [46]. Additionally, these symptomatic participants with MCI or AD had slower and more variable typing and tracing outcomes in different tasks on an assessment application. Another study, using an application which replaces the built-in keyboard with the application’s own custom keyboard to collect passive data, asked participants to complete structured assignments [47]. These assignments were to type paragraph-length texts as a response to a prompt on their smartphones. These assignments were performed in a non-clinical setting without autocorrect or a time limit, and then participants were asked to send these texts to the researchers so that they could be analyzed. Researchers found that participants with MCI used less nouns than verbs in the structured assignment. Additionally, using six months of passively acquired keystroke data along with natural language processing, researchers were able to detect mild cognitive impairment in patients and distinguish them from controls [47]. By discerning these subtle changes in texting, smartphones provide a potential way to detect MCIs and monitor cognitive fluctuations, allowing for treatments or close monitoring to be implemented earlier to improve the quality of life for patients with neurodegenerative disorders.

3.2. Multiple Sclerosis

Multiple sclerosis (MS) is a neurodegenerative immune-mediated disorder causing mobility and cognitive impairment as immune cells attack neurons in the central nervous system [58,59]. These impairments can be present early in the disease course, and atrophy captured by MRI can also be seen early on in the disease course [60]. Given that cognitive impairment is evident early on, being able to detect MS at its onset or near after provides a crucial window to stem the further progression of the disease. Thus far, studies have used smartphone applications (e.g., elevateMS) to assess motor and cognitive functions in patients with MS [61], but were impeded by incomplete data assessments that required active participation from patients in order to monitor symptoms and disease burden. Using passive data allows researchers to obtain data longitudinally, near-continuously, and unobtrusively, thus bypassing these obstacles. Indeed, by using longitudinal keystroke dynamics, researchers have been able to extract potential biomarkers for multiple sclerosis [49]. In one study, typing sessions were initially aggregated per day to obtain five summary statistics: mean, median, standard deviation, minimum, and maximum. Patients with MS had on average significantly higher keystroke latencies compared to controls. These keypress latencies were positively correlated with the expanded disability status scale (EDSS), while key release was positively correlated with the nine-hole peg test (NHPT). All keystroke features were negatively correlated with the symbol digit modalities test (SDMT). The median time of disease duration in patients was 5.7 years and the median of disease severity, using the EDSS, was 3.5 years within this cohort. Even with mild disease severity and with a shorter disease duration in patients with MS, distinctions between controls and patients were already apparent. Another study also examined the relationship between keystroke dynamics and cognitive functioning in participants with MS [48]. They found that typing speed and use of the backspace key along with autocorrection events correlated with a better cognitive functioning and less severe symptoms. These correlations imply that participants with MS who have more mild symptoms could potentially be better at monitoring and correcting their mistakes. Another study was able to group participants by detecting bradykinesia and rigidity in users’ dominant hands using machine learning algorithms on keystroke features [50]. Using one year’s worth of data, researchers found that participants with MS who had worse arm motor function had a higher latency between keypresses, and participants with MS who had a decreased processing speed corresponded with a higher latency using punctuation and backspace keys [50]. Using the same dataset, researchers were also able to estimate the levels of disease severity, manual dexterity, and cognitive capabilities from keystroke dynamics using a machine learning model that used three predictors (a time-related cluster, a cognitive-related cluster, and the number of times autofill was used) [51]. Participants with MS who were quicker to correct and adjust their texting had higher SDMT scores, an indicator of cognitive functioning, which helped with model predictions [51]. These studies show that keystroke dynamics can be used as potential biomarkers for MS before significant disease onset, which would allow for earlier treatments and preventative care.

4. Keystroke Dynamics and Affected Cognitive Domains in Mood Disorders

Certain mood disorders are associated with cognitive deficits [62,63,64], with cognitive deficits being established through neuropsychological tests for bipolar disorder [65,66,67] and depression [63,68,69]. Cognitive deficits that can be found in patients with mood disorders imply a disruption in cognitive control [70,71]. Cognitive control is a necessary ability to flexibly alter and guide behavior in the face of constantly changing circumstances, which is hindered in those with mood disorders. To examine cognitive control, task-switching paradigms (i.e., trail-making test part B) test cognitive flexibility [72], processing speed [73], and executive control [74]. Previously, these tests were administered in person via pencil and paper but have now been adapted and validated for digital devices (i.e., smartphones) [45,75]. Recently, researchers used smartphones and passive data collection to examine cognitive control in participants with mood disorders [52]. They found that participants with mood disorders not only showed lower cognitive performances on the trail-making test part B, but participants with mood disorders also had diurnal pattern differences in their keystroke dynamics compared to healthy controls, where individuals with higher cognitive performances had faster keystrokes and more consistent typing speeds throughout the day [52]. Another study examined processing speed and executive function in patients with bipolar disorder by comparing keystroke dynamics with a smartphone-based version of a task-switching paradigm and a depression rating scale [53]. Researchers found that typing speeds from keystroke dynamics, especially when compared to mood ratings, could potentially derive features of cognition and cognitive control, such as visual attention, processing speed, and task switching.
Changes in linguistic patterns can reflect certain mood states [76], and smartphones can provide a way to potentially measure these changes in mood in a non-clinical setting as well as provide objective measurements. Previously, patients with bipolar disorder in a depressive state were shown to have an impairment in phonemic fluency, while patients with bipolar disorder in a manic state were shown to have a moderate-to-large effect size deficient in language when it came to letter fluency and semantic fluency [66]. Recently, using smartphones and passive data collection, researchers examined keystroke dynamics and found that participants with bipolar disorder who had more depressive symptoms had increased autocorrect rates, while participants with bipolar disorder who were in a potentially more manic state used the backspace key less [54]. This can possibly be accounted for by a decreased ability to concentrate within depressed states and additionally a decreased self-monitoring known to happen with higher mania scores. In another study, researchers investigating keystroke dynamics in patients with depression found that patients with depression had longer hold times between both pressing and releasing a key and between releasing a key and pressing the next one [55]. Distilling these subtle changes in keystroke dynamics, especially in conjunction with depression scores, would allow researchers and clinicians to monitor any potential cognitive dysfunction, which would allow for early intervention or treatment for particular mood disorders. Early intervention could be crucial and provide life-saving treatment.

5. Discussion

The aim of this review was to examine how researchers have been using keystroke dynamics from smartphones to examine cognition. Keystroke dynamics have provided potential digital biomarkers to infer cognitive functioning outside of the laboratory. In general, smartphones allow for unobtrusive, near-continuous, and longitudinal passive data collection, which provides a unique means for future research directions. From having individual cognitive footprints [77] to predicting mood states [54], keystroke dynamics coupled with accelerometry appears to provide sufficient informative data to distinguish healthy controls from people with neuropsychiatric disorders. Currently, most studies implementing passive data to investigate cognition have been through a lens of neuropsychiatric disorders, extracting potential biomarkers from keystroke dynamics and other passive data. Although these biomarkers show promising results, more research must be conducted before remote diagnoses or disease monitoring can replace expert evaluation. Additionally, as passive data collection and analyses become more advanced, perhaps the lens examining disorders can be expanded, and the biomarkers found from examining disorders can be applied universally for preventative measures and early diagnoses. However, there currently remain many obstacles to assessing neurocognitive functioning and predicting cognitive fluctuations. One limitation we encountered was that some studies we reviewed found potential biomarkers through detecting statistically significant group differences in keystroke dynamics, while other studies used predictive models utilizing features from keystroke dynamics. Studies that used only statistical analyses could apply prospective biomarkers to predictive models and examine the capability of said biomarkers. Another limitation we encountered for studies using statistical analyses that specifically used mixed-effect models were that there were no effect sizes for these analyses, given the complexity of different variances at each level. Additionally, sample sizes were often small or skewed toward those with disorders. We would suggest implementing a longitudinal follow-up of cohorts consisting of those with neuropsychiatric disorders to observe if the same digital passive biomarkers can indeed predict cognition in subsequent in-person evaluations. Another potential biomarker from passive data would be using GPS location entropy, a measure of regularity, in conjunction with keystroke dynamics. These could potentially yield new predictive biomarkers to provide warnings before significant mood or cognitive changes, as decreased entropy in GPS location data have previously been associated with depression [78,79,80].
As previously mentioned, cognition, as defined by RDoC, has six domains [16], but not all were quantifiable. Some domains were not elaborated on due to a current lack of research comparing passive data measurements to the gold-standard measurements of these domains. This is a gap in the literature that can be improved upon and is an essential area to investigate, given the negative implications of declining attention, perception, working, and declarative memory. With innovative large language models being created and updated, there may soon be ways to combine them with keyboard dynamics to examine cognition and further research toward specific cognitive domains.
Ethical concerns certainly arise regarding informed consent, smartphone usage, passive data collection, and privacy. Data literacy can be a key component of informed consent for certain research studies. For some neuropsychiatric disorders with longitudinal studies, obtaining participant consent multiple times throughout the study may be necessary and would be ethically important as the disease course progresses. As some neuropsychiatric disorders progress, participants may reconsider their participation in research studies and may want to opt-out of studies or may even lack the cognitive facilities to make informed choices. Because of the dynamic nature of cognition for certain neuropsychiatric disorders, researchers must be cognizant of these possibilities and may want to periodically confirm participant consent, potentially using over-the-air updates via smartphones for this purpose. Researchers also need to ensure that participants not only understand the purpose for the research being conducted but researchers also need to be able to explain the technology being used in layman terms to ensure full comprehension.
In terms of passive data collection and privacy, with the enormous amounts of data being continually collected from participants, researchers not only need to ensure that participants understand the extent of data collection, but are also obligated to handle the data securely and to anonymize the data when necessary. In 2018, the European Union (EU) has passed the General Data Protection Regulation (GDPR) law to protect the right to privacy for its residents [81]. The GDPR strengthens data rights of all EU residents and holds data controllers accountable to keep digital data private. In comparison, the USA does not currently have an equivalent comprehensive law that protects consumer data. There are niche laws that protect certain types of personal data (e.g., Health Insurance Portability and Accountability Act (HIPPA), Electronic Communications Privacy Act (EPCA), Children’s Online Privacy Protection Rule (COPPA), Family Educational Rights and Privacy Act (FERPA), etc.) that currently provide some protection for digital privacy. However, there are several comprehensive acts of legislation for data privacy, at both the state and federal levels, that are underway and meant to protect individual data privacy. As passive data collection becomes more utilized, perhaps government oversight (e.g., the GDPR law) is necessary to have a universally accepted method of storing and encrypting data to ensure the privacy of participants.
In the realm of digital privacy, all smartphone users emit digital exhaust. Digital exhaust encompasses all the information that smartphone users create and leave behind as they browse websites and applications. By data mining digital exhaust, researchers could study individuals and their unique patterns and potentially distinguish individuals from their data. Researchers could then derive more information regarding specific individuals, jeopardizing their right to privacy, especially in a medical context, as being able to identify individuals could be harmful if the data were breached and then searchable. As increased amounts of data can be extracted from smartphone usage, the distinction between personal digital data and health data becomes more obscure, as evidenced by the studies reviewed herein that demonstrated the utility of keystroke dynamics in passively inferring cognition. It then becomes the responsibility of researchers to ensure the privacy of their research participants. With participants’ privacy in mind, researchers could examine smartphone users’ digital exhaust to explore cognitive domains. However, with its dynamic nature, the varying amounts of digital exhaust can modulate users’ cognitive performance as well.
Arguably, the relationship between smartphone usage and cognition is a complicated one, with research suggesting not only negative effects but also positive and potentially neutral ones. Smartphone usage, especially in cases of excessive use and in addiction, has garnered a negative reputation by being a distraction from tasks at hand [82] and potentially decreasing concentration levels and worsening impulse control [83]. Excessive smartphone usage is also potentially associated with higher rates of depression, anxiety, and smartphone addiction [84,85], which can negatively affect cognition and lower academic performance [86]. Additionally, smartphone users may perceive their smartphone usage as less than their actual usage [5,87,88]. Furthermore, smartphone notifications can be distractive by provoking phone checking, which might lead to habitual checking [89,90] and, potentially, smartphone addiction [91]. This in turn can interrupt attention and focus on other tasks [92,93], decreasing executive function [94] and increasing cognitive failures [95]. Moreover, social media applications on smartphones can also have a negative impact on cognitive control [96], especially in adolescents [97], and an increased use of these applications can lead to cognitive failures [95,98], along with being a risk factor for worsening mental health [99,100]. In addition, available cognitive capacity can decline when in the mere presence of a smartphone [101], and having a smartphone within view can cause distractions from the task at hand and impair productivity, despite not being actively checked [92,102]. In contrast to these negative findings, some studies could not find strong evidence demonstrating the detrimental effects of smartphones on cognition [103,104]. Furthermore, some research has also shown that smartphones may aid cognition and memory through various means. Some smartphone applications may help in decreasing cognitive load through certain tools and applications (i.e., writing down tasks in a list using a smartphone, smartphone calendar reminders for appointments, registering contacts’ phone numbers) [95], allowing for said cognitive space to be used for other purposes. In one study, participants were able to more accurately complete a task by using smartphones to help record and remember parts of the task [105], but when smartphones were taken away in a subsequent task, participants fared worse than when they had not depended on a smartphone originally. Smartphones can also have specific applications geared toward training better cognitive function effectively, especially for the elderly [106]. As digital health technologies become more widespread, researchers using passive data collection to study cognition should be cognizant of this complex relationship and find ways to bypass any incongruencies.
In conclusion, for the last decade, smartphones have provided researchers with a new device and avenue for health technologies. As smartphones become even more ubiquitous, their impact on health research exponentially increases as well. The number of features that researchers have been able to extract thus far from passive data collected from digital technologies already have numerous clinical applications. In this review, we have shown the utility of using keystroke dynamics and the richness it can provide for data analyses, especially when compared in conjunction with other passively and actively collected data. However, there remains innumerous features that can be extracted further from passive data, and ensuring digital data privacy for participants and obtaining their consent for longitudinal studies, especially for those with neuropsychiatric and neurodegenerative disorders, is key.

Author Contributions

Conceptualization, A.D.L.; writing—original draft preparation, T.M.N.; writing—review and editing, T.M.N., A.D.L. and O.A.; supervision, A.D.L. and O.A. All authors have read and agreed to the published version of the manuscript.

Funding

This work was funded by the National Institute of Mental Health of the National Institutes of Health (NIMH-NIH), 5T32MH067631 and R01MH120168.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

Alex D. Leow reports being a cofounder of KeyWise AI, serving on the medical advisory board for digital medicine for Otsuka, USA. Olusola Ajilore reports being a cofounder of KeyWise AI and serving on the advisory board of Embodied Labs and Blueprint Health.

References

  1. Pew Research Center. Mobile Fact Sheet; Pew Research Center: Washington, DC, USA, 2021. [Google Scholar]
  2. Shiffman, S.; Stone, A.A.; Hufford, M.R. Ecological Momentary Assessment. Annu. Rev. Clin. Psychol. 2008, 4, 1–32. [Google Scholar] [CrossRef]
  3. Meyerowitz-Katz, G.; Ravi, S.; Arnolda, L.; Feng, X.; Maberly, G.; Astell-Burt, T. Rates of Attrition and Dropout in App-Based Interventions for Chronic Disease: Systematic Review and Meta-Analysis. J. Med. Internet Res. 2020, 22, e20283. [Google Scholar] [CrossRef]
  4. Khokhlov, I.; Reznik, L.; Ajmera, S. Sensors in Mobile Devices Knowledge Base. IEEE Sens. Lett. 2020, 4, 1–4. [Google Scholar] [CrossRef]
  5. Parry, D.A.; Davidson, B.I.; Sewall, C.J.R.; Fisher, J.T.; Mieczkowski, H.; Quintana, D.S. A systematic review and meta-analysis of discrepancies between logged and self-reported digital media use. Nat. Hum. Behav. 2021, 5, 1535–1547. [Google Scholar] [CrossRef] [PubMed]
  6. Slegers, K.; van Boxtel, M.; Jolles, J. Effects of computer training and internet usage on cognitive abilities in older adults: A randomized controlled study. Aging Clin. Exp. Res. 2009, 21, 43–54. [Google Scholar] [CrossRef] [PubMed]
  7. Kliegel, M.; McDaniel, M.A.; Einstein, G.O. Prospective Memory: Cognitive, Neuroscience, Developmental, and Applied Perspectives; Psychology Press: London, UK, 2007. [Google Scholar]
  8. Dagum, P. Digital biomarkers of cognitive function. NPJ Digit. Med. 2018, 1, 10. [Google Scholar] [CrossRef]
  9. Folstein, M.F.; Folstein, S.E.; McHugh, P.R. “Mini-mental state”: A practical method for grading the cognitive state of patients for the clinician. J. Psychiatr. Res. 1975, 12, 189–198. [Google Scholar] [CrossRef]
  10. Hodkinson, H.M. Evaluation of a Mental Test Score for Assessment of Mental Impairment in the Elderly. Age Ageing 1972, 1, 233–238. [Google Scholar] [CrossRef]
  11. Kahn, R.L.; Goldfarb, A.I.; Pollack, M.A.X.; Peck, A. Brief Objective Measures for the Determination of Mental Status in the Aged. Am. J. Psychiatry 1960, 117, 326–328. [Google Scholar] [CrossRef]
  12. Pfeiffer, E. A Short Portable Mental Status Questionnaire for the Assessment of Organic Brain Deficit in Elderly Patients. J. Am. Geriatr. Soc. 1975, 23, 433–441. [Google Scholar] [CrossRef]
  13. Nasreddine, Z.S.; Phillips, N.A.; Bedirian, V.; Charbonneau, S.; Whitehead, V.; Collin, I.; Cummings, J.L.; Chertkow, H. The Montreal Cognitive Assessment, MoCA: A brief screening tool for mild cognitive impairment. J. Am. Geriatr. Soc. 2005, 53, 695–699. [Google Scholar] [CrossRef] [PubMed]
  14. Jiang, X.; Jokinen, J.P.P.; Oulasvirta, A.; Ren, X. Learning to type with mobile keyboards: Findings with a randomized keyboard. Comput. Hum. Behav. 2022, 126, 106992. [Google Scholar] [CrossRef]
  15. Alfalahi, H.; Khandoker, A.H.; Chowdhury, N.; Iakovakis, D.; Dias, S.B.; Chaudhuri, K.R.; Hadjileontiadis, L.J. Diagnostic accuracy of keystroke dynamics as digital biomarkers for fine motor decline in neuropsychiatric disorders: A systematic review and meta-analysis. Sci. Rep. 2022, 12, 7690. [Google Scholar] [CrossRef] [PubMed]
  16. National Institue of Mental Health. Definitions of the RDoC Domains and Constructs; National Institue of Mental Health: Bethesda, MD, USA, 2022.
  17. Sanislow, C.A.; Pine, D.S.; Quinn, K.J.; Kozak, M.J.; Garvey, M.A.; Heinssen, R.K.; Wang, P.S.; Cuthbert, B.N. Developing constructs for psychopathology research: Research domain criteria. J. Abnorm. Psychol. 2010, 119, 631–639. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  18. Bublak, P.; Redel, P.; Finke, K. Spatial and non-spatial attention deficits in neurodegenerative diseases: Assessment based on Bundesen’s theory of visual attention (TVA). Restor. Neurol. Neurosci. 2006, 24, 287–301. [Google Scholar] [PubMed]
  19. Uc, E.Y.; Rizzo, M. Driving and neurodegenerative diseases. Curr. Neurol. Neurosci. Rep. 2008, 8, 377–383. [Google Scholar] [CrossRef] [Green Version]
  20. O’Callaghan, C.; Kveraga, K.; Shine, J.M.; Adams, R.B., Jr.; Bar, M. Predictions penetrate perception: Converging insights from brain, behaviour and disorder. Conscious. Cogn. 2017, 47, 63–74. [Google Scholar] [CrossRef] [Green Version]
  21. Lazar, S.M.; Evans, D.W.; Myers, S.M.; Moreno-De Luca, A.; Moore, G.J. Social cognition and neural substrates of face perception: Implications for neurodevelopmental and neuropsychiatric disorders. Behav. Brain Res. 2014, 263, 1–8. [Google Scholar] [CrossRef]
  22. Robbins, T.W.; Arnsten, A.F. The neuropsychopharmacology of fronto-executive function: Monoaminergic modulation. Annu. Rev. Neurosci. 2009, 32, 267–287. [Google Scholar] [CrossRef] [Green Version]
  23. Ahmed, S.; Haigh, A.M.; de Jager, C.A.; Garrard, P. Connected speech as a marker of disease progression in autopsy-proven Alzheimer’s disease. Brain 2013, 136, 3727–3737. [Google Scholar] [CrossRef] [Green Version]
  24. Sperling, R.; Mormino, E.; Johnson, K. The evolution of preclinical Alzheimer’s disease: Implications for prevention trials. Neuron 2014, 84, 608–622. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  25. Chan, J.C.S.; Stout, J.C.; Vogel, A.P. Speech in prodromal and symptomatic Huntington’s disease as a model of measuring onset and progression in dominantly inherited neurodegenerative diseases. Neurosci. Biobehav. Rev. 2019, 107, 450–460. [Google Scholar] [CrossRef]
  26. Montero-Odasso, M.; Verghese, J.; Beauchet, O.; Hausdorff, J.M. Gait and cognition: A complementary approach to understanding brain function and the risk of falling. J. Am. Geriatr. Soc. 2012, 60, 2127–2136. [Google Scholar] [CrossRef] [Green Version]
  27. Nestor, P.J.; Fryer, T.D.; Hodges, J.R. Declarative memory impairments in Alzheimer’s disease and semantic dementia. Neuroimage 2006, 30, 1010–1020. [Google Scholar] [CrossRef] [PubMed]
  28. Christopher, G.; MacDonald, J. The impact of clinical depression on working memory. Cogn. Neuropsychiatry 2005, 10, 379–399. [Google Scholar] [CrossRef] [PubMed]
  29. Hultsch, D.F.; MacDonald, S.W.; Hunter, M.A.; Levy-Bencheton, J.; Strauss, E. Intraindividual variability in cognitive performance in older adults: Comparison of adults with mild dementia, adults with arthritis, and healthy adults. Neuropsychology 2000, 14, 588–598. [Google Scholar] [CrossRef] [PubMed]
  30. Christ, B.U.; Combrinck, M.I.; Thomas, K.G.F. Both Reaction Time and Accuracy Measures of Intraindividual Variability Predict Cognitive Performance in Alzheimer’s Disease. Front. Hum. Neurosci. 2018, 12, 124. [Google Scholar] [CrossRef] [Green Version]
  31. Duchek, J.M.; Balota, D.A.; Tse, C.S.; Holtzman, D.M.; Fagan, A.M.; Goate, A.M. The utility of intraindividual variability in selective attention tasks as an early marker for Alzheimer’s disease. Neuropsychology 2009, 23, 746–758. [Google Scholar] [CrossRef] [Green Version]
  32. Mazerolle, E.L.; Wojtowicz, M.A.; Omisade, A.; Fisk, J.D. Intra-individual variability in information processing speed reflects white matter microstructure in multiple sclerosis. Neuroimage Clin. 2013, 2, 894–902. [Google Scholar] [CrossRef] [Green Version]
  33. Holtzer, R.; Foley, F.; D’Orio, V.; Spat, J.; Shuman, M.; Wang, C. Learning and cognitive fatigue trajectories in multiple sclerosis defined using a burst measurement design. Mult. Scler. 2013, 19, 1518–1525. [Google Scholar] [CrossRef]
  34. De Meijer, L.; Merlo, D.; Skibina, O.; Grobbee, E.J.; Gale, J.; Haartsen, J.; Maruff, P.; Darby, D.; Butzkueven, H.; Van der Walt, A. Monitoring cognitive change in multiple sclerosis using a computerized cognitive battery. Mult. Scler. J. Exp. Transl. Clin. 2018, 4, 2055217318815513. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  35. Riegler, K.E.; Cadden, M.; Guty, E.T.; Bruce, J.M.; Arnett, P.A. Perceived Fatigue Impact and Cognitive Variability in Multiple Sclerosis. J. Int. Neuropsychol. Soc. 2022, 28, 281–291. [Google Scholar] [CrossRef]
  36. Wojtowicz, M.; Mazerolle, E.L.; Bhan, V.; Fisk, J.D. Altered functional connectivity and performance variability in relapsing-remitting multiple sclerosis. Mult. Scler. 2014, 20, 1453–1463. [Google Scholar] [CrossRef] [PubMed]
  37. Chow, R.; Rabi, R.; Paracha, S.; Vasquez, B.P.; Hasher, L.; Alain, C.; Anderson, N.D. Reaction Time Intraindividual Variability Reveals Inhibitory Deficits in Single- and Multiple-Domain Amnestic Mild Cognitive Impairment. J. Gerontol. B Psychol. Sci. Soc. Sci. 2022, 77, 71–83. [Google Scholar] [CrossRef] [PubMed]
  38. Haynes, B.I.; Bauermeister, S.; Bunce, D. A Systematic Review of Longitudinal Associations Between Reaction Time Intraindividual Variability and Age-Related Cognitive Decline or Impairment, Dementia, and Mortality. J. Int. Neuropsychol. Soc. 2017, 23, 431–445. [Google Scholar] [CrossRef] [PubMed]
  39. Christensen, H.; Dear, K.B.; Anstey, K.J.; Parslow, R.A.; Sachdev, P.; Jorm, A.F. Within-occasion intraindividual variability and preclinical diagnostic status: Is intraindividual variability an indicator of mild cognitive impairment? Neuropsychology 2005, 19, 309–317. [Google Scholar] [CrossRef]
  40. Lovden, M.; Li, S.C.; Shing, Y.L.; Lindenberger, U. Within-person trial-to-trial variability precedes and predicts cognitive decline in old and very old age: Longitudinal data from the Berlin Aging Study. Neuropsychologia 2007, 45, 2827–2838. [Google Scholar] [CrossRef] [Green Version]
  41. Bielak, A.A.; Hultsch, D.F.; Strauss, E.; Macdonald, S.W.; Hunter, M.A. Intraindividual variability in reaction time predicts cognitive outcomes 5 years later. Neuropsychology 2010, 24, 731–741. [Google Scholar] [CrossRef]
  42. Scott, B.M.; Austin, T.; Royall, D.R.; Hilsabeck, R.C. Cognitive intraindividual variability as a potential biomarker for early detection of cognitive and functional decline. Neuropsychology 2023, 37, 52–63. [Google Scholar] [CrossRef]
  43. Weizenbaum, E.L.; Fulford, D.; Torous, J.; Pinsky, E.; Kolachalama, V.B.; Cronin-Golomb, A. Smartphone-Based Neuropsychological Assessment in Parkinson’s Disease: Feasibility, Validity, and Contextually Driven Variability in Cognition. J. Int. Neuropsychol. Soc. 2022, 28, 401–413. [Google Scholar] [CrossRef]
  44. Moore, R.C.; Ackerman, R.A.; Russell, M.T.; Campbell, L.M.; Depp, C.A.; Harvey, P.D.; Pinkham, A.E. Feasibility and validity of ecological momentary cognitive testing among older adults with mild cognitive impairment. Front. Digit. Health 2022, 4, 946685. [Google Scholar] [CrossRef]
  45. Moore, R.C.; Swendsen, J.; Depp, C.A. Applications for self-administered mobile cognitive assessments in clinical research: A systematic review. Int. J. Methods Psychiatr. Res. 2017, 26, e1562. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  46. Chen, R.; Jankovic, F.; Marinsek, N.; Foschini, L.; Kourtis, L.; Signorini, A.; Pugh, M.; Shen, J.; Yaari, R.; Maljkovic, V.; et al. Developing Measures of Cognitive Impairment in the Real World from Consumer-Grade Multimodal Sensor Streams. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, Anchorage, AK, USA, 4–8 August 2019; pp. 2145–2155. [Google Scholar]
  47. Ntracha, A.; Iakovakis, D.; Hadjidimitriou, S.; Charisis, V.S.; Tsolaki, M.; Hadjileontiadis, L.J. Detection of Mild Cognitive Impairment Through Natural Language and Touchscreen Typing Processing. Front. Digit. Health 2020, 2, 567158. [Google Scholar] [CrossRef] [PubMed]
  48. Chen, M.H.; Leow, A.; Ross, M.K.; DeLuca, J.; Chiaravalloti, N.; Costa, S.L.; Genova, H.M.; Weber, E.; Hussain, F.; Demos, A.P. Associations between smartphone keystroke dynamics and cognition in MS. Digit. Health 2022, 8, 20552076221143234. [Google Scholar] [CrossRef] [PubMed]
  49. Lam, K.H.; Meijer, K.A.; Loonstra, F.C.; Coerver, E.M.E.; Twose, J.; Redeman, E.; Moraal, B.; Barkhof, F.; de Groot, V.; Uitdehaag, B.M.J.; et al. Real-world keystroke dynamics are a potentially valid biomarker for clinical disability in multiple sclerosis. Mult. Scler. J. 2020, 27, 1421–1431. [Google Scholar] [CrossRef]
  50. Lam, K.H.; Twose, J.; Lissenberg-Witte, B.; Licitra, G.; Meijer, K.; Uitdehaag, B.; De Groot, V.; Killestein, J. The Use of Smartphone Keystroke Dynamics to Passively Monitor Upper Limb and Cognitive Function in Multiple Sclerosis: Longitudinal Analysis. J. Med. Internet Res. 2022, 24, e37614. [Google Scholar] [CrossRef] [PubMed]
  51. Hoeijmakers, A.; Licitra, G.; Meijer, K.; Lam, K.H.; Molenaar, P.; Strijbis, E.; Killestein, J. Disease severity classification using passively collected smartphone-based keystroke dynamics within multiple sclerosis. Sci. Rep. 2023, 13, 1871. [Google Scholar] [CrossRef]
  52. Ning, E.; Cladek, A.T.; Ross, M.K.; Kabir, S.; Barve, A.; Kennelly, E.; Hussain, F.; Duffecy, J.; Langenecker, S.L.; Nguyen, T.; et al. Smartphone-derived Virtual Keyboard Dynamics Coupled with Accelerometer Data as a Window into Understanding Brain Health: Smartphone Keyboard and Accelerometer as Window into Brain Health. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, Hamburg, Germany, 23–28 April 2023; p. 326. [Google Scholar]
  53. Ross, M.K.; Demos, A.P.; Zulueta, J.; Piscitello, A.; Langenecker, S.A.; McInnis, M.; Ajilore, O.; Nelson, P.C.; Ryan, K.A.; Leow, A. Naturalistic smartphone keyboard typing reflects processing speed and executive function. Brain Behav. 2021, 11, e2363. [Google Scholar] [CrossRef]
  54. Zulueta, J.; Piscitello, A.; Rasic, M.; Easter, R.; Babu, P.; Langenecker, S.A.; McInnis, M.; Ajilore, O.; Nelson, P.C.; Ryan, K.; et al. Predicting Mood Disturbance Severity with Mobile Phone Keystroke Metadata: A BiAffect Digital Phenotyping Study. J. Med. Internet Res. 2018, 20, e241. [Google Scholar] [CrossRef]
  55. Mastoras, R.E.; Iakovakis, D.; Hadjidimitriou, S.; Charisis, V.; Kassie, S.; Alsaadi, T.; Khandoker, A.; Hadjileontiadis, L.J. Touchscreen typing pattern analysis for remote detection of the depressive tendency. Sci. Rep. 2019, 9, 13414. [Google Scholar] [CrossRef] [Green Version]
  56. Blennow, K.; de Leon, M.J.; Zetterberg, H. Alzheimer’s disease. Lancet 2006, 368, 387–403. [Google Scholar] [CrossRef] [PubMed]
  57. Szatloczki, G.; Hoffmann, I.; Vincze, V.; Kalman, J.; Pakaski, M. Speaking in Alzheimer’s Disease, is That an Early Sign? Importance of Changes in Language Abilities in Alzheimer’s Disease. Front. Aging Neurosci. 2015, 7, 195. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  58. McFarlin, D.E.; McFarland, H.F. Multiple Sclerosis (first of two parts). N. Engl. J. Med. 1982, 307, 1183–1188. [Google Scholar] [CrossRef]
  59. McFarlin, D.E.; McFarland, H.F. Multiple Sclerosis (second of two parts). N. Engl. J. Med. 1982, 307, 1246–1251. [Google Scholar] [CrossRef] [PubMed]
  60. Lebrun, C.; Blanc, F.; Brassat, D.; Zephir, H.; de Seze, J.; Cfsep. Cognitive function in radiologically isolated syndrome. Mult. Scler. 2010, 16, 919–925. [Google Scholar] [CrossRef]
  61. Pratap, A.; Grant, D.; Vegesna, A.; Tummalacherla, M.; Cohan, S.; Deshpande, C.; Mangravite, L.; Omberg, L. Evaluating the Utility of Smartphone-Based Sensor Assessments in Persons With Multiple Sclerosis in the Real-World Using an App (elevateMS): Observational, Prospective Pilot Digital Health Study. JMIR Mhealth Uhealth 2020, 8, e22108. [Google Scholar] [CrossRef] [PubMed]
  62. Porter, R.J.; Robinson, L.J.; Malhi, G.S.; Gallagher, P. The neurocognitive profile of mood disorders—A review of the evidence and methodological issues. Bipolar Disord. 2015, 17, 21–40. [Google Scholar] [CrossRef]
  63. Rock, P.L.; Roiser, J.P.; Riedel, W.J.; Blackwell, A.D. Cognitive impairment in depression: A systematic review and meta-analysis. Psychol. Med. 2014, 44, 2029–2040. [Google Scholar] [CrossRef] [Green Version]
  64. Malhi, G.S.; Ivanovski, B.; Hadzi-Pavlovic, D.; Mitchell, P.B.; Vieta, E.; Sachdev, P. Neuropsychological deficits and functional impairment in bipolar depression, hypomania and euthymia. Bipolar Disord. 2007, 9, 114–125. [Google Scholar] [CrossRef]
  65. Bourne, C.; Aydemir, O.; Balanza-Martinez, V.; Bora, E.; Brissos, S.; Cavanagh, J.T.; Clark, L.; Cubukcuoglu, Z.; Dias, V.V.; Dittmann, S.; et al. Neuropsychological testing of cognitive impairment in euthymic bipolar disorder: An individual patient data meta-analysis. Acta Psychiatr. Scand. 2013, 128, 149–162. [Google Scholar] [CrossRef]
  66. Kurtz, M.M.; Gerraty, R.T. A meta-analytic investigation of neurocognitive deficits in bipolar illness: Profile and effects of clinical state. Neuropsychology 2009, 23, 551–562. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  67. Robinson, L.J.; Thompson, J.M.; Gallagher, P.; Goswami, U.; Young, A.H.; Ferrier, I.N.; Moore, P.B. A meta-analysis of cognitive deficits in euthymic patients with bipolar disorder. J. Affect. Disord. 2006, 93, 105–115. [Google Scholar] [CrossRef]
  68. McDermott, L.M.; Ebmeier, K.P. A meta-analysis of depression severity and cognitive function. J. Affect. Disord. 2009, 119, 1–8. [Google Scholar] [CrossRef] [PubMed]
  69. Austin, M.P.; Mitchell, P.; Goodwin, G.M. Cognitive deficits in depression: Possible implications for functional neuropathology. Br. J. Psychiatry 2001, 178, 200–206. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  70. McTeague, L.M.; Huemer, J.; Carreon, D.M.; Jiang, Y.; Eickhoff, S.B.; Etkin, A. Identification of Common Neural Circuit Disruptions in Cognitive Control Across Psychiatric Disorders. Am. J. Psychiatry 2017, 174, 676–685. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  71. Fales, C.L.; Barch, D.M.; Rundle, M.M.; Mintun, M.A.; Snyder, A.Z.; Cohen, J.D.; Mathews, J.; Sheline, Y.I. Altered Emotional Interference Processing in Affective and Cognitive-Control Brain Circuitry in Major Depression. Biol. Psychiatry 2008, 63, 377–384. [Google Scholar] [CrossRef] [Green Version]
  72. Crowe, S.F. The differential contribution of mental tracking, cognitive flexibility, visual search, and motor speed to performance on parts A and B of the Trail Making Test. J. Clin. Psychol. 1998, 54, 585–591. [Google Scholar] [CrossRef]
  73. Reitan, R.M. Validity of the Trail Making Test as an indicator of organic brain damage. Percept. Mot. Ski. 1958, 8, 271–276. [Google Scholar] [CrossRef]
  74. Arbuthnott, K.; Frank, J. Trail Making Test, Part B as a Measure of Executive Control: Validation Using a Set-Switching Paradigm. J. Clin. Exp. Neuropsychol. 2000, 22, 518–528. [Google Scholar] [CrossRef]
  75. Brouillette, R.M.; Foil, H.; Fontenot, S.; Correro, A.; Allen, R.; Martin, C.K.; Bruce-Keller, A.J.; Keller, J.N. Feasibility, reliability, and validity of a smartphone based application for the assessment of cognitive function in the elderly. PLoS ONE 2013, 8, e65925. [Google Scholar] [CrossRef] [Green Version]
  76. Faurholt-Jepsen, M.; Vinberg, M.; Frost, M.; Debel, S.; Margrethe Christensen, E.; Bardram, J.E.; Kessing, L.V. Behavioral activities collected through smartphones and the association with illness activity in bipolar disorder. Int. J. Methods Psychiatr. Res. 2016, 25, 309–323. [Google Scholar] [CrossRef] [PubMed]
  77. Chang, J.M.; Fang, C.C.; Ho, K.H.; Kelly, N.; Wu, P.Y.; Ding, Y.; Chu, C.; Gilbert, S.; Kamal, A.E.; Kung, S.Y. Capturing Cognitive Fingerprints from Keystroke Dynamics. IT Prof. 2013, 15, 24–28. [Google Scholar] [CrossRef] [Green Version]
  78. Opoku Asare, K.; Moshe, I.; Terhorst, Y.; Vega, J.; Hosio, S.; Baumeister, H.; Pulkki-Råback, L.; Ferreira, D. Mood ratings and digital biomarkers from smartphone and wearable data differentiates and predicts depression status: A longitudinal data analysis. Pervasive Mob. Comput. 2022, 83, 101621. [Google Scholar] [CrossRef]
  79. Saeb, S.; Zhang, M.; Karr, C.J.; Schueller, S.M.; Corden, M.E.; Kording, K.P.; Mohr, D.C. Mobile Phone Sensor Correlates of Depressive Symptom Severity in Daily-Life Behavior: An Exploratory Study. J. Med. Internet Res. 2015, 17, e175. [Google Scholar] [CrossRef]
  80. Moshe, I.; Terhorst, Y.; Opoku Asare, K.; Sander, L.B.; Ferreira, D.; Baumeister, H.; Mohr, D.C.; Pulkki-Raback, L. Predicting Symptoms of Depression and Anxiety Using Smartphone and Wearable Data. Front. Psychiatry 2021, 12, 625247. [Google Scholar] [CrossRef]
  81. Voigt, P.; Von dem Bussche, A. The eu general data protection regulation (GDPR). In A Practical Guide, 1st ed.; Springer International Publishing: Cham, Switzerland, 2017; Volume 10, pp. 9–249. [Google Scholar]
  82. Throuvala, M.A.; Pontes, H.M.; Tsaousis, I.; Griffiths, M.D.; Rennoldson, M.; Kuss, D.J. Exploring the Dimensions of Smartphone Distraction: Development, Validation, Measurement Invariance, and Latent Mean Differences of the Smartphone Distraction Scale (SDS). Front. Psychiatry 2021, 12, 642634. [Google Scholar] [CrossRef]
  83. Hadar, A.; Hadas, I.; Lazarovits, A.; Alyagon, U.; Eliraz, D.; Zangen, A. Answering the missed call: Initial exploration of cognitive and electrophysiological changes associated with smartphone use and abuse. PLoS ONE 2017, 12, e0180094. [Google Scholar] [CrossRef] [Green Version]
  84. Demirci, K.; Akgonul, M.; Akpinar, A. Relationship of smartphone use severity with sleep quality, depression, and anxiety in university students. J. Behav. Addict. 2015, 4, 85–92. [Google Scholar] [CrossRef] [Green Version]
  85. Choi, J.S.; Park, S.M.; Roh, M.S.; Lee, J.Y.; Park, C.B.; Hwang, J.Y.; Gwak, A.R.; Jung, H.Y. Dysfunctional inhibitory control and impulsivity in Internet addiction. Psychiatry Res. 2014, 215, 424–428. [Google Scholar] [CrossRef]
  86. Felisoni, D.D.; Godoi, A.S. Cell phone usage and academic performance: An experiment. Comput. Educ. 2018, 117, 175–187. [Google Scholar] [CrossRef]
  87. Wade, N.E.; Ortigara, J.M.; Sullivan, R.M.; Tomko, R.L.; Breslin, F.J.; Baker, F.C.; Fuemmeler, B.F.; Delrahim Howlett, K.; Lisdahl, K.M.; Marshall, A.T.; et al. Passive Sensing of Preteens’ Smartphone Use: An Adolescent Brain Cognitive Development (ABCD) Cohort Substudy. JMIR Ment. Health 2021, 8, e29426. [Google Scholar] [CrossRef] [PubMed]
  88. Lin, Y.H.; Lin, Y.C.; Lee, Y.H.; Lin, P.H.; Lin, S.H.; Chang, L.R.; Tseng, H.W.; Yen, L.Y.; Yang, C.C.; Kuo, T.B. Time distortion associated with smartphone addiction: Identifying smartphone addiction via a mobile application (App). J. Psychiatr Res. 2015, 65, 139–145. [Google Scholar] [CrossRef] [PubMed]
  89. Oulasvirta, A.; Rattenbury, T.; Ma, L.; Raita, E. Habits make smartphone use more pervasive. Pers. Ubiquitous Comput. 2011, 16, 105–114. [Google Scholar] [CrossRef]
  90. Heitmayer, M.; Lahlou, S. Why are smartphones disruptive? An empirical study of smartphone use in real-life contexts. Comput. Hum. Behav. 2021, 116, 106637. [Google Scholar] [CrossRef]
  91. Lee, M.; Han, M.; Pak, J. Analysis of Behavioral Characteristics of Smartphone Addiction Using Data Mining. Appl. Sci. 2018, 8, 1191. [Google Scholar] [CrossRef] [Green Version]
  92. Stothart, C.; Mitchum, A.; Yehnert, C. The attentional cost of receiving a cell phone notification. J. Exp. Psychol. Hum. Percept. Perform. 2015, 41, 893–897. [Google Scholar] [CrossRef] [Green Version]
  93. Ralph, B.C.; Thomson, D.R.; Cheyne, J.A.; Smilek, D. Media multitasking and failures of attention in everyday life. Psychol. Res. 2014, 78, 661–669. [Google Scholar] [CrossRef] [PubMed]
  94. Toh, W.X.; Ng, W.Q.; Yang, H.; Yang, S. Disentangling the effects of smartphone screen time, checking frequency, and problematic use on executive function: A structural equation modelling analysis. Curr. Psychol. 2021, 42, 4225–4242. [Google Scholar] [CrossRef]
  95. Hartanto, A.; Lee, K.Y.X.; Chua, Y.J.; Quek, F.Y.X.; Majeed, N.M. Smartphone use and daily cognitive failures: A critical examination using a daily diary approach with objective smartphone measures. Br. J. Psychol. 2023, 114, 70–85. [Google Scholar] [CrossRef]
  96. Chen, Q.; Yan, Z. Does multitasking with mobile phones affect learning? A review. Comput. Hum. Behav. 2016, 54, 34–42. [Google Scholar] [CrossRef]
  97. van der Schuur, W.A.; Baumgartner, S.E.; Sumter, S.R.; Valkenburg, P.M. The consequences of media multitasking for youth: A review. Comput. Hum. Behav. 2015, 53, 204–215. [Google Scholar] [CrossRef]
  98. Sharifian, N.; Zahodne, L.B. Social Media Bytes: Daily Associations Between Social Media Use and Everyday Memory Failures Across the Adult Life Span. J. Gerontol. B Psychol. Sci. Soc. Sci. 2020, 75, 540–548. [Google Scholar] [CrossRef] [Green Version]
  99. Odgers, C.L.; Jensen, M.R. Annual Research Review: Adolescent mental health in the digital age: Facts, fears, and future directions. J. Child Psychol. Psychiatry 2020, 61, 336–348. [Google Scholar] [CrossRef] [PubMed]
  100. Ivie, E.J.; Pettitt, A.; Moses, L.J.; Allen, N.B. A meta-analysis of the association between adolescent social media use and depressive symptoms. J. Affect. Disord. 2020, 275, 165–174. [Google Scholar] [CrossRef] [PubMed]
  101. Ward, A.F.; Duke, K.; Gneezy, A.; Bos, M.W. Brain Drain: The Mere Presence of One’s Own Smartphone Reduces Available Cognitive Capacity. J. Assoc. Consum. Res. 2017, 2, 140–154. [Google Scholar] [CrossRef] [Green Version]
  102. Kushlev, K.; Proulx, J.; Dunn, E.W. Silence Your Phones. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, San Jose, CA, USA, 7–12 May 2016; pp. 1011–1020. [Google Scholar]
  103. Frost, P.; Donahue, P.; Goeben, K.; Connor, M.; Cheong, H.S.; Schroeder, A. An examination of the potential lingering effects of smartphone use on cognition. Appl. Cogn. Psychol. 2019, 33, 1055–1067. [Google Scholar] [CrossRef]
  104. Hartmann, M.; Martarelli, C.S.; Reber, T.P.; Rothen, N. Does a smartphone on the desk drain our brain? No evidence of cognitive costs due to smartphone presence in a short-term and prospective memory task. Conscious. Cogn. 2020, 86, 103033. [Google Scholar] [CrossRef]
  105. Dupont, D.; Zhu, Q.; Gilbert, S.J. Value-based routing of delayed intentions into brain-based versus external memory stores. J. Exp. Psychol. Gen. 2023, 152, 175–187. [Google Scholar] [CrossRef]
  106. Klimova, B.; Valis, M. Smartphone Applications Can Serve as Effective Cognitive Training Tools in Healthy Aging. Front. Aging Neurosci. 2017, 9, 436. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Smartphone schematic depicting examples of passive data collection.
Figure 1. Smartphone schematic depicting examples of passive data collection.
Brainsci 13 00959 g001
Figure 2. PRISMA flow diagram of search and selection process.
Figure 2. PRISMA flow diagram of search and selection process.
Brainsci 13 00959 g002
Table 1. Definitions of passive and active data collection along with examples.
Table 1. Definitions of passive and active data collection along with examples.
Type DefinitionExamplesImplementable Digital Devices
Active Data CollectionData acquisition from participants requiring active participation, allowing for subjective data measurementsEcological momentary assessments, mood/cognitive/motor self-reported assessmentsSmartphones, tablets, smartwatches
Passive Data CollectionUnobtrusive data acquisition from participants from digital technology, where participants are unaware of collection, allowing for objective data measurementsKeystroke dynamics, accelerometer, GPS, screen time, temperature, phone-checking frequency, physical activity, number of text messages and emails, duration and frequency of phone calls madeSmartphones, tablets, smartwatches, sleep monitors, fitness trackers
Table 2. Summary of study characteristics and findings.
Table 2. Summary of study characteristics and findings.
StudyN ParticipantsDigital Technologies ImplementedData CollectedFindingsStatistical AnalysisMachine Learning Models and Validation Metrics
Chen, R. et al., 2019 [46]24 people with mild cognitive impairment (MCI), 7 people with mild AD dementia, 84 healthy controlsiPhone 7 Plus, Apple Watch Series 2, 10.5” iPad Pro with a smart keyboard, Beddit sleep monitoring devicePassive: Number of text messages sent and received, time duration to send a text message, typing speeds, accelerometer, gyroscope, stairs climbed, stand hours, workout sessions, heart rate, sleep sensors, application usage time, phone unlocks, breathe sessions

Active: Daily energy surveys, tapping task, dragging task, typed narrative task, verbal narrative task, video, and audio bi-weekly
Symptomatic participants (with MCI or with mild AD dementia) typed slower, had a less regular routine (measured via first and last phone acceleration), took their first steps of the day later (measured via phone’s pedometer), sent and received fewer text messages, relied more on applications suggested by Siri, and had worse survey compliance than healthy controlsNot Applicable (N/A)Model: Extreme gradient boosting algorithm

Validation: training/testing: 70/30

Results:

demographics AUROC: 0.757;

device-derived features AUROC: 0.771 (±0.016, 95% CI);

demographics + device derived features = 0.804 (±0.015, 95% CI);

age-matched demographics AUROC: 0.519 (±0.018, 95% CI);

age-matched device features AUROC: 0.726 (±0.021, 95% CI);

age-matched demographics + device features AUROC: 0.725 (±0.022, 95% CI)
Ntracha, A. et al., 2020 [47]11 people with MCI, 12 healthy controlsAndroid smartphonesPassive: Keystroke dynamics (timestamps of keypresses and releases, backspace, pauses, number of characters typed, typing session duration)

Active: PHQ-9 Questionnaire, written assignments for natural language processing (typing up to four paragraphs on a given topic)
Participants with mild cognitive impairment (MCI) were able to be distinguished from healthy controls using passive and active data in natural learning processing models, keystroke models, and fused models. Participants with MCI had bradykinesia and rigidity detected from their keystroke dynamics when compared to healthy controlsN/AModels: k-Nearest Neighbors (k-NN), logistical regression (LR), random forest, ensemble method

Validation: leave one subject out method for training and testing

Results:
Keystroke features with kNN classifier: AUC: 0.78 (0.68–0.88, 95% CI), specificity/sensitivity: 0.64/0.92

Natural Language Processing (NLP) features with LR classifier: AUC: 0.76 (0.65–0.85, 95% CI), specificity/sensitivity: 0.80/0.71

Ensemble model fusion of keystroke and NLP features:
AUC: 0.75 (0.63–0.86, 95% CI), specificity/sensitivity: 0.90/0.60
Chen, M. et al., 2022 [48]16 people with multiple sclerosis (MS), 10 healthy controlsiOS and Android smartphonesPassive: Keystroke dynamics (keypress type, timestamp, relative distance between consecutive keystrokes, distance between the keystroke and the center of the keyboard), accelerometer

Active: Digital neuropsychological tests (symbol digit modalities test, digit span, trail-making test, Delis–Kaplan executive function system (D-KEFS) color-word interference test, controlled oral word association test or D-KEFS verbal fluency test, California verbal learning test, Rey auditory verbal learning test, or Hopkins verbal learning test-revised), symptom rating scales (modified fatigue impact scale, Chicago multiscale depression inventory, state-trait anxiety inventory)
Participants with MS with less severe symptoms had higher uses of the backspace key and a faster typing speed. Faster typing speed was associated with better performance on measures of processing speed, attention, and executive functioning as well as having less impact from fatigue and having less severe anxiety symptomsMethod: Multilevel models (level 1: keystroke dynamics within typing session; level 2: subjects)

Significant results: Features evaluated using Welch’s t-test:
number of days of data collection (mean number): −1.86, p = 0.076

proportion of time spent using one hand to type (%): 542.70, p < 0.001

number of characters per typing session (mean): 0.01, p < 0.107

median inter-key delay (typing speed) per session (seconds): −1.45, p < 0.001

inter-key delay median absolute deviation per session (seconds): 0.11, p = 0.032
N/A
Lam, K.H. et al., 2020 [49]102 people with MS, 24 healthy controlsiOS and Android smartphonesPassive: Keypress dynamics (type of keypress (alphanumeric, backspace, space key, punctuation), time and date of keypresses, successive keypress latencies and releases

Active: Assessments, including expanded disability status scale, nine-hole peg test, symbol digit modalities test (SDMT)
Participants with MS had higher keypress latencies, release latencies, flight time, post-punctuation pause, pre-correction and post-correction slowing compared to healthy controlsMethod: Pearson’s correlation coefficient

Significant results:
SDMT with:
press-press latency: −0.525, p > 0.01

release-release latency −0.553, p < 0.01

hold time: −0.286, p < 0.01

flight time: −0.525, p < 0.01

pre-correction slowing: −0.300, p < 0.01

post-correction slowing −0.444, p < 0.01

correction duration: −0.162, p < 0.05

after punctuation pause: −0.317, p < 0.01
N/A
Lam, K.H. et al., 2022 [50]102 people with MSiOS and Android smartphonesPassive: Keypress dynamics (type of keypress (alphanumeric, backspace, space key, punctuation), time and date of keypresses, successive keypress latencies and releases

Active: Assessments, including expanded disability status scale, nine-hole peg test, symbol digit modalities test (SDMT)
Participants with MS with worse arm function had higher latency between keypresses and participants with worse processing speed corresponded with higher latency using punctuation and backspace keysMethod: Linear mixed-models

Significant results: cognitive score cluster associated with SDMT: −8.57 (−12.02 to −5.12, 95% CI), p < 0.001, random effect variance: 82.7%, explained variance: 25.4%;

cognitive score cluster and covariances (age, sex, level of education): −5.02 (−9.02 to −1.02), p = 0.02, random effect variance: 77.1%, explained variance: 30.4%;

hybrid model (including covariates):
between subjects: −11.25 (−17.28 to −5.21), p < 0.001;
within subjects: −0.35 (−5.60 to 4.89), p = 0.9
N/A
Hoeijmakers, A. et al., 2023 [51]102 people with multiple sclerosis (MS), 24 healthy controlsiOS and Android smartphonesPassive: Keypress dynamics (type of keypress (alphanumeric, backspace, space key, punctuation), time and date of keypresses, successive keypress latencies and releases

Active: Assessments, including expanded disability status scale, nine-hole peg test, symbol digit modalities test
Participants with MS could be discerned from healthy controls by using clinical outcome measures as targets for machine learning (ML) techniques, with ML techniques being able to estimate level of disease severity, manual dexterity, and cognitive capabilitiesN/AModels: Binary classifications: random forest, logistical regression, k-nearest neighbors, support vector machine, Gaussian naive Bayes

Validation: training/testing: 80/20

Results from cross-validation: AUC = 0.762 (0.677–0.828, 95% CI)

AUC-ROC = 0.726, sensitivity/specificity/accuracy: 0.750/0.429/0.48

estimating level of fine motor skills AUROC: 0.753
Ning, E. et al., 2023 [52]64 participants with mood disorders (major depressive disorder, bipolar I/II, persistent depressive disorder, or cyclothymia), 26 healthy controlsiOS and Android smartphonesPassive: Keystroke dynamics (category of keypress (i.e., alphanumeric, backspace, punctuation), associated timestamps, autocorrection events), accelerometer, gyroscope

Active: Digital trail-making tests part B
Participants with mood disorders showed lower cognitive performance on the trail-making test. There were also diurnal pattern differences between participant with mood disorders and healthy controls, where individuals with higher cognitive performances had faster keypresses and were less sensitive to the time of dayMethod: longitudinal mixed effects

Significant results: aging effect: typing slowed ~20 ms/7 years;
sessions with lower accuracy had shorter IKDs ~10 ms, b = −0.89, p < 0.001;
more variable IKD within a session has slower session typing, b = 434.57, p < 0.001;
more typing, faster typing, b = −4.35, p < 0.001
N/A
Ross, M. et al., 2021 [53]11 people with BP, 8 healthy controlsSamsung Galaxy Note 4 Android smartphonePassive: Keystroke dynamics (category of keypress (i.e., alphanumeric, backspace, punctuation), associated timestamps, autocorrection events)

Active: Digital trail-making tests part B (dTMT-B)
Participants with mood disorders had significantly different keystroke dynamics from healthy controls when compared to depression ratings and the
trail-making test
Method: longitudinal mixed effects

Significant results: subject-centered HDRS-17 score predicting dTMT-B: b = 0.038, p = 0.004;

subject-centered typing speed predicting dTMT-B: b = 0.032, p = 0.004;

faster grand mean centered typing speed suggesting faster dTMT-B completion time: b = 0.189, p < 0.001
N/A
Zulueta, J. et al., 2018 [54]9 people with bipolar disorder (BP) (5 with BP I, 4 with BP II)Samsung Galaxy Note 4 Android smartphonesPassive: Keystroke dynamics (keystroke entry date and time, duration of keypress, latency between keypresses, distance from last key along two axes, and autocorrection, backspace, space switching-keyboard, and other behaviors), accelerometer

Active: Hamilton depression rating scale, Young mania rating scale
Participants with bipolar disorder who were in a potentially more manic state (had higher mania symptoms) used the backspace key less and while in a potentially more depressive state had an increase in autocorrection ratesMethod: mixed effects regression

Significant results: average accelerometer displacement with HDRS: 3.20 (1.20 to 5.21, 95% CI), p = 0.0017;
average accelerometer displacement with YMRS: 0.39 (0.15 to 0.64, 95% CI) p = 0.003;
autocorrect rate with HDRS: 2.67 (0.87 to 4.47, 95% CI), p = 0.0036;
backspace ratio with YMRS −0.30 (−0.53 to −0.070, 95% CI), p = 0.014
N/A
Mastoras, R.E. et al., 2019 [55]11 people with depressive tendencies, 14 healthy controlsAndroid smartphonesPassive: Keystroke dynamics (timestamps of keypresses and releases, delete rate, number of characters typed and typing session duration)

Active: Patient Health Questionnaire-9
Participants with depressive tendencies held down keypresses for longer and had longer pauses between keypresses compared to healthy controlsN/AModels: Random forest, gradient boosting classifier, support vector machine classifier

Validation: leave one subject out method for training and testing

Results: random forest (best performing pipeline): AUC = 0.89 (0.72–1.00, 95% CI), sensitivity/specificity: 0.82/0.86
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Nguyen, T.M.; Leow, A.D.; Ajilore, O. A Review on Smartphone Keystroke Dynamics as a Digital Biomarker for Understanding Neurocognitive Functioning. Brain Sci. 2023, 13, 959. https://doi.org/10.3390/brainsci13060959

AMA Style

Nguyen TM, Leow AD, Ajilore O. A Review on Smartphone Keystroke Dynamics as a Digital Biomarker for Understanding Neurocognitive Functioning. Brain Sciences. 2023; 13(6):959. https://doi.org/10.3390/brainsci13060959

Chicago/Turabian Style

Nguyen, Theresa M., Alex D. Leow, and Olusola Ajilore. 2023. "A Review on Smartphone Keystroke Dynamics as a Digital Biomarker for Understanding Neurocognitive Functioning" Brain Sciences 13, no. 6: 959. https://doi.org/10.3390/brainsci13060959

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop