Next Article in Journal
Knowledge on the Complementary Feeding of Infants Older than Six Months among Mothers Following Vegetarian and Traditional Diets
Previous Article in Journal
Antioxidant, Anti-Inflammatory, and Immunomodulatory Properties of Tea—The Positive Impact of Tea Consumption on Patients with Autoimmune Diabetes
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Evaluation of Available Cognitive Tools Used to Measure Mild Cognitive Decline: A Scoping Review

by
Chian Thong Chun
1,
Kirsty Seward
1,2,
Amanda Patterson
1,2,
Alice Melton
1,2 and
Lesley MacDonald-Wicks
1,2,*
1
School of Health Sciences, Faculty of Health and Medicine, University of Newcastle, University Drive, Callaghan, NSW 2308, Australia
2
Priority Research Centre for Physical Activity and Nutrition, University of Newcastle, Callaghan, NSW 2308, Australia
*
Author to whom correspondence should be addressed.
Nutrients 2021, 13(11), 3974; https://doi.org/10.3390/nu13113974
Submission received: 15 October 2021 / Revised: 28 October 2021 / Accepted: 29 October 2021 / Published: 8 November 2021
(This article belongs to the Section Lipids)

Abstract

:
Cognitive decline is a broad syndrome ranging from non-pathological/age-associated cognitive decline to pathological dementia. Mild cognitive impairment MCI) is defined as the stage of cognition that falls between normal ageing and dementia. Studies have found that early lifestyle interventions for MCI may delay its pathological progression. Hence, this review aims to determine the most efficient cognitive tools to discriminate mild cognitive decline in its early stages. After a systematic search of five online databases, a total of 52 different cognitive tools were identified. The performance of each tool was assessed by its psychometric properties, administration time and delivery method. The Montreal Cognitive Assessment (MoCA, n = 15), the Mini-Mental State Examination (MMSE, n = 14) and the Clock Drawing Test (CDT, n = 4) were most frequently cited in the literature. The preferable tools with all-round performance are the Six-item Cognitive Impairment Test (6CIT), MoCA (with the cut-offs of ≤24/22/19/15.5), MMSE (with the cut-off of ≤26) and the Hong Kong Brief Cognitive Test (HKBC). In addition, SAGE is recommended for a self-completed survey setting whilst a 4-point CDT is quick and easy to be added into other cognitive assessments. However, most tools were affected by age and education levels. Furthermore, optimal cut-off points need to be cautiously chosen while screening for MCI among different populations.

1. Introduction

Dementia is currently recognised as a global health priority, and is one of the major causes of disability amongst older adults [1,2]. Globally, there are 50 million people diagnosed with dementia, with a disease burden of AUD 1.4 trillion annually [1,2]. As the population continues to age, the worldwide prevalence of dementia is predicted to triple to 152 million people within the next three decades [3]. This will result in further costs for governments, communities, families and individuals. In addition, the medical, psychological and emotional impact on those with dementia and to caregivers/families is significant and detrimentally affects their quality of life [1].
Cognitive decline is a broad syndrome ranging from non-pathological/age-associated cognitive decline to pathological mild cognitive impairment, and further progression to dementia [4]. Mild cognitive impairment (MCI) is a term used to identify the stage of cognition that falls between normal ageing and dementia, defined as slight but measurable cognitive decline without the loss of functional ability [5,6,7]. Therefore, cognitive decline is recognised to occur through a mild and subtle manner onto a more comprehensive presentation; and its changes form a continuum [4]. Different from dementia, people with MCI can perform daily living activities independently with minimal aids or assistance [5]. Its onset is evident since middle age (age 45 to 49), but the failure to detect subtle cognitive changes has resulted in the delay of care among 27–81% of affected patients [8,9,10]. Detection can be unpredictable because each individual experiences different rates of decline [4]. In addition, research indicates that MCI is associated with heightened risk of progression to dementia as compared to individuals with more normal cognition [11].
Due to the poor prognosis implications, early detection of subtle cognitive changes is beneficial for practitioners to identify possible treatable causes or provide appropriate interventions. Currently, the clinical diagnosis of MCI is mainly determined by a physician’s best judgement [12,13]. Clinical characterisation methods including the Clinical Dementia Rating (CDR) scale, Petersen’s Criteria and the National Institute on Ageing-Alzheimer’s Association (NIA-AA) Criteria are frequently used in combination with laboratory and neurological tests to diagnose MCI [7]. These tests need to be administered by trained physicians and require extensive amounts of time. Hence, various brief cognitive tools have been introduced to detect cognitive decline as first-line screening methods [14]. A structured screening tool is required to be brief, easy to administer, have good psychometric properties, generalisable in elderly populations, and preferably able to be self-administered or conducted by non-health care professionals [14]. Many studies had evaluated and validated the dementia screening tests; however, there is limited research on MCI screening tools specifically. The most recent systematic review suggested that the Montreal Cognitive Assessment (MoCA) is the preferred tool for screening MCI in the primary care setting [14]. However, only a limited number of studies (14 articles) were included in this review [14]. There is also a lack of knowledge regarding the generalisability and usability of the tools in other settings and/or populations [14].
Disease-modifying therapy (DMT) for cognitive decline is currently a prioritised global research area to manage the rise in prevalence of cognitive decline and associated costs to society [15]. It is clear from clinical trials that there is a lack of pharmacological agents which are able to treat the underlying cause(s) or slow down the rate of cognitive decline [5]. Primarily, these pharmacological agents can only manage the symptoms by temporarily ameliorating memory and cognitive problems [5]. Hence, the emphasis of research has shifted to utilising lifestyle modifications as prevention or early treatment approaches. Several studies have shown a relationship between the development of cognitive decline and lifestyle-related risk factors [16]. Therefore, World Health Organisation guidelines recommend stakeholders to target modifiable lifestyle factors including improved nutrition and diet to diminish the risk [3,16]. This is supported by a recent systematic review which demonstrated that the modification of diet quality is a promising, yet long-term (more than 6 months) preventive measure to limit the progression of cognitive decline [17]. Even so, the lack of knowledge regarding the type and properties of cognitive tools remains one of the biggest barriers in research because the large range of tools used in studies makes comparison between studies difficult [17]. It is recommended that improved knowledge in the properties of cognitive assessment would help to elucidate the effectiveness of diet and nutrition in cognitive decline [17].
Therefore, the demand for easily administered, sensitive, specific and reliable cognitive tools to identify the early stages of subtle cognitive decline is high for several reasons. Firstly, identifying these tools can assist future researchers with selecting appropriate tools for the study design, and strengthen the ability to assess the effectiveness of interventions (both lifestyle and pharmacological) on the progression of cognitive impairment [18]. Secondly, health care practitioners can select these tools to assess an individual’s cognition and detect abnormal cognitive changes earlier, thus resulting in earlier intervention and improved patient outcomes [18].
In this study, we aimed to catalogue and assess the tools used to evaluate mild cognitive impairment and decline among healthy elderly populations. To achieve this, we considered multiple factors of the cognitive tools, including their psychometric performance and generalisability in different settings and/or populations. A scoping review instead of systematic review was chosen in order to include all the relevant information available and tools cited in the literature and to identify any gaps for future studies.

2. Materials and Methods

2.1. Protocol and Registration

This protocol was developed using the methodological framework for scoping reviews proposed by Arksey and O’Malley (2005) [19] and further refined by using the Preferred Reporting Items for Systematic reviews and Meta-Analyses extension for Scoping Reviews (PRISMA-ScR) Checklist [20]. The protocol for this review was registered with the Open Science Framework: https://osf.io/tb3gc/ (accessed in 1 June 2020).

2.2. Eligibility Criteria

To be included in this review, papers need to be focused on the evaluation of screening and/or diagnostic performance of cognitive tools used to measure mild cognitive decline. Peer-reviewed journal papers were included if they were: in English language, assessed general healthy adult humans (>45 years, without any diagnosed health conditions or diseases) and evaluated the psychometric performance (i.e., specificity, sensitivity, validity, reliability) of cognitive tools. All quantitative study designs were eligible for inclusion. However, reviews and grey literature were excluded. Papers were excluded if they did not meet the above specified criteria or they focused on interventions rather than performance of cognitive tools. Tools that are not easily administered or are invasive (such as imaging tools or biomarkers) were also excluded. Moreover, papers published before 2015 were excluded to provide an up-to-date review on current literature. All papers had to be easily available to the research team at the time of the study, as time was limited due to the nature of the embedded honours program of the principal researcher.

2.3. Information Sources and Search

Comprehensive literature searches for potentially relevant articles up until April 2020 were conducted in the following online databases: CINAHL (Ebsco), MEDLINE (Ovid), EMBASE (Ovid), PsycINFO (Ovid) and Cochrane. The search strategies were developed with the assistance of an experienced research librarian. The search strategy contained population, intervention and outcome terms. Searches were limited to adults aged 45 years and above as this is the age range in which mild cognitive decline presents [9]. The articles with publication dates before 2015 were excluded to provide an up-to-date review. The final search strategy for MEDLINE can be found in Supplementary Table S1. Similar search strategies were used while conducting searches in other identified databases. The final search results were exported into the EndNote X9 [21] referencing software. After removing the duplicates, the results were uploaded onto the online systematic review management system Covidence [22] for article screening purpose.

2.4. Selection of Sources of Evidence

After removing duplicates from EndNote X9 [21] and Covidence [22], 32,681 publications were available for screening (Figure 1). Prior to screening, 3 reviewers (CTC, KS and AM) conducted screening trials and discussions on two occasions to increase consistency among reviewers. During the screening trials, CTC, KS and AM double screened 10 articles independently before discussions. After the mutual agreement of screening trial results, abstracts and titles of potentially relevant articles were single screened by CTC, KS or AM in Covidence [22]. Full-text screening and discussions as above were conducted again prior to data extraction. Relevant full-text articles (n = 444) were single screened by CTC, KS or AM against the inclusion criteria, with the reason for exclusion recorded. All included full-text papers (n = 49) underwent data extraction.

2.5. Data Charting Process and Data Items

CTC designed a standardised data-charting form (a customised spreadsheet) under supervision to chart data from eligible studies and to determine the appropriate variables to extract. The included variables in the spreadsheet were study characteristics (author, year, country of origin), characteristics of tools (name of the tool, the version of tool, range of the scores/points, cut-off point to detect mild cognitive decline, administration method and the duration of administration), study design, study population (age, %female, education level), settings, the psychometric performance of tools (including sensitivity, specificity, reliability and validity in detecting mild cognitive decline), factors that may affect the performance of the cognitive tool and the comparison standard(s) in the validation studies.
CTC charted the data in the data charting form under supervision. LMW checked the extracted data. AM hand-search the information if there was missing data in the spreadsheet. KS double-checked 10% of the extracted data. Reviewers iteratively updated the data-charting form before synthesising the results.

2.6. Synthesis of Results

By using the standardised data-charting form, all results were summarised and synthesised after discussions with all reviewers. By using the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) flowchart, reviewers documented the screening methods and recorded the quantity of included and excluded studies in this review (Figure 1). Additionally, by using the coding system, reviewers counted the frequency that each tool cited in included papers to catalogue which tool had the most frequent research done on its performance.
Regarding the psychometric properties, validity was charted as the Sensitivity (Sn), Specificity (Sp), Area Under the Curve (AUC), Positive Predictive Value (PPV) and Negative Predictive Value (NPV). Sn is the ability of a tool to correctly classify an individual as having ‘mild cognitive decline’, whereas Sp is the ability of a tool to correctly classify an individual as ‘without mild cognitive decline’ [23]. AUC is an overall measurement of validity performance of a screening/diagnostic test [13]. PPV is the percentage of patients with a positive test who actually have ‘mild cognitive decline’; whereas NPV is the percentage of patients with a negative test who actually do not have ‘mild cognitive decline’ [23]. All the above properties were charted as percentages, with the closeness to 100% being higher respective validity. Reliability of a tool was identified based on its performance on all reliability tests used in the included studies. Interpretation of the above properties is presented in Table 1. By referencing with other validity studies, reviewers interpreted the psychometric properties based on the criteria developed by researchers’ consensus [13,24]. To be classed as good, the cognitive tool has to achieve the below criteria: good to excellent validity, good reliability, short administration time of ≤15 min whilst being able to be self-administered or conducted by non-health care professionals [14]. Hence, reviewers assessed the performance of cognitive tools using the above appraisal format.
Lastly, a narrative synthesis of results was developed to assess and evaluate the characteristics and psychometric properties of each of the identified cognitive tools based on the data charting form and the criteria (Table 1).

3. Results

3.1. Study Selection

In total, 46,015 articles published in the five-year period (2015 to April 2020) were retrieved. After removing duplicate articles, 32,681 articles were screened in Covidence [22], with another 395 articles excluded due to inappropriate outcomes (n = 137), inappropriate study purpose (n = 104), inappropriate population (n = 84), papers which were unable to be retrieved (n = 25), not tools of interest (n = 23), inappropriate study design (n = 17) and duplicated articles (n = 5). After evaluating the full text, 49 articles met inclusion criteria and were included in this review.

3.2. Study Characteristics

Key characteristics of the 49 included articles can be found in Table 2. Considerable variations were found between studies for country, participant’s characteristics, studied cognitive tools and their comparison standard(s). The majority of studies were conducted in Asian countries (n = 17) [25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41], followed by European countries (n = 13) [42,43,44,45,46,47,48,49,50,51,52,53,54] and the Unites States (n = 7) [55,56,57,58,59,60,61]. The remaining studies came from Brazil (n = 3) [62,63,64], Australia (n = 2) [65,66], Greece (n = 2) [67,68], Argentina (n = 1) [69], unclear origin (n = 2) [70,71], Cuba (n = 1) [72] and Turkey (n = 1) [73]. In terms of study design, most included articles were cross-sectional (n = 33) [25,26,27,28,29,31,32,34,35,37,38,39,40,41,42,43,44,45,46,47,49,53,54,63,65,66,67,68,69,70,71,72,73] and cohort studies (n = 14) [30,33,36,48,50,51,52,55,56,57,58,61,62,64]. The characteristics for participants in each study were similar, with the age ranging from 50 to 95 years and the proportion of females ranging from 33 to 87%. Participants with low, average and high levels of education were included. To evaluate the psychometric performance of tools, studies used various validated comparison standards including the Clinical Dementia Rating (CDR) [26,32,56,65], the Mini-Mental State Examination (MMSE) [25,42,45,46,49], Petersen’s criteria [29,36,53,57,64,71,73], National Institute on Ageing-Alzheimer’s Association (NIA-AA) criteria [40,44,47,50,70], brief cognitive tests [59,67], clinical consensus by health professionals [61], Magnetic Resonance Imaging (MRI) scans, Diagnostic and Statistical Manual of Mental Disorders criteria (DSM) [27], other methods [51,60,63,68,72], or a combination of the above standards [28,30,31,33,34,35,37,38,39,41,43,48,52,55,56,62,66,69,72] to classify participants as ‘mild cognitive decline’ or ‘without mild cognitive decline’.

3.3. Cognitive Tools for Mild Cognitive Decline

A total of 52 different cognitive tools used to detect cognitive decline were catalogued and assessed in this review (Table 3). The Montreal Cognitive Assessment (MoCA) (n = 15) [26,27,28,29,31,32,34,36,44,49,56,61,63,65,73] and MMSE (n = 14) [26,27,28,29,32,34,36,40,50,53,57,66,72,73] followed by the Clock Drawing Test (CDT) (n = 4) [47,50,51,54] were most frequently cited in the literature. The other 49 tools were only studied in a limited number of articles (1 to 2 studies each). All of the tools were studied in clinical context and were applied in primary care and/or community settings. Most of the tools need to be administered by health care professionals (n = 14) [28,32,35,36,38,46,47,49,54,58,59,62,63,64,65,67,72,73] or trained personnel (n = 12) [26,31,33,39,40,41,44,53,65,68,70,72]. The remaining tools can be conducted by untrained examiners (n = 6) [27,29,42,45,51] or self-administered (n = 6) [30,43,53,58,60,62]. Among the self-administered tools, the Hong Kong–Vigilance and Memory Test (HK-VMT) [30] and the Self-Administered Gerocognitive Examination (SAGE) [60] can be administered via electronic devices.

3.4. Psychometric Performance of Included Cognitive Tools

Table 4 collates the available version(s), cut-off point(s), and psychometric performance (validity and reliability), factors which affect the performance and the administration time of the cognitive tools. Table 5 summarises all the data for the performance of the cognitive tools compared with the pre-identified criteria on the tools overall performance. Based on the researchers’ appraisal, there are several cognitive tools that achieved the status of good cognitive tool, including the Six-item Cognitive Impairment Test (6CIT), MoCA (with the cut-offs of ≤24/22/19/15.5), MMSE (with the cut-off of ≤26) and the Hong Kong Brief Cognitive Test (HKBC).
These tools provided good to excellent validity and reliability in detecting people with mild cognitive decline within 15 min of administration time. In addition, they do not require health care professionals to administer. However, education levels, age, gender and emotional status can affect the performance of these cognitive tools. For instance, the performance of 11 tools were found to be associated with education [27,28,29,30,31,36,42,50,53,67,68,72] while the results of 10 tools were associated with age [28,36,39,41,45,50,67,68,72]. In addition, a briefer, revised or translated version which can better accommodate the settings of specific populations was also available for most of the tools [25,26,27,29,31,32,38,40,41,42,44,45,46,48,49,52,53,55,58,59,60,62,63,64,66,68,69,72,73].

4. Discussion

This scoping review collates a comprehensive list of brief cognitive tools used to measure mild cognitive decline in healthy elderly populations. To achieve effective screening outcomes, the brief cognitive tools are required to have good to excellent psychometric properties, short administration time and can be self-administered or administered by non-health care professionals [14,24].
Similar to recent systematic reviews, MoCA, MMSE and CDT are the most commonly used cognitive assessment tools in screening mild cognitive decline [14,74]. Based on our critical evaluation (Table 5), the ideal screening tools with versatile performance are 6CIT [42], MoCA (with the cut-offs of ≤24/22/19/15.5) [26,27,28,31,32,44,49,56], MMSE (with the cut-off of ≤26) [26,27,28,50,72] and HKBC [27]. The remaining 48 tools have suboptimal performance or insufficient information in any of these criteria: psychometric properties, administration time or administration methods. All of these tools are suitable to use in community or primary care settings.
Among these ideal screening tools, HKBC has the highest validity and reliability in identifying the earliest stages of subtle cognitive decline [27]. However, it was only validated in Hong Kong with a limited number of studies, and might not be generalisable among other populations.
MMSE is the most recognised brief cognitive tool which is frequently used in measuring cognitive impairment in clinical, research and community settings [75]. However, as supported by multiple systematic reviews and meta-analysis, MoCA can detect the subtle changes in cognitive capacity better than MMSE [14,75,76]. Studies proposed that there are several features in MoCA’s design that can potentially explain its superior sensitivity in MCI detection [77]. As compared to MMSE, MoCA’s assessment tasks includes more words, fewer learning trials, and a longer delay before the memory recall test [77]. MCI participants can be mildly impaired in their executive functions, complex visuospatial processing and the higher-level language abilities [77]. Thus, MoCA with more diverse and demanding tasks can better distinguish the changes in the above components than MMSE [77].
Even so, both MoCA and MMSE are recommended as the widely generalisable cognitive tools with all-round performance. They have been adapted and validated in different versions to minimise the effect of language and culture on their psychometric performance. Both tools can be administered by trained or untrained personnel in multiple health care settings such as hospital, primary care and the community. However, not all cut-off points provide high psychometric performances in screening mild cognitive decline. Different cut-off scores have also been published when the tests are modified to suit the local culture [74]. Hence, optimal cut-off points need to be carefully chosen while interpreting these results. Nonetheless, the presence of educational bias remains a concern while administering MoCA and MMSE and this was supported by a systematic review by Roshaslina Rosli et al. [74]. The impact of education may result in inappropriate referral due to the overestimation of the prevalence of mild cognitive decline [74]. To address this issue, MoCA-B is an modified version of MoCA which was designed to be less dependant on literacy levels [32]. Additional studies in this area may be beneficial for future use and development of the tools. Alternatively, Visual Cognitive Assessment Test (VCAT) is not affected by languages or cultural background, overcoming the common barriers for most cognitive tools including MoCA and MMSE [33,35]. It is designed to be a visual-based cognitive tool to reduce the language demands [35]. Only the instructions, but not the test components require translation [35]. Based on our appraisal, the only criteria resulting in its exclusion from the ‘good cognitive tool’ category was the slightly lengthy administration time (15 to 20 min) for a brief cognitive tool [33].
To detect mild cognitive decline in surveys, self-completed tools such as the Dementia Screening Interview (AD8), SAGE, the Everyday Memory Questionnaire (EMQ), the Cognitive Change Questionnaire (CCQ), HK-VMT and Test Your Memory (TYM) can be suitable. Among these self-administered tools, SAGE has the best validity and reliability and is also validated to be conduct via electronic devices [60]. From our review, there are some very brief cognitive tools which required less than 5 min to deliver. 6CIT is the preferable very brief cognitive tool with versatile properties [42]. However, it was only validated against MMSE which is not a true gold standard in diagnosing MCI [42]. A 4-point CDT only requires less than 2 min to conduct [50]. Its only limitation is the fair to good validity while screening MCI. Thus, CDT may be beneficial to use in combination with other screening tools without adding a significant amount of administration time. In addition, a short-form Brief Cognitive Assessment Tool (BCAT) is also valid and reliable to be conducted by professional personnel within 3 to 4 min [58].
Interestingly, the level of psychometric performance can be different while screening different types of MCI. There are generally two subtypes of MCI, which are amnestic MCI (a-MCI) and non-amnestic MCI (na-MCI) [78,79]. Research has shown that there are structural differences in brain tissues among different MCI subtypes and these pathological changes affect different cognitive components [80]. Thus, people with a-MCI have impaired memory whereas na-MCI affects people’s thinking skills other than memory [78,79]. Hence, cognitive tests which assessed different domains may have different performance in identifying each MCI subtype. For instance, Short Test of Mental Status (STMS) has higher validity in discriminating na-MCI as compared to a-MCI which is potentially due to its assessment properties of having a larger domain in assessing memory rather than other cognitive skills [61,81]. Therefore, future studies are recommended to further validate the MCI screening tools’ performance in discriminating different subtypes of MCI. Additional studies were also required to further validate the cut-off points and psychometric performance of the included brief cognitive tools in this review.
The limited available studies and data among included articles remains the biggest limitation to our review. The exclusion of studies before 2015, grey literature and non-English studies may limit some of the information relevant to this review. To make this review more feasible within the honours program limitation, the optional critical appraisal of study quality was not conducted in this review. Despite these limitations, this is a thorough scoping review and has collated a large number of studies from the previous 5 years. Studies from various countries were included, which allowed us to catalogue the brief cognitive tools used in worldwide populations and across a variety of settings. Substantial work was undertaken to evaluate each of the tools used in measuring mild cognitive decline.

5. Conclusions

Based on our review, there were 52 different tools available to discriminate mild cognitive decline among healthy elderly populations. 6CIT [42], MoCA (with the cut-offs of ≤24/22/19/15.5) [28,32,34,35,44,46,49,60], MMSE (with the cut-off of ≤26) [26,27,28,50,72] and HKBC [27] are good at discriminating the subtle cognitive changes as a result of MCI. They have versatile performance in terms of their psychometric properties, administration time and delivery methods. In addition, MoCA and MMSE have been modified into various versions to be generalisable in multiple populations. To detect subtle cognitive changes in surveys, SAGE is recommended, and it can also be administered digitally. A 4-point CDT is quick and easy to be added into other cognitive screening tests while assessing MCI. However, suitable cut-off points need to be further studied to validate performance as a mild cognitive decline screening test.
The lack of thorough evaluation of cognitive tools in identifying MCI appears to be a challenge among clinical and research settings. The aim of this review was to catalogue and assess the tools used to evaluate mild cognitive decline among healthy elderly populations, and to identify gaps in the literature which might guide future research in this area. This review advocates additional research being needed to recommend the best MCI cognitive screening tools among different populations and environments.

Supplementary Materials

The following are available online at https://www.mdpi.com/article/10.3390/nu13113974/s1, Table S1: the final search strategy for MEDLINE.

Author Contributions

Conceptualisation, K.S., A.P., A.M. and L.M.-W.; methodology, C.T.C.; validation, K.S., A.P., A.M. and L.M.-W.; formal analysis, C.T.C.; writing—original draft preparation, C.T.C.; writing—review and editing, L.M.-W., K.S., A.M., A.P and C.T.C.; supervision, K.S., A.P., A.M. and L.M.-W. All authors read and approved the final manuscript.

Funding

The research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Acknowledgments

Debbie Booth for her kind assistance in developing search strategies.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. World Health Organization. Dementia: A Public Health Priority; Alzheimer’s Disease International: London, UK, 2012. [Google Scholar]
  2. Alzheimer’s Disease International. World Alzheimer Report 2019: Attitudes to dementia; WHO: London, UK, 2019. [Google Scholar]
  3. World Health Organization. Risk Reduction of Cognitive Decline and Dementia: WHO Guidelines; WHO: Geneva, Switzerland, 2019. [Google Scholar]
  4. Deary, I.J.; Corley, J.; Gow, A.; Harris, S.E.; Houlihan, L.M.; Marioni, R.; Penke, L.; Rafnsson, S.B.; Starr, J.M. Age-associated cognitive decline. Br. Med. Bull. 2009, 92, 135–152. [Google Scholar] [CrossRef]
  5. Mild Cognitive Impairment. Available online: https://www.dementia.org.au/about-dementia-and-memory-loss/about-dementia/memory-loss/mild-cognitive-impairment (accessed on 8 October 2020).
  6. Knopman, D.; Petersen, R. Mild Cognitive Impairment and Mild Dementia: A Clinical Perspective. Mayo Clin. Proc. 2014, 89, 1452–1459. [Google Scholar] [CrossRef] [Green Version]
  7. Mild Cognitive Impairment. Available online: https://memory.ucsf.edu/dementia/mild-cognitive-impairment (accessed on 2 November 2021).
  8. Cognitive Assessment Toolkit. Available online: https://www.alz.org/media/documents/cognitive-assessment-toolkit.pdf (accessed on 8 October 2020).
  9. Singh-Manoux, A.; Kivimaki, M.; Glymour, M.M.; Elbaz, A.; Berr, C.; Ebmeier, K.; E Ferrie, J.; Dugravot, A. Timing of onset of cognitive decline: Results from Whitehall II prospective cohort study. BMJ 2012, 344, d7622. [Google Scholar] [CrossRef] [Green Version]
  10. Cornelis, M.C.; Wang, Y.; Holland, T.; Agarwal, P.; Weintraub, S.; Morris, M.C. Age and cognitive decline in the UK Biobank. PLoS ONE 2019, 14, e0213948. [Google Scholar] [CrossRef] [Green Version]
  11. Roberts, R.O.; Knopman, D.S.; Mielke, M.M.; Cha, R.H.; Pankratz, V.S.; Christianson, T.J.; Geda, Y.E.; Boeve, B.F.; Ivnik, R.J.; Tangalos, E.G.; et al. Higher risk of progression to dementia in mild cognitive impairment cases who revert to normal. Neurology 2013, 82, 317–325. [Google Scholar] [CrossRef] [Green Version]
  12. Mild Cognitive Impairment. Available online: https://www.dementia.org.au/sites/default/files/helpsheets/Helpsheet-OtherInformation01-MildCognitiveImpairment_english.pdf (accessed on 27 August 2020).
  13. Safari, S.; Baratloo, A.; Alfil, M.; Negida, A. Evidence Based Emergency Medicine; Part 5 Receiver Operating Curve and Area under the Curve. Emergency 2016, 4, 111–113. [Google Scholar]
  14. Razak, M.A.A.; Ahmad, N.A.; Chan, Y.Y.; Kasim, N.M.; Yusof, M.; Ghani, M.K.A.A.; Omar, M.; Aziz, F.A.A.; Jamaluddin, R. Validity of screening tools for dementia and mild cognitive impairment among the elderly in primary health care: A systematic review. Public Health 2019, 169, 84–92. [Google Scholar] [CrossRef]
  15. Cummings, J.; Aisen, P.S.; Dubois, B.; Frölich, L.; Jr, C.R.J.; Jones, R.W.; Morris, J.C.; Raskin, J.; Dowsett, S.A.; Scheltens, P. Drug development in Alzheimer’s disease: The path to 2025. Alzheimer’s Res. Ther. 2016, 8, 1–12. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  16. Mental Health: WHO Guidelines on Risk Reduction of Cognitive Decline and Dementia. Available online: https://www.who.int/mental_health/neurology/dementia/risk_reduction_gdg_meeting/en/ (accessed on 27 August 2020).
  17. Alice, M.; Lesley, M.W.; Amanda, P. Is Diet Quality a predictor of cognitive decline in older adults? A Systematic Review. Nutrients. (manuscript in preparation).
  18. Snyder, P.J.; Jackson, C.E.; Petersen, R.C.; Khachaturian, A.S.; Kaye, J.; Albert, M.S.; Weintraub, S. Assessment of cognition in mild cognitive impairment: A comparative study. Alzheimers Dement. 2011, 7, 338–355. [Google Scholar] [CrossRef] [Green Version]
  19. Arksey, H.; O’Malley, L. Scoping studies: Towards a methodological framework. Int. J. Soc. Res. Methodol. 2005, 8, 19–32. [Google Scholar] [CrossRef] [Green Version]
  20. Tricco, A.C.; Lillie, E.; Zarin, W.; O’Brien, K.K.; Colquhoun, H.; Levac, D.; Moher, D.; Peters, M.D.J.; Horsley, T.; Weeks, L.; et al. PRISMA extension for scoping reviews (PRISMA-ScR): Checklist and explanation. Ann. Intern. Med. 2018, 169, 467–473. [Google Scholar] [CrossRef] [Green Version]
  21. EndNote X9. Available online: https://clarivate.libguides.com/endnote_training/endnote_online (accessed on 6 February 2020).
  22. Covidence. Systematic Review Software; Veritas Health Innovation: Melbourne, Australia, 2019. [Google Scholar]
  23. Parikh, R.; Mathai, A.; Parikh, S.; Sekhar, G.C.; Thomas, R. Understanding and using sensitivity, specificity and predictive values. Indian J. Ophthalmol. 2008, 56, 45. [Google Scholar] [CrossRef]
  24. Greiner, M.; Pfeiffer, D.; Smith, R. Principles and practical application of the receiver-operating characteristic analysis for diagnostic tests. Prev. Vet. Med. 2000, 45, 23–41. [Google Scholar] [CrossRef]
  25. Charernboon, T. Diagnostic accuracy of the Thai version of the Mini-Addenbrooke’s Cognitive Examination as a mild cognitive impairment and dementia screening test. Psychogeriatrics 2019, 19, 340–344. [Google Scholar] [CrossRef]
  26. Chen, K.-L.; Xu, Y.; Chu, A.-Q.; Ding, D.; Liang, X.N.; Nasreddine, Z.S.; Dong, Q.; Hong, Z.; Zhao, Q.-H.; Guo, Q.-H. Validation of the Chinese version of Montreal Cognitive Assessment basic for screening mild cognitive impairment. J. Am. Geriatr. Soc. 2016, 64, 285–290. [Google Scholar] [CrossRef]
  27. Chiu, H.F.; Zhong, B.L.; Leung, T.; Li, S.W.; Chow, P.; Tsoh, J.; Yan, C.; Xiang, Y.T.; Wong, M. Development and validation of a new cognitive screening test: The Hong Kong Brief Cognitive Test (HKBC). Int. J. Geriatr. Psychiatry 2018, 33, 994–999. [Google Scholar] [CrossRef]
  28. Chiu, P.; Tang, H.; Wei, C.; Zhang, C.; Hung, G.U.; Zhou, W. NMD-12: A new machine-learning derived screening instrument to detect mild cognitive impairment and dementia. PLoS ONE 2019, 14, e0213430. [Google Scholar] [CrossRef] [Green Version]
  29. Chu, L.; Ng, K.; Law, A.; Lee, A.; Kwan, F. Validity of the Cantonese Chinese Montreal Cognitive Assessment in Southern Chinese. Geriatr. Gerontol. Int. 2014, 15, 96–103. [Google Scholar] [CrossRef]
  30. Fung, A.W.-T.; Lam, L.C.W. Validation of a computerized Hong Kong—Vigilance and memory test (HK-VMT) to detect early cognitive impairment in healthy older adults. Aging. Ment. Health 2018, 24, 186–192. [Google Scholar] [CrossRef]
  31. Huang, L.; Chen, K.; Lin, B.; Tang, L.; Zhao, Q.H.; Li, F.; Guo, Q.H. An abbreviated version of Silhouettes test: A brief validated mild cognitive impairment screening tool. Int. Psychogeriatr. 2018, 31, 849–856. [Google Scholar] [CrossRef] [PubMed]
  32. Julayanont, P.; Tangwongchai, S.; Hemrungrojn, S.; Tunvirachaisakul, C.; Phanthumchinda, K.; Hongsawat, J.; Suwichanarakul, P.; Thanasirorat, S.; Nasreddine, Z.S. The Montreal Cognitive Assessment-Basic: A Screening Tool for Mild Cognitive Impairment in Illiterate and Low-Educated Elderly Adults. J. Am. Geriatr. Soc. 2015, 63, 2550–2554. [Google Scholar] [CrossRef]
  33. Kandiah, N.; Zhang, A.; Bautista, D.; Silva, E.; Ting, S.K.S.; Ng, A.; Assam, P. Early detection of dementia in multilingual populations: Visual Cognitive Assessment Test (VCAT). J. Neurol. Neurosurg. Psychiatry 2015, 87, 156–160. [Google Scholar] [CrossRef]
  34. Phua, A.; Hiu, S.; Goh, W.; Ikram, M.K.; Venketasubramanian, N.; Tan, B.Y.; Chen, C.L.-H.; Xu, X. Low Accuracy of Brief Cognitive Tests in Tracking Longitudinal Cognitive Decline in an Asian Elderly Cohort. J. Alzheimers. Dis. 2018, 62, 409–416. [Google Scholar] [CrossRef]
  35. Low, A.; Lim, L.; Lim, L.; Wong, B.; Silva, E.; Ng, K.P.; Kandiah, N. Construct validity of the Visual Cognitive Assessment Test (VCAT)—A cross-cultural language-neutral cognitive screening tool. Int. Psychogeriatr. 2019, 32, 141–149. [Google Scholar] [CrossRef] [Green Version]
  36. Mellor, D.; Lewis, M.; McCabe, M.; Byrne, L.; Wang, T.; Wang, J.; Zhu, M.; Cheng, Y.; Yang, C.; Dong, S.; et al. Determining appropriate screening tools and cut-points for cognitive impairment in an elderly Chinese sample. Psychol. Assess. 2016, 28, 1345–1353. [Google Scholar] [CrossRef]
  37. Ni, J.; Shi, J.; Wei, M.; Tian, J.; Jian, W.; Liu, J.; Liu, T.; Liu, B. Screening mild cognitive impairment by delayed story recall and instrumental activities of daily living. Int. J. Geriatr. Psychiatry 2015, 30, 888–890. [Google Scholar] [CrossRef]
  38. Park, J.; Jung, M.; Kim, J.; Park, H.; Kim, J.R.; Park, J.H. Validity of a novel computerized screening test system for mild cognitive impairment. Int. Psychogeriatr. 2018, 30, 1455–1463. [Google Scholar] [CrossRef]
  39. Feng, X.; Zhou, A.; Liu, Z.; Li, F.; Wei, C.; Zhang, G.; Jia, J. Validation of the Delayed Matching-to-Sample Task 48 (DMS48) in Elderly Chinese. J. Alzheimers Dis. 2018, 61, 1611–1618. [Google Scholar] [CrossRef]
  40. Xu, F.; Ma, J.; Sun, F.; Lee, J.; Coon, D.W.; Xiao, Q.; Huang, Y.; Zhang, L.; Liang, Z.H. The Efficacy of General Practitioner Assessment of Cognition in Chinese Elders Aged 80 and Older. Am. J. Alzheimer’s Dis. 2019, 34, 523–529. [Google Scholar] [CrossRef] [Green Version]
  41. Zainal, N.; Silva, E.; Lim, L.; Kandiah, N. Psychometric Properties of Alzheimer’s Disease Assessment Scale-Cognitive Subscale for Mild Cognitive Impairment and Mild Alzheimer’s Disease Patients in an Asian Context. Ann. Acad. Med. Singap. 2016, 45, 273–283. [Google Scholar]
  42. Apostolo, J.L.A.; Paiva, D.d.S.; da Silva, R.C.G.; dos Santos, E.J.F.; Schultz, T.J. Adaptation and validation into Portuguese language of the six-item cognitive impairment test (6CIT). Aging. Ment. Health 2018, 22, 1184–1189. [Google Scholar] [CrossRef] [PubMed]
  43. Avila-Villanueva, M.; Rebollo-Vazquez, A.; de Leon, J.M.R.-S.; Valenti, M.; Medina, M.; Fernández-Blázquez, M.A. Clinical relevance of specific cognitive complaints in determining mild cognitive impairment from cognitively normal states in a study of healthy elderly controls. Front Aging Neurosci. 2016, 8, 233. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  44. Bartos, A.; Fayette, D. Validation of the Czech Montreal Cognitive Assessment for Mild Cognitive Impairment due to Alzheimer Disease and Czech Norms in 1,552 Elderly Persons. Dement. Geriatr. Cogn. Disord. 2018, 1, 335–345. [Google Scholar] [CrossRef]
  45. Robin, H.; James, H.; Munro, C.; Lisa, D.; Barry, E.; Amy, F.; Laura, L.; Karin, M.; Dustin, W. Brief Cognitive Status Exam (BCSE) in older adults with MCI or dementia. Int. Psychogeriatr. 2015, 27, 221–229. [Google Scholar]
  46. Chipi, E.; Frattini, G.; Eusebi, P.; Mollica, A.; D’Andrea, K.; Russo, M.; Bernardelli, A.; Montanucci, C.; Luchetti, E.; Calabresi, P.; et al. The Italian version of cognitive function instrument (CFI): Reliability and validity in a cohort of healthy elderly. Neurol. Sci. 2018, 39, 111–118. [Google Scholar] [CrossRef]
  47. Duro, D.; Freitas, S.; Tábuas-Pereira, M.; Santiago, B.; Botelho, M.A.; Santana, I. Discriminative capacity and construct validity of the Clock Drawing Test in Mild Cognitive Impairment and Alzheimer’s disease. Clin. Neuropsychol. 2018, 33, 1159–1174. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  48. Lemos, R.; Marôco, J.; Simões, M.; Santana, I. Construct and diagnostic validities of the Free and Cued Selective Reminding Test in the Alzheimer’s disease spectrum. J. Clin. Exp. 2016, 38, 913–924. [Google Scholar] [CrossRef]
  49. Pirrotta, F.; Timpano, F.; Bonanno, L.; Nunnari, D.; Marino, S.; Bramanti, P.; Lanzafame, P. Italian Validation of Montreal Cognitive Assessment. Eur. J. Psychol. Assess. 2015, 31, 131–137. [Google Scholar] [CrossRef]
  50. Rakusa, M.; Jensterle, J.; Mlakar, J. Clock Drawing Test: A Simple Scoring System for the Accurate Screening of Cognitive Impairment in Patients with Mild Cognitive Impairment and Dementia. Dement. and Geriatr. Cogn. Disord. 2018, 45, 326–334. [Google Scholar] [CrossRef]
  51. Ricci, M.; Pigliautile, M.; D’Ambrosio, V.; Ercolani, S.; Bianchini, C.; Ruggiero, C.; Vanacore, N.; Mecocci, P. The clock drawing test as a screening tool in mild cognitive impairment and very mild dementia: A new brief method of scoring and normative data in the elderly. J. Neurol. Sci. 2016, 37, 867–873. [Google Scholar] [CrossRef]
  52. Serna, A.; Contador, I.; Bermejo-Pareja, F.; Mitchell, A.; Fernandez-Calvo, B.; Ramos, F.; Villarejo, A.; Benito-Leon, J. Accuracy of a Brief Neuropsychological Battery for the Diagnosis of Dementia and Mild Cognitive Impairment: An Analysis of the NEDICES Cohort. J. Alzheimers Dis. 2015, 48, 163–173. [Google Scholar] [CrossRef]
  53. Van, E.; Van, J.; Jansen, I.; Van, M.; Floor, Z.; Bimmel, D.; Tjalling, R.; Andringa, G. The Test Your Memory (TYM) Test Outperforms the MMSE in the Detection of MCI and Dementia. Curr. Alzheimer Res. 2017, 14, 598–607. [Google Scholar]
  54. Vyhnálek, M.; Rubínová, E.; Marková, H.; Nikolai, T.; Laczó, J.; Andel, R.; Hort, J. Clock drawing test in screening for Alzheimer’s dementia and mild cognitive impairment in clinical practice. Int. J. Geriatr. 2016, 32, 933–939. [Google Scholar] [CrossRef]
  55. Baerresen, K.M.; Miller, K.J.; Hanson, E.R.; Miller, J.S.; Dye, R.V.; E Hartman, R.; Vermeersch, D.; Small, G.W. Neuropsychological tests for predicting cognitive decline in older adults. Neurodegener. Dis. Manag. 2015, 5, 191–201. [Google Scholar] [CrossRef] [Green Version]
  56. Krishnan, K.; Rossetti, H.; Hynan, L.; Carter, K.; Falkowski, J.; Lacritz, L.; Munro, C.; Weiner, M. Changes in Montreal Cognitive Assessment Scores Over Time. Assessment 2016, 24, 772–777. [Google Scholar]
  57. Malek-Ahmadi, M.; Chen, K.; Davis, K.; Belden, C.; Powell, J.; Jacobson, S.A.; Sabbagh, M.N. Sensitivity to change and prediction of global change for the Alzheimer’s Questionnaire. Alzheimer’s Res. Ther. 2015, 7, 1. [Google Scholar] [CrossRef] [Green Version]
  58. Mansbach, W.; Mace, R. A comparison of the diagnostic accuracy of the AD8 and BCAT-SF in identifying dementia and mild cognitive impairment in long-term care residents. Aging Neuropsychol. Cogn. 2016, 23, 609–624. [Google Scholar] [CrossRef]
  59. Mitchell, J.; Dick, M.; Wood, A.; Tapp, A.; Ziegler, R. The Utility of the Dementia Severity Rating Scale in Differentiating Mild Cognitive Impairment and Alzheimer Disease from Controls. Alzheimer Dis. Assoc. Disord. 2015, 29, 222–228. [Google Scholar] [CrossRef] [Green Version]
  60. Scharre, D.; Chang, S.; Nagaraja, H.; Vrettos, N.; Bornstein, R.A. Digitally translated Self-Administered Gerocognitive Examination (eSAGE): Relationship with its validated paper version, neuropsychological evaluations, and clinical assessments. Alzheimer’s Res. Ther. 2017, 9, 44. [Google Scholar] [CrossRef] [Green Version]
  61. Townley, R.; Syrjanen, J.; Botha, H.; Kremers, W.; Aakre, J.A.; Fields, J.A.; MacHulda, M.M.; Graff-Radford, J.; Savica, R.; Jones, D.T.; et al. Comparison of the Short Test of Mental Status and the Montreal Cognitive Assessment Across the Cognitive Spectrum. Mayo. Clin. Proc. 2019, 94, 1516–1523. [Google Scholar] [CrossRef]
  62. Damin, A.; Nitrini, R.; Brucki, S. Cognitive Change Questionnaire as a method for cognitive impairment screening. Dement. Neuropsychol. 2015, 9, 237–244. [Google Scholar] [CrossRef] [Green Version]
  63. Pinto, T.; Machado, L.; Costa, M.; Santos, M.; Bulgacov, T.M.; Rolim, A.P.P.; Silva, G.A.; Rodrigues-Júnior, A.L.; Sougey, E.B.; Ximenes, R.C. Accuracy and Psychometric Properties of the Brazilian Version of the Montreal Cognitive Assessment as a Brief Screening Tool for Mild Cognitive Impairment and Alzheimer’s Disease in the Initial Stages in the Elderly. Dement. Geriatr. Cogn. Disord. 2019, 47, 366–374. [Google Scholar] [CrossRef]
  64. Radanovic, M.; Facco, G.; Forlenza, O. Sensitivity and specificity of a briefer version of the Cambridge Cognitive Examination (CAMCog-Short) in the detection of cognitive decline in the elderly: An exploratory study. Int. J. Geriatr. Psychiatry 2018, 33, 769–778. [Google Scholar] [CrossRef]
  65. Clarnette, R.; O’Caoimh, R.; Antony, D.; Svendrovski, A.; Molloy, D.W. Comparison of the Quick Mild Cognitive Impairment (Qmci) screen to the Montreal Cognitive Assessment (MoCA) in an Australian geriatrics clinic. Int. J. Geriatr. Psychiatry 2016, 32, 643–649. [Google Scholar] [CrossRef] [Green Version]
  66. Lee, S.; Ong, B.; Pike, K.; Mullaly, E.; Rand, E.; Storey, E.; Ames, D.; Saling, M.; Clare, L.; Kinsella, G.J. The Contribution of Prospective Memory Performance to the Neuropsychological Assessment of Mild Cognitive Impairment. Clin. Neuropsychol. 2016, 30, 131–149. [Google Scholar] [CrossRef]
  67. Georgakis, M.K.; Papadopoulos, F.C.; Beratis, I.; Michelakos, T.; Kanavidis, P.; Dafermos, V.; Tousoulis, D.; Papageorgiou, S.; Petridou, E.T. Validation of TICS for detection of dementia and mild cognitive impairment among individuals characterized by low levels of education or illiteracy: A population-based study in rural Greece. Clin. Neuropsychol. 2017, 31, 61–71. [Google Scholar] [CrossRef]
  68. Iatraki, E.; Simos, P.; Bertsias, A.; Duijker, G.; Zaganas, I.; Tziraki, C.; Vgontzas, A.; Lionis, C. Cognitive screening tools for primary care settings: Examining the ‘Test Your Memory’ and ‘General Practitioner assessment of Cognition’ tools in a rural aging population in Greece. Euro. J. Gen. 2017, 23, 172–179. [Google Scholar] [CrossRef] [Green Version]
  69. Roman, F.; Iturry, M.; Rojas, G.; Barceló, E.; Buschke, H.; Allegri, R.F. Validation of the Argentine version of the Memory Binding Test (MBT) for Early Detection of Mild Cognitive Impairment. Dement. Neuropsychol. 2016, 10, 217–226. [Google Scholar] [CrossRef] [Green Version]
  70. Freedman, M.; Leach, L.; Carmela, M.; Stokes, K.A.; Goldberg, Y.; Spring, R.; Nourhaghighi, N.; Gee, T.; Strother, S.C.; Alhaj, M.O.; et al. The Toronto Cognitive Assessment (TorCA): Normative data and validation to detect amnestic mild cognitive impairment. Alzheimers. Res. Ther. 2018, 10, 65. [Google Scholar] [CrossRef]
  71. Heyanka, D.; Scott, J.; Adams, R. Improving the Diagnostic Accuracy of the RBANS in Mild Cognitive Impairment with Construct-Consistent Measures. Appl. Neuropsychol. Adult 2013, 22, 32–41. [Google Scholar] [CrossRef]
  72. Broche-Perez, Y.; Lopez-Pujol, H.A. Validation of the Cuban Version of Addenbrooke’s Cognitive Examination-Revised for Screening Mild Cognitive Impairment. Dement. Geriatr. Cogn. Disord. 2017, 44, 320–327. [Google Scholar] [CrossRef]
  73. Yavuz, B.; Varan, H.; O’Caoimh, R.; Kizilarslanoglu, M.; Kilic, M.; Molloy, D.; Dogrul, R.T.; Karabulut, E.; Svendrovski, A.; Sagir, A.; et al. Validation of the Turkish Version of the Quick Mild Cognitive Impairment Screen. Am. J. Alzheimer’s Dis. 2017, 32, 145–156. [Google Scholar] [CrossRef]
  74. Rosli, R.; Tan, M.; Gray, W.; Subramanian, P.; Chin, A.V. Cognitive assessment tools in Asia: A systematic review. Int. Psychoger. 2015, 28, 189–210. [Google Scholar] [CrossRef]
  75. Arevalo-Rodriguez, I.; Smailagic, N.; Roqué, M.; Ciapponi, A.; Sanchez-Perez, E.; Giannakou, A.; Pedraza, O.L.; Cosp, X.B.; Cullum, S. Mini-Mental State Examination (MMSE) for the detection of Alzheimer’s disease and other dementias in people with mild cognitive impairment (MCI). Cochrane Database Syst. Rev. 2015. [Google Scholar] [CrossRef]
  76. Ciesielska, N.; Sokołowski, R.; Mazur, E.; Podhorecka, M.; Polak-Szabela, A.; Kędziora-Kornatowska, K. Is the Montreal Cognitive Assessment (MoCA) test better suited than the Mini-Mental State Examination (MMSE) in mild cognitive impairment (MCI) detection among people aged over 60? Meta-analysis. Psychiatr. Pol. 2016, 50, 1039–1052. [Google Scholar] [CrossRef]
  77. Nasreddine, Z.S.; Phillips, N.A.; Bédirian, V.; Charbonneau, S.; Whitehead, V.; Collin, I.; Cumming, J.L.; Chertkow, H. The Montreal Cognitive Assessment, MoCA: A brief screening tool for mild cognitive impairment. J. Am. Geriatr. Soc. 2005, 53, 695–699. [Google Scholar] [CrossRef]
  78. Alzheimer’s Disease. National Institute on Aging. Available online: https://www.nia.nih.gov/health/alzheimers-disease-fact-sheet (accessed on 27 October 2020).
  79. What Is Mild Cognitive Impairment? Available online: https://www.nia.nih.gov/health/what-mild-cognitive-impairment (accessed on 8 October 2020).
  80. Csukly, G.; Sirály, E.; Fodor, Z.; Horváth, A.; Salacz, P.; Hidasi, Z.; Csibri, É.; Rudas, G.; Szabó, Á. The Differentiation of Amnestic Type MCI from the Non-Amnestic Types by Structural MRI. Front Aging Neurosci. 2016, 8, 1–52. [Google Scholar] [CrossRef] [Green Version]
  81. Ghose, S.; Das, S.; Poria, S.; Das, T. Short test of mental status in the detection of mild cognitive impairment in India. Indian J. Psychiatry 2019, 61, 184–191. [Google Scholar]
Figure 1. PRISMA flow chart for study selection process.
Figure 1. PRISMA flow chart for study selection process.
Nutrients 13 03974 g001
Table 1. Validity criteria for cognitive tools.
Table 1. Validity criteria for cognitive tools.
Criteria *InterpretationRange (%)
Sn and SpExcellent91–100
Good76–90
Fair50–75
Poor<50
AUCExcellent91–100
Good81–90
Fair71–80
Poor<70
PPV and NPVExcellent91–100
Good76–90
Fair50–75
Poor<50
* The criteria for Sn, Sp, PPV and NPV were decided based on researchers’ consensus. The criterion for AUC was adapted from Safari S et al. [13].
Table 2. Included studies.
Table 2. Included studies.
No.Authors, Year, CountryStudy DesignParticipants CharacteristicsCognitive ToolComparison Standard
Age (Mean ± SD or Range)% FemaleEducation Years (Mean ± SD or Range)
1Apostolo JLA et al., 2018, Portugal [42]Cross-sectional 67.7 ± 9.770.430.7% 0–2 years, 43.3% 3–6 years, 26% 7–18 years6CITMMSE
2Avila-Villanueva M et al., 2016, Spain [43]Cross-sectional 74.07 ± 3.86311.15 ± 6.69EMQCDR, NIA-AA criteria
3Baerresen KM et al., 2015, US [55]Cohort60.84 ± 10.766016.67 ± 2.94 BSRT, RCFT, TMTRigorous diagnostic methods: MRI scan, clinical consensus of neurology, geriatric psychiatry, neuropsychology and radiology staff
4Bartos A et at., 2018, Czech Republic [44]Cross-sectional70 ± 85912–17MoCANIA-AA criteria
5Bouman Z et al., 2015 Netherlands [45]Cross-sectional76.6 ± 5.9~46~66% low level, 19% average level, 16% high levelBCSEMMSE
6Broche-Perez Y et al., 2018, Cuba [72]Cross-sectional73.28 ± 7.16~679.82 ± 4.23 ACE, MMSEPetersen’s criteria, CDR
7Charernboon T, 2019, Thailand [25]Cross-sectional64.9 ± 6.576.710.2 ± 4.9ACEThai version of MMSE
8Chen K-L et al., 2016, China [26]Cross-sectional68.2 ± 9.1~664.8 ± 1.7MMSE, MoCACDR
9Chipi E et al., 2017, Italy [46]Cross-sectional70.9 ± 5.161.211.5 ± 4.5CFIMMSE
10Chiu HF et al., 2017, Hong Kong [27]Cross-sectional75.4 ± 6.656.66.5 ± 3.8 HKBC, MoCA, MMSEDSM-5
11Chiu P et al., 2019, Taiwan [28]Cross-sectional67.8 ± 10.747.26.9 ± 5.1MMSE, NMD-12, MoCA, IADL, AD8, CASI, NPINIA-AA criteria, CDR
12Chu L et al., 2015, Hong Kong [29]Cross-sectional72.2 ± 6.1876.97 ± 4.69MMSEPetersen’s criteria
13Clarnette R et al., 2016, Australia [65]Cross-sectional50–95524–21Qmci, MoCACDR
14Damin A et al., 2015 Brazil [62]Cohort68.27 ± 7.34N/A7.48 ± 4.48CCQMMSE, CAMCog, CDR and the brief cognitive screening battery
15Duro D et al., 2018, Portugal [47]Cross-sectional69.47 ± 8.8963.56.69 ± 4.14CDTNIA-AA criteria
16Freedman M et al., 2018 [70]Cross-sectional75.3 ± 7.9~6715.02 ± 3.2TorCANIA-AA criteria
17Fung AW-T et al., 2018, Hong Kong [30]Cohort68.8 ± 6.358.49.8 ± 4.8HK-VMTCombined clinical and cognitive criteria suitable for local older population, CDR
18Georgakis MK et al., 2017, Greece [67]Cross-sectional74.3 ± 6.651.64.5 ± 2.6TICS5-objects test
19Heyanka D et al., 2015 [71]Cross-sectional71.5 ± 7.5~4314.8 ± 3.2RBANSPetersen’s criteria
20Huang L et al., 2018, China [31]Cross-sectional65.71 ± 8.10~5612.78 ± 2.74RCFT, MoCA, VOSP, BNT, STT, JLO, STPetersen’s criteria, CDR
21Iatraki E et al., 2017, Greece [68]Cross-sectional71.0 ± 6.964.66.4 ± 3.1TYM, GPCogUnclear
22Julayanont P et al., 2015, Thailand [32]Cross-sectional66.6 ± 6.7843.6 ± 1.1MoCA, MMSECDR global
23Khandiah N et al., 2015, Singapore [33]Cohort67.8 ± 8.8646.110.5 ± 6.0VCATPetersen’s criteria, CDR, NIA-AA criteria
24Phua A et al., 2017, Singapore [34]Cross-sectional66.8 ± 5.5629.3 ± 4.9MoCA, MMSEDSM-IV, CDR global
25Krishnan K et al., 2016, US [56]Cohort58–776415.2 ± 2.7MoCAHistory, clinical examination, CDR, and a comprehensive neuropsychological battery based on published criteria
26Lee S et al., 2016, Australia [66]Cross-sectionalMedian 7353Median 14CVLT, The Envolope Task, PRMQ, Single-item Memory Scale, MMSEHVLT-R, Logical Memory, Wechsler Memory Scale Third Edition, Verbal Paired Associates, Wechsler Memory Scale Fourth Edition, RCFT, CDR, ADFACS, NINCDS-ADRDA criteria, MMSE
27Lemos R et al., 2016, Portugal [48]Cohort70.22 ± 7.6552.57.7 ± 5.01FCSRTMMSE, CDR
28Low A et al., 2019, Singapore [35]Cross-sectional61.47 ± 7.197012.36 ± 3.76VCATNIA-AA criteria, CDR, MRI scan
29Malek-Ahmadi M et al., 2015, US [57]Longitudinal Cohort81.70 ± 7.25~4814.74 ± 2.54MMSE, AQ, FAQPetersen’s criteria
30Mansbach W et al., 2016, US [58]Cohort82.33 ± 9.156484% at least 12 years educationBCAT, AD8Unclear, diagnosed by licensed psychologist’s evaluations
31Mellor D et al., 2016, China [36]Cohort72.54 ± 8.4057.99.12 ± 4.36MoCA, MMSEPetersen’s criteria
32Mitchell J et al., 2015, US [59]Case–control75.9 ± 8.550.915.2 ± 2.9FAQ, DSRS, CWLT, BADLSWMS-III Logical Memory test or the CERAD Word List
33Ni J et al., 2015, China [37]Cross-sectional62.57 ± 8.61~5912.04 ± 3.34DSRHistory and physical exams, MMSE, story recall (immediate and 30 min delayed), CDR, ADL
34Park J et al., 2018, South Korea [38]Cross-sectional74.93 ± 6.9656.35.83 ± 4.52mSTS-MCIMoCA-K, MMSE-K, neuropsychological battery (Rey Auditory Verbal Learning Test and Delayed Visual Reproduction and Logical Memory, two subtests of the Wechsler Memory Scale)
35Pinto T et al., 2019, Brazil [63]Cross-sectional73.9 ± 6.276.410.9 ± 4.4MoCAStatistically compared
36Pirrotta F et al., 2014, Italy [49]Cross-sectional70.5 ± 11.558.28.1 ± 4.6MoCAMMSE
37Radanovic M et al., 2017, Brazil [64]Cohort~68.7 ± 5.85~79~10.35 ± 2.45CAMCogPetersen’s criteria
38Rakusa M et al., 2018, Slovenia [50]CohortMedian 74N/A65% Secondary school, 23% University, 12% Primary SchoolMMSE, CDTNIA-AA criteria
39Ricci M et al., 2016, Italy [51]Cohort73.3 ± 6.9N/A7.2 ± 4.2CDTNINCDS- ADRDA criteria
40Roman F et al., 2016, Argentina [69]Cross-sectional67.5 ± 8.3N/A11.5 ± 4.1MBTSpanish Version of MMSE, CDT, Signoret Verbal Memory Battery, TMT, VF, Spanish Version of BNT, and the Digit Span forward and backward
41Scharre D et al., 2017, US [60]Investigational75.2 ± 7.36715.1 ± 2.7SAGEUnclear
42Serna A et al., 2015, Spain [52]Cohort78.10 ± 5.0459.364.2% illiteracy/read and write, 35.8% primary/secondary or higherSemantic Fluency/VF, Logical MemoryInternational Work Group criteria, MMSE
43Townley R et al., 2019 US [61]Cohort~72.4 ± 8.9547–51~ 15.05 ± 2.65STMS, MoCAClinical consensus
44Van de Zande E et al., 2017, Netherlands [53]Cross-sectional73.05 ± 8.62~5210.34 ± 3.66 MMSE, TYMPetersen’s criteria
45Vyhnálek M et al., 2016, Czech Republic [54]Cross-sectional71.20 ± 6.77~6415.30 ± 2.95CDTCDR
46Feng X et al., 2017, China [39]Cross-sectional65.99 ± 10.4562.592.88% 0 years, 7.19% 1–6 years, 51.08% 7–12 years, 38.85% ≥12 yearsDMS48Chinese Version of MMSE, MoCA, CDR, NIA-AA criteria
47Xu F et al., 2019, China [40]Cross-sectional82.87 ± 3.13433.462.8% having bachelor’s degreesMMSE, GPCogNIA-AA criteria
48Yavuz B et al., 2017 Turkey [73]Cross-sectional75.4 ± 6.9650–21 (Median 5)MMSE, QmciPetersen’s criteria
49Zainal N et al., 2016, Singapore [41]Cross-sectional61.81 ± 6.9668.811.70 ± 3.13ADAS-CogPetersen’s criteria, CDR
6 CIT: Six-item Cognitive Impairment Test; MMSE: Mini-Mental State Examination; EMQ: Everyday Memory Questionnaire; CDR: Clinical Dementia Rating; NIA-AA: National Institute on Aging-Alzheimer’s Association; BSRT: Buschke Simple Reaction Time; RCFT: Rey–Osterrieth Complex Figure Test; TMT: Trail Making Test; MRI: Magnetic Resonance Imaging; MoCA: Montreal Cognitive Assessment; BCSE: Brief Cognitive Status Exam; ACE: Addenbrooke’s Cognitive Examination. Abbreviations list for Table 2: CFI: Cognitive Function Instrument; HKBC: Hong Kong Brief Cognitive Test; DSM-5: Diagnostic and Statistical Manual of Mental Disorders, 5th Edition; NMD-12: Normal-MCI-Dementia 12 Questionnaire; IADL: Instrumental Activities of Daily Living; AD8: Dementia Screening Interview; CASI: Cognitive Abilities Screening Instrument; NPI: Neuropsychological Inventory; Qmci: Quick Mild Cognitive Impairment; CCQ: Cognitive Change Questionnaire; CAMCog: Cambridge Cognitive Examination; TorCA: Toronto Cognitive Assessment; HK-VMT: Hong Kong—Vigilance and Memory Test; TICS: Telephone Interview for Cognitive Status; RBANS: Repeatable Battery for the Assessment of Neuropsychological Status; VOSP: Visual Object and Space Perception; BNT: Boston Naming Test; STT: Shape Trail Test; JLO: Judgment of Line Orientation; ST: Similarity Test; TYM: Test Your Memory; GPCog: General Practitioner assessment of Cognition; DSM-IV: Diagnostic and Statistical Manual of Mental Disorders, 4th Edition; CVLT: California Verbal Learning Test; PRMQ: Prospective and Retrospective Memory Questionnaire; HVLT-R: Hopkins Verbal Learning Test—Revised; ADFACS: Alzheimer’s Disease Functional Assessment and Change Scale; NINCDS-ADRDA: National Institute of Neurological and Communicative Disorders and Stroke and the Alzheimer’s Disease and Related Disorders Association; FCSRT: Free and Cued Selective Reminding Test; VCAT: Visual Cognitive Assessment Test; AQ: Alzheimer’s Questionnaire; FAQ: Functional Activities Questionnaire; BCAT: Brief Cognitive Assessment Tool; BLAT: Blind Learning Aptitude Test; DSRS: Severity Rating Scale; CWLT: CERAD Word List Memory Test; BADLS: Bristol Activities of Daily Living Scale; DSR: Delayed Story Recall; WMS-III: Wechsler Memory Scale-3rd Edition; CERAD: Consortium to Establish a Registry for Alzheimer’s Disease; ADL: Activities of Daily Living; mSTS-MCI: Mobile Screening Test System for screening Mild Cognitive Impairment; MoCA-K: Korean version of MoCA; MMSE-K: Korean version of MMSE; CDT: Clock Drawing Test; MBT: Memory Binding Test; VF: Verbal Fluency; SAGE: Self-Administered Gerocognitive Examination; STMS: Short Test of Mental Status; DMS48: Delayed Matching-to-Sample Task 48; ADAS-Cog: Alzheimer’s Disease Assessment Scale-Cognitive subscale.
Table 3. Included Tools and Its Study Characteristics.
Table 3. Included Tools and Its Study Characteristics.
No.Cognitive ToolArticle No.Authors, Year, CountrySettingsAdministration Method
16CIT1Apostolo JLA et al., 2018, Portugal [42]Community, Primary health care unitsBy untrained examiner (post-graduate student)
2EMQ1Avila-Villanueva M et al., 2016, Spain [43]Community Self-administered
3BSRT2Baerresen KM et al., 2015, US [55]CommunityNR
4RCFT2Baerresen KM et al., 2015, US [55]CommunityNR
2Huang L et al., 2018, China [31]Memory ClinicBy trained examiner
5TMT4Baerresen KM et al., 2015, US [55]CommunityNR
6MoCA8Bartos A et at., 2018, Czech Republic [44]CommunityBy trained examiner
10Chen K-L et al., 2016, China [26]HospitalBy trained examiner
12Chiu HF et al., 2017, Hong Kong [27]CommunityBy untrained examiner (research assistant)
13Chu L et al., 2015, Hong Kong [29]Memory Clinic, CommunityBy examiner
6MoCA13Clarnette R et al., 2016, Australia [65]Geriatrics ClinicBy trained professionals (geriatrician)
22Julayanont P et al., 2015, Thailand [32]Community HospitalBy trained professionals (nurse with expertise in cognitive assessment)
24Phua A et al., 2017, Singapore [34]Memory ClinicNR
25Krishnan K et al., 2016, US [56]Community, Clinical CareNR
31Mellor D et al., 2016, China [36]CommunityBy trained professionals (psychologist or attending level psychiatrist)
35Pinto T et al., 2019, Brazil [63]Health Care CentresBy trained professionals (neurologist researcher)
36Pirrotta F et al., 2014, Italy [49]Clinical, ResearchBy trained professionals (psychologist)
43Townley R et al., 2019 US [61]CommunityNR
48Yavuz B et al., 2017, Turkey [73]Geriatrics ClinicBy trained professionals (psychologist)
11Chiu P et al., 2019, Taiwan [28]Health Care CentresBy professionals (neuropsychologist)
20Huang L et al., 2018, China [31]Memory ClinicBy trained examiner
7BCSE5Bouman Z et al., 2015 Netherlands [45]Memory ClinicBy untrained examiner
8ACE6Broche-Perez Y et al., 2018, Cuba [72]Primary Care Community Centre: nursing homes (permanent residences for the elderly) and day care centresBy trained professionals (neurologist and geriatrician)
7Charernboon T, 2019, Thailand [25]Memory ClinicNR
9MMSE6Broche-Perez Y et al., 2018, Cuba [72]Primary Care Community Centre: nursing homes (permanent residences for the elderly) and day care centresBy professionals
(neurologist and geriatrician)
8Chen K-L et al., 2016, China [26]HospitalBy trained examiner
10Chiu HF et al., 2017, Hong Kong [27]CommunityBy untrained examiner
(research assistant)
12Chu L et al., 2015, Hong Kong [29]Memory Clinic, CommunityBy examiner
22Julayanont P et al., 2015, Thailand [32]Community HospitalBy trained professionals
(nurse with expertise in
cognitive assessment)
24Phua A et al., 2017, Singapore [34]Memory ClinicNR
26Lee S et al., 2016, Australia [66]Community, Memory ClinicUnclear
31Mellor D et al., 2016, China [36]CommunityBy trained professionals
(psychologist or psychiatrist)
38Rakusa M et al., 2018, Slovenia [50]CommunityNR
44Van de Zande E et al., 2017, Netherlands [53]Memory ClinicBy trained examiner
47Xu F et al., 2019, China [40]CommunityBy trained examiner
48Yavuz B et al., 2017 Turkey [73]Geriatrics ClinicBy trained examiner
11Chiu P et al., 2019, Taiwan [28]Health Care CentresBy professionals
(neuropsychologist)
29Malek-Ahmadi M et al., 2015, US [57]CommunityNR
10CFI9Chipi E et al., 2017, Italy [46]Memory ClinicBy professionals
(neuropsychologist)
11RBANS19Heyanka D et al., 2015 [71]Medical CentreNR
12HKBC10Chiu HF et al., 2017, Hong Kong [27]CommunityBy untrained examiner
(research assistant)
13NMD-1211Chiu P et al., 2019, Taiwan [28]Health Care CentresBy professionals
(neuropsychologist)
14Qmci13Clarnette R et al., 2016, Australia [65]Geriatrics ClinicBy trained professionals
(geriatrician)
48Yavuz B et al., 2017 Turkey [73]Geriatrics ClinicBy trained examiner
15CCQ14Damin A et al., 2015 Brazil [62]ClinicalBy professionals
(physician)or self-administered
16CDT15Duro D et al., 2018, Portugal [47]Tertiary CentreBy professionals
(neuropsychologist)
38Rakusa M et al., 2018, Slovenia [50]CommunityNR
39Ricci M et al., 2016, Italy [51]Memory Clinic, CommunityBy untrained examiner
45Vyhnálek M et al., 2016, Czech Republic [54]Memory ClinicBy professionals
(neuropsychologist,
neurologist, resident)
17HK-VMT17Fung AW-T et al., 2018, Hong Kong [30]CommunitySelf-administered
(touch-screen laptop)
18TorCA16Freedman M et al., 2018 [70]Suitable for use in any medical settingBy trained examineror professionals
(health care professionals)
19TICS18Georgakis MK et al., 2017, Greece [67]Community, Health CentreBy professionals
(health care professionals)
20VOSP20Huang L et al., 2018, China [31]Memory ClinicBy trained examiner
21TYM21Iatraki E et al., 2017, Greece [68]Rural Primary CareBy trained examiner
44Van de Zande E et al., 2017, Netherlands [53]Memory Clinic, Primary Clinical Setting (GP practice, home care) Self-administered (under supervision)
22GPCog21Iatraki E et al., 2017, Greece [68]Rural Primary CareBy trained examiner
47Xu F et al., 2019, China [40]Outpatient Clinical, Primary CareBy trained examiner
23CVLT26Lee S et al., 2016, Australia [66]Community, Memory ClinicNR
24The Envelope Task26Lee S et al., 2016, Australia [66]Community, Memory ClinicNR
25PRMQ26Lee S et al., 2016, Australia [66]Community, Memory ClinicNR
26Single-item Memory Scale26Lee S et al., 2016, Australia [66]Community, Memory ClinicNR
27FCSRT27Lemos R et al., 2016, Portugal [48]Community, HospitalNR
28AQ29Malek-Ahmadi M et al., 2015, US [57]Designed for ease of use in primary care settingNR
29FAQ29Malek-Ahmadi M et al., 2015, US [57]CommunityNR
32Mitchell J et al., 2015, US [59]CommunityBy professionals (clinician)
30BCAT30Mansbach W et al., 2016, US [58]Long-Term CareBy professionals
31AD811Chiu P et al., 2019, Taiwan [28]Health Care CentresBy professionals (neuropsychologist)
30Mansbach W et al., 2016, US [58]Long-Term CareSelf-administered
32DSRS32Mitchell J et al., 2015, US [59]CommunityBy professionals (clinician)
33CMLT32Mitchell J et al., 2015, US [59]CommunityBy professionals (clinician)
32 + 33CWLT-5 + DSRS32Mitchell J et al., 2015, US [59]CommunityBy professionals (clinician)
34BADLS32Mitchell J et al., 2015, US [59]CommunityBy professionals (clinician)
35DSR33Ni J et al., 2015, China [37]Memory ClinicNR
36mSTS-MCI34Park J et al., 2018, South Korea [38]Clinical settings, Primary care, Geriatrics Outpatient ClinicsBy professionals (occupational therapist), using mobile application
37CAMCog37Radanovic M et al., 2017, Brazil [64]ClinicalBy professionals (physician)
38MBT40Roman F et al., 2016, Argentina [69]ClinicalNR
39SAGE41Scharre D et al., 2017, US [60]Community, Clinic, ResearchSelf-administered (paper-based or on tablet)
40Semantic Fleuncy/VF42Serna A et al., 2015, Spain [52]CommunityNR
41Logical Memory42Serna A et al., 2015, Spain [52]CommunityNR
42STMS43Townley R et al., 2019 US [61]Community, Primary CareNR
43DMS4846Feng X et al., 2017, China [39]Memory ClinicBy trained examiner
44ADAS-Cog49Zainal N et al., 2016, Singapore [41]Clinical Trials, ClinicBy trained examiner
45IADL11Chiu P et al., 2019, Taiwan [28]Health Care CentresBy professionals (neuropsychologist)
46CASI11Chiu P et al., 2019, Taiwan [28]Health Care CentresBy professionals (neuropsychologist)
47NPI11Chiu P et al., 2019, Taiwan [28]Health Care CentresBy professionals (neuropsychologist)
48BNT20Huang L et al., 2018, China [31]Memory ClinicBy trained examiner
49STT20Huang L et al., 2018, China [31]Memory ClinicBy trained examiner
50JLO20Huang L et al., 2018, China [31]Memory ClinicBy trained examiner
51ST20Huang L et al., 2018, China [31]Memory ClinicBy trained examiner
52VCAT23Khandiah N et al., 2015, Singapore [33]Community, ClinicalBy trained examiner
28Low A et al., 2019, Singapore [35]Community, Memory ClinicBy professionals (psychologist)
Note: ‘Article No.’ extracted from Table 2. Abbreviation list for Table 3: NR: not reported.
Table 4. Psychometric Properties of Cognitive Tools to Detect Mild Cognitive Decline.
Table 4. Psychometric Properties of Cognitive Tools to Detect Mild Cognitive Decline.
No.Cognitive ToolVersion of ToolsAuthor, Year, CountryRange of Total Score Cut-Off Point *Sn/Sp (%)ValidityReliabilityAffecting FactorsDuration (mins)
AUC (%)PPV/NPV (%)
16CITPortuguese VersionApostolo JLA et al., 2018, Portugal [42]8–11≤10 (all literacy level)82.78/84.849184.3/83.3 High test–retest reliability, Strong internal consistency Literacy Level2 to 3
Portuguese VersionApostolo JLA et al., 2018, Portugal [42]4–15≤12 (education 0–2 years)93.44/68.099488.4/80High test–retest reliability, Strong internal consistency Literacy Level2 to 3
Portuguese VersionApostolo JLA et al., 2018, Portugal [42]9–12.03≤10 (education 3–6 years)88/86.239582.2/90.8High test–retest reliability, Strong internal consistency Literacy Level2 to 3
2EMQ-Avila-Villanueva M et al., 2016, Spain [43]NRNRNRNRNRNRNRNR
3BSRT-Baerresen KM et al., 2015, US [55]NRNRPredicted conversion to MCI and the conversion to ADNRNR
4RCFT-Baerresen KM et al., 2015, US [55]0–36NRPredicted conversion from normal aging to MCINRNR
Rey Complex Figure Test Copy (CFT-C)Huang L et al., 2018, China [31]0–36≤3246.9/76.981.6NRNRNRNR
5TMTTest B (TMT-B)Baerresen KM et al., 2015, US [55]NRNRPredicted conversion to MCI and the conversion to ADNRNR
6MoCACzech Version (MoCA-CZ)Bartos A et at., 2018, Czech Republic [44]0–30≤25 94/6289NRNRNR12 ± 3
Czech Version (MoCA-CZ)Bartos A et at., 2018, Czech Republic [44]0–30≤2487/7289NRNRNR12 ± 3
Chinese Version (MoCA-BC)Chen K-L et al., 2016, China [26]0–30≤19 (education ≤6 years)87.9/8189.6NRNRNRNR
Chinese Version (MoCA-BC)Chen K-L et al., 2016, China [26]0–30≤22 (education 7–12 years)92.9/91.294.9NRNRNRNR
Chinese Version (MoCA-BC)Chen K-L et al., 2016, China [26]0–30≤24 (education >12 years)89.9/81.591.6NRNRNRNR
Cantonese VersionChiu HF et al., 2017, Hong Kong [27]0–30≤19/2080/8691.394/98NREducationNR
6MoCACantonese Chinese VersionChu L et al., 2015, Hong Kong [29]0–3022/2378/7395NRHigh test–retest reliability, High internal consistency, High inter-rater reliabilityEducation (sex and age not associated)≤10
-Clarnette R et al., 2016, Australia [65]0–30≤2387/8084–9295/58NRNRNR
Basic Version (MoCA-B)Julayanont P et al., 2015, Thailand [32]0–3024/2586/86NR85/82Good internal consistencyDesigned to be less dependent upon education and literacy15 to 21
-Phua A et al., 2017,Singapore [34]0–30NR63/77NR70/65NRNRNR
-Krishnan K et al., 2016, US [56]0–30≤2651/96NRNRGood test–retest reliabilityNR10
6MoCA-Mellor D et al., 2016, China [36]0–30≤22.587/738954.5/93.6NRAge, Gender, EducationNR
Brazilian Version (MoCA-BR)Pinto T et al., 2019, Brazil [63]0–30NRNRNRNRGood internal consistency, Good test–retest reliability, Excellent inter-examiner reliabilityNR13.1 ± 2.7
Italian versionPirrotta F et al., 2014, Italy [49]0–30≤15.583/9796NRHigh intra-rater reliability, High test–retest agreement, Excellent inter-rater reliabilityNR10
-Townley R et al., 2019 US [61]0–30≤2689/47Incident MCI: 70, a-MCI: 90, na- MCI: 84NRNRNRNR
6MoCA-Yavuz B et al., 2017, Turkey [73]0–30<2659/726972/71NRNR10
-Chiu P et al., 2019, Taiwan [28]0–3019/2068/6567NRNRAge, EducationNR
-Huang L et al., 2018, China [31]0–30≤2481.5/65.181.8NRNRNRNR
7BCSEDutch VersionBouman Z et al., 2015 Netherlands [45]0–58≤4681/80NR61/92Excellent inter-rater reliability, High internal consistencyAge5 to 15
Dutch VersionBouman Z et al., 2015 Netherlands [45]0–58≤2784/76NR57/92Excellent inter-rater reliability, High internal consistencyAge5 to 15
8ACECuban Revised Version (ACE-R)Broche-Perez Y et al., 2018, Cuba [72]0–100≤8489/7293NRGood internal consistency reliabilityAge, Years of SchoolingA few mins more than MMSE
Thai Mini VersionCharernboon T, 2019, Thailand [25]0–10021/2295/859080.9/96.2High internal consistencyNR8 to 13
9MMSE-Broche-Perez Y et al., 2018, Cuba [72]1–3025/2656/8363NRNRNRNR
-Chen K-L et al., 2016, China [26]1–30≤2686.2/60.379.7NRNRNRNR
-Chen K-L et al., 2016, China [26]1–30≤2778.6/52.273.6NRNRNRNR
-Chen K-L et al., 2016, China [26]1–30≤2876.4/53.472.1NRNRNRNR
Cantonese VersionChiu HF et al., 2017, Hong Kong [27]1–3025/2683/8490.493/98NRNRNR
9MMSEChinese VersionChu L et al., 2015, Hong Kong [29]1–3027/2867/8378NRNREducationNR
Thai VersionJulayanont P et al., 2015, Thailand [32]1–30NR33/8870.2NRNRNRNR
-Phua A et al., 2017, Singapore [34]1–30NR70/59NR64/66NRNRNR
-Lee S et al., 2016, Australia [66]1–30<2975.7/68.977NRNREmotional status indices (anxiety and depression)NR
-Mellor D et al., 2016, China [36]1–30<25.568/838560.5/87.4NRAge, Gender, Educational LevelNR
9MMSE-Rakusa M et al., 2018, Slovenia [50]1–3025/2620/9363NRNRNRNR
-Van de Zande E et al., 2017, Netherlands [53]1–30≤2357/9868.596/69.5NREducation5 to 10
-Xu F et al., 2019, China [40]1–3027 ≤ and ≤ 2959/58.2NRNRNRNR5 to 10
Standardised Mini Version (SMMSE)Yavuz B et al., 2017 Turkey [73]1–30≤2336/947187/56NRNRNR
-Chiu P et al., 2019, Taiwan [28]1–3026/2764/7066NRNRAge, EducationNR
-Malek-Ahmadi M et al., 2015, US [57]1–30NRSmall sensitivity to change (helpful in detecting change over time)56% ReliabilityNRNR
10CFIItalian VersionChipi E et al., 2017, Italy [46]0–14NRNRAccurateReliableNRNR
11RBANS-Heyanka D et al., 2015 [71]0–100NR52–93/ 35–93 (based on different subtests)NR16–91/ 72–94 (based on different subtests)NRNRNR
12HKBC-Chiu HF et al., 2017, Hong Kong [27]0–3021/2290/8695.594/99Good test–retest reliability, Excellent interrater reliability, Satisfactory internal consistencyNR7
13NMD-12-Chiu P et al., 2019, Taiwan [28]NR1/287/9394NRNRNRNR
14Qmci-Clarnette R et al., 2016, Australia [65]0–100≤6093/8091–9795/73NRNR4.2
14QmciTurkish Version (Qmci-TR)Yavuz B et al., 2017 Turkey [73]0–100<6267/818080/68Strong inter-rater reliability, Strong test–retest reliability NR3 to 5
15CCQ8-item CCQ (CCQ8)Damin A et al., 2015 Brazil [62]NR>197.6/66.7High Accuracy78.4/95.6NRNRNR
8-item CCQ (CCQ8)Damin A et al., 2015 Brazil [62]NR≥278/93.9High Accuracy94.1/77.5NRNRNR
16CDT-Duro D et al., 2018, Portugal [47] 0–18 (Babins System)≤1560/6263.861/61High inter-rater reliabilityNRNR
-Duro D et al., 2018, Portugal [47] 0–10 (Rouleau System)≤964/5863.560/62High inter-rater reliabilityNRNR
-Rakusa M et al., 2018, Slovenia [50]0–4≤369/9181NRNRAge, Education<2
16CDT-Ricci M et al., 2016, Italy [51]0–5≤1.3076/84Good Diagnostic AccuracyExcellent inter-rater reliabilityNRVery short and easy
-Vyhnálek M et al., 2016, Czech Republic [54]NRNR62–84/47 –63NRNRNRNRNR
17TorCA-Freedman M et al., 2018 [70]0–295≤27580/7979% AccuracyGood test–retest reliability, Adequate internal consistencyNRMedian 34
18HK-VMT-Fung AW-T et al., 2018, Hong Kong [30]0–4021/2286.1/75.379.3NRGood test–retest reliabilityEducation15
-Fung AW-T et al., 2018, Hong Kong [30]0–40<22 (education <6 years)71.1/87.379.3NRGood test–retest reliabilityEducation15
18HK-VMT-Fung AW-T et al., 2018, Hong Kong [30]0–40<25 (education >6 years)71.4/76.579.3NRGood test–retest reliabilityEducation15
19TICS-Georgakis MK et al., 2017, Greece [67]0–4126/2745.8/73.756.930.6/84.3Adequate internal consistency, Very high test–retest reliabilityAge, EducationNR
20VOSPAbbreviated version of the Silhouettes subtest (Silhouettes-A)Huang L et al., 2018, China [31]0–15≤1079.6/65.181.6NRHigh internal consistency/inter-rater reliability, Excellent test–retest reliabilityGender, Education (Unaffected by age)3 to 5
21TYMGreek VersionIatraki E et al., 2017, Greece [68]0–5035/3680/77NR47/93Good internal consistencyAge, Education5 to 10
Dutch VersionVan de Zande E et al., 2017, Netherlands [53]0–50≤3874/9179.587.9/79.2Good inter-rater reliabilityEducation10 to 15
22GPCogGreek Version of GPCog-PatientIatraki E et al., 2017, Greece [68]0–97/889/61High discrimination accuracy for high education level population; Moderate accuracy for low education level population38/95Good internal consistencyAge, Education<5
Chinese Version of 2-stage method (GPCOG-C) Xu F et al., 2019, China [40]GPCOG-patient: 0–9; Informant Interview: 0–9GPCOG-patient: 5–8; Informant Interview: >462.3/84.6NRNRNRUnaffected by education, gender and age4 to 6
23CVLTSecond Edition (CVLT-II)Lee S et al., 2016, Australia [66]0–16<882.9/93.294NRNREmotional status indices (anxiety and depression)NR
24The Envelope Task-Lee S et al., 2016, Australia [66]0–4<364.3/91.983NRNREmotional status indices (anxiety and depression)NR
25PRMQ-Lee S et al., 2016, Australia [66]0–80<4650/75.766NRNREmotional status indices (anxiety and depression)NR
26Single-item Memory Scale-Lee S et al., 2016, Australia [66]0–5<355.7/89.276NRNREmotional status indices (anxiety and depression)NR
27FCSRTPortuguese VersionLemos R et al., 2016, Portugal [48]ITR: 0–48≤3572/8381.881/75NRUnaffected by literacy level~2
Portuguese VersionLemos R et al., 2016, Portugal [48]DTR: 0–16≤1276/8082.479/77NRUnaffected by literacy level~30
28AQ-Malek-Ahmadi M et al., 2015, US [57]0–27NRSmall sensitivity to change (helpful in detecting change over time)65% ReliabilityNRNR
29FAQ-Malek-Ahmadi M et al., 2015, US [57]0–30NRSmall sensitivity to change (helpful in detecting change over time)63% ReliabilityNRNR
-Mitchell J et al., 2015, US [59]0–30NR47/82NRNRNRNRNR
30BCAT Short Form (BCAT-SF)Mansbach W et al., 2016, US [58]0–21≤1982/808693/57Good internal consistency, ReliableNR3 to 4
31AD8-Chiu P et al., 2019, Taiwan [28]0–81/278/9392NRNRUnaffected by age, educationNR
-Mansbach W et al., 2016, US [58]0–8≥178/305978/29Acceptable internal consistencyNRNR
-Mansbach W et al., 2016, US [58]0–8≥268/635983/34Acceptable internal consistencyNRNR
31AD8-Mansbach W et al., 2016, US [58]0–8≥347/635981/27Acceptable internal consistencyNRNR
32DSRS-Mitchell J et al., 2015, US [59]0–51NR60/81NRNRGood construct reliability NR5
33CWLTCERAD Word List 5-minute recall testMitchell J et al., 2015, US [59]NRNR62/96NRNRNRNRNR
CWLT-3rd TrialMitchell J et al., 2015, US [59]NRNR41/90NRNRNRNRNR
CWLT-Trials 1-3Mitchell J et al., 2015, US [59]NRNR57/94NRNRNRNRNR
CWLT-CompositeMitchell J et al., 2015, US [59]NRNR66/95NRNRNRNRNR
32 and 33CWLT-5 + DSRS-Mitchell J et al., 2015, US [59]NRNR76/98NRNRNRNRNR
34BADLS-Mitchell J et al., 2015, US [59]NRNR36/86NRNRGood construct reliability NRNR
35DSR-Ni J et al., 2015, China [37]NR≤15100/95.999.8Good diagnostic accuracyExcellent internal consistencyNRNR
36mSTS-MCImSTS-MCI ScoresPark J et al., 2018, South Korea [38]0–1818/1999/93High Concurrent ValidityHigh internal consistency, High test–retest reliabilityNR15
mSTS-MCI Reaction TimePark J et al., 2018, South Korea [38]0–1013.22/13.32100/97High Concurrent ValidityHigh internal consistency, High test–retest reliabilityNR15
37CAMCogBriefer Version (CAMCog-Short)Radanovic M et al., 2017, Brazil [64]0–6351/52 (education >9 years)65.2/78.879.7NRNRNRNR
Briefer Version (CAMCog-Short)Radanovic M et al., 2017, Brazil [64]0–6359/60 (education ≤8)70/75.577.3NRNRNRNR
38MBTArgentine VersionRoman F et al., 2016, Argentina [69]0–32NR69/888893/55NRNR6
39SAGE-Scharre D et al., 2017, US [60]6–22<1571/9088NRNRNRMedian 17.5
Digitally Translated (eSAGE)Scharre D et al., 2017, US [60]10–22<1669/8683NRNRNRMedian 16
40Semantic Fleuncy/VF-Serna A et al., 2015, Spain [52]0–17≤10.553/677252/75NRNR1
-Serna A et al., 2015, Spain [52]0–17≤11.562/677252/75NRNR1
-Serna A et al., 2015, Spain [52]0–17≤12.570/567248/76NRNR1
41Logical Memory20-min Delayed Recall (DR)Serna A et al., 2015, Spain [52]0–6≤2.543/857163/72NRNR20
20-min Delayed Recall (DR)Serna A et al., 2015, Spain [52]0–6≤3.557/717154/74NRNR20
41Logical Memory20-min Delayed Recall (DR)Serna A et al., 2015, Spain [52]0–6≤4.578/427144/77NRNR20
42STMS-Townley R et al., 2019 US [61]N/A<3572/74Incident MCI: 71, a-MCI: 85, na-MCI: 91NRNRNRNR
43DMS48-Feng X et al., 2017, China [39]0–4842/4386.6/94.296.6NRNRAge (Unaffected by education)Short time taking
44ADAS-CogADAS-Cog 11-itemZainal N et al., 2016, Singapore [41]0–70≥473/697890/40Excellent internal consistencyAge30 to 45
ADAS-Cog 12-itemZainal N et al., 2016, Singapore [41]0–80≥590/537988/58Excellent internal consistencyNR30 to 45
ADAS-Cog Episodic Memory Composite ScaleZainal N et al., 2016, Singapore [41]0–32≥661/737386/41Excellent internal consistencyNR30 to 45
45IADL-Chiu P et al., 2019, Taiwan [28]NR7/898/2763NRNRNRNR
46CASI-Chiu P et al., 2019, Taiwan [28]NR82/8368/6872NRNRAge, EducationNR
47NPI-Chiu P et al., 2019, Taiwan [28]NR3/463/6263NRNRNRNR
48BNT-Huang L et al., 2018, China [31]NR2470.6/55.267.3NRNRNRNR
49STTTest B (STT-B)Huang L et al., 2018, China [31]NR16950.7/8068.3NRNRNRNR
50JLO-Huang L et al., 2018, China [31]NR2759.7/53.262NRNRNRNR
51ST-Huang L et al., 2018, China [31]NR1464/62.666.4NRNRNRNR
52VCAT-Khandiah N et al., 2015, Singapore [33]0–3018–2285.6/81.193.389/75.9NRUnaffected by language15.7 ± 7.3
-Low A et al., 2019, Singapore [35]0–3020–2475.4/71.1Good construct validity74.4/72.3Good internal consistencyUnaffected by language and cultural backgroundNR
Abbreviations list for Table 4: AD: Alzheimer’s Disease; Sn/Sp: Sensitivity/Specificity; AUC: Area Under Curve; PPV/NPV: Positive Predictive Value/Negative Predictive Value.
Table 5. Summary of the cognitive tools performance.
Table 5. Summary of the cognitive tools performance.
ToolCut-Off PointDifferent Versions IncludedValidityGood ReliabilityAffecting FactorsAdministration Time ≤15 minsCan Be Self-Administered or Conducted by Non-Professional
6 CIT≤4/10/12Good/ExcellentEducation
EMQLimited results
BSRTLimited results
RCFT≤32 Fair---x
TMTLimited results
MoCA≤26 Fair/GoodEducation (may be affected by gender and age)
≤25, ≤22.5 Good
≤24, ≤22, ≤19, ≤15.5Good/Excellent
≤20Fair
BCSE≤27, ≤46Fair/GoodAge
ACE≤84, ≤22Good/ExcellentAge, Educationx
MMSE≤29, ≤27FairAge, Education, Emotional status, Gender
≤28, ≤25.5, ≤23Fair/Good
≤26Good
CFI-Good--x
RBANS--Fair----
HKBC≤22-Excellent-
NMD-12≤2-Excellent---x
Qmci<62/≤60Good/Excellent-x
CCQ>1, ≥2Good/Excellent---
CDT≤15, ≤9, ≤3, ≤1.3-Fair/GoodAge, Education
TorCA≤275-Good--xx
HK-VMT<22, ≤25-FairEducation
TICS≤27-Poor/FairAge, Education-x
VOSP≤10-GoodGender, Educationx
TYM≤38, ≤36Fair/GoodAge, Education
GPCog≥4, ≥8Fair/GoodInconsistent resultsx
CVLT<8Good/Excellent-Emotional Status--
The Envelope Task<3-Good-Emotional Status--
PRMQ<46-Fair-Emotional Status--
Single-item Memory Scale<3-Fair/Good-Emotional Status--
FCSRT≤35, ≤12Good--x-
AQLimited results
FAQ--Poor/Good----
BCAT≤19-Good-x
AD8≥1, ≥2, ≥3-Poor/Fair--
DSRS--Fair/Good-x
CWLT-Fair---x
CWLT + DSRS--Good/Excellent---x
BADLS--Poor--x
DSR≤15 Excellent---
mSTS-MCI≤19, ≤13.32Excellent-x
CAMCog≤52, ≤60Fair/Good---x
MBT-Good---
SAGE<15, <16-Good--x
Semantic Fluency/VF≤10.5, ≤11.5, ≤12.5-Fair---
Logical Memory≤2.5, ≤3.5, ≤4.5Poor/Fair--x-
STMS<35-Good----
DMS48≤43-Good/Excellent-Age-x
ADAS-Cog≥4, ≥5, ≥6Good/Excellent-xx
IADL≤8-Poor/Fair---x
CASI≤83-Fair-Age, Education-x
NPI≤4-Fair---x
BNT≤24-Fair---
STT≤169-Fair---
JLO≤27-Fair---
ST≤14-Fair---
VCAT18–22, 20–24-Good/Excellentxx
Extracted and evaluated from Table 3 and Table 4. ‘✓’ represents yes; ‘x’ represent no; ‘-’ represent unavailable data. Multiple ratings recorded if there were different results from included articles.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Chun, C.T.; Seward, K.; Patterson, A.; Melton, A.; MacDonald-Wicks, L. Evaluation of Available Cognitive Tools Used to Measure Mild Cognitive Decline: A Scoping Review. Nutrients 2021, 13, 3974. https://doi.org/10.3390/nu13113974

AMA Style

Chun CT, Seward K, Patterson A, Melton A, MacDonald-Wicks L. Evaluation of Available Cognitive Tools Used to Measure Mild Cognitive Decline: A Scoping Review. Nutrients. 2021; 13(11):3974. https://doi.org/10.3390/nu13113974

Chicago/Turabian Style

Chun, Chian Thong, Kirsty Seward, Amanda Patterson, Alice Melton, and Lesley MacDonald-Wicks. 2021. "Evaluation of Available Cognitive Tools Used to Measure Mild Cognitive Decline: A Scoping Review" Nutrients 13, no. 11: 3974. https://doi.org/10.3390/nu13113974

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop