Next Article in Journal
How Does Health Literacy Modify Indicators of Health Behaviour and of Health? A Longitudinal Study with Trainees in North Germany
Previous Article in Journal
Motives for the Use or Not of Protective Equipment for the Recreational Practice of Skiing and Snowboarding in Spanish Winter Stations
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Systematic Review

Digital Tools in Behavior Change Support Education in Health and Other Students: A Systematic Review

1
Faculty of Health Sciences, University of Maribor, 2000 Maribor, Slovenia
2
Faculty of Electrical Engineering and Computer Science, University of Maribor, 2000 Maribor, Slovenia
3
Usher Institute, University of Edinburgh, Edinburgh EH8 9YL, UK
4
Nursing Research, Innovation and Development Centre of Lisbon, Nursing School of Lisbon, 1600-190 Lisbon, Portugal
5
Faculty of Healthcare, Sports and Welfare, Inholland University of Applied Sciences, 3521 Haarlem, The Netherlands
*
Author to whom correspondence should be addressed.
Healthcare 2022, 10(1), 1; https://doi.org/10.3390/healthcare10010001
Submission received: 6 November 2021 / Revised: 13 December 2021 / Accepted: 16 December 2021 / Published: 21 December 2021

Abstract

:
Due to the increased prevalence of chronic diseases, behavior changes are integral to self-management. Healthcare and other professionals are expected to support these behavior changes, and therefore, undergraduate students should receive up-to-date and evidence-based training in this respect. Our work aims to review the outcomes of digital tools in behavior change support education. A secondary aim was to examine existing instruments to assess the effectiveness of these tools. A PIO (population/problem, intervention, outcome) research question led our literature search. The population was limited to students in nursing, sports sciences, and pharmacy; the interventions were limited to digital teaching tools; and the outcomes consisted of knowledge, motivation, and competencies. A systematic literature review was performed in the PubMed, CINAHL, MEDLINE, Web of Science, SAGE, Scopus, and Cochrane Library databases and by backward citation searching. We used PRISMA guidelines 2020 to depict the search process for relevant literature. Two authors evaluated included studies using the Mixed Methods Appraisal Tool (MMAT) independently. Using inclusion and exclusion criteria, we included 15 studies in the final analysis: six quantitative descriptive studies, two randomized studies, six mixed methods studies, and one qualitative study. According to the MMAT, all studies were suitable for further analysis in terms of quality. The studies resorted to various digital tools to improve students’ knowledge of behavior change techniques in individuals with chronic disease, leading to greater self-confidence, better cooperation, and practical experience and skills. The most common limitations that have been perceived for using these tools are time and space constraints.

1. Introduction

Due to the growing burden of chronic diseases, such as obesity, diabetes, and cardiovascular disease [1,2], the need for individual support in self-management is increasing [3,4,5]. Changing behaviors for effective self-management can improve health outcomes and the quality of life of people with chronic diseases [1]. Furthermore, it can improve life expectancy and reduce health costs [6]. New best practice and evidence-based healthcare education are needed to prepare future healthcare and other professionals to support people with chronic disease, which reflects a significant challenge for educators [7].
Healthcare and other professionals have a key role in promoting healthy behavior and motivating individuals with chronic diseases to live healthier lives [8]. They can signal problems, provide tailored information, enable persons with chronic disease to participate in lifestyle supporting programs, and help these persons maintain healthy behaviors [9]. Public health has advanced in recent years in solving complex systemic problems. Healthcare and other professionals must be equipped with a range of skills such as reducing complications, problem-solving and evidence-based practice, and decision-making [10,11].
The quality of education provided to patients with chronic disease is significantly influenced by the quality of academic education students receive, i.e., education about professional roles, supervision received, self-preparation for training, mutual peer support, and teaching instruments [12]. Therefore, learning by health and other students is extremely important. Clinical learning is one of the essential issues that help understand students’ practice in the clinical health environment and influences their professional development [13]. Nieman (2007), for example, believed that educational modules in chronic diseases should offer appropriate training for students [14]. Students who also have an education in behavioral or social sciences will find it easier to identify risky behaviors of persons with chronic disease and appropriately encourage behavior changes [15].
Despite the importance of health education, the literature suggests the existence of insufficient competencies of healthcare and other professionals in this field [16,17,18]. As part of the Train4Health project (https://www.train4health.eu/, accessed on 1 November 2021), we want to improve students’ education in supporting behavior changes to promote self-care effectively in people with chronic diseases. The project target groups are nursing, pharmacy, and sport sciences students.
There is a need for digital teaching tools, such as simulation software, e-learning, and digital guided courses, such as massive open online courses (MOOCs), to provide health education and behavior change support [17,18]. E-learning is defined as an educational intervention that is transmitted electronically over the Internet and requires various technological and communication systems. Among healthcare professionals, its use has increased significantly in recent years [19,20]. The Massive Online Open Course (MOOC) is an approach that uses the Internet to enable courses to achieve a broader educational impact to a wider range of individuals [21,22,23]. The use of technology in healthcare education makes it easier for students to acquire basic knowledge, skills, and psychomotor skills to improve decision-making competencies and practice in various events [24]. Using a combination of traditional teaching methods with e-learning methods can be an effective complement to improving their clinical skills [25].
The main objectives of this systematic review are to assess the outcomes of the use of digital teaching tools and review the assessment instruments to evaluate the research outcomes (e.g., education skills and learning experience) in health and other students following the introduction of digital teaching tools.

2. Materials and Methods

This systematic review was registered with PROSPERO (CRD42021233690). The review was performed following five steps: (1) formulating a review question, (2) identifying relevant work, (3) evaluating the quality of the study, (4) summarizing the evidence, and (5) interpreting the findings [26].
The first research question based on the PIO approach [27] was: “What is the outcome (O) of the use of digital teaching tools to support behavioral change (I) in healthcare and other professionals (P)?” Another research question was: “What assessment instrument are used (I) to assess research outcomes (e.g., education skills, learning experience, etc.) (O) in healthcare and other students?”.
The selection process for the relevant studies consisted of five steps: (1) databases search and backward citation search, (2) removal of duplicates, (3) screening of records based on the title and abstract, (4) overview of the results based on full text, and (5) analysis of studies involved in the synthesis.
A systematic search of the relevant literature took place in seven international databases: PubMed, CINAHL, MEDLINE, Web of Science, SAGE, Scopus, and Cochrane Library and with backward citation by manually searching the reference lists of all the articles included [28]. If individual records were not fully available or additional information about the results was required, we contacted the authors of the articles. We also searched for unpublished works in the application databases of various review protocols (PROSPERO).
The search in databases was performed using the following search string: (“nurs* student*” OR “healthcare student*” OR “pharmacy student*” OR “sport student*”) AND (“pedagogical method” OR “e-learning cours*” OR “online cours*” OR “MOOC” OR “case stud*” OR “simulation*” OR “virtual patient*”) AND (“knowledge*” OR “motivation*” OR “engagement*” OR “skill*” OR “competence*” OR “self-care” OR “self-management” OR “change the behavior” OR “change attitudes” OR “behaviour change” OR “behavior change” OR “behaviour change techniques” OR “behavior change techniques” OR “health behavior” OR “health behavior”) AND (“non-communicable disease” OR “chronic disease*” OR “chronic illness” OR “coronary disease” OR “coronary artery disease” OR “heart disease” OR “heart failure” OR “cardiovascular disease” OR “high blood pressure” OR “hypertension” OR “diabetes mellitus type 2” OR “ischemic heart disease” OR “type 2 diabetes” OR “non-insulin-dependent diabetes” OR “adult-onset diabetes” OR “NIDDM” OR “T2D” OR “obesity”). Search strategies for the individual databases are presented in Table S1. Database searches are presented in Table S2.
Based on the inclusion and exclusion criteria (Table 1), two authors screened the records individually. Articles reviewed in full text and excluded based on exclusion criteria are present in Table S3.
Two authors extracted the data from the relevant studies into a preprepared table with recoverable identification data (Table S4). To assess the quality of the studies, we used the Mixed Methods Appraisal Tool (MMAT) [29,30]. The MMAT is intended to critically evaluate studies included in systematic mixed methods reviews (qualitative, quantitative, and mixed studies) and enable methodological quality assessments. Mixed methods studying also includes individual evaluations of qualitative and quantitative methods. The quality assessment of the mixed method study should not exceed the quality of the weakest component. We reported our MMAT results in metrics and not as no metrics, as is described in the MMAT instructions, because of that way being more informative for the readers (Table S5) [29,30].
The findings were synthesized using a thematic method. The obtained results were classified into codes, subtopics, and main topics [31]. We also performed a content analysis of the relevant records [32].

3. Results

3.1. Results of Literature Review

The search process for relevant results is shown in Figure 1 with a PRISMA flow diagram [33,34]. Fifteen studies were included in the final analysis based on the inclusion and exclusion criteria.
Table 2 presents the characteristics of included studies, study type, and MMAT score. All studies that received MMAT score of 50% or higher were included in the further analysis. Breakdown of MMAT Score is presented in Table S6.
The final analysis included six quantitative descriptive studies, one quantitative single-blind randomized controlled trial, one quantitative descriptive single-blind study, one-center, a cluster randomized controlled trials, one qualitative study, and six mixed study methods. The evaluation of studies using the MMAT ranged from 50% to 90%.
Participants learned to promote different techniques (e.g., MOOC, patient simulation, standard patients, etc.) of changing behaviors and treatment in the field of various disease states: diabetes (n = 3) [35,41,47], heart failure (n = 4) [37,43,49], COPD (n = 3) [38,40,45], stroke [38], asthma [49], prostate cancer [44], breast cancer [38], hypertension [43,46], mental health [48], and dementia [38] (each n = 1). They also address interventions related to behavior changes such as the transition of care [43], cardiac life support, insulin injection technique [39], use of inhalers [36], and care in an ambulance [42] (each n = 1).
In the analyzed studies, the authors used a simulation (n = 12), a virtual case study (n = 1), and MOOC (n = 2) for teaching students and healthcare and other professionals with digital teaching tools.

3.2. Assessment Instruments to Evaluate Research Outcomes

Table 3 includes information on the tools used to assess the outcomes in postgraduate students using a variety of digital behavioral change instruments (the basic data of the included studies and the main findings are presented in Table S4). All 15 studies used different instruments for evaluating the outcomes of the research. All the instrument descriptions are provided in Table 3.

3.3. Assessment of the Digital Teaching Tools Outcomes

The thematic analysis of the articles is presented below (Table 4).
The main themes we designed based on the thematic analysis are positive outcomes of using digital teaching tools and barriers to using digital teaching tools. Positive outcomes of using digital teaching tools include four subtopics: knowledge, confidence, practical experience, and collaboration. The findings in the articles showed that the use of digital technologies influences the active learning of users [38], developing skills [36,40], and critical thinking [38]. This also significantly impacts increasing their knowledge and maintaining their knowledge [39,49]. In this way, this also influences the improvement and building of self-confidence and self-confidence [38,48,49], as well as the increase in skills [41]. Thus, students are also more prepared for clinical and professional practices [35,44] and improve their professional network and cooperation with other professionals [35,37]. However, there are many restrictions on the use of digital tools, such as time constraints [43], financial barriers [43], and resource constraints such as space constraints [43] or material constraints [37]. In this study, the authors recommended providing more time for activity development.
Table 4. Thematic analysis.
Table 4. Thematic analysis.
Main ThemesSubthemesCodes
Positive outcomes of using digital teaching toolsKnowledge-knowledge retention [39,49]
-increase in knowledge [39]
-active learning [38]
-developing/improving skills [36,40]
-critical thinking [38]
-significantly higher counseling [39]
Confidence-builds confidence [38]
-felt more confident [48,49]
-skills increased [41]
-diabetes education skills assessed [41]
-trust [45]
Practical experience-more prepared for interprofessional education [37]
-improve the professional practice [35]
-effect on their clinical/professional practice [44]
-expressed satisfaction with experiencing such a practice [36]
Collaboration-increase their professional network [35]
-think more positively about other professionals [37]
Barriers to the use of digital teaching toolsRestrictions-using only one patient simulator [37]
-time in students’ schedules [43]
-financial resources [43]
-space [43]
-lagging feedback [46]
-technology issues [46]
Suggestions for improvement-faculty time to develop activities [46]

4. Discussion

We included 15 articles in the final analysis. Of these, six were quantitative descriptive studies [35,37,41,45,47,49], two were randomized studies [36,39], six were mixed methods studies [40,42,43,44,46,48], and one was a qualitative study [38]. Different populations were included in the studies, such as nursing, sports science, and pharmacy students.
Simulations are among the most common digital teaching tools. Simulations in the undergraduate nursing curriculum are becoming increasingly popular and becoming the foundation of many nursing programs [57]. MOOC allows lecturers to reach a large, diverse audience. In a study using MOOC for the purpose of learning about health safety science, users reported a significant increase in competency. However, they pointed out that MOOC is difficult to include in all curricula [58].
Researchers are also advising the inclusion of digital badges and gamification digital teaching tools [59].

4.1. Assessment Tools

Based on the analyzed studies, we found that there is no unique tool that would allow insight and monitoring the effectiveness of different pedagogical approaches in students on their knowledge and effectiveness in supporting the changing behavior of a person with chronic disease. In individual studies, researchers used individual tools that monitor only individual aspects or are helpful only for individual diseases. Thus, for example, DAS-3 [53] is intended to assess self-confidence in education in developing skills in diabetes. Self-confidence can also be measured with the SSSC questionnaire [50,51]. Additionally, individual questionnaires are intended only to assess the individual tools used, such as SDS [50,51], which is used in simulation learning. However, most researchers still use questionnaires, which are compiled individually based on material reviews. Since different learning tools are used and different topics are addressed, it is not easy to choose a unique assessment instrument that could be used to assess the effectiveness of educational digital teaching tools. A similar finding in a study by Alturkistani et al. (2020) noted that, due to the diversity of topics addressed by MOOCs, it is not possible to propose a single evaluation tool for all [60].
Such differences in the use of assessment tools occur mainly because assessments must be carried out in accordance with the expected learning outcomes, which means that the whole assessment process is adapted to them [61]. The authors of various studies have also recommended different assessment methods [62]. It is also important to evaluate which material we can use to measure what we want to measure [63]. Assessments are therefore closely linked to the learning outcomes that the students expect to achieve [64].

4.2. Implications for Practice and Policy

Learning outcomes of students and other participants are a central part of the learning process [61]. Expected learning outcomes in students relate primarily to their knowledge, skills, and behaviors that should be achieved at the end of the educational program and measured [62].
The main positive outcomes of students’ digital behavior change support education in our study are knowledge, confidence, practical experience, and collaboration (Table 4). Active learning helps students incorporate meaningful understanding [65]. This requires students to start thinking at a higher level [66].
Increased knowledge is associated with increased self-confidence and a sense of security. Health knowledge is a key element in ensuring good quality health [67]. In a study by Albrechtsen et al. (2017) [35], the authors found that 89% of health professionals reported improved knowledge after the introduction of an intervention. In addition to increased knowledge, it is also important to improve students’ critical thinking after using digital teaching tools [47]. Students also use different approaches to improve their self-confidence in clinical skills [49].
In addition to knowledge, students also gain practical experience and preparedness for real-world situations. Additionally, simulating chronic illnesses has improved students’ perceptions of their ability to empathize and counsel persons with chronic diseases [42]. The active use of various tools has also contributed to better cooperation and interaction between students, staff, and persons with chronic diseases. Of the participants, 48% in the study by Albrechtsen et al. (2017) [35] reported increasing their professional network and collaboration during their education.

4.3. Restrictions on the Use of Digital Teaching Tools

Due to less familiarity with computer approaches, the challenges for the faculty are still worrying [68]. Time and material barriers were detected among the most common constraints in the analyzed studies. Bolesta et al. (2014) [37] highlighted the logistical difficulties, as, in their study, students had the option of working with only one patient simulator. It was also important to face the organization of the timing of education in the curriculum. The problem of timing in student schedules was also highlighted by other studies [43,44].
In recent years, the approach to teaching in a modified classroom has been increasingly used in undergraduate medical education [69]. The rapid development of information technology and changes in the philosophy of education has encouraged the development of the concept of a modified classroom [70]. Higher education institutions are imbued with the technological advances brought about by the industrial revolution [71], which requires fundamental changes in traditional teaching and learning activities [72]. Medical education is also changing rapidly [24]. As technology is an integral part of the work of health professionals, technology must be included in the curriculum for students [73].
Conducted systematic literature reviews have benefits for students, researchers, educators, and administrators. There are few barriers to the use of digital teaching tools. Some are related to the institution and educators’ restrictions, others to the students’ time. Institutions should provide more simulators for students to use. Additionally, students’ academic schedules should be adjusted so that students have more time for performing simulations. Therefore, students would gain more knowledge, skills, and confidence in performing simulations. Moreover, educators should undertake more education on using simulations in teaching. For educators, it is important to recognize the benefits of using digital learning tools and following trends for the sustainable development of education. Research should be focused on positive outcomes and students’ experiences with using simulations in education.

4.4. Limitations

A different typology of studies (qualitative, quantitative, and mixed studies) with heterogeneous results was included in this systematic review, so a meta-analysis could not be performed. There were also differences in the study’s design and the method of implementation. Different rating scales were used to assess the success of the interventions in the studies. Different populations (nursing, sports science, and pharmacy students) were included in the analyzed studies, so the results cannot be generalized for an individual population. The choice of these three target groups, which is intrinsic to the project, limited the search string; potential studies in digital behavior change support education in other areas represent an untapped resource meriting exploration in future works. In assessing the quality of the articles, despite using the MMAT rating scale, there is the possibility of subjectivity. We tried to avoid this as much as possible by involving two evaluators.

5. Conclusions

Using digital teaching tools such as MOOC and simulations, we can help motivate students and, thus, increase their knowledge, confidence, skills, and experience. All the studies analyzed considered only the positive effects of the use of digital learning tools that affect the effectiveness of students and their skills. Despite the topic, some limitations in the implementation of these tools in the learning process were perceived, which related mainly to their resources. The studies included in the review used a very heterogeneous set of assessment instruments. In the future, a tool should be developed to monitor the knowledge of students and health professionals to support behavior changes in persons with chronic diseases.

Supplementary Materials

The following are available online at https://www.mdpi.com/article/10.3390/healthcare10010001/s1, Table S1. Search strategy in the databases. Table S2. Number of records in the databases. Table S3. List of excluded studies. Table S4. Basic data of the included studies. Table S5. PRISMA 2020 checklist. Table S6. MMAT scores.

Author Contributions

Conceptualization: L.G., M.L., G.Š. and K.B.; Data curation, L.G., N.F., L.C.B., M.L. and G.Š.; Formal analysis, L.G., M.P.G., K.B., I.B.F. and M.L.; Methodology, L.G., M.L. and G.Š.; Supervision, L.G., M.P.G. and M.L.; Writing—original draft, L.G. and M.L.; and Writing—review and editing, L.G., M.L., G.Š., L.C.B., N.F., M.P.G., I.B.F. and K.B. All authors have read and agreed to the published version of the manuscript.

Funding

This project received funding from the Erasmus+ Programme of the European Union under grant agreement no. 2019-1-PT01-KA203-061389. This study was also supported by the “knowledge through creative pathways 2016–2020” scheme co-funded by the European Union from the European Social Fund and the Republic of Slovenia and the Slovenian Research Agency (grant numbers N2-0101 and P2-0057).

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

COPD—chronic obstructive pulmonary disease; DAS-3—Diabetes Attitude Scale; MMAT—Mixed Methods Appraisal Tool; MMS—mixed method study; MOOC—Massive Open Online Course; RCS—randomized control study; RIPLS—Readiness for Interprofessional Learning Scale; RS—randomized study; SDS—Simulation Design Scale; SGID—small group instructional diagnosis; SOAP—Subjective, Objective, Assessment, Plan; SSSC—Student Satisfaction and Self-Confidence in Learning Scale; QUAL—qualitative; QUAN—quantitative.

References

  1. Araújo-Soares, V.; Hankonen, N.; Presseau, J.; Rodrigues, A.; Sniehotta, F. Developing Behavior Change Interventions for Self-Management in Chronic Illness. Eur. Psychol. 2019, 24, 7–25. [Google Scholar] [CrossRef]
  2. Calma, K.R.; Halcomb, E.; Stephens, M. The impact of curriculum on nursing students’ attitudes, perceptions and preparedness to work in primary health care: An integrative review. Nurse Educ. Pract. 2019, 39, 1–10. [Google Scholar] [CrossRef] [PubMed]
  3. Miller, W.R.; Lasiter, S.; Ellis, R.B.; Buelow, J.M. Chronic disease self-management: A hybrid concept analysis. Nurs. Outlook 2015, 63, 154–161. [Google Scholar] [CrossRef] [Green Version]
  4. O’Connell, S.; Mc Carthy, V.J.C.; Savage, E. Frameworks for self-management support for chronic disease: A cross-country comparative document analysis. BMC Health Serv. Res. 2018, 18, 583. [Google Scholar] [CrossRef]
  5. Wagner, E.H.; Austin, B.T.; Davis, C.; Hindmarsh, M.; Schaefer, J.; Bonomi, A. Improving Chronic Illness Care: Translating Evidence into Action. Health Aff. 2001, 20, 64–78. [Google Scholar] [CrossRef] [Green Version]
  6. Newson, J.T.; Huguet, N.; Ramage-Morin, P.L.; McCarthy, M.J.; Bernier, J.; Kaplan, M.S.; McFarland, B.; Newsom, J.T. Health behaviour changes after diagnosis of chronic illness among Canadians aged 50 or older. Public Health Rep. 2012, 23, 49–53. [Google Scholar]
  7. Derryberry, M. Today’s Health Problems and Health Education. Am. J. Public Health 2004, 94, 368–371. [Google Scholar] [CrossRef]
  8. Kivelä, K.; Elo, S.; Kyngäs, H.; Kääriäinen, M. The effects of health coaching on adult patients with chronic diseases: A systematic review. Patient Educ. Couns. 2014, 97, 147–157. [Google Scholar] [CrossRef] [PubMed]
  9. Sohl, S.J.; Birdee, G.; Elam, R. Complementary Tools to Empower and Sustain Behavior Change: Motivational interviewing and mindfulness. Am. J. Lifestyle Med. 2016, 10, 429–436. [Google Scholar] [CrossRef]
  10. Yousefi, H.; Ziaee, E.S.; Golshiri, P. Nurses’ consultative role to health promotion in patients with chronic diseases. J. Educ. Health Promot. 2019, 8, 178. [Google Scholar] [PubMed]
  11. Levy, M.; Gentry, D.; Klesges, L. Innovations in public health education: Promoting professional development and a culture of health. Am. J. Public Health 2015, 105, S44–S45. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  12. Vijn, T.W.; Fluit, C.R.M.G.; Kremer, J.A.M.; Beune, T.; Faber, M.J.; Wollersheim, H. Involving Medical Students in Providing Patient Education for Real Patients: A Scoping Review. J. Gen. Intern. Med. 2017, 32, 1031–1043. [Google Scholar] [CrossRef] [PubMed]
  13. Dadgaran, S.A.; Parvizy, S.; Peyrovi, H. Passing through a rocky way to reach the pick of clinical competency: A grounded theory study on nursing students’ clinical learning. Iran. J. Nurs. Midwifery Res. 2012, 17, 330–337. [Google Scholar]
  14. Nieman, L.Z. A preclinical training model for chronic care education. Med. Teach. 2007, 29, 391–393. [Google Scholar] [CrossRef]
  15. Stuhlmiller, C.M.; Tolchard, B. Developing a student-led health and wellbeing clinic in an underserved community: Collaborative learning, health outcomes and cost savings. BMC Nurs. 2015, 14, 32. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  16. Shemtob, L. Should motivational interviewing training be mandatory for medical students? Med. Educ. Online 2016, 21, 31272. [Google Scholar] [CrossRef] [Green Version]
  17. Sadeghi, R.; Heshmati, H. Innovative methods in teaching college health education course: A systematic review. J. Educ. Health Promot. 2019, 8, 103. [Google Scholar] [CrossRef]
  18. Shojaeezadeh, D.; Heshmati, H. Integration of Health Education and Promotion Models for Designing Health Education Course for Promotion of Student’s Capabilities in Related to Health Education. Iran. J. Public Health 2018, 47, 1432–1433. [Google Scholar]
  19. Masic, I. E-Learning as New Method of Medical Education. Acta Inform. Medica 2008, 16, 102–117. [Google Scholar] [CrossRef] [Green Version]
  20. Vaona, A.; Banzi, R.; Kwag, K.H.; Rigon, G.; Cereda, D.; Pecoraro, V.; Tramacere, I.; Moja, L. E-learning for health professionals. Cochrane Database Syst. Rev. 2018, 2018, CD011736. [Google Scholar] [CrossRef]
  21. Manallack, D.T.; Yuriev, E. Ten simple rules for developing a MOOC. PLoS Comput. Biol. 2016, 12, e1005061. [Google Scholar] [CrossRef]
  22. Gyles, C. Is there a MOOC in your future? Can. Vet. J. 2013, 54, 721–724. [Google Scholar]
  23. Foley, K.; Alturkistani, A.; Carter, A.; Stenfors, T.; Blum, E.; Car, J.; Majeed, A.; Brindley, D.; Meinert, E. Massive Open Online Courses (MOOC) Evaluation Methods: Protocol for a Systematic Review. JMIR Res. Protoc. 2019, 8, e12087. [Google Scholar] [CrossRef] [Green Version]
  24. Guze, P.A. Using Technology to Meet the Challenges of Medical Education. Trans. Am. Clin. Clim. Assoc. 2015, 126, 260–270. [Google Scholar]
  25. Ashouri, E.; Sheikhaboumasoudi, R.; Bagheri, M.; Hosseini, S.A.; Elahi, N. Improving nursing students’ learning outcomes in fundamentals of nursing course through combination of traditional and e-learning methods. Iran. J. Nurs. Midwifery Res. 2018, 23, 217–221. [Google Scholar] [CrossRef] [PubMed]
  26. Khan, K.S.; Kunz, R.; Kleijnen, J.; Antes, G. Five Steps to Conducting a Systematic Review. J. R. Soc. Med. 2003, 96, 118–121. [Google Scholar] [CrossRef]
  27. Riva, J.J.; Malik, K.M.; Burnie, S.J.; Endicott, A.R.; Busse, J. What is your research question? An introduction to the PICOT format for clinicians. J. Can. Chiropr. Assoc. 2012, 56, 167–171. [Google Scholar]
  28. Guerreiro, M.P.; Angelini, L.; Henriques, H.R.; El Kamali, M.; Baixinho, C.; Balsa, J.; Félix, I.B.; Khaled, O.A.; Carmo, M.B.; Cláudio, A.P.; et al. Conversational Agents for Health and Well-being Across the Life Course: Protocol for an Evidence Map. JMIR Res. Protoc. 2021, 10, e26680. [Google Scholar] [CrossRef] [PubMed]
  29. Hong, Q.; Pluye, P.; Fàbregues, S.; Bartlett, G.; Boardman, F.; Cargo, M.; Dagenais, P.; Gagnon, M.P.; Griffiths, F.; Nicolau, B.; et al. Mixed Methods Appraisal Tool (MMAT), Version 2018. Available online: http://mixedmethodsappraisaltoolpublic.pbworks.com/w/file/fetch/127916259/ (accessed on 19 May 2021).
  30. Hong, Q.N.; Fàbregues, S.; Bartlett, G.; Boardman, F.; Cargo, M.; Dagenais, P.; Gagnon, M.-P.; Griffiths, F.; Nicolau, B.; O’Cathain, A.; et al. The Mixed Methods Appraisal Tool (MMAT) version 2018 for information professionals and researchers. Educ. Inf. 2018, 34, 285–291. [Google Scholar] [CrossRef] [Green Version]
  31. Polit, D.F.; Beck, C.T. Nursing Research: Generating and Assessing Evidence for Nursing Practice, 10th ed.; Wolters Kluwer: Philadelphia, OR, USA, 2017. [Google Scholar]
  32. Erlingsson, C.; Brysiewicz, P. A hands-on guide to doing content analysis. Afr. J. Emerg. Med. 2017, 7, 93–99. [Google Scholar] [CrossRef]
  33. Moher, D.; Liberati, A.; Tetzlaff, J.; Altman, D.G.; The PRISMA Group. Preferred reporting items for systematic reviews and meta-analyses: The PRISMA Statement. PLoS Med. 2009, 6, e1000097. [Google Scholar] [CrossRef] [Green Version]
  34. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. PLoS Med. 2021, 18, e1003583. [Google Scholar] [CrossRef]
  35. Albrechtsen, N.J.W.; Poulsen, K.W.; Svensson, L.Ø.; Jensen, L.; Holst, J.J.; Torekov, S.S. Health care professionals from developing countries report educational benefits after an online diabetes course. BMC Med. Educ. 2017, 17, 97. [Google Scholar] [CrossRef] [Green Version]
  36. Basak, T.; Demirtas, A.; Iyigun, E. The effect of simulation based education on patient teaching skills of nursing students: A randomized controlled study. J. Prof. Nurs. 2019, 35, 417–424. [Google Scholar] [CrossRef]
  37. Bolesta, S.; Chmil, J.V. Interprofessional Education among Student Health Professionals Using Human Patient Simulation. Am. J. Pharm. Educ. 2014, 78, 94. [Google Scholar] [CrossRef] [Green Version]
  38. Bonito, S.R. The usefulness of case studies in a Virtual Clinical Environment (VCE) multimedia courseware in nursing. J. Med. Investig. 2019, 66, 38–41. [Google Scholar] [CrossRef] [Green Version]
  39. Bowers, R.; Tunney, R.; Kelly, K.; Mills, B.; Trotta, K.; Wheeless, C.N.; Drew, R. Impact of Standardized Simulated Patients on First-Year Pharmacy Students’ Knowledge Retention of Insulin Injection Technique and Counseling Skills. Am. J. Pharm. Educ. 2017, 81, 113. [Google Scholar] [CrossRef]
  40. Coleman, D.; McLaughlin, D. Using simulated patients as a learning strategy to support undergraduate nurses to develop patient-teaching skills. Br. J. Nurs. 2019, 28, 1300–1306. [Google Scholar] [CrossRef]
  41. DeLea, D.; Shrader, S.; Phillips, C. A Week-Long Diabetes Simulation for Pharmacy Students. Am. J. Pharm. Educ. 2010, 74, 130. [Google Scholar] [CrossRef] [Green Version]
  42. Isaacs, D.; Roberson, C.L.A.; Prasad-Reddy, L. A Chronic Disease State Simulation in an Ambulatory Care Elective Course. Am. J. Pharm. Educ. 2015, 79, 133. [Google Scholar] [CrossRef] [Green Version]
  43. Kolanczyk, D.M.; Borchert, J.S.; Lempicki, K.A. Focus group describing simulation-based learning for cardiovascular topics in US colleges and schools of pharmacy. Curr. Pharm. Teach. Learn. 2019, 11, 1144–1151. [Google Scholar] [CrossRef]
  44. Moule, P.; Pollard, K.; Armoogum, J.; Messer, S. Virtual patients: Development in cancer nursing education. Nurse Educ. Today 2015, 35, 875–880. [Google Scholar] [CrossRef]
  45. Padilha, J.M.; Machado, P.P.; Ribeiro, A.L.; Ribeiro, R.; Vieira, F.; Costa, P. Easiness, usefulness and intention to use a MOOC in nursing. Nurse Educ. Today 2020, 97, 104705. [Google Scholar] [CrossRef]
  46. Cowart, K.; Updike, W.H. Pharmacy student perception of a remote hypertension and drug information simulation-based learning experience in response to the SARS-CoV-2 pandemic. J. Am. Coll. Clin. Pharm. 2021, 4, 53–59. [Google Scholar] [CrossRef]
  47. Schultze, S.R.; Mujica, F.C.; Kleinheksel, A. Demographic and spatial trends in diabetes-related virtual nursing examinations. Soc. Sci. Med. 2019, 222, 225–230. [Google Scholar] [CrossRef]
  48. Sweigart, L.; Burden, M.; Carlton, K.H.; Fillwalk, J. Virtual Simulations across Curriculum Prepare Nursing Students for Patient Interviews. Clin. Simul. Nurs. 2014, 10, e139–e145. [Google Scholar] [CrossRef]
  49. Vyas, D.; Wombwell, E.; Russell, E.; Caligiuri, F. High-Fidelity Patient Simulation Series to Supplement Introductory Pharmacy Practice Experiences. Am. J. Pharm. Educ. 2010, 74, 169. [Google Scholar] [CrossRef] [Green Version]
  50. Jeffries, P.R.; Rizzolo, M.A. Designing and Implementing Models for the Innovative Use of Simulation to Teach Nursing Care of Ill Adults and Children: A National, Multi-Site, Multi-Method Study; National League for Nursing: New York, NY, USA, 2006. [Google Scholar]
  51. Unver, V.; Basak, T.; Watts, P.; Gaioso, V.; Moss, J.; Tastan, S.; Iyigun, E.; Tosun, N. The reliability and validity of three questionnaires: The Student Satisfaction and Self-Confidence in Learning Scale, Simulation Design Scale, and Educational Practices Questionnaire. Contemp. Nurse 2017, 53, 60–74. [Google Scholar] [CrossRef]
  52. Parsell, G.; Bligh, J. The development of a questionnaire to assess the readiness of health care students for interprofessional learning (RIPLS). Med. Educ. 1999, 33, 95–100. [Google Scholar] [CrossRef] [Green Version]
  53. Anderson, R.M.; Fitzgerald, J.T.; Funnell, M.M.; Gruppen, L.D. The Third Version of the Diabetes Attitude Scale. Diabetes Care 1998, 21, 1403–1407. [Google Scholar] [CrossRef]
  54. Davis, F.D. Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Q. 1989, 1, 319–340. [Google Scholar] [CrossRef] [Green Version]
  55. Venkatesh, V.; Davis, F.D. A model of the antecedents of perceived ease of use: Development and test. Decis. Sci. 1996, 27, 451–481. [Google Scholar] [CrossRef]
  56. Venkatesh, V. Determinants of Perceived Ease of Use: Integrating Control, Intrinsic Motivation, and Emotion into the Technology Acceptance Model. Inf. Syst. Res. 2000, 11, 342–365. [Google Scholar] [CrossRef] [Green Version]
  57. Aebersold, M.; Tschannen, D.; Bathish, M. Innovative Simulation Strategies in Education. Nurs. Res. Pract. 2012, 2012, 765212. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  58. Gleason, K.T.; Commodore-Mensah, Y.; Wu, A.W.; Kearns, R.; Pronovost, P.; Aboumatar, H.; Himmelfarb, C.R.D. Massive open online course (MOOC) learning builds capacity and improves competence for patient safety among global learners: A prospective cohort study. Nurse Educ. Today 2021, 104, 104984. [Google Scholar] [CrossRef]
  59. White, M.; Shellenbarger, T. Gamification of Nursing Education with Digital Badges. Nurse Educ. 2018, 43, 78–82. [Google Scholar] [CrossRef] [PubMed]
  60. Alturkistani, A.; Lam, C.; Foley, K.; Stenfors, T.; Blum, E.R.; Van Velthoven, M.H.; Meinert, E. Massive Open Online Course Evaluation Methods: Systematic Review. J. Med. Internet Res. 2020, 22, e13851. [Google Scholar] [CrossRef] [PubMed]
  61. Crespo, R.M.; Najjar, J.; Derntl, M.; Leony, D.; Neumann, S.; Oberhuemer, P.; Totschnig, M.; Simon, B.; Gutierrez, I.; Kloos, C.D. Aligning assessment with learning outcomes in outcome-based education. In Proceedings of the IEEE EDUCON 2010 Conference, Madrid, Spain, 14–16 April 2010; pp. 1239–1246. [Google Scholar]
  62. The University of North Carolina at Chapel Hill. Introduction to Student Learning Outcomes Assessment for Continuing Program Improvement; Office of Institutional research and Assessment: Chapel Hill, NC, USA, 2017. [Google Scholar]
  63. Siddiqui, Z.S. Framework for an effective assessment: From rocky roads to silk route. Pak. J. Med. Sci. 2017, 33, 505–509. [Google Scholar] [CrossRef] [PubMed]
  64. Shumway, J.; Harden, R. AMEE Guide No. 25: The assessment of learning outcomes for the competent and reflective physician. Med. Teach. 2003, 25, 569–584. [Google Scholar] [CrossRef] [PubMed]
  65. Johnson, H.A.; Barrett, L.C. Your teaching strategy matters: How engagement impacts application in health information literacy instruction. J. Med. Libr. Assoc. 2017, 105, 44–48. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  66. Dolan, E.L.; Collins, J.P. We must teach more effectively: Here are four ways to get started. Mol. Biol. Cell 2015, 26, 2151–2155. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  67. He, Z.; Cheng, Z.; Shao, T.; Liu, C.; Shao, P.; Bishwajit, G.; Feng, D.; Feng, Z. Factors Influencing Health Knowledge and Behaviors among the Elderly in Rural China. Int. J. Environ. Res. Public Health 2016, 13, 975. [Google Scholar] [CrossRef] [Green Version]
  68. Axley, L. The integration of technology into nursing curricula: Supporting faculty via the technology fellowship program. Online J. Issues Nurs. 2008, 13, 1–11. [Google Scholar] [CrossRef]
  69. Ramnanan, C.J.; Pound, L.D. Advances in medical education and practice: Student perceptions of the flipped classroom. Adv. Med. Educ. Pract. 2017, 8, 63–73. [Google Scholar] [CrossRef] [Green Version]
  70. Chiou, S.-F.; Su, H.-C.; Liu, K.-F.; Hwang, H.-F. Flipped Classroom: A New Teaching Strategy for Integrating Information Technology into Nursing Education. Hu Li Za Zhi J. Nurs. 2015, 62, 5. [Google Scholar]
  71. Benavides, L.M.C.; Arias, J.A.T.; Serna, M.D.A.; Bedoya, J.W.B.; Burgos, D. Digital Transformation in Higher Education Institutions: A Systematic Literature Review. Sensors 2020, 20, 3291. [Google Scholar] [CrossRef]
  72. Genlott, A.A.; Grönlund, Å.; Viberg, O. Disseminating digital innovation in school—leading second-order educational change. Educ. Inf. Technol. 2019, 24, 3021–3039. [Google Scholar] [CrossRef] [Green Version]
  73. Williamson, K.M.; Muckle, J. Students’ Perception of Technology Use in Nursing Education. CIN Comput. Inform. Nurs. 2018, 36, 70–76. [Google Scholar] [CrossRef]
Figure 1. Flow diagram [34].
Figure 1. Flow diagram [34].
Healthcare 10 00001 g001
Table 1. Inclusion and exclusion criteria.
Table 1. Inclusion and exclusion criteria.
Inclusion Criteria
PopulationStudents (nursing, sports science, and pharmacy)
InterventionRQ 1: MOOC, e-learning, simulation in the field of chronic diseases
RQ 2: Assessment instruments
OutcomesOutcomes of behavior change support education (knowledge, motivation, engagement, skills, learning outcomes, etc.)
Study designQuantitative (e.g., case studies, randomized controlled trials, and controlled trials); qualitative (e.g., interview, questionnaire, and focus groups); and mixed method studies
LanguageEnglish language
Time frame2000–2021
Access/
Exclusion criteria
Substantive inadequacy; records involving students from other professional fields; records in other languages; and reviews, comments, and protocols
Table 2. Study characteristics and quality assessment of the included studies.
Table 2. Study characteristics and quality assessment of the included studies.
No.Author, YearType of StudyMMAT Score (%)
1Albrechtsen et al., 2017 [35]QUAN descriptive study80%
2Basak et al., 2019 [36]QUAN single-blinded RCT90%
3Bolesta et al., 2014 [37]QUAN descriptive study80%
4Bonito 2019 [38]QUAL study80%
5Bowers et al., 2017 [39]QUAN descriptive study single-blinded, single-center, cluster RS90%
6Coleman & McLaughlin 2019 [40]MMS60%
7Delea et al., 2010 [41]QUAN descriptive study70%
8Isaacs et al., 2015 [42]MMS90%
9Kolanczyk et al., 2019 [43]MMS80%
10Moule et al., 2015 [44]MMS70%
11Padilha et al., 2021 [45]QUAN descriptive study80%
12Pharm Cowart et al., 2021 [46]MMS80%
13Schultze et al., 2019 [47]QUAN descriptive study80%
14Sweigart et al., 2014 [48]MMS50%
15Vyas et al., 2010 [49]QUAN descriptive study70%
Legend: MMAT = Mixed Methods Appraisal Tool; MMS = mixed methods study; No. = number; RCT = randomized controlled trials; RS = randomized study; QUAL = qualitative; QUAN = quantitative.
Table 3. Instruments used for evaluating the outcomes of the research.
Table 3. Instruments used for evaluating the outcomes of the research.
No.Assessment Instruments and Short Description
1The post-course questionnaire included nine questions. The first eight were demographic. Question 9 consisted of 15 statements that collected data on the participant’s professional benefits from the course.
2The SSSC [50,51] includes 13 items but has been reduced to 12 due to Turkish adaptation. Participants were rated on a 5-point scale. The SDS [50,51] ordered 20 items in five subcategories. Based on the literature, a 15-item performance assessment checklist of teaching skills was prepared. The feedback form contained five questions.
3Pre-laboratory and post-laboratory survey instrument was created using a modification of RIPLS [52] and included 19 points, which used a 5-point Likert scale to assess students’ readiness for interprofessional learning.
4A self-administered questionnaire with open-ended questions.
5A 15-point checklist was used to assess each appropriate insulin pen counseling and injection technique component. All elements were evaluated in the form of yes/no.
6Short five-item anonymous pro forma consisted of four open questions and one closed question. The closed-ended questions assessed by participants on a five-point scale evaluated the learning experience. With an open-ended question, they wanted to determine students’ perceptions of what was helpful to them about this simulation, how they could improve their experience, and whether any other topic they found beneficial to include in the simulated curriculum.
7DAS-3 [53] included 33 questions, and questions consisted of confidence in diabetes education skills had seven questions. Students answered the questions using a 5-point Likert scale
8Data Collection Sheet Follow-Up Visit; Chronic Disease State Reflection Questions; reflections and SOAP notes. The questionnaire included 11 targeted questions on simulating chronic disease status and used a 5-point Likert scale for assessment.
9Focus groups and surveys. The survey questionnaire included eight questions about the simulation methods used for cardiac simulations.
10Questionnaire, review about a virtual patient, and comments.
11The questionnaire was based on a questionnaire Davis Technology Acceptance Model [54,55] and based on ease-of-use perception [56]
12Pre- and post-surveys questionnaire with quantitative and qualitative questions.
13Entries data included demographic data and four specific factors necessary for determining the perception of diabetes in nursing students (number of clinical findings identified by students during the examination with the virtual patient, the total number of empathic statements shared with the virtual patient, the total number patient education statements given to the patient, and the overall outcome of the clinical inference).
14Computerized evaluation of each of the virtual experiences.
15Pre-simulation and post-simulation quizzes with 5–15 questions specific to each simulation scenario were used to assess whether students’ knowledge increased through participation in the simulation.
Legend: DAS-3 = Diabetes Attitude Scale; No. = number; RIPLS = Readiness for Interprofessional Learning Scale; SDS = Simulation Design Scale; SGID = Small group instructional diagnosis; SOAP = Subjective, Objective, Assessment, Plan; SSSC = Student Satisfaction and Self-Confidence in Learning Scale.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Gosak, L.; Štiglic, G.; Budler, L.C.; Félix, I.B.; Braam, K.; Fijačko, N.; Guerreiro, M.P.; Lorber, M. Digital Tools in Behavior Change Support Education in Health and Other Students: A Systematic Review. Healthcare 2022, 10, 1. https://doi.org/10.3390/healthcare10010001

AMA Style

Gosak L, Štiglic G, Budler LC, Félix IB, Braam K, Fijačko N, Guerreiro MP, Lorber M. Digital Tools in Behavior Change Support Education in Health and Other Students: A Systematic Review. Healthcare. 2022; 10(1):1. https://doi.org/10.3390/healthcare10010001

Chicago/Turabian Style

Gosak, Lucija, Gregor Štiglic, Leona Cilar Budler, Isa Brito Félix, Katja Braam, Nino Fijačko, Mara Pereira Guerreiro, and Mateja Lorber. 2022. "Digital Tools in Behavior Change Support Education in Health and Other Students: A Systematic Review" Healthcare 10, no. 1: 1. https://doi.org/10.3390/healthcare10010001

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop