Digital Assessment in Technology-Enriched Education: Thematic Review
Abstract
:1. Introduction
2. Method
3. Results of Research Strategy
4. Findings and Discussion
4.1. Conditions and Suggestions
4.2. Conditions and Suggestions
4.3. Challenges
4.4. Future Studies and Significant Notes
5. Conclusions and Limitations
- The development of digital skills (for the educator and learner);
- The meaningful selection of appropriate technologies with clear assessment criteria;
- The guidance of student learning and formative training before summative assessment (applying self-assessment);
- The assessment of knowledge and skills at different levels;
- Useful and timely feedback for educators and learners;
- Availability and individualization;
- Ensuring academic integrity.
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Daniela, L. (Ed.) Why smart pedagogy? In Didactics of Smart Pedagogy: Smart Pedagogy for Technology Enhanced Learning; Springer: Cham, Switzerland, 2019; pp. 7–12. [Google Scholar]
- Amante, L.; Oliveira, I.R.; Gomes, M.J. E-Assessment in Portuguese higher education: Framework and perceptions of teachers and students. In Handbook of Research on E-Assessment in Higher Education; Azevedo, A., Azevedo, J., Eds.; IGI Global: Hershey, PA, USA, 2019; pp. 312–333. [Google Scholar] [CrossRef]
- Babo, R.; Babo, L.; Suhonen, J.; Tukiainen, M. E-assessment with multiple-choice questions: A 5 year study of students’ opinions and experience. J. Inf. Technol. Educ. Innov. Pract. 2020, 19, 1–29. [Google Scholar] [CrossRef] [PubMed]
- Chen, C.-H.; Koong, C.-S.; Liao, C. Influences of integrating dynamic assessment into a speech recognition learning design to support students’ English speaking skills, learning anxiety and cognitive load. Educ. Technol. Soc. 2022, 25, 1–14. [Google Scholar]
- Alruwais, N.; Wills, G.; Wald, M. Advantages and challenges of using e-assessment. Int. J. Inf. Educ. Technol. 2018, 8, 34–37. [Google Scholar] [CrossRef]
- Timmis, S.; Broadfoot, P.; Sutherland, R.; Oldfield, A. Rethinking assessment in a digital age: Opportunities, challenges and risks. Br. Educ. Res. J. 2016, 42, 454–476. [Google Scholar] [CrossRef]
- Raaheim, A.; Mathiassen, K.; Moen, V.; Lona, I.; Gynnild, V.; Bunæs, B.R.; Hasle, E.T. Digital assessment—How does it challenge local practices and national law? A Norwegian case study. Eur. J. High. Educ. 2019, 9, 219–231. [Google Scholar] [CrossRef]
- Appiah, M.; van Tonder, F. E-Assessment in higher education: A review. Int. J. Bus. Manag. Econ. Res. 2018, 9, 1454–1460. [Google Scholar]
- Al-Fraihat, D.; Joy, M.; Masa’deh, R.; Sinclair, D. Evaluating E-learning systems success: An empirical study. Comput. Hum. Behav. 2020, 102, 67–86. [Google Scholar] [CrossRef]
- Lin, C.-H.; Wu, W.-H.; Lee, T.-N. Using an online learning platform to show students’ achievements and attention in the video lecture and online practice learning environments. Educ. Technol. Soc. 2022, 25, 155–165. [Google Scholar]
- Lavidas, K.; Komis, V.; Achriani, A. Explaining faculty members’ behavioral intention to use learning management systems. J. Comput. Educ. 2022, 9, 707–725. [Google Scholar] [CrossRef]
- Huang, R.; Spector, J.M.; Yang, J. Educational Technology: A Primer for the 21st Century; Springer: Singapore, 2019. [Google Scholar] [CrossRef]
- Milakovich, M.E.; Wise, J.M. Digital Learning: The Challenges of Borderless Education; Edward Elgar Publishing: Cheltenham, UK, 2019. [Google Scholar]
- Marek, M.W.; Wu, P.N. Digital learning curriculum design: Outcomes and affordances. In Pedagogies of Digital Learning in Higher Education; Daniela, L., Ed.; Routledge: Abingdon-on-Thames, UK, 2020; pp. 163–182. [Google Scholar] [CrossRef]
- Carless, D. Feedback for student learning in higher education. In International Encyclopedia of Education, 4th ed.; Tierney, R., Rizvi, F., Ercikan, K., Smith, G., Eds.; Elsevier: Amsterdam, The Netherlands, 2022; pp. 623–629. [Google Scholar]
- Kazmi, B.A.; Riaz, U. Technology-enhanced learning activities and student participation. In Learning and Teaching in Higher Education: Perspectives from a Business School; Daniels, K., Elliott, C., Finley, S., Chapmen, C., Eds.; Edward Elgar Publishing: Cheltenham, UK, 2020; pp. 177–183. [Google Scholar]
- Prensky, M. Teaching Digital Natives: Partnering for Real Learning; SAGE: Richmond, BC, Canada, 2010. [Google Scholar]
- Schofield, K. Theorising about learning and knowing. In Learning and Teaching in Higher Education: Perspectives from a Business School; Daniels, K., Elliott, C., Finley, S., Chapmen, C., Eds.; Edward Elgar Publishing: Cheltenham, UK, 2020; pp. 2–12. [Google Scholar]
- Alemdag, E.; Yildirim, Z. Design and development of an online formative peer assessment environment with instructional scaffolds. Educ. Technol. Res. Dev. 2022, 70, 1359–1389. [Google Scholar] [CrossRef]
- Guardia, L.; Maina, M.; Mancini, F.; Naaman, H. EPICA—Articulating skills for the workplace. In The Envisioning Report for Empowering Universities, 4th ed.; Ubach, G., Ed.; European Association of Distance Teaching Universities: Maastricht, The Netherlands, 2020; pp. 26–29. [Google Scholar]
- Podsiad, M.; Havard, B. Faculty acceptance of the peer assessment collaboration evaluation tool: A quantitative study. Educ. Technol. Res. Dev. 2020, 68, 1381–1407. [Google Scholar] [CrossRef]
- Ibarra-Sáiz, M.S.; Rodríguez-Gómez, G.; Boud, D.; Rotsaert, T.; Brown, S.; Salinas-Salazar, M.L.; Rodríguez-Gómez, H.M. The future of assessment in higher education. RELIEVE 2020, 26, art. M1. [Google Scholar] [CrossRef]
- Loureiro, P.; Gomes, M.J. Online Peer Assessment for Learning: Findings from Higher Education Students. Educ. Sci. 2023, 13, 253. [Google Scholar] [CrossRef]
- Lin, C.-J. An online peer assessment approach to supporting mind-mapping flipped learning activities for college English writing courses. J. Comput. Educ. 2019, 6, 385–415. [Google Scholar] [CrossRef]
- Robertson, S.N.; Humphrey, S.M.; Steele, J.P. Using technology tools for formative assessments. J. Educ. Online 2019, 16, 1–10. [Google Scholar] [CrossRef]
- Kempe, A.L.; Grönlund, Å. Collaborative digital textbooks—A comparison of five different designs shaping teaching and learning. Educ. Inf. Technol. 2019, 24, 2909–2941. [Google Scholar] [CrossRef]
- Chin, H.; Chew, C.M.; Lim, H.L. Incorporating feedback in online cognitive diagnostic assessment for enhancing grade five students’ achievement in ‘time’. J. Comput. Educ. 2021, 8, 183–212. [Google Scholar] [CrossRef]
- Jensen, L.X.; Bearman, M.; Boud, D. Understanding feedback in online learning—A critical review and metaphor analysis. Comput. Educ. 2021, 173, 104271. [Google Scholar] [CrossRef]
- Gikandi, J.W.; Morrow, D.; Davis, N.E. Online formative assessment in higher education: A review of the literature. Comput. Educ. 2011, 57, 2333–2351. [Google Scholar] [CrossRef]
- Alshaikh, A.A. The degree of utilizing e-assessment techniques at Prince Sattam Bin Abdulaziz University: Faculty perspectives. J. Educ. Soc. Res. 2020, 10, 238–249. [Google Scholar] [CrossRef]
- Butler-Henderson, K.; Crawford, J. A systematic review of online examinations: A pedagogical innovation for scalable authentication and integrity. Comput. Educ. 2020, 159, 104024. [Google Scholar] [CrossRef] [PubMed]
- Gallatly, R.; Carciofo, R. Using an online discussion forum in a summative coursework assignment. J. Educ. Online 2020, 17, 1–12. [Google Scholar]
- Finley, S. Writing effective multiple choice questions. In Learning and Teaching in Higher Education: Perspectives from a Business School; Daniels, K., Elliott, C., Finley, S., Chapmen, C., Eds.; Edward Elgar Publishing: Cheltenham, UK, 2020; pp. 295–303. [Google Scholar]
- Ibbet, N.L.; Wheldon, B.J. The incidence of clueing in multiple choice testbank questions in accounting: Some evidence from Australia. E-J. Bus. Educ. Scholarsh. Teach. 2016, 10, 20–35. [Google Scholar]
- Eaton, S.E. Academic integrity during COVID-19: Reflections from the University of Calgary. ISEA 2020, 48, 80–85. [Google Scholar]
- Dawson, P. Defending Assessment Security in a Digital World: Preventing E-Cheating and Supporting Academic Integrity in Higher Education; Routledge: London, UK, 2021. [Google Scholar]
- Stödberg, U. A research review of e-assessment. Assess. Eval. High. Educ. 2012, 37, 591–604. [Google Scholar] [CrossRef]
- Ridgway, J.; McCusker, S.; Pead, D. Literature Review of E-Assessment; Project Report; Futurelab: Bristol, UK, 2004. [Google Scholar]
- Booth, A.; Noyes, J.; Flemming, K.; Gerhardus, A.; Wahlster, P.; Van Der Wilt, G.J.; Mozygemba, K.; Refolo, P.; Sacchini, D.; Tummers, M.; et al. Guidance on Choosing Qualitative Evidence Synthesis Methods for use in Health Technology Assessments of Complex Interventions. 2016. Available online: http://www.integrate-hta.eu/downloads/ (accessed on 17 January 2022).
- Flemming, K.; Noyes, J. Qualitative evidence synthesis: Where are we at? Int. J. Qual. Methods 2021, 20, 1–13. [Google Scholar] [CrossRef]
- Zawacki-Richter, O.; Kerres, M.; Bedenlier, S.; Bond, M.; Buntins, K. (Eds.) Systematic Reviews in Educational Research: Methodology, Perspectives and Application; Springer: Wiesbaden, Germany, 2020. [Google Scholar]
- The PRISMA 2020 Statement: An Updated Guideline for Reporting Systematic Reviews. Available online: https://www.bmj.com/content/372/bmj.n71 (accessed on 2 February 2022).
- Rof, A.; Bikfalvi, A.; Marques, P. Pandemic-accelerated digital transformation of a born digital higher education institution: Towards a customized multimode learning strategy. Educ. Technol. Soc. 2022, 25, 124–141. [Google Scholar]
- Hong, J.-C.; Liu, X.; Cao, W.; Tai, K.-H.; Zhao, L. Effects of self-efficacy and online learning Mind States on learning ineffectiveness during the COVID-19 Lockdown. Educ. Technol. Soc. 2022, 25, 142–154. [Google Scholar]
- Bozkurt, A.; Sharma, R.C. Emergency remote teaching in a time of global crisis due to CoronaVirus pandemic. Asian J. Distance Educ. 2020, 15, 1–6. [Google Scholar] [CrossRef]
- Chiou, P.Z. Learning cytology in times of pandemic: An educational institutional experience with remote teaching. J. Am. Soc. Cytopathol. 2020, 9, 579–585. [Google Scholar] [CrossRef]
- Hodges, C.; Moore, S.; Lockee, B.; Trust, T.; Bond, A. The Difference Between Emergency Remote Teaching and Online Learning. Available online: https://er.educause.edu/articles/2020/3/the-difference-between-emergency-remote-teaching-and-online-learning#fn17 (accessed on 14 January 2022).
- Newman, M.; Gough, D. Systematic reviews in educational research: Methodology, perspectives and application. In Systematic Reviews in Educational Research: Methodology, Perspectives and Application; Zawacki-Richter, O., Kerres, M., Bedenlier, S., Bond, M., Buntins, K., Eds.; Springer: Wiesbaden, Germany, 2019; pp. 3–22. [Google Scholar] [CrossRef]
- Ghouali, K.; Cecilia, R.R. Towards a Moodle-based assessment of Algerian EFL students’ writing performance. Porta Linguarum 2021, 36, 231–248. [Google Scholar] [CrossRef]
- Poth, C. The contributions of mixed insights to advancing technology-enhanced formative assessments within higher education learning environments: An illustrative example. Int. J. Educ. Technol. High. Educ. 2018, 15, 9. [Google Scholar] [CrossRef]
- Barana, A.; Marchisio, M. An interactive learning environment to empower engagement in Mathematics. Interact. Des. Archit. J. 2020, 45, 302–321. [Google Scholar] [CrossRef]
- Lajane, H.; Gouifrane, R.; Qaisar, R.; Noudmi, F.Z.; Lotfi, S.; Chemsi, G.; Radid, M. Formative e-assessment for Moroccan Polyvalent nurses training: Effects and challenges. Int. J. Emerg. Technol. Learn. 2020, 15, 236–251. [Google Scholar] [CrossRef]
- Babo, R.; Rocha, J.; Fitas, R.; Suhonen, J.; Tukiainen, M. Self and peer e-assessment: A study on software usability. Int. J. Inf. Commun. Technol. Educ. 2021, 17, 68–85. [Google Scholar] [CrossRef]
- Guerrero-Roldán, A.E.; Rodriguez-Gonzalez, M.E.; Karadeniz, A.; Kocdar, R.; Aleksieva, L.; Peytcheva-Forsyth, R. Students’ experiences on using an authentication and authorship checking system in e-assessment. Hacet. Univ. J. Educ. 2019, 35, 6–24. [Google Scholar] [CrossRef]
- Bahati, B.; Fors, U.; Hansen, P.; Nouri, J.; Mukama, E. Measuring learner satisfaction with formative e-assessment strategies. Int. J. Emerg. Technol. Learn. 2019, 14, 61–79. [Google Scholar] [CrossRef]
- Mimirinis, M. Qualitative differences in academics’ conceptions of e-assessment. Assess. Eval. High. Educ. 2018, 44, 233–248. [Google Scholar] [CrossRef]
- Mahmood, H.K.; Hussain, F.; Mahmood, M.; Kumail, R.; Abbas, J. Impact of e-assessment at middle school students’ learning—An empirical study at USA middle school students. Int. J. Sci. Eng. Res. 2020, 11, 1722–1736. [Google Scholar]
- Deelay, S.J. Using technology to facilitate effective assessment for learning and feedback in higher education. Assess. Eval. High. Educ. 2018, 43, 439–448. [Google Scholar] [CrossRef]
- Swart, O.; Shuttleworth, C.C. The new face of alternative assessment in accounting sciences—Technology as an anthropomorphic stakeholder. S. Afr. J. High. Educ. 2021, 35, 200–219. [Google Scholar] [CrossRef]
- Kocdar, S.; Karadeniz, A.; Peytcheva-Forsyth, R.; Stoeva, V. Cheating and plagiarism in e-assessment: Students’ perspectives. Open Prax. 2018, 10, 221–235. [Google Scholar] [CrossRef]
- Moccozet, L.; Benkacem, O.; Berisha, E.; Trindade, R.T.; Bürgi, P.-Y. A versatile and flexible framework for summative e-assessment in higher education. Int. J. Contin. Eng. Educ. Life-Long Learn. 2019, 29, 1–18. [Google Scholar]
- Ghilay, Y. Computer assisted assessment (CAA) in higher education: Multi-text and quantitative courses. J. Online High. Educ. 2019, 3, 13–34. [Google Scholar]
- Garcez, A.; Silva, R.; Franco, M. Digital transformation shaping structural pillars for academic entrepreneurship: A framework proposal and research agenda. Educ. Inf. Technol. 2022, 27, 1159–1182. [Google Scholar] [CrossRef] [PubMed]
- Bamoallem, B.; Altarteer, S. Remote emergency learning during COVID-19 and its impact on university students perception of blended learning in KSA. Educ. Inf. Technol. 2022, 27, 157–179. [Google Scholar] [CrossRef]
- Al-Azawei, A.; Baiee, W.R.; Mohammed, M.A. Learners’ experience towards e-assessment tools: A comparative study on virtual reality and Moodle quiz. Int. J. Emerg. Technol. Learn. 2019, 14, 34–50. [Google Scholar] [CrossRef]
- Martinez, V.; Mon, M.A.; Alvarez, M.; Fueyo, E.; Dobarro, A. E-Self-Assessment as a strategy to improve the learning process at university. Educ. Res. Int. 2020, 2020, 3454783. [Google Scholar] [CrossRef]
- Wu, W.; Berestova, A.; Lobuteva, A.; Stroiteleva, N. An intelligent computer system for assessing student performance. Int. J. Emerg. Technol. Learn. 2021, 16, 31–45. [Google Scholar] [CrossRef]
- McCallum, S.; Milner, M.M. The effectiveness of formative assessment: Student views and staff reflections. Assess. Eval. High. Educ. 2020, 46, 1–16. [Google Scholar] [CrossRef]
- Wong, S.F.; Mahmud, M.M.; Wong, S.S. Effectiveness of formative e-assessment procedure: Learning calculus in blended learning environment. In Proceedings of the 2020 8th International Conference on Communications and Broadband Networking, Auckland, New Zealand, 15–18 April 2020. [Google Scholar] [CrossRef]
- Weir, I.; Gwynllyw, R.; Henderson, K. A case study in the e-assessment of statistics for non-specialists. J. Univ. Teach. Learn. Pract. 2021, 18, 3–20. [Google Scholar] [CrossRef]
- Cramp, J.; Medlin, J.F.; Lake, P.; Sharp, C. Lessons learned from implementing remotely invigilated online exams. J. Univ. Teach. Learn. Pract. 2019, 16, 1–20. [Google Scholar] [CrossRef]
- Makokotlela, M.V. An E-Portfolio as an assessment strategy in an open distance learning context. Int. J. Inf. Commun. Technol. Educ. 2020, 16, 122–134. [Google Scholar] [CrossRef]
- Peytcheva-Forsyth, R.; Aleksieva, L.; Yovkova, B. The impact of prior experience of e-learning and e-assessment on students’ and teachers’ approaches to the use of a student authentication and authorship checking system. In Proceedings of the 10th International Conference on Education and New Learning Technologies, Palma, Spain, 2–4 July 2018; pp. 2311–2321. [Google Scholar]
- Danniels, E.; Pyle, A.; DeLuca, C. The role of technology in supporting classroom assessment in playbased kindergarten. Teach. Teach. Educ. 2020, 88, 102966. [Google Scholar] [CrossRef]
- Babo, R.; Suhonen, J. E-assessment with multiple choice questions: A qualitative study of teachers’ opinions and experience regarding the new assessment strategy. Int. J. Learn. Technol. 2018, 13, 220–248. [Google Scholar] [CrossRef]
First Author | Year | Country | Method | Technology | Research units | Theme |
---|---|---|---|---|---|---|
Al-Azawei | 2019 | Iraq | Survey; pre-test; post-test | Moodle, Samsung GEAR VR | 32 students | E-assessment tools: virtual reality and Moodle quiz |
Babo | 2018 | Portugal | Focus groups | Moodle | 4 lecturers | E-assessment with multiple choice questions |
Babo | 2021 | Portugal, Finland | Description | WebPA, TeamMates, InteDash-board Workshop, PeerMark, iPeer | 5 researchers | Self and peer e-assessment: software usability |
Babo | 2020 | Portugal, Finland | Survey; interviews | Moodle | 815 students | E-assessment with multiple-choice questions |
Bahati | 2019 | Sweden | Survey | Moodle | 108 students | Formative e-assessment strategies |
Barana | 2020 | Italy | Survey | Interactive White Board | 278 students | Automatic formative assessment |
Cramp | 2019 | Australia | Reflections | UniSA Online | 9 study courses | Implementation of remotely invigilated online exams |
Danniels | 2020 | Canada | Observations; interviews | NA | 20 kindergarten classes | Classroom assessment in play-based kindergarten |
Deelay | 2018 | UK | Focus groups; interviews | Moodle, Camtasia, Echo360, Google Glass | 20 students | Technology to facilitate effective assessment |
Ghilay | 2019 | Israel | Survey; document analysis | Moodle | >76 students | Effectiveness of computer-assisted assessment |
Ghouali | 2021 | Algeria, Spain | Pre-test; post-test | Moodle | 42 students | Moodle-based assessment |
Guerrero-Roldán | 2018 | Spain | Reflection; description | NA | 1 study course | Aligning e-assessment to competences and learning activities |
Guerrero-Roldán | 2019 | Spain, Bulgaria, Turkey | Survey | TeSLA | 735 students | Authentication and authorship checking system |
Kocdar | 2018 | Turkey, Bulgaria | Survey | TeSLA | 952 students | Cheating and plagiarism in e-assessment |
Lajane | 2020 | Marocco | Interviews | Chamilo | 10 teachers, 58 students | Formative e-assessment |
Mahmood | 2020 | USA | Survey | NA | 130 students | Impact of e-assessment on learning |
Makokotle-la | 2020 | South Africa | Interviews; document analysis | NA | 9 students; 13 e-portfolio | Use of an e-portfolio |
Martinez | 2020 | Spain | Survey | Moodle | 316 students | E-self-assessment as improver of performance |
McCallum | 2020 | UK | Survey | Moodle | 245 students | Effectiveness of formative e-assessment |
Mimirinis | 2018 | UK | Interviews | Electronic management of assessment | 21 academic | Academics’ conceptions of e-assessment |
Moccozet | 2019 | Switzerland | Mixed | Bring your own device approach | NA | Framework for summative e-assessment |
Peytcheva-Forsyth | 2018 | Bulgaria | Survey; interviews | TeSLA | 285 students, 100 staff | Prevention of cheating and plagiarism |
Poth | 2018 | Canada | Survey; observation; document analysis | Moodle | 274 students, 175 observati-ons, 26 summaries | Two technology-enhanced formative assessment classroom strategies |
Swart | 2021 | South Africa | Interviews | ODeL | NA | Alternative assessment and technology |
Weir | 2021 | UK | Case study; reflection | Dewis | 123 students | E-assessment of statistics |
Wong | 2020 | Malaysia | data from e-learning system | Elearn | 41 student | Effectiveness of the formative e-assessment |
Wu | 2021 | Russia, China | data from system | ASP.NET | 50 students | An intelligent computer system for assessment |
Categories | Thematic Groups | Codes |
---|---|---|
Conditions and suggestions | Application of assessment | to guide student learning; self-assessment |
Assessment features | authentic; consistent; transparent; practicable; combination of different methods; focus on higher-order skills; to ensure academic integrity; algorithm-based; open answers; immediate feedback; interactive feedback; contextualized; fit for purposes; student-centered pedagogy; available for a long time; feedback-rich; repeated | |
Operating conditions | training and resources; avoidance of resistance to change; commitment of all involved; adequate guidance (learnability); availability; constant practice and use; student-generated questions | |
Role of educator | support for achievement; detect plagiarism; to ensure fairness; quality of questions; to explain assessment; to be flexible and open; to ensure usefulness of feedback | |
Technologies | administrative transactions; to ensure authorship and authentication; collecting results; safe user data; to provide examination; secure web browser; clear navigation; error recognition; user satisfaction; hybrid technologies | |
Opportunities | General characteristics | innovative and powerful; flexible, efficient and convenient; accessible and flexible |
Specific benefits | training before summative; increasing feedback; increasing objectivity, consistency and fairness; temporal and spatial flexibility; meta-cognitive processes; responsibility for own learning; to develop critical thinking and the capacity for analysis; immediate feedback (timely); flexibility in time and place; academic integrity; students’ learning benefits; makes writing easier; access for students with disabilities; encouraging high-order thinking; improving student performance; individualization; student self-assessment and evaluation; more engagement; collaborative learning from peer assessment; skills that enrich learning; detailed feedback | |
Specific benefits for educators | Development of technological competencies; personal development and growth; faster process; storage and sharing of assessment data; processing and application of assessment data; decreasing cost; increasing efficiency; facilitating design, submission and correction of tests; functionality of the tests; managing and streamlining the assessment process | |
Challenges | For educators | to ensure digital pedagogy; implementation for high-stake examination; more time for creating tasks; to detect impersonation, contract cheating and plagiarism; to detect the possibility of ‘hidden’ usage of additional materials; students’ resistance to using technology; privacy and data security; the time required to prepare the exam; avoidance of memorization; to test conceptual understanding; to assess complex, open-ended work; limitations of automated marking; limited ability to recognize the correct input |
For institution | e-assessments for high-stake examinations; costs (software, hardware); lack of an institutional platform; Wi-Fi network; training in innovative approaches integrating ICT; the diversity of tasks required from teachers; the poor ICT infrastructure | |
For students | test completion time | |
For students and educators | lack of computer skills; connectivity and support; lack of technical support; inadequate training; technology experience; lack of comfort |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Jurāne-Brēmane, A. Digital Assessment in Technology-Enriched Education: Thematic Review. Educ. Sci. 2023, 13, 522. https://doi.org/10.3390/educsci13050522
Jurāne-Brēmane A. Digital Assessment in Technology-Enriched Education: Thematic Review. Education Sciences. 2023; 13(5):522. https://doi.org/10.3390/educsci13050522
Chicago/Turabian StyleJurāne-Brēmane, Anžela. 2023. "Digital Assessment in Technology-Enriched Education: Thematic Review" Education Sciences 13, no. 5: 522. https://doi.org/10.3390/educsci13050522
APA StyleJurāne-Brēmane, A. (2023). Digital Assessment in Technology-Enriched Education: Thematic Review. Education Sciences, 13(5), 522. https://doi.org/10.3390/educsci13050522