Challenges and Possibilities of ICT-Mediated Assessment in Virtual Teaching and Learning Processes
Abstract
:1. Introduction
- What are the meanings of ICT-mediated assessment?
- What kind of knowledge is susceptible to ICT-mediated assessment?
- What are the assessment possibilities offered by the current ICT platforms?
2. Methodology
3. Results
3.1. Systematic Review of ICT-Mediated Assessment
3.1.1. What Are the Meanings of ICT-Mediated Assessment?
3.1.2. What Kind of Knowledge Is Susceptible to ICT-Mediated Assessment?
3.1.3. What Are the Assessment Possibilities Offered by the Current ICT Platforms?
3.2. LMS Evaluation Tools
4. Discussion
5. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Richmond, G.; Salazar, M.; Jones, N. Assessment and the Future of Teacher Education. J. Teach. Educ. 2019, 70, 86–89. [Google Scholar] [CrossRef] [Green Version]
- Sadler, D. Assessment & Evaluation in Higher Education Interpretations of criteria-based assessment and grading in higher education. Assess. Eval. High. Educ. 2005, 30, 175–194. [Google Scholar] [CrossRef]
- Noaman, A.; Ragab, A.; Madbouly, A.; Khedra, A.; Fayoumi, A. Higher education quality assessment model: Towards achieving educational quality standard. Stud. High. Educ. 2017, 42, 23–46. [Google Scholar] [CrossRef]
- Braun, H.; Singer, J. Assessment for monitoring of education systems: International comparisons. Ann. Am. Acad. Pol. Soc. Sci. 2019, 683, 75–92. [Google Scholar] [CrossRef]
- Huang, X.; Hu, Z. On the Validity of Educational Evaluation and Its Construction. High. Educ. Stud. 2015, 5, 99–105. [Google Scholar] [CrossRef] [Green Version]
- Penuel, W. A dialogical epistemology for educational evaluation. NSSE Yearb. 2010, 109, 128–143. [Google Scholar]
- Kim, K.; Seo, E. The relationship between teacher efficacy and students’ academic achievement: A meta-analysis. Soc. Behav. Personal. 2018, 46, 529–540. [Google Scholar] [CrossRef]
- Liu, Q.; Geertshuis, S.; Grainger, R. Understanding academics’ adoption of learning technologies: A systematic review. Comput. Educ. 2020, 151, 103857. [Google Scholar] [CrossRef] [Green Version]
- Bouzguenda, I.; Alalouch, C.; Fava, N. Towards smart sustainable cities: A review of the role digital citizen participation could play in advancing social sustainability. Sustain. Cities Soc. 2019, 50, 101627. [Google Scholar] [CrossRef]
- Hernández, R.; Cáceres, I.; Zarate, J.; Coronado, D.; Loli, T.; Arévalo, G. Information and Communication Technology (ICT) and Its Practice in Educational Evaluation. J. Educ. Psychol. 2019, 7, 6–10. [Google Scholar] [CrossRef]
- Mioduser, D.; Nachmias, R.; Forkosh-Baruch, A. New Literacies for the Knowledge Society. In Second Handbook of Information Technology in Primary and Secondary Education; Voogt, J., Knezek, G., Christensen, R., Lai, K.W., Eds.; Springer: Berlin/Heidelberg, Germany, 2008; pp. 23–42. ISBN 978-0-387-73315-9. [Google Scholar]
- Charteris, J.; Quinn, F.; Parkes, M.; Fletcher, P.; Reyes, V. e-Assessment for learning and performativity in higher education: A case for existential learning. Australas. J. Educ. Technol. 2016, 32, 112–122. [Google Scholar] [CrossRef] [Green Version]
- Spector, J.M.; Ifenthaler, D.; Sampson, D.; Yang, J.L.; Mukama, E.; Warusavitarana, A.; Lokuge, K.; Eichhorn, K.; Fluck, A.; Huang, R.; et al. technology enhanced formative assessment for 21st century learning. J. Educ. Technol. Soc. 2016, 19, 58–71. [Google Scholar]
- Xiong, Y.; Suen, H.K. Assessment approaches in massive open online courses: Possibilities, challenges and future directions. Int. Rev. Educ. 2018, 64, 241–263. [Google Scholar] [CrossRef]
- Nikou, S.A.; Economides, A.A. Mobile-based assessment: A literature review of publications in major referred journals from 2009 to 2018. Comput. Educ. 2018, 125, 101–119. [Google Scholar] [CrossRef]
- Muñoz, J.; González, C. Evaluación en Sistemas de Aprendizaje Móvil: Una revisión de la literatura. Rev. Ibérica Sist. Tecnol. Inf. 2019, E22, 187–199. [Google Scholar]
- Mousavinasab, E.; Zarifsanaiey, N.R.; Niakan, S.; Rakhshan, M.; Keikha, L.; Ghazi, M. Intelligent tutoring systems: A systematic review of characteristics, applications, and evaluation methods. Interact. Learn. Env. 2018. [Google Scholar] [CrossRef]
- Moher, D.; Liberati, A.; Tetzlaff, J.; Altman, D.G. Preferred Reporting Items for Systematic Reviews and Meta-Analyses: The PRISMA Statement. PLOS Med. 2009, 6. [Google Scholar] [CrossRef] [Green Version]
- Kasim, N.; Khalid, F. Choosing the Right Learning Management System (LMS) for the Higher Education Institution Context: A Systematic Review. Int. J. Emerg. Technol. Learn. 2016, 11, 55–61. [Google Scholar] [CrossRef] [Green Version]
- Makokotlela, M.V. An E-Portfolio as an Assessment Strategy in an Open Distance Learning Context. Int. J. Inf. Commun. Technol. Educ. 2020, 16, 122–134. [Google Scholar] [CrossRef]
- Chen, S.Y.; Tseng, Y.F. The impacts of scaffolding e-assessment English learning: A cognitive style perspective. Comput. Assist. Lang. Learn. 2019. [Google Scholar] [CrossRef]
- Formanek, M.; Wenger, M.C.; Buxner, S.R.; Impey, C.D.; Sonam, T. Insights about large-scale online peer assessment from an analysis of an astronomy MOOC. Comput. Educ. 2017, 113, 243–262. [Google Scholar] [CrossRef]
- Wimmer, H.; Powell, L.; Kilgus, L.; Force, C. Improving Course Assessment via Web-based Homework. Int. J. Online Pedagog. Course Des. 2017, 7, 1–19. [Google Scholar] [CrossRef]
- Steif, P.S.; Fu, L.; Kara, L.B. Providing formative assessment to students solving multipath engineering problems with complex arrangements of interacting parts: An intelligent tutor approach. Interact. Learn. Environ. 2016, 24, 1864–1880. [Google Scholar] [CrossRef]
- Martin, F.; Ritzhaupt, A.; Kumar, S.; Budhrani, K. Award-winning faculty online teaching practices: Course design, assessment and evaluation, and facilitation. Internet High. Educ. 2019, 42, 34–43. [Google Scholar] [CrossRef]
- Lee, Y.; Rofe, J.S. Paragogy and flipped assessment: Experience of designing and running a MOOC on research methods. Open Learn. 2016, 31, 116–129. [Google Scholar] [CrossRef] [Green Version]
- García-Peñalvo, F.J.; Corell, A.; Abella-García, V.; Grande, M. Online assessment in higher education in the time of COVID-19. Educ. Knowl. Soc. 2020. [Google Scholar] [CrossRef]
- Thoma, B.; Turnquist, A.; Zaver, F.; Hall, A.K.; Chan, T.M. Communication, learning and assessment: Exploring the dimensions of the digital learning environment. Med. Teach. 2019, 41, 385–390. [Google Scholar] [CrossRef]
- Hills, L.; Clarke, A.; Hughes, J.; Butcher, J.; Shelton, I.; McPherson, E. Chinese whispers? Investigating the consistency of the language of assessment between a distance education institution, its tutors and students. Open Learn. 2018, 33, 238–249. [Google Scholar] [CrossRef]
- Leppisaari, I.; Peltoniemi, J.; Hohenthal, T.; Im, Y. Searching for effective peer assessment models for improving online learning in HE–Do-It-Yourself (DIY) case. J. Interact. Learn. Res. 2018, 29, 507–528. [Google Scholar]
- Alizadeh, T.; Tomerini, D.; Colbran, S. Teaching planning studios: An online assessment task to enhance the first year experience. J. Plan. Educ. Res. 2017, 37, 234–245. [Google Scholar] [CrossRef]
- Fenton-O’Creevy, M.; van Mourik, C. ‘I understood the words but I didn’t know what they meant’: Japanese online MBA students’ experiences of British assessment practices. Open Learn. 2019, 31, 130–140. [Google Scholar] [CrossRef]
- Hills, L.; Hughes, J. Assessment worlds colliding? Negotiating between discourses of assessment on an online open course. Open Learn. 2016, 31, 108–115. [Google Scholar] [CrossRef] [Green Version]
- Garcia-Loro, F.; Martin, S.; Ruiperez-Valiente, J.A.; San Cristobal, E.; Castro, M. Reviewing and analyzing peer review Inter-Rater Reliability in a MOOC platform. Comput. Educ. 2020, 154, 103894. [Google Scholar] [CrossRef]
- Li, X. Self-assessment as ‘assessment as learning’in translator and interpreter education: Validity and washback. Interpret. Transl. Train. 2018, 12, 48–67. [Google Scholar] [CrossRef]
- Chrysafiadi, K.; Troussas, C.; Virvou, M. Combination of fuzzy and cognitive theories for adaptive e-assessment. Expert Syst. Appl. 2020, 161, 113614. [Google Scholar] [CrossRef]
- Chiu, P.S.; Pu, Y.H.; Kao, C.C.; Wu, T.T.; Huang, Y.M. An authentic learning based evaluation method for mobile learning in Higher Education. Innov. Educ. Teach. Int. 2018, 55, 336–347. [Google Scholar] [CrossRef]
- Tsai, S.C. Effectiveness of ESL students’ performance by computational assessment and role of reading strategies in courseware-implemented business translation tasks. Comput. Assist. Lang. Learn. 2017, 30, 474–487. [Google Scholar] [CrossRef]
- Cohen, D.; Sasson, I. Online quizzes in a virtual learning environment as a tool for formative assessment. J. Technol. Sci. Educ. 2016, 6, 188–208. [Google Scholar]
- Wang, Y.; Liang, Y.; Liu, L.; Liu, Y. A multi-peer assessment platform for programming language learning: Considering group non-consensus and personal radicalness. Interact. Learn. Environ. 2016, 24, 2011–2031. [Google Scholar] [CrossRef]
- Lajane, H.; Gouifrane, R.; Qaisar, R.; Noudmi, F.; Lotfi, S.; Chemsi, G.; Radid, M. Formative e-Assessment for Moroccan Polyvalent Nurses Training: Effects and Challenges. Int. J. Emerg. Technol. Learn. 2020, 15, 236–251. [Google Scholar] [CrossRef]
- Astalini, A.; Darmaji, D.; Kurniawan, W.; Anwar, K.; Kurniawan, D. Effectivenes of Using E-Module and E-Assessment. International Association of Online Engineering. Int. J. Inf. Manag. 2019, 13, 21–38. [Google Scholar]
- Bhagat, K.K.; Liou, W.K.; Michael Spector, J.; Chang, C.Y. To use augmented reality or not in formative assessment: A comparative study. Interact. Learn. Environ. 2019, 27, 830–840. [Google Scholar] [CrossRef]
- Robertson, S.N.; Humphrey, S.M.; Steele, J.P. Using Technology Tools for Formative Assessments. J. Educ. Online 2019, 16, 2. [Google Scholar] [CrossRef]
- Amasha, M.A.; Abougalala, R.A.; Reeves, A.J.; Alkhalaf, S. Combining Online Learning & Assessment in synchronization form. Educ. Inf. Technol. 2018, 23, 2517–2529. [Google Scholar] [CrossRef]
- Lin, C.Y.; Wang, T.H. Implementation of personalized e-Assessment for remedial teaching in an e-Learning environment. Eurasia J. Math. Sci. Technol. Educ. 2017, 13, 1045–1058. [Google Scholar] [CrossRef]
- Petrović, J.; Pale, P.; Jeren, B. Online formative assessments in a digital signal processing course: Effects of feedback type and content difficulty on students learning achievements. Educ. Inf. Technol. 2017, 22, 3047–3061. [Google Scholar] [CrossRef]
- Vakili, S.; Ebadi, S. Exploring EFL learners developmental errors in academic writing through face-to-Face and Computer-Mediated dynamic assessment. Comput. Assist. Lang. Learn. 2019, 1–36. [Google Scholar] [CrossRef]
- Usher, M.; Barak, M. Peer assessment in a project-based engineering course: Comparing between on-campus and online learning environments. Assess. Eval. High. Educ. 2018, 43, 745–759. [Google Scholar] [CrossRef]
- Mumtaz, K.; Iqbal, M.M.; Khalid, S.; Rafiq, T.; Owais, S.M.; Al Achhab, M. An E-assessment framework for blended learning with augmented reality to enhance the student learning. Eurasia J. Math. Sci. Technol. Educ. 2017, 13, 4419–4436. [Google Scholar] [CrossRef]
- Bahar, M.; Asil, M. Attitude towards e-assessment: Influence of gender, computer usage and level of education. Open Learn. 2018, 33, 221–237. [Google Scholar] [CrossRef]
- Al-Emran, M.; Salloum, S.A. Students’ attitudes towards the use of mobile technologies in e-Evaluation. Int. J. Interact. Mob. Technol. 2017, 11, 195–202. [Google Scholar] [CrossRef]
- Cakiroglu, U.; Erdogdu, F.; Kokoc, M.; Atabay, M. Students’ Preferences in Online Assessment Process: Influences on Academic Performances. Turk. Online J. Distance Educ. 2017, 18, 132–142. [Google Scholar] [CrossRef]
- McCarthy, J. Enhancing feedback in higher education: Students’ attitudes towards online and in-class formative assessment feedback models. Act. Learn. High. Educ. 2017, 18, 127–141. [Google Scholar] [CrossRef]
- Becerra-Alonso, D.; Lopez-Cobo, I.; Gómez-Rey, P.; Fernández-Navarro, F.; Barbera, E. EduZinc: A tool for the creation and assessment of student learning activities in complex open, online, and flexible learning environments. Distance Educ. 2020, 41, 86–105. [Google Scholar] [CrossRef]
- Gañán, D.; Caballé, S.; Clarisó, R.; Conesa, J.; Bañeres, D. ICT-FLAG: A web-based e-assessment platform featuring learning analytics and gamification. Int. J. Web Inf. Syst. 2017, 13, 25–54. [Google Scholar] [CrossRef]
- Farias, G.; Muñoz de la Peña, D.; Gómez-Estern, F.; De la Torre, L.; Sánchez, C.; Dormido, S. Adding automatic evaluation to interactive virtual labs. Interact. Learn. Environ. 2016, 24, 1456–1476. [Google Scholar] [CrossRef]
- Mora, N.; Caballe, S.; Daradoumis, T. Providing a multi-fold assessment framework to virtualized collaborative learning in support for engineering education. Int. J. Emerg. Technol. Learn. 2016, 11, 41–51. [Google Scholar] [CrossRef] [Green Version]
- Barra, E.; López-Pernas, S.; Alonso, Á.; Sánchez-Rada, J.F.; Gordillo, A.; Quemada, J. Automated Assessment in Programming Courses: A Case Study during the COVID-19 Era. Sustainability 2020, 12, 7451. [Google Scholar] [CrossRef]
- Sekendiz, B. Utilisation of formative peer-assessment in distance online education: A case study of a multi-model sport management unit. Interact. Learn. Env. 2018, 26, 682–694. [Google Scholar] [CrossRef]
- Watson, C.; Wilson, A.; Drew, V.; Thompson, T.L. Small data, online learning and assessment practices in higher education: A case study of failure? Assess. Eval. High. Educ. 2017, 42, 1030–1045. [Google Scholar] [CrossRef] [Green Version]
- Albazar, H. A New Automated Forms Generation Algorithm for Online Assessment. J. Inf. Knowl. Manag. 2020, 19, 2040008. [Google Scholar] [CrossRef]
- Tan, P.J.; Hsu, M.H. Designing a system for English evaluation and teaching devices: A PZB and TAM model analysis. Eurasia J. Math. Sci. Technol. Educ. 2018, 14, 2107–2119. [Google Scholar] [CrossRef]
- Saha, S.K.; Rao, D. Development of a practical system for computerized evaluation of descriptive answers of middle school level students. Interact. Learn. Env. 2019. [Google Scholar] [CrossRef]
- Jayashankar, S.; Sridaran, R. Superlative model using word cloud for short answers evaluation in eLearning. Educ. Inf. Technol. 2017, 22, 2383–2402. [Google Scholar] [CrossRef]
- Mimirinis, M. Qualitative differences in academics’ conceptions of e-assessment. Assess. Eval. High. Educ. 2019, 44, 233–248. [Google Scholar] [CrossRef]
- Babo, R.; Suhonen, J. E-assessment with multiple choice questions: A qualitative study of teachers’ opinions and experience regarding the new assessment strategy. Int. J. Learn. Technol. 2018, 13, 220–248. [Google Scholar] [CrossRef]
- Zhan, Y.; So, W.W. Views and practices from the chalkface: Development of a formative assessment multimedia learning environment. Technol. Pedagog. Inf. 2017, 26, 501–515. [Google Scholar] [CrossRef]
- Su, H. Educational Assessment of the Post-Pandemic Age: Chinese Experiences and Trends Based on Large-Scale Online Learning. Educ. Meas. 2020, 39, 37–40. [Google Scholar] [CrossRef]
- Tenorio, T.; Bittencourt, I.I.; Isotani, S.; Pedro, A.; Ospina, P. A gamified peer assessment model for online learning environments in a competitive context. Comput. Hum. Behav. 2016, 64, 247–263. [Google Scholar] [CrossRef]
- Ramírez-Donoso, L.; Pérez-Sanagustín, M.; Neyem, A. MyMOOCSpace: Mobile cloud-based system tool to improve collaboration and preparation of group assessments in traditional engineering courses in higher education. Comput. Appl. Eng. Educ. 2018, 26, 1507–1518. [Google Scholar] [CrossRef]
- Mihret, D.G.; Abayadeera, N.; Watty, K.; McKay, J. Teaching auditing using cases in an online learning environment: The role of ePortfolio assessment. ACC Educ. 2017, 26, 335–357. [Google Scholar] [CrossRef]
- Purkayastha, S.; Surapaneni, A.K.; Maity, P.; Rajapuri, A.S.; Gichoya, J.W. Critical Components of Formative Assessment in Process-Oriented Guided Inquiry Learning for Online Labs. Electron. J. E-Learn. 2019, 17, 79–92. [Google Scholar] [CrossRef] [Green Version]
- Link, S.; Mehrzad, M.; Rahimi, M. Impact of automated writing evaluation on teacher feedback, student revision, and writing improvement. Comput. Assist. Lang. Learn. 2020, 1–30. [Google Scholar] [CrossRef]
- McVey, M. Preservice Teachers’ Perception of Assessment Strategies in Online Teaching. J. Digit. Learn. Teach. Educ. 2016, 32, 119–127. [Google Scholar] [CrossRef]
- Ebadi, S.; Rahimi, M. Mediating EFL learners’ academic writing skills in online dynamic assessment using Google Docs. Comput. Assist. Lang. Learn. 2019, 32, 527–555. [Google Scholar] [CrossRef]
- Hardianti, R.D.; Taufiq, M.; Pamelasari, S.D. The development of alternative assessment instrument in web-based scientific communication skill in science education seminar course. J. Pendidik. IPA Indones. 2017, 6. [Google Scholar] [CrossRef] [Green Version]
- Johansson, E. The Assessment of Higher-order Thinking Skills in Online EFL Courses: A Quantitative Content Analysis. Engl. Stud. 2020, 19, 224–256. [Google Scholar]
- Magal-Royo, T.; Garcia Laborda, J.; Price, S. A New m-Learning Scenario for a Listening Comprehension Assessment Test in Second Language Acquisition [SLA]. J. Univers. Comput. Sci. 2017, 23, 1200–1214. [Google Scholar]
- Maguire, P.; Maguire, R.; Kelly, R. Using automatic machine assessment to teach computer programming. Comput. Sci. Educ. 2017, 27, 197–214. [Google Scholar] [CrossRef] [Green Version]
- Mustakerov, I.; Borissova, D. A Framework for Development of e-learning System for computer programming: Application in the C programming Language. J. E-Learn. Knowl. Soc. 2017, 13, 2. [Google Scholar]
- Hu, Y.; Wu, B.; Gu, X. Learning analysis of K-12 students’ online problem solving: A three-stage assessment approach. Interact. Learn. Environ. 2017, 25, 262–279. [Google Scholar] [CrossRef]
- Mayeshiba, M.; Jansen, K.R.; Mihlbauer, L. An Evaluation of Critical Thinking in Competency-Based and Traditional Online Learning Environments. Online Learn. 2018, 22, 77–89. [Google Scholar] [CrossRef] [Green Version]
- Thompson, M.M.; Braude, E.J. Evaluation of Knowla: An online assessment and learning tool. J. Educ. Comput. Res. 2016, 54, 483–512. [Google Scholar] [CrossRef]
- Roberts, A.M.; LoCasale-Crouch, J.; Hamre, B.K.; Buckrop, J.M. Adapting for Scalability: Automating the Video Assessment of Instructional Learning. Online Learn. 2017, 21, 257–272. [Google Scholar] [CrossRef] [Green Version]
- Neumann, M.M.; Worrall, S.; Neumann, D.L. Validation of an expressive and receptive tablet assessment of early literacy. J. Res. Technol. Educ. 2019, 51, 326–341. [Google Scholar] [CrossRef]
- Wilson, M.; Scalise, K.; Gochyyev, P. Domain modelling for advanced learning environments: The BEAR Assessment System Software. Educ. Psychol. 2019, 39, 1199–1217. [Google Scholar] [CrossRef]
- Birjali, M.; Beni-Hssane, A.; Erritali, M. A novel adaptive e-learning model based on Big Data by using competence-based knowledge and social learner activities. Appl. Soft Comput. 2018, 69, 14–32. [Google Scholar] [CrossRef]
- Nikou, S.A.; Economides, A.A. An outdoor mobile-based assessment activity: Measuring students’ motivation and acceptance. Int. J. Interact. Mob. Technol. 2016, 10, 11–17. [Google Scholar] [CrossRef] [Green Version]
- Wang, Y.; Fang, H.; Jin, Q.; Ma, J. SSPA: An effective semi-supervised peer assessment method for large scale MOOCs. Interact. Learn. Environ. 2019. [Google Scholar] [CrossRef]
- García, A.C.; Gil-Mediavilla, M.; Álvarez, I.; Casares, M.D. Evaluación entre iguales en entornos de educación superior online mediante el taller de Moodle. Estudio de caso. Form. Univ. 2020, 13, 119–126. [Google Scholar] [CrossRef]
- Holmes, N. Engaging with assessment: Increasing student engagement through continuous assessment. Act. Learn. High. Educ. 2018, 19, 23–34. [Google Scholar] [CrossRef] [Green Version]
- Romero, L.; Gutierrez, M.; Caliusco, M.L. Semantic modeling of portfolio assessment in e-learning environment. Adv. Sci. Technol. Eng. Syst. J. 2017, 2, 149–156. [Google Scholar] [CrossRef] [Green Version]
- Nikou, S.A.; Economides, A.A. Mobile-based assessment: Investigating the factors that influence behavioral intention to use. Comput. Educ. 2017, 109, 56–73. [Google Scholar] [CrossRef]
- Karay, Y.; Reiss, B.; Schauber, S.K. Progress testing anytime and anywhere–Does a mobile-learning approach enhance the utility of a large-scale formative assessment tool? Med. Teach. 2020, 42, 1154–1162. [Google Scholar] [CrossRef]
- Nikou, S.A.; Economides, A.A. Mobile-Based Assessment: Integrating acceptance and motivational factors into a combined model of Self-Determination Theory and Technology Acceptance. Comput. Hum. Behav. 2017, 68, 83–95. [Google Scholar] [CrossRef]
- Nikou, S.A.; Economides, A.A. The impact of paper-based, computer-based and mobile-based self-assessment on students’ science motivation and achievement. Comput. Hum. Behav. 2016, 55, 1241–1248. [Google Scholar] [CrossRef]
- Altınay, Z. Evaluating peer learning and assessment in online collaborative learning environments. Behav. Inf. Technol. 2017, 36, 312–320. [Google Scholar] [CrossRef]
- Chee, K.; Yahaya, N.; Ibrahim, N.; Hasan, M.N. Review of mobile learning trends 2010–2015: A meta-analysis. Educ. Technol. Soc. 2017, 20, 113–126. [Google Scholar]
- Jo, J.; Jun, H.; Lim, H. A comparative study on gamification of the flipped classroom in engineering education to enhance the effects of learning. Comput. Appl. Eng. Educ. 2018, 26, 1626–1640. [Google Scholar] [CrossRef]
- Arana, A.I.; Gironés, M.V.; Olagaray, M.L. Mejora de los procesos de evaluación mediante analítica visual del aprendizaje. Educ. Knowl. Soc. 2020, 21, 9. [Google Scholar]
- Deena, G.; Raja, K.; PK, N.B.; Kannan, K. Developing the Assessment Questions Automatically to Determine the Cognitive Level of the E-Learner Using NLP Techniques. Int. J. Serv. Sci. Manag. Eng. Technol. 2020, 11, 95–110. [Google Scholar] [CrossRef]
- Aljohany, D.A.; Salama, R.M.; Saleh, M. ASSA: Adaptive E-Learning Smart Students Assessment Model. Int. J. Adv. Comput. Sci. Appl. 2018, 9, 128–136. [Google Scholar] [CrossRef]
- Paiva, R.C.; Ferreira, M.S.; Frade, M.M. Intelligent tutorial system based on personalized system of instruction to teach or remind mathematical concepts. J. Comput. Assist. Learn. 2017, 33, 370–381. [Google Scholar] [CrossRef]
- Bendaly Hlaoui, Y.; Hajjej, F.; Jemni Ben Ayed, L. Learning analytics for the development of adapted e-assessment workflow system. Comput. Appl. Eng. Educ. 2016, 24, 951–966. [Google Scholar] [CrossRef]
- Almuayqil, S.; Abd El-Ghany, S.A.; Shehab, A. Towards an Ontology-Based Fully Integrated System for Student E-Assessment. J. Appl. Inf. Technol. 2020, 98, 21. [Google Scholar]
- Khdour, T. A semantic assessment framework for e-learning systems. Int. J. Knowl. Learn. 2020, 13, 110–122. [Google Scholar] [CrossRef]
- Vachharajani, V.; Pareek, J. Effective Structure Matching Algorithm for Automatic Assessment of Use-Case Diagram. Int. J. Distance Educ. Technol. 2020, 18, 31–50. [Google Scholar] [CrossRef]
- Daradoumis, T.; Puig, J.M.; Arguedas, M.; Liñan, L.C. Analyzing students’ perceptions to improve the design of an automated assessment tool in online distributed programming. Comput. Educ. 2019, 128, 159–170. [Google Scholar] [CrossRef]
- Santhanavijayan, A.; Balasundaram, S.R.; Narayanan, S.H.; Kumar, S.V.; Prasad, V.V. Automatic generation of multiple choice questions for e-assessment. Int. J. Signal. Imaging Syst. Eng. 2017, 10, 54–62. [Google Scholar] [CrossRef]
- Striewe, M. An architecture for modular grading and feedback generation for complex exercises. Sci. Comput. Program. 2016, 129, 35–47. [Google Scholar] [CrossRef]
- Khlaisang, J.; Koraneekij, P. Open online assessment management system platform and instrument to enhance the information, media, and ICT literacy skills of 21st century learners. Int. J. Emerg. Technol. Learn. 2019, 14, 111–127. [Google Scholar] [CrossRef]
- Nissen, J.M.; Jariwala, M.; Close, E.W.; Van Dusen, B. Participation and performance on paper-and computer-based low-stakes assessments. Int. J. Stem Educ. 2018, 5, 21. [Google Scholar] [CrossRef] [PubMed]
- Kortemeyer, G. Scalable continual quality control of formative assessment items in an educational digital library: An empirical study. Int. J. Digit. Libr. 2016, 17, 143–155. [Google Scholar] [CrossRef]
- Kranenburg, L.; Reerds, S.T.; Cools, M.; Alderson, J.; Muscarella, M.; Grijpink, K.; Quigley, C.; Drop, S.L. Global application of assessment of competencies of Paediatric endocrinology fellows in the Management of Differences of sex development (DSD) using the ESPE e-learning. org portal. Med. Sci. Educ. 2016, 26, 679–689. [Google Scholar] [CrossRef] [Green Version]
- Tsai, S.C. Implementing interactive courseware into EFL business writing: Computational assessment and learning satisfaction. Interact. Learn. Environ. 2019, 27, 46–61. [Google Scholar] [CrossRef]
- Lowe, T.W.; Mestel, B.D. Using STACK to support student learning at masters level: A case study. Teach. Math. Its Appl. 2020, 39, 61–70. [Google Scholar] [CrossRef]
- Massing, T.; Schwinning, N.; Striewe, M.; Hanck, C.; Goedicke, M. E-assessment using variable-content exercises in mathematical statistics. J. Stat. Educ. 2018, 26, 174–189. [Google Scholar] [CrossRef] [Green Version]
- Ilahi-Amri, M.; Cheniti-Belcadhi, L.; Braham, R. A Framework for Competence based e-Assessment. IxDA 2017, 32, 189–204. [Google Scholar]
- Misut, M.; Misutova, M. Software Solution Improving Productivity and Quality for Big Volume Students’ Group Assessment Process. Int. J. Emerg. Technol. Learn. 2017, 12, 175–190. [Google Scholar] [CrossRef] [Green Version]
- Gwynllyw, D.R.; Weir, I.S.; Henderson, K.L. Using DEWIS and R for multi-staged statistics e-Assessments. Teach. Math. Its Appl. 2016, 35, 14–26. [Google Scholar] [CrossRef] [Green Version]
- Khalaf, B. Traditional and Inquiry-Based Learning Pedagogy: A Systematic Critical Review. Int. J. Instr. 2018, 11, 545–564. [Google Scholar] [CrossRef]
- Magolda, M. Creating Contexts for Learning and Self-Authorship: Constructive-Developmental Pedagogy; Vanderbilt University Press: Nashville, TN, USA, 1999. [Google Scholar]
- Breunig, M. Turning experiential education and critical pedagogy theory into praxis. J. Exp. Educ. 2005, 28, 106–122. [Google Scholar] [CrossRef]
- Shcherbinin, M.; Vasilievich Kruchinin, S.; Gennadievich Ivanov, A. MOOC and MOOC degrees: New learning paradigm and its specifics. Manag. Appl. Sci. Tech. 2019, 10, 1–14. [Google Scholar]
Criteria | Description |
---|---|
1. Database | SCOPUS SAGE Journals Taylor&Francis |
2. Type of publication | Research journal |
3. Year of publication | Between 2016 to 2020 |
4. Inclusion criteria | ICT-mediated assessment and evaluation ICT-mediated assessment approaches Perception studies in online settings Development and description of tools and platforms for online assessment Experimental research comparing online and face-to-face assessment |
5. Exclusion criteria | Evaluation of online platforms, digital contents, and online learning methodologies not focused on assessment processes Design of courses, platforms, or digital contents Assessment approaches not related to online settings Assessment instruments not related to online settings No access to the full text |
Research Question | Category | Elements | Type of Analysis |
---|---|---|---|
All questions | Year of publication | ~ | Quantitative synthesis |
Country | ~ | Quantitative synthesis | |
Type of study | Perception Experimental Descriptive Case study Technological development | Quantitative synthesis | |
(1) What are the meanings of ICT-mediated assessment? | Actors | Students Teachers Students and teachers | Quantitative synthesis |
What is evaluated? | Contents Skills Outcomes | Quantitative synthesis and meta-analysis | |
Purpose of the evaluation | ~ | Meta-analysis | |
(2) What kind of knowledge is susceptible to ICT-mediated assessment? | Fields of knowledge | Foreign language Social sciences Sciences Engineering Art and humanities Economic and administrative sciences Education Health | Quantitative synthesis |
Levels | K-12 Higher education | Quantitative synthesis | |
(3) What are the assessment possibilities offered by the current ICT platforms? | Digital tool, strategies and Platforms | ~ | Meta-analysis |
Pedagogical approach | Traditional No-traditional Critical | Quantitative synthesis and meta-analysis |
Strategy | Samples Reference | Description | Challenges |
---|---|---|---|
Self-assessment | [35,97] |
|
|
Peer-assessment | [22,30,34,40,49,60,70,90,98] |
|
|
Mobile assessment | [15,37,52,71,89,94,95,96,97,99] |
|
|
Gamification | [56,70,100] |
|
|
Augmented Reality | [43,50] |
|
|
Learning analytics and adaptive assessments | [36,55,87,88,101,102,103,104,105] |
|
|
Automated assessment | [56,57,59,64,71,106,107,108,109,110,111] |
|
|
Software and Platforms | Reference | Description |
---|---|---|
SOCRATIVE | [44] | Web-based platform for quizzes |
EduZinc | [55] | Application to customized assessment activities according to the needs and skills of students |
ICT-FLAG | [56] | Formative assessment tool including learning analytics and gamification services |
CC-LR prototype | [58] | Collaborative complex learning resource personalized awareness and feedback system using learning analytics |
FAMLE | [68] | Formative assessment multimedia learning environment based on assessment tasks that measure performance, learning, and knowledge and display learning data to students and teachers |
MeuTutor | [70] | Intelligent tutoring system that allows monitoring the formative process |
MyMOOCSpace | [71] | Cloud-based mobile system for collaborative learning process |
KNOWLA | [84] | Knowledge assembly web-based interface allows creating and grading assessments. Students should assemble a set of scrambled fragments into a logical order |
BASS 1.0 | [87] | Web-based system to design, development and delivery assessment and feedback |
TCU-AMS | [112] | Open online assessment management system compatible with Open edX platform supporting traditional self-and peer- assessment. |
DSLab | [109] | Web-based system with automatic assessment, feedback, interactive comparison between student solution and the correct solution |
COBLE | [101] | Competence-based learning environment allowing visual information about assessment to students |
LASSO | [113] | Learning About STEM Student Outcomes web-based platforms with assessment instruments in several disciplines |
LON-CAPA | [114] | Open-source platform allowing to create and develop assessment |
ESPE-eLearning | [115] | European Society for Paediatric Endocrinology e-learning portal for medical training |
English for International Trade and Business | [116] | Multimedia platform for Chinese EFL students including self-checking and feedback system |
Adobe Connect | [98] | Software for online training and web conferencing |
STACK | [117] | Open-source for randomization of questions, integrated into Moodle, with automatic feedback. Oriented to mathematical and algebraic questions. |
JACK | [118] | Computer assisted assessment platform, originally created for programing course. Currently, supporting several fields. |
SWAP-COMP | [119] | Platform to support competence-based learning. |
SANCHO | [120] | Client server application to support automatic evaluation of text |
Cloud-AWAS | [105] | Cloud Adapted Workflow e-Assessment System supports learning analytics, can be integrated into any LMS |
DEWIS | [121] | e-assessment system integrating embedded R code. Support statistical analysis assessment. |
Pedagogical Approaches | Emphasis | Knowledge Structure—Curriculum | Relationship Teacher-Students | Evaluation and Assessment |
---|---|---|---|---|
Traditional [122] |
|
|
|
|
Nontraditional: Experiential–New School-Cognitivist–Developmental-Constructivist [123] |
|
|
|
|
Critical [124] |
|
|
|
|
LMS Platform | Type of Questions | Configurations |
---|---|---|
Moodle 1 |
|
|
DOKEOS 2 |
|
|
Caroline 3 |
|
|
SAKAI 4 |
|
|
Microsoft Teams 5 |
|
|
Google Classroom 6 |
|
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Torres-Madroñero, E.M.; Torres-Madroñero, M.C.; Ruiz Botero, L.D. Challenges and Possibilities of ICT-Mediated Assessment in Virtual Teaching and Learning Processes. Future Internet 2020, 12, 232. https://doi.org/10.3390/fi12120232
Torres-Madroñero EM, Torres-Madroñero MC, Ruiz Botero LD. Challenges and Possibilities of ICT-Mediated Assessment in Virtual Teaching and Learning Processes. Future Internet. 2020; 12(12):232. https://doi.org/10.3390/fi12120232
Chicago/Turabian StyleTorres-Madroñero, Esperanza Milena, Maria C. Torres-Madroñero, and Luz Dary Ruiz Botero. 2020. "Challenges and Possibilities of ICT-Mediated Assessment in Virtual Teaching and Learning Processes" Future Internet 12, no. 12: 232. https://doi.org/10.3390/fi12120232