Next Article in Journal
Effectiveness of LMS Digital Tools Used by the Academics to Foster Students’ Engagement
Next Article in Special Issue
An Exploratory Study of Helping Undergraduate Students Solve Literature Review Problems Using Litstudy and NLP
Previous Article in Journal
Teachers’ Perceptions of the Cultural Capital of Children and Families with Immigrant Backgrounds in Early Childhood Education
Previous Article in Special Issue
AI Course Design Planning Framework: Developing Domain-Specific AI Education Courses
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Evaluating AI Courses: A Valid and Reliable Instrument for Assessing Artificial-Intelligence Learning through Comparative Self-Assessment

by
Matthias Carl Laupichler
1,*,
Alexandra Aster
1,
Jan-Ole Perschewski
2 and
Johannes Schleiss
2
1
Institute of Medical Education, University Hospital Bonn, 53127 Bonn, Germany
2
Artificial Intelligence Lab, Otto von Guericke University Magdeburg, 39106 Magdeburg, Germany
*
Author to whom correspondence should be addressed.
Educ. Sci. 2023, 13(10), 978; https://doi.org/10.3390/educsci13100978
Submission received: 28 July 2023 / Revised: 20 September 2023 / Accepted: 21 September 2023 / Published: 26 September 2023
(This article belongs to the Collection Artificial Intelligence (AI) and Education)

Abstract

A growing number of courses seek to increase the basic artificial-intelligence skills (“AI literacy”) of their participants. At this time, there is no valid and reliable measurement tool that can be used to assess AI-learning gains. However, the existence of such a tool would be important to enable quality assurance and comparability. In this study, a validated AI-literacy-assessment instrument, the “scale for the assessment of non-experts’ AI literacy” (SNAIL) was adapted and used to evaluate an undergraduate AI course. We investigated whether the scale can be used to reliably evaluate AI courses and whether mediator variables, such as attitudes toward AI or participation in other AI courses, had an influence on learning gains. In addition to the traditional mean comparisons (i.e., t-tests), the comparative self-assessment (CSA) gain was calculated, which allowed for a more meaningful assessment of the increase in AI literacy. We found preliminary evidence that the adapted SNAIL questionnaire enables a valid evaluation of AI-learning gains. In particular, distinctions among different subconstructs and the differentiation constructs, such as attitudes toward AI, seem to be possible with the help of the SNAIL questionnaire.
Keywords: AI literacy; AI-literacy scale; artificial intelligence education; assessment; course evaluation; comparative self-assessment AI literacy; AI-literacy scale; artificial intelligence education; assessment; course evaluation; comparative self-assessment

Share and Cite

MDPI and ACS Style

Laupichler, M.C.; Aster, A.; Perschewski, J.-O.; Schleiss, J. Evaluating AI Courses: A Valid and Reliable Instrument for Assessing Artificial-Intelligence Learning through Comparative Self-Assessment. Educ. Sci. 2023, 13, 978. https://doi.org/10.3390/educsci13100978

AMA Style

Laupichler MC, Aster A, Perschewski J-O, Schleiss J. Evaluating AI Courses: A Valid and Reliable Instrument for Assessing Artificial-Intelligence Learning through Comparative Self-Assessment. Education Sciences. 2023; 13(10):978. https://doi.org/10.3390/educsci13100978

Chicago/Turabian Style

Laupichler, Matthias Carl, Alexandra Aster, Jan-Ole Perschewski, and Johannes Schleiss. 2023. "Evaluating AI Courses: A Valid and Reliable Instrument for Assessing Artificial-Intelligence Learning through Comparative Self-Assessment" Education Sciences 13, no. 10: 978. https://doi.org/10.3390/educsci13100978

APA Style

Laupichler, M. C., Aster, A., Perschewski, J.-O., & Schleiss, J. (2023). Evaluating AI Courses: A Valid and Reliable Instrument for Assessing Artificial-Intelligence Learning through Comparative Self-Assessment. Education Sciences, 13(10), 978. https://doi.org/10.3390/educsci13100978

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop