Next Article in Journal
Comparison of Loneliness and Social Skill Levels of Children with Specific Learning Disabilities in Terms of Participation in Sports
Previous Article in Journal
Distance Learning—Predictions and Possibilities
Article Menu

Export Article

Open AccessArticle
Educ. Sci. 2018, 8(1), 36; https://doi.org/10.3390/educsci8010036

Optimal Weighting for Exam Composition

1
Ganzfried Research, Miami Beach, FL 33139, USA
2
School of Computing and Information Sciences, Florida International University, Miami, FL 33139, USA
*
Author to whom correspondence should be addressed.
Received: 4 January 2018 / Revised: 10 February 2018 / Accepted: 1 March 2018 / Published: 9 March 2018
(This article belongs to the Special Issue Artificial Intelligence and Education)
Full-Text   |   PDF [1056 KB, uploaded 9 March 2018]   |  

Abstract

A problem faced by many instructors is that of designing exams that accurately assess the abilities of the students. Typically, these exams are prepared several days in advance, and generic question scores are used based on rough approximation of the question difficulty and length. For example, for a recent class taught by the author, there were 30 multiple choice questions worth 3 points, 15 true/false with explanation questions worth 4 points, and 5 analytical exercises worth 10 points. We describe a novel framework where algorithms from machine learning are used to modify the exam question weights in order to optimize the exam scores, using the overall final score as a proxy for a student’s true ability. We show that significant error reduction can be obtained by our approach over standard weighting schemes, i.e., for the final and midterm exam, the mean absolute error for prediction decreases by 90.58% and 97.70% for linear regression approach respectively resulting in better estimation. We make several new observations regarding the properties of the “good” and “bad” exam questions that can have impact on the design of improved future evaluation methods. View Full-Text
Keywords: intelligent tutoring systems; collaborative learning; student modelling; supervised learning intelligent tutoring systems; collaborative learning; student modelling; supervised learning
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Ganzfried, S.; Yusuf, F. Optimal Weighting for Exam Composition. Educ. Sci. 2018, 8, 36.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Educ. Sci. EISSN 2227-7102 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top