Next Article in Journal
A Hierarchical Multitier Approach for Privacy Policies in e-Government Environments
Next Article in Special Issue
Elusive Learning—Using Learning Analytics to Support Reflective Sensemaking of Ill-Structured Ethical Problems: A Learner-Managed Dashboard Solution
Previous Article in Journal
Dynamic Load Balancing Strategy for Cloud Computing with Ant Colony Optimization
Previous Article in Special Issue
Utilizing the ECHO Model in the Veterans Health Affairs System: Guidelines for Setup, Operations and Preliminary Findings
Article Menu

Export Article

Open AccessFeature PaperArticle
Future Internet 2015, 7(4), 484-499; doi:10.3390/fi7040484

Improving Teacher Effectiveness: Designing Better Assessment Tools in Learning Management Systems

1
Department of Electrical and Computer Engineering, Stevens Institute of Technology, Hoboken, NJ 07030, USA
2
Instructional Technology, Stevens Institute of Technology, Hoboken, NJ 07030, USA
*
Author to whom correspondence should be addressed.
Academic Editor: Liz Bacon
Received: 1 September 2015 / Revised: 24 November 2015 / Accepted: 1 December 2015 / Published: 18 December 2015
(This article belongs to the Special Issue eLearning)
View Full-Text   |   Download PDF [2518 KB, uploaded 18 December 2015]   |  

Abstract

Current-generation assessment tools used in K-12 and post-secondary education are limited in the type of questions they support; this limitation makes it difficult for instructors to navigate their assessment engines. Furthermore, the question types tend to score low on Bloom’s Taxonomy. Dedicated learning management systems (LMS) such as Blackboard, Moodle and Canvas are somewhat better than informal tools as they offer more question types and some randomization. Still, question types in all the major LMS assessment engines are limited. Additionally, LMSs place a heavy burden on teachers to generate online assessments. In this study we analyzed the top three LMS providers to identify inefficiencies. These inefficiencies in LMS design, point us to ways to ask better questions. Our findings show that teachers have not adopted current tools because they do not offer definitive improvements in productivity. Therefore, we developed LiquiZ, a design for a next-generation assessment engine that reduces user effort and provides more advanced question types that allow teachers to ask questions that can currently only be asked in one-on-one demonstration. The initial LiquiZ project is targeted toward STEM subjects, so the question types are particularly advantageous in math or science subjects. View Full-Text
Keywords: Learning Management Systems; web-based Assessment; LMS; CMS; HOTS; Bloom’s taxonomy; STEM Learning Management Systems; web-based Assessment; LMS; CMS; HOTS; Bloom’s taxonomy; STEM
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Kruger, D.; Inman, S.; Ding, Z.; Kang, Y.; Kuna, P.; Liu, Y.; Lu, X.; Oro, S.; Wang, Y. Improving Teacher Effectiveness: Designing Better Assessment Tools in Learning Management Systems. Future Internet 2015, 7, 484-499.

Show more citation formats Show less citations formats

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Future Internet EISSN 1999-5903 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top