Special Issue "eLearning"

Quicklinks

A special issue of Future Internet (ISSN 1999-5903).

Deadline for manuscript submissions: closed (31 August 2015)

Special Issue Editors

Guest Editor
Prof. Yianna Vovides

Communication, Culture, and Technology (CCT) Georgetown University Washington D.C., USA
Website | E-Mail
Guest Editor
Prof. Liz Bacon

Deputy Pro Vice-Chancellor (Technology Enhanced Learning) Faculty of Architecture, Computing and Humanities University of Greenwich Old Royal Naval College London SE10 9LS, United Kingdom
E-Mail

Special Issue Information

Dear Colleagues,

Tremendous growth in the last few years in regards to eLearning, especially through the popularization of Massive Open Online Courses (MOOCs) and Microlearning, prompts us to ask: “what is the learning that is occurring within the online space and what are the outcomes?” Understanding what type of learning is resulting from these efforts, is still a major challenge. Many of the eLearning systems being used have limited dashboard capabilities, with many not going beyond the basic course management components; however, given the scale of the eLearning efforts in recent years, dashboards, defined as sophisticated monitoring systems, are a necessity for all participants—students, instructors, and administrators—as they have the potential to enable us to make sense of the learning that is occurring within the virtual space. What are then the issues surrounding the development of dashboards to visualize learning processes and outcomes? We posit that the major obstacles are no longer about the technical know-how; the technical expertise to develop eLearning dashboards is relatively easy to identify and procure. Rather, the challenges are about conceptualizing and designing dashboards that provide meaningful analytics that engage participants, including instructors and administrators, in making connections of what they are learning.

Within this context, this Future Internet Special Issue on eLearning aims to explore the topic of dashboards. Therefore, we invite researchers to submit manuscripts that focus on eLearning Dashboards.

Some may approach it from a learning dashboard perspective that targets learners focusing on questions such as “what should be on a learning dashboard that is meaningful to learners and would encourage repeat engagement and reflection about what is being learned?”

Some may approach it from a teaching dashboard perspective asking questions about how to support learners who are at risk, or those who may need more challenging learning opportunities. For example, “what information do we want to know about learners to help us predict dropout?” “When would such information be most useful?”

Others may choose to approach it from an institutional perspective with a focus on trying to capture learning experiences over time and asking questions such as “What type of programs would best engage our learners further?” or “what information do we need to capture to generate meaningful information about how our students learn?” and “how do we analyze it intelligently using techniques from, e.g., AI and Big Data?” and “What are the overall curricular outcomes and competencies?”

We also welcome manuscripts that address the broader social and ethical dimension about what data is being captured and how it is used to generate data for a dashboard.

Prof. Yianna Vovides
Prof. Liz Bacon
Guest Editors

Submission

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. Papers will be published continuously (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are refereed through a peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Future Internet is an international peer-reviewed Open Access quarterly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 500 CHF (Swiss Francs). English correction and/or formatting fees of 250 CHF (Swiss Francs) will be charged in certain cases for those articles accepted for publication that require extensive additional formatting and/or English corrections.

Keywords

  • design
  • dashboards
  • elearning
  • analytics
  • online learning
  • eportfolios
  • engagement
  • deep learning
  • monitoring
  • big data
  • artificial intelligence

Published Papers (5 papers)

View options order results:
result details:
Displaying articles 1-5
Export citation of selected articles as:

Research

Jump to: Other

Open AccessFeature PaperArticle Elusive Learning—Using Learning Analytics to Support Reflective Sensemaking of Ill-Structured Ethical Problems: A Learner-Managed Dashboard Solution
Future Internet 2016, 8(2), 26; doi:10.3390/fi8020026
Received: 17 December 2015 / Revised: 12 May 2016 / Accepted: 12 May 2016 / Published: 11 June 2016
PDF Full-text (2670 KB) | HTML Full-text | XML Full-text
Abstract
Since the turn of the 21st century, we have seen a surge of studies on the state of U.S. education addressing issues such as cost, graduation rates, retention, achievement, engagement, and curricular outcomes. There is an expectation that graduates should be able to
[...] Read more.
Since the turn of the 21st century, we have seen a surge of studies on the state of U.S. education addressing issues such as cost, graduation rates, retention, achievement, engagement, and curricular outcomes. There is an expectation that graduates should be able to enter the workplace equipped to take on complex and “messy” or ill-structured problems as part of their professional and everyday life. In the context of online learning, we have identified two key issues that are elusive (hard to capture and make visible): learning with ill-structured problems and the interaction of social and individual learning. We believe that the intersection between learning and analytics has the potential, in the long-term, to minimize the elusiveness of deep learning. A proposed analytics model is described in this article that is meant to capture and also support further development of a learner’s reflective sensemaking. Full article
(This article belongs to the Special Issue eLearning)
Figures

Open AccessFeature PaperArticle Improving Teacher Effectiveness: Designing Better Assessment Tools in Learning Management Systems
Future Internet 2015, 7(4), 484-499; doi:10.3390/fi7040484
Received: 1 September 2015 / Revised: 24 November 2015 / Accepted: 1 December 2015 / Published: 18 December 2015
PDF Full-text (2518 KB) | HTML Full-text | XML Full-text
Abstract
Current-generation assessment tools used in K-12 and post-secondary education are limited in the type of questions they support; this limitation makes it difficult for instructors to navigate their assessment engines. Furthermore, the question types tend to score low on Bloom’s Taxonomy. Dedicated learning
[...] Read more.
Current-generation assessment tools used in K-12 and post-secondary education are limited in the type of questions they support; this limitation makes it difficult for instructors to navigate their assessment engines. Furthermore, the question types tend to score low on Bloom’s Taxonomy. Dedicated learning management systems (LMS) such as Blackboard, Moodle and Canvas are somewhat better than informal tools as they offer more question types and some randomization. Still, question types in all the major LMS assessment engines are limited. Additionally, LMSs place a heavy burden on teachers to generate online assessments. In this study we analyzed the top three LMS providers to identify inefficiencies. These inefficiencies in LMS design, point us to ways to ask better questions. Our findings show that teachers have not adopted current tools because they do not offer definitive improvements in productivity. Therefore, we developed LiquiZ, a design for a next-generation assessment engine that reduces user effort and provides more advanced question types that allow teachers to ask questions that can currently only be asked in one-on-one demonstration. The initial LiquiZ project is targeted toward STEM subjects, so the question types are particularly advantageous in math or science subjects. Full article
(This article belongs to the Special Issue eLearning)
Open AccessArticle Output from Statistical Predictive Models as Input to eLearning Dashboards
Future Internet 2015, 7(2), 170-183; doi:10.3390/fi7020170
Received: 1 March 2015 / Accepted: 11 May 2015 / Published: 2 June 2015
PDF Full-text (1066 KB) | HTML Full-text | XML Full-text
Abstract
We describe how statistical predictive models might play an expanded role in educational analytics by giving students automated, real-time information about what their current performance means for eventual success in eLearning environments. We discuss how an online messaging system might tailor information to
[...] Read more.
We describe how statistical predictive models might play an expanded role in educational analytics by giving students automated, real-time information about what their current performance means for eventual success in eLearning environments. We discuss how an online messaging system might tailor information to individual students using predictive analytics. The proposed system would be data-driven and quantitative; e.g., a message might furnish the probability that a student will successfully complete the certificate requirements of a massive open online course. Repeated messages would prod underperforming students and alert instructors to those in need of intervention. Administrators responsible for accreditation or outcomes assessment would have ready documentation of learning outcomes and actions taken to address unsatisfactory student performance. The article’s brief introduction to statistical predictive models sets the stage for a description of the messaging system. Resources and methods needed to develop and implement the system are discussed. Full article
(This article belongs to the Special Issue eLearning)
Open AccessArticle Quantitative Analysis of the Usage of a Pedagogical Tool Combining Questions Listed as Learning Objectives and Answers Provided as Online Videos
Future Internet 2015, 7(2), 140-151; doi:10.3390/fi7020140
Received: 27 February 2015 / Revised: 29 April 2015 / Accepted: 8 May 2015 / Published: 15 May 2015
PDF Full-text (438 KB) | HTML Full-text | XML Full-text | Supplementary Files
Abstract
To improve the learning of basic concepts in molecular biology of an undergraduate science class, a pedagogical tool was developed, consisting of learning objectives listed at the end of each lecture and answers to those objectives made available as videos online. The aim
[...] Read more.
To improve the learning of basic concepts in molecular biology of an undergraduate science class, a pedagogical tool was developed, consisting of learning objectives listed at the end of each lecture and answers to those objectives made available as videos online. The aim of this study was to determine if the pedagogical tool was used by students as instructed, and to explore students’ perception of its usefulness. A combination of quantitative survey data and measures of online viewing was used to evaluate the usage of the pedagogical practice. A total of 77 short videos linked to 11 lectures were made available to 71 students, and 64 completed the survey. Using online tracking tools, a total of 7046 views were recorded. Survey data indicated that most students (73.4%) accessed all videos, and the majority (98.4%) found the videos to be useful in assisting their learning. Interestingly, approximately half of the students (53.1%) always or most of the time used the pedagogical tool as recommended, and consistently answered the learning objectives before watching the videos. While the proposed pedagogical tool was used by the majority of students outside the classroom, only half used it as recommended limiting the impact on students’ involvement in the learning of the material presented in class. Full article
(This article belongs to the Special Issue eLearning)

Other

Jump to: Research

Open AccessProject Report Utilizing the ECHO Model in the Veterans Health Affairs System: Guidelines for Setup, Operations and Preliminary Findings
Future Internet 2015, 7(2), 184-195; doi:10.3390/fi7020184
Received: 21 January 2015 / Revised: 24 April 2015 / Accepted: 21 May 2015 / Published: 8 June 2015
PDF Full-text (380 KB) | HTML Full-text | XML Full-text
Abstract
Background: In 2011, the Veterans Health Administration (VHA) consulted with the Project ECHO (Extension for Community Healthcare Outcomes) team at the University of New Mexico, Albuquerque, to reproduce their successful model within the VHA. Methods: The VHA launched SCAN-ECHO (Specialty Care Access Network-Extension
[...] Read more.
Background: In 2011, the Veterans Health Administration (VHA) consulted with the Project ECHO (Extension for Community Healthcare Outcomes) team at the University of New Mexico, Albuquerque, to reproduce their successful model within the VHA. Methods: The VHA launched SCAN-ECHO (Specialty Care Access Network-Extension for Community Healthcare Outcomes), a multisite videoconferencing system to conduct live clinical consultations between specialists at a VHA Medical Center (hospital) and primary care providers stationed at satellite VHA CBOCs (Community-Based Outpatient Clinic). Results: Analysis of the first three years rendered a mean attendee satisfaction of 89.53% and a consultation satisfaction score of 88.10%. About half of the SCAN-ECHO consultations resulted in patients receiving their treatment from their local primary care providers; the remaining half were referred to the VHA Medical Center when the treatment involved equipment or services not available at the CBOCs (e.g., MRI, surgery). Conclusion: This paper details the setup, operation logistics and preliminary findings, suggesting that SCAN-ECHO is a viable model for providing quality specialty clinical consultation service, prompter access to care, reduced commutes and continuing education. Additionally, the use of a secured Internet-based videoconferencing system that supports connectivity to multiple (mobile) devices could expand the utilization of this service. Full article
(This article belongs to the Special Issue eLearning)

Journal Contact

MDPI AG
Future Internet Editorial Office
St. Alban-Anlage 66, 4052 Basel, Switzerland
futureinternet@mdpi.com
Tel. +41 61 683 77 34
Fax: +41 61 302 89 18
Editorial Board
Contact Details Submit to Future Internet
Back to Top