Next Article in Journal
Artificial Intelligence and Machine Learning Technologies for Personalized Nutrition: A Review
Previous Article in Journal
Knowledge Management for Improved Digital Transformation in Insurance Companies: Systematic Review and Perspectives
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

TEADASH: Implementing and Evaluating a Teacher-Facing Dashboard Using Design Science Research

by
Ngoc Buu Cat Nguyen
1,*,
Marcus Lithander
2,
Christian Master Östlund
1,
Thashmee Karunaratne
2,* and
William Jobe
3
1
Department of Media and Design, School of Business, Economics and IT, University West, 461 32 Trollhättan, Sweden
2
Department of Learning in Engineering Sciences, Royal Institute of Technology (KTH), 114 28 Stockholm, Sweden
3
Informatics Department, School of Business, Economics and IT, University West, 461 32 Trollhättan, Sweden
*
Authors to whom correspondence should be addressed.
Informatics 2024, 11(3), 61; https://doi.org/10.3390/informatics11030061
Submission received: 18 June 2024 / Revised: 15 August 2024 / Accepted: 21 August 2024 / Published: 26 August 2024

Abstract

:
The benefits of teacher-facing dashboards are incontestable, yet their evidence is finite in terms of long-term use, meaningful usability, and maturity level. Thus, this paper uses design science research and critical theory to design and develop TEADASH to support teachers in making decisions on teaching and learning. Three cycles of design science research and multiple small loops were implemented to develop the dashboard. The tool was then deployed and evaluated in real time with the authentic courses. Five courses from two Swedish universities were included in this study. The co-design with teachers is crucial to the applicability of this dashboard, while letting teachers use the tool during their courses is more important to help them to recognize the features they actually use and the tool’s usefulness for their teaching practices. TEADASH can address the prior matters, align with the learning design, and meet teachers’ needs. The technical and co-design aspects, as well as the advantages and challenges of applying TEADASH in practice, are also discussed here.

1. Introduction

Interactive dashboards are crucial artifacts to aid specialists and the general public in understanding complex datasets using visualizations, tables, or maps, and presenting multiple widgets together on a single display [1]. The adoption of dashboards gradually increases over time, but the challenges in developing and using dashboards remain, such as lack of user involvement, the design and constraints of layouts, and visual literacy gaps [1]. This paper delves into the development and evaluation of a teacher-facing dashboard in the context of higher education, in which we try to solve the current challenges that we all face when developing and using dashboards.
By virtue of the intense workload and time-consuming nature of examining teaching activities, monitoring, and assessing student data is a demanding experience for many teachers, particularly when learning occurs in online- or technology-mediated environments [2]. A part of the teaching analytics (TA) research effort has been devoted to the development of dashboards that provide teachers with constant and swift feedback about student progress [3,4]. Properly designed and useful dashboards can substantially support teachers in their daily teaching practices through evidence-based interventions and learning design (LD) revision [5]. Visualizations and visual analytics techniques are common in TA and encourage teacher inquiry and reflection [6].
However, the actual use of teacher-facing dashboards (TFDs) is limited due to several reasons. Most TFDs could be useful for raising awareness, but they have not provided actionable insights for intervention or analysis of the impacts of teaching and learning [7]. The impact of these dashboards on learning and teaching outcomes was not reported. Although some dashboards produced a positive impact, they were based on single studies or small samples, and, therefore, lacked generalizability [7]. Kaliisa et al. [7] also revealed the dearth of standardized assessment tools and evaluation criteria relating to learning and teaching outcomes such as attitudes, motivation, and engagement in the context of TFDs. Moreover, one of the reported reasons is that teachers occasionally use dashboards or access them for a very short time, and this reason can be partially explained by the lack of engaging teachers in the design and development process of the dashboards. The other reasons are as follows: that most dashboards were prescriptive [8], less customizable [9], and lacked explicit theoretical grounding; the insufficient usefulness of dashboards for actual users [4]; the shortage of qualified data for visualizations [10] or unintegrated dashboards in learning management systems (LMS), which result in manual data extraction and import [4]; and that the deployment of dashboards was predominantly at the prototype and pilot stages, with the evaluation focusing on self-reports and user reactions. Additionally, Ndukwe and Daniel [6] carried out a systematic literature review to expose the need for more research in TA to explore how teachers can engage with data associated with teaching, encourage reflection, and improve teaching quality.
The shortage of evaluation or empirical validation of the TFDs limits the proposed solution refinement [3,11] and their applicability [12]. It is quite common that TFDs do not offer information in an explainable and actionable way [13]. This design limitation becomes a barrier to teachers’ use. Therefore, dashboards need to be improved toward “educational data storytelling”, and upskilling teachers’ ability to understand the meaning of data visualizations should be considered [13]. The approach of educational data storytelling is the aggregation of information visualization (InfoVis) and data storytelling (DS) principles, connecting to the theory-grounded LD and learning context [13].
The literature has revealed that, although several TFDs have been developed, the promising potential of TFDs has not been obtained, thereby constraining their applications in practice. The base of this study is grounded in the previous literature on TFDs and the actual demands of the teachers from two Swedish universities. The proposed solution, TEADASH, addresses the bottlenecks identified in practice within higher education and in the above-mentioned literature. Specifically, the research gaps that TEADASH intends to fill include the following: connecting the TFD with the context embodied in theoretical-grounded LD; engaging teachers in the design and development process; conducting an empirical evaluation of the TFD in real-world courses together with in-charge teachers of the courses; increasing visualization quality through multiple analytical aspects; integrating descriptive, predictive, and prescriptive analytics in the TFD; generalizing the TFD based on the studied courses; enabling an automatic process of extracting, analyzing, and visualizing real-time data; integrating the TFD in the LMS; and providing evidence of the explainable and actionable TFD that leads to the interventions in this study.
For the development of TEADASH, we used design science research (DSR) in conjunction with the critical approach, and this method has been used with promising results in previous studies [14,15]. DSR is the rigorous application of theoretical knowledge and engineering design principles to address complex and practical challenges while the critical approach plays a role as a design guideline for the design process. This combination helps to ensure the connection and meaning between data, the context, and the teachers’ needs in TEADASH.
Accordingly, the research questions are as follows:
RQ1:
What are the requirements for a TFD that are useful in assisting teachers to make decisions on teaching and learning?
RQ2:
How can TEADASH be designed, developed, and evaluated to assist teachers in making decisions on teaching and learning?
This paper makes two significant contributions. Firstly, we contribute the steps and methods of how we have executed the design and development process of TEADASH in which we engaged teachers (users) together with the researcher (developer) to ensure what teachers look for, which can be found in the dashboard. Secondly, we contribute lessons learned through the evaluation process before the deployment of the dashboard, during its use, and after the dashboard was used by the teachers in their real-time courses. The lessons reflect the notable points for the entire process of designing, developing, and evaluating a dashboard, which researchers in similar fields or dashboard developers should take into account.
The paper is structured as follows: First, we present the background and other studies relating to our study. In this section, we collect existing solutions and limitations of dashboards supporting teachers, thereby identifying what we need to do with TEADASH to close the prior gaps. We also review the existing literature on DSR as a method to develop dashboards for teachers and students. We delve into the critical approach that is used as theoretical scaffolding for designing the visualizations and the implications of their use. Second, based on existing related work and the critical approach, we present the method, including the study context, the iterations of the DSR that we have performed, and the design of the evaluation process. Third, we present the design requirements (DRs) stemming from teachers’ needs and LDs, generalize DRs to design principles (DPs), and demonstrate the respective features of TEADASH following the DPs. The evaluation results are also exhibited according to before the deployment, during, and after the running courses. Fourth, we discuss the lessons learned from the entire process of developing and evaluating the dashboard. Last, we conclude our study with implications, limitations, and suggestions for moving forward.

2. Related Research

2.1. Existing Solutions and Limitations

Few dashboards, models, or prototypes have been developed that encompass the topic of applying TA to support teachers in examining teaching practices and monitoring their students’ progress [3]. One of the issues raised by the prior literature is the misalignment of pedagogical design and visualizations. To solve this issue, Echeverria et al. [13] developed a conceptual model of suggestions on what to consider when developing a DS dashboard contextualized to the learning context and user needs. The suggestions include potential elements to help teachers to explore visualizations with less effort, elements guiding teachers to gain insights from visualizations without technicians’ help, text labels and key data points that can catch attention, and graphical factors that may help users to understand the visualizations better. Corrin et al. [16] developed the Loop tool to connect pedagogical design to the dashboard to inform learning activities. However, the aspects of DS and the effectiveness of this tool in intervening were not reported in the Loop design process. CADA is a dashboard that was developed to solve this aforementioned issue by engaging teachers in the design process and responding to the issues of turning the patterns into possibilities for action and disintegration with LMS [4]. Nevertheless, real-time feedback and interaction between graphs for exploration were not described in CADA. Additionally, DaVID [17] is a model concentrating on the aspect of improving the instructional design at the runtime of courses by, for example, highlighting the color, provisioning collective, or individualized insights of the courses, or producing warnings and corrections during the courses. Another teacher-facing dashboard was developed by Dourado et al. [3] for process-oriented feedback in online learning through two iterations together with teachers and visualization experts. The authors thoroughly considered developing this dashboard to solve the problems identified by teachers to meet their needs. By using the data extracted from Moodle LMS, this dashboard was implemented as a web-based application and evaluated by experts through questionnaires. Accordingly, this dashboard has not been tested by the end users, and its impacts on teachers’ practices have not been recorded. It is also noticed that, although teachers’ needs were included in the design process, the LD was not fully addressed in this paper. This may exacerbate the misalignment of the LD and visualizations, as mentioned in the previous literature [13,18]. We also found one dashboard for teachers that was developed and tested in practice within the context of primary schools [19]; however, whether it applies to higher education needs to be examined.
Based on the limitations and suggestions of the existing models and dashboards, TEADASH has been developed to close the above gaps and apply the suggested models. Namely, TEADASH encompasses the following factors: (1) connects LD, context, and user needs to the visualizations; (2) includes DS via the interaction between graphs for users’ exploration and graphical factors, text labels, and key data points for catching attention, producing warnings and facilitating users’ understanding and allows integration in LMS; (3) automatically extracts real-time data; (4) engages teachers in the design and development process; (5) uses theoretically grounded LDs; (6) evaluates the TFD usefulness in real-world courses; (7) improves the visualization design through multiple analytical aspects to enable actionable intervention; (8) and generalizes the TFD based on five studied courses. Therefore, this dashboard provides evidence of an evaluated TFD that enables interventions and actionable insights for teachers.

2.2. Using Design Science Research in Developing Teacher-Facing Dashboards

It is not uncommon to use DSR or design-based research (DBR) to develop TFDs, due to the systematic process with specific steps of this method. Schmitz et al. [20] used four iterations of DSR cycles to develop FoLA—a tool to co-create an LD of a learning activity—including technology enhancements and measurement indicators. Kaliisa and Dolonen [4] followed five iterative design stages of DBR to develop CADA—a Canvas analytics dashboard—together with teachers. The demo version of Observata was developed according to DBR with the co-design with end users (project managers) to explore the effects of combining learning analytics and classroom observation [21]. The LA4LD tool was also developed based on DSR with the co-creation of students and teachers to provide them with on-demand feedback on learning activities during the run-time of a course [22]. Jayashanka et al. [15] used DSR to design and create a TELA dashboard to improve students’ motivation, engagement, and grades in online and blended learning environments. Nguyen et al. [14] presented a study that focused on developing and evaluating a set of design principles using DSR methodology for learning analytics information systems in higher education.

2.3. Critical Theory in Information Visualization

To better comprehend the influence, manipulation, and empowerment that interactive visualizations can produce, a critical approach was developed in human–computer interaction. This theory seeks to investigate the motivations behind visualizations and the possible implications of their use [23]. The purpose of visualization does not focus heavily on “analytics insight,” but rather on obtaining a wide recognition of an issue, on awareness about an online community’s shared resources, or on reflection about oneself (i.e., LD).
Following Dörk et al. [23], the critical approach adopts DS through visualizations to harmonize between designers and viewers. Consequently, the range of interpretations of visualization depends on both the designer and the viewer, becoming, to some extent, subjective and interpretive. No “one” visualization can capture all aspects of a particular dataset from all angles. The authors outlined four principles for this critical approach, including disclosure, plurality, contingency, and empowerment.
Disclosure refers to the exchanges and reflection of designers and viewers in creating visualizations from an issue, data, representation, and interaction. This is embodied in co-design between the researcher and teachers in the implementation of this dashboard to align it with teachers’ specific goals. This helps to set clear boundaries between what needs to be communicated through the dashboard and what does not [24]. In this context, teachers were asked to play the role of a designer and a user at the same time.
Plurality alludes to exposing multiple facets and enables various interpretations for circumscribed views and singular readings. Various kinds of data were selected to provide multiple insights into students’ learning and were visualized from detailed to general levels, which adapts to the different needs of teachers.
Contingency provides the flexibility and dynamics in visualizations that users freely choose to engage with for a given issue and relate to their practices. Accordingly, the interaction between graphs allows teachers to relate to the dilemmas that they want to explore, while highlighted colors or narratives emphasize the key aspects that drive teachers’ focus of attention [24].
Empowerment allows users to utilize their own stories to shift from awareness to action through interactivity. Empowering visualizations enable interactive dashboards, linkages across heterogeneous graphs, and supportive graphs to help teachers to see additional perspectives beyond the goal of a specific visualization. Consequently, captured information drives teachers toward proper interventions.
These principles allow us to examine the relationship between data and visualization, context and activities, and designers and viewers to lead to actionable insights. Additionally, DS refers to choosing an appropriate visual, since certain visualization techniques fit certain purposes (e.g., line charts effectively show changes over time) [24]. The other design principles mentioned in DS include decluttering, which elicits removing elements (i.e., unnecessary headers or features) to reduce the complexity of graphs; title, which drives the intent of the visualization; and captions explaining the important features of the data [24].
The factors in the critical theory and additional design principles of DS are utilized in TEADASH and analyzed in the upcoming sections.

3. Method

3.1. Research Context

To elicit the requirements and develop the artifact, the DSR methodology process model described by Peffers et al. [25] was followed. The DSR and disclosure aspect of the critical theory accentuated the engagement and collaboration of developers and users in the whole process. Namely, we formed a multidisciplinary design team including researchers and teachers who systematically identified problems and demands, as well as iteratively designed, implemented, and tested a design solution from local to generalized contexts [22]. As mentioned in the introduction, the TA solution is a dashboard grounded on the theoretical foundation of LD and teachers’ needs. This study used three DSR cycles, as shown in Figure 1, and has conducted twelve semi-structured interviews and discussions throughout the three DSR cycles (Table A1 in Appendix A).
Five courses from two Swedish universities were included in this study. The common features of these courses were as follows: the courses were given in a blended or online mode using the Canvas LMS, the participants were adult learners, and the courses were short and intensive. All of these courses were part of this study at different times but complemented each other and individually contributed to this research. Particularly, the first, second, and fourth courses engaged in the DSR, while the third, fourth, and fifth courses were involved in the deployment process. The timeline is exhibited in Figure 2. The first, second, third, and fifth courses have similar LDs and student groups. The LD of the fourth course is different from the other courses, but there is one common learning activity—asynchronous discussion forums. Two of the authors are the main teachers of the second, third, and fourth courses. Based on the needs of these courses, we first generated a set of DRs. These DRs were then abstracted and transformed into DPs [26]. The DRs and DPs were formulated as the standards for evaluating and assessing constructs to improve the reliability and validity of the findings and enable insightful comparisons and generalizability. This responds to the challenge posed in [7].
The theory-grounded LDs of the first, second, third, and fifth courses: The first, second, and fifth courses belonged to the engineering department, while the third course belonged to the social science field of a Swedish university. These courses used synchronous discussions as the primary learning activity. Employed adult learners who are experts in particular subjects were the key targets of these courses, which led to the small-scale courses (a small number of students, about more or less 20 students). The learning perspective that formed the basis for the courses was that of socio-cultural learning, as it emphasizes learning as a social process [27]. This is relevant for employed adult learners as they are focused on acquiring skills and knowledge that relate to every task and challenge in their practice.
The theory-grounded LD of the fourth course: This course was in psychology and behavior offered by another Swedish university and contained the following key learning activities: quizzes, assignments, and asynchronous discussions. This course also had a small number of students most of the time (about 50 students), although it sometimes admitted more students than usual (about 100 students). This course was designed based on the fundamental principles of learning from cognitive psychology to promote student learning. The course followed three main ideas. Firstly, applying cognitive load theory emphasizes stepwise instructions and explicit scaffolding [28]. Recognizing that overwhelming information can hinder learning, the course ensures a structured progression, enabling the students to grasp complex concepts incrementally. Secondly, the course strategically incorporates quizzes throughout. Research on retrieval practice indicates that regular assessments enhance long-term retention, fostering a deeper understanding of the material [29,30]. These quizzes serve as checkpoints, allowing students to reinforce their knowledge and providing educators with valuable feedback on individual progress [31,32]. Lastly, the course employs multimedia elements, particularly videos, guided by research-based guidelines for effective learning [33,34]. By leveraging the principles of multimedia learning, such as coherence and contiguity, the videos aim to enhance comprehension and retention [35].

3.2. Design and Development Process of TEADASH

In this section, we present three iterations of the DSR in the design and development process of TEADASH.
First cycle: The LD of the first course was carried out by the teacher before the study, so the log data of this course was not informative enough to draw insights using TA due to the misalignment between TA and LD. We started with this course to gradually integrate TA into the LD. As a part of the requirement elicitation, interviews were conducted with the teachers to understand the LD; the challenges that the teachers faced and their expectations through the LD; students’ circumstances, as well as their learning ways, goals, and interests; and to exchange the ideas of this study (Interview 1 and 2—Appendix A.1). Due to the new concept of TA, the teachers expressed the uncertainty of how TA works. Despite the limitation of the log data, we demonstrated the feasible preliminary visualizations of the students’ learning based on the described LD to help the teacher to become familiar with TA and engage in the study (Discussion 3—see Appendix A.3). This piqued the teacher’s interest and curiosity about TA. The first prototype of TEADASH was subsequently built using C# to extract log data and Tableau to create visualizations (see more details in [36]).
Similar procedures were performed in the second course. However, with the second course, preliminary visualizations could not be made due to it having fewer learning activities compared to the first course, but a visualization demo of the first course was performed for the teachers of the second course to raise awareness of the TA concept (Interview 5—see Appendix A.1 and Appendix A.3).
At the end of the first cycle, based on the findings revealed by the first prototype, supplementing asynchronous forums to the current LD of the second course was suggested (Discussion 5). This change helped to solve the existing pitfall in the current LD, which was the students’ low engagement in the course, the alignment with students’ learning preferences and teachers’ expectations, and opening more spaces for TA [37].
Second cycle: To add asynchronous discussion forums to the current LD, the teachers in the second course substantially revised the structure of their course page and teaching strategies to motivate the students to participate in the forums. This intervention was implemented during Spring 2023. After this intervention, the students’ engagement had additionally increased in the asynchronous discussions and with the synchronous discussions, leading to wider and deeper discussions on specific topics. The second prototype was developed, which was built as a web application with the same static datasets used in the first prototype (see Figure A1, Figure A2 and Figure A3). This prototype tended to show an overview of a course rather than individual students.
The fourth course was included in this study in the middle of the second cycle. The teacher of this course had previous experience of TA, so we decided to make the second prototype available to the teacher. A similar process as that used in the other two courses was followed with the fourth course to capture the LD and the context (Interview 8—see Appendix A.1).
The second prototype was upgraded with the following features: using real-time data, adding the network graph, adding the feature of prediction, with the visualizations of the sequential activities supplementing the network graph (which were assignment submissions and participation in quizzes), and identifying “at-risk” students. This upgraded version was designed to provide more information on individual students while still keeping a course overview following the teachers’ feedback. This prototype was coded in Python using the libraries as follows: Dash and Flask for creating web applications, Plotly for creating interactive graphs, Pandas and NumPy for data preprocessing, Sklearn for prediction algorithm, OAuth2Session, requests, and json for calling APIs, and dash_auth for building login function. The data fields extracted from Canvas and used for the visualizations of the dashboard were selected following Table A2 (Appendix A). The evaluation in this second cycle occurred similarly to that shown in the previous cycle. Accordingly, the adjustments in the LD and planned interventions were proposed for the upcoming courses based on the findings from the dashboard (Interviews 6, 7, 9, and 10—see Appendix A.2 and Appendix A.3). Additionally, the necessary improvements for the dashboard were proposed, including adding the individual access visualization together with the prediction table to provide a more comprehensive situation of an individual student, adding a button to retrieve new data when needed, and complementing the due dates of assignments to the access graph.
Third cycle: The proposed improvements for the dashboard were implemented and shown to the teachers. The evaluations of this TFD before the deployment stage were conducted with the teachers of the second and fourth courses, where we went through the functions of this dashboard, reflected on how every visualization could help the teachers to make decisions to enhance learning and teaching, made minor adjustments to finalize this release version, and noted new ideas for further development (Discussions 11 and 12). The evaluations before the deployment were performed through formal and informal interviews with the teachers based on the following key questions: (1) “When you know this information from this visualization, what can you do to improve LD or support students?” and (2) “Is the dashboard potential for you as a teacher to enhance the LD and support students?” These questions were guided by the empowerment aspect of the theory in order to examine how the dashboard led the teachers to proper interventions. Before this dashboard was deployed, we demonstrated the features of the dashboard and instructed the teachers on how to use it. The teachers also had a chance to try the dashboard.
According to the deployment plan, the management boards, the e-learning teams, and the IT departments of two universities were engaged in this stage to support the deployment process in terms of techniques, policies, and maintenance for long-term use.
Apart from these three cycles, many loops of design–implementation–evaluation were carried out between the cycles. In addition, there were informal catch-up meetings with the teachers to continuously co-design and evaluate the dashboard. As a result, we were able to swiftly improve the TFD to meet the teachers’ demands and align with the LDs.

3.3. Real-Time Evaluation of TEADASH

This dashboard is viewed as an IT artifact developed through the DSR, leading to the necessity of the evaluation of this dashboard in terms of functionality, completeness, consistency, accuracy, performance, reliability, usability, and fitting with the LD, teachers’ needs, and available log data that Canvas supports [38] not only during the design and development process, but also in real-world settings. TEADASH had come into use and was deployed in the third, fourth, and fifth courses offered in Spring 2024. The third and fifth courses were engaged in this real-time evaluation for the following three reasons: (1) the teachers in the second and third courses were the same, and the teacher in the fifth course was willing to join this study and redesign the course to be able to keep track of the students’ progress using data; (2) the LD and the student group were similar to those in the second course; and (3) the course started at the same time that this dashboard was deployed. The evaluation of TEADASH was divided into the following three parts: before the deployment (the evaluations conducted after every cycle of the DSR described in Section 3.2), during, and after the real-time courses (see Section 4.4). The interviews during and after the running courses were recorded, noted, and permission from the teachers was obtained orally. In total, 12 formal interviews were conducted for the evaluations during and after the real-time courses, including 3 interviews with the teachers in the third course, 5 interviews with the teacher in the fourth course, and 4 interviews with the teacher in the fifth course. The average time for each interview was about one hour. Additionally, there were follow-up informal meetings and emails between the researcher and the teachers during the running courses. A thematic analysis was applied to analyze these interview data.
We elected to undertake qualitative evaluations of this dashboard through interviews with the teachers from the studied courses rather than quantitative evaluations with students for three reasons. Firstly, the goal of this study is to create a dashboard to support teachers in enhancing their teaching practices; therefore, student voices and student learning are not included. Secondly, to determine whether the dashboard works in practice, the teachers who co-designed the tool were also the ones who evaluated it. Thirdly, the dashboard was deployed in real-time courses to examine how it supports teachers in improving LD and assisting students. Thus, obtaining enough teachers and courses for a meaningful quantitative tool evaluation takes a significant amount of time, which is not feasible within the duration of this study.
During the courses, the teachers were asked to use the tool to follow their students’ behaviors and make in-time interventions to support the students falling behind and adjust the LDs. The “during” evaluations were performed before every deadline (for the fourth course) or before the Zoom lectures (for the third and fifth courses) through interviews or emails to catch up on the changes in the teacher’s thoughts and make essential interventions. Following the critical theory, this activity enhances the collaboration of the researcher and the teachers, provides the teachers with the necessary support for analytical thinking, and explores how the dashboard satisfies the teachers’ needs in practice. The interview questions aimed to explore the usability, usefulness, informativeness, time-saving aspects, and overall impacts of the dashboard on enabling the teachers to act, which were guided by plurality, contingency, and empowerment aspects of the critical theory. The key questions were “What can we do for different types of students?”, “Is the dashboard useful to you and in which ways?”, “Does the dashboard provide useful and sufficient information that you are looking for?”, “Do you want to improve the dashboard and in which ways?”, “What do you think about user-friendliness and ease of use of the dashboard?”, and “Does the dashboard save you time to do your teaching practices?”. Over time, the researcher also recognized the changes in the teachers’ use and preferences, so some questions were added, such as “Is there any difficulty when using the dashboard?” and “How to facilitate and motivate your use of this tool?”
After the courses, the final evaluations were executed to reflect on the teachers’ use of the tool throughout each of the courses. We evaluated the usefulness of the TFD based on the DRs created by the teachers in the design process. Following this were the reflection questions guided by the theory and DS regarding usefulness, informativeness, time saving, ease of use, ease of understanding, user interfaces, authorization, and tool performance. The questions aimed to explore the overarching experience of the teachers after using the dashboard in real time.

3.4. Ethical Consideration in TEADASH

This study was approved by the management boards of the two universities. The ethical applications were also submitted to the local ethical committees of both universities. Due to no relevance to personal information, this study does not require ethical approval. Simultaneously, the local devices used for data storage were encrypted to ensure that no data were revealed in case of theft. Particularly in the dashboard, the student names are displayed for only their in-charge teachers, which makes it feasible for the teachers to offer respective support to the individual students. Since this dashboard will be integrated into Canvas as a redirecting tool, teachers can only see the visualizations from their courses’ log data with the provided authorization. This avoids the situation that teachers can see the courses of which they are not in charge and no one other than the teachers with the provided authorization can access this tool. Due to data privacy, the student names are hidden in the figures presented in this paper. Furthermore, the codes of this dashboard and the extracted Canvas data were pushed to the servers, which were protected by the involved universities.

4. Results

4.1. From LD and Teachers’ Needs to Design Requirements

First and second courses: These courses had a limited number of scheduled lectures and were followed by synchronous discussions. However, the teachers were not able to listen to all of the students and keep track of what they discussed, as they were in different groups. This further emphasized the importance of asynchronous forums where the students could voice their opinions while allowing the teachers to follow what students discuss. Thus, the dashboard should show the situation of the forums to help teachers to quickly capture what is going on there (DR1). Teachers also wanted to examine if and when the students viewed the uploaded learning materials and watched the videos (DR2). Premised on the available data, how frequently students accessed the Canvas course pages was displayed. From this information, teachers wanted to see this access information together with the deadlines of assignments (DR3). Additionally, there were several mandatory assignments, inducing the need to perceive the submission situation of every assignment (DR4).
Fourth course: This course encompasses the following learning activities: quizzes and cases in the aggregated form of submission and discussion forum. The teacher expected prediction to find out the students who needed support (DR5). Examining how often students access the course page also interests the teacher (DR6).
The teacher wanted to investigate if students who were more active during the first part of the class had a higher likelihood of passing the course. To achieve this, data regarding the frequency of student participation, the number of attempts in quizzes, and the timing of accessing the course page were analyzed to identify potential correlations with overall course success (DR7).
To investigate if drop-out rates were higher for specific assignments, student engagement and completion rates for different tasks were analyzed to pinpoint any patterns or challenges associated with particular course components, guiding potential improvements in the content or delivery (DR8).
Lastly, the teacher sought to explore whether the students who interacted with their peers in discussion forums were more likely to pass subsequent assignments. Tracking the correlation between engagement in discussion forums and future academic performance would provide insights into the role of collaborative learning and social interaction in supporting student success. This aspect aimed to inform strategies for fostering a more interactive and collaborative online learning environment (DR9).

4.2. Design Principles for the Development of Teaching Analytics Dashboard

We grouped the similar DRs and generalized them to the DPs. The elements of passing the courses and the number of attempts cannot be considered currently, due to the unavailable data, but will be considered for further development. Instead, the course completion is addressed and scrutinized based on the completion of the individual learning activities. As a result, the DPs are built as follows:
DP1: This dashboard should help to examine interactions between students in forums (DRs 1, 8, and 9).
DP2: This dashboard should be designed with information on student access and the assignment deadlines during the courses (DRs 3, 6, and 7) to observe student learning behaviors compared to the deadlines.
DP3: The submission situation should be displayed in the dashboard overall and individually (DRs 4, 7, and 8).
DP4: This dashboard should help teachers to identify the students who need support to catch up with the courses (DRs 5, 7, 8, and 9).
DP5: This dashboard should show which learning materials students have viewed (DR2).
Table 1 presents a link between learning theories, LD, teachers’ needs, and the DPs of TEADASH.
Figure 3 supports DP1 and presents the interactions of students in the forums in the form of social networks. Teachers can select the nodes to see the detailed submissions of these students. This design supplements the information of these students’ submissions together with their interactions in the forums, thereby empowering teachers’ understanding of student behaviors. This design shows the activity sequence, which is recommended in [39]. The table in Figure 3 shows specific information about the network graph in case teachers want to know how many posts and replies every student has. This table is also updated following the selected nodes. For the students who have no replies, the sections are colored to catch the teachers’ attention immediately. The rudimentary recommendation is written in the dashboard following the teachers’ feedback during the co-design process.
DP2 is embodied in Figure 4, showing the number of students accessing the course page and the sum view following the timeline. The due dates of the assignments are represented as vertical dash lines.
Figure 5 shows the submission situation of the whole course, as well as individual students, which addresses DP3.
To address DP4, there are two functions designed in this dashboard. One function allows teachers to export the list of students who submitted one assignment but did not submit the other assignment. The dashboard allows users to download this list as a CSV file. The other function—prediction—displayed in Figure 6, allows teachers to keep track of the at-risk students for the final submission. When teachers click on one row, the login situation of the selected student is displayed in the graph below, providing teachers with extra information to consider proper intervention for students. This design is another activity sequence in this dashboard.
DP5 is implemented in the parallel plot, which is similar to Figure 5. The visualizations can be downloaded as image files to support teachers for further analysis.

4.3. Selection for Design of Visualizations in TEADASH

Following the critical approach and DS, five concerns were considered when designing visualizations in the dashboard. Firstly, proper charts should be selected to present the datasets. For example, for the overview of the completion rate in each assignment, a stacked bar chart demonstrates this aspect well and saves space for the display in the interface. To view the details of each category, a parallel plot can clearly show both the details and the trends of the datasets. The teachers also want to have tables to systematically see the details of students. Secondly, plurality in the theory enables various analytical visualizations of the same dataset. For instance, with access data, an overview of student access is shown in the line chart. Simultaneously, teachers can also see the access data of each student in the scattered plot linked with the prediction table. Thirdly, the contingency aspect drives us to design interactive graphs to allow teachers to discover a specific category in the relevant charts. For example, in the network graph of student interactions, when teachers select a specific student or a group of students, the parallel plot will show respective submissions of assignments that the selected students have carried out and the corresponding number of posts that they have made in the forums. With a double-click, teachers can reset the graphs to their initial status. Decluttering in DS is also embodied in the interactive graphs. Initially, the parallel plot shows all of the students. When teachers select a specific group of students in the bar charts, such as those who submitted on time or those who submitted late, the parallel plot will be updated accordingly and remove the other groups of students. Fourthly, the empowerment aspect and the teachers’ needs guide us to add additional graphs to complement the main graphs. For example, the table of prediction is followed by a graph of individual access. If the teachers see at-risk students in the prediction table, they will look further into the graph of individual access to better understand how often the at-risk students access the course page, thereby facilitating the teachers’ decision making. Fifthly, titles, captions, and instructions guided by DS are also added to the dashboard to help teachers to understand the graphs more easily and know how to use the graphs.

4.4. Evaluation Results

The evaluations before the deployment of TEADASH: For the first question (“when you know this information from this visualization, what can you do to improve LD or support students?”), the teachers were able to propose solutions, respectively, to the information revealed by the dashboard by reflecting on the current LD and discussing what they thought of the students’ learning and how they could support them. The proposed changes and actions for the upcoming courses included more explicit course syllabi or instructions, more teacher engagement to facilitate the forums, and sending reminders to the students who were dropping behind. For the second question (“is the dashboard potential for you as a teacher to enhance the LD and support students?”), the teachers agreed on the dashboard’s potential and benefits in enabling necessary actions and were eager to use this tool in their upcoming courses.
The evaluations during the courses: The teachers were informed about their students’ engagement in the learning activities and conducted two interventions during the third and fourth courses. Firstly, the teachers in both courses chose to send reminders as a simple and quick way to inform the students who fell behind. Secondly, originating from observing the low interaction of students in the first and second forums and recognizing the inexplicit or missing information provided to the students, the teachers in both courses decided to clarify the instructions of the upcoming forums. Namely, the teachers in the third course chose to send an additional reminder to encourage the students to comment on each other in Discussion 3 instead of updating the instructions of the discussion, while the teacher of the fourth course explicitly added to the instructions of Case 4, which was “commenting on those whom you have not commented before” to motivate student interaction.
Additionally, the dashboard helped the teachers to be informed of the students’ progress. Specifically, in the third course, the students’ procrastination in posting in the forums occurred, which did not well facilitate peer comments. The teachers in the third and fourth courses were aware of several isolated students in the forums and the access pattern throughout the courses. These observations led the teachers to propose several changes for the upcoming courses, including a change in assignment deadlines, a change in assignment content, setting the start-up quiz as mandatory, following up with students at the early stages of the courses, and increasing teacher engagement in the forums. Meanwhile, the teacher in the fifth course felt positive when the students behaved as expected and actively engaged in the learning activities. Thus, no intervention was proposed for the fifth course.
The evaluations after the courses: The DRs were examined to evaluate if the dashboard satisfied them (see Table 2). The features of the dashboard provided the teachers with insights into their students’ learning throughout the courses. The involved teachers recognized the usefulness of the dashboard in supporting them to improve LD or keeping them informed of their students’ learning in real time. Namely, the teachers in the third and fourth courses commented the following: “Yes I like the visualizations”, “Each case and quiz, I think it does a good job with that. And visualizing, it’s very easy to digest”, “by using this, these ideas have come for sure. I wouldn’t have thought of those without using it.” The dashboard allowed the teacher in the fifth course to have more control, especially with the online setting, stating the following: “I mean that’s the role of any information visualization. So, I get a quick check that okay, things are going. They are active. You know, I could, very quickly through these visualizations, see if something is wrong with the course. If someone is left isolated or if no one is accessing the site, then I would know that”.
Most of the features in the TFD were used and helped the teachers to generate necessary interventions to support the students or improve the LDs. However, there were features that the teachers only looked at but did not know what to do with, since the information related to individual students, which is sensitive to the teachers, especially the network graph showing the isolated students in the forums and the prediction table showing the at-risk students. Regarding this, the teachers in the third course said the following: “I think it’s a tricky question. I identified isolated students. […] And then I just sent out a general message to them ‘be more active’. Because I would feel a bit weird to single out *student name* saying ‘hey, be more active’” and “but again, I would feel I don’t know how I would approach them”, while the teacher in the fourth course thought the following: “I understand but I don’t know how to act with it”. The teacher in the fifth course had an opposite opinion, as follows: “this network graph visualization is very nice […] So, I think this is a tool that makes it quicker to get an overview of how things are going, in particular for very busy teachers, this is, I think most teachers are busy.” To solve the issue of what to do when perceiving the information from the network graph and the prediction table, a feature of automated feedback was proposed. While the teachers in the third course wanted the dashboard itself to send automated feedback to students and inform them with the summary of the action, stating the following: “if it would be automated, I think it would be more neutral and less workload on the teacher”, the teacher in the fourth course would flexibly act depending on different automated feedback, stating the following: “When it’s about more strategic decision […] about course design. So, if it’s about like, oh, maybe you should include more case assignments, you need to do something with your discussion assignments […]. Then I think I would like to get informed to make a decision and to look at the data” and “if it’s like reminders, then I think it could be automated”. Similarly, after using the tool in the real-time courses, the teachers in the third and fourth courses proposed the use of the dashboard as an overview of the courses instead of using it at an individual level, stating the following: “I think I look at that less with where I can see the individual student. But the aggregated data, those kinds of graphs are the ones that I use” and “it’s more about that we don’t need to trace the individual students so much as to trace just the general activity.” Despite this, these teachers still had some desire to know individuals’ progress if needed. The teacher in the fifth course contended the following: “for me as a teacher, I don’t think it’s sensitive information. For me to teach well, I need to know who is doing what” and “I would lose quality if you would anonymize this in some way. From my perspective, I think it’s fully okay also for teachers to see the names of the students.”
When asked about the aspect of time saving, the teachers had similar opinions that the dashboard helped them to quickly capture what was going on in their courses and that it made it possible to pedagogically improve the LDs, stating the following “the dashboard makes it possible for me to pedagogically develop the course, change the course design.” However, the teacher in the fourth course spent more time sending out reminders, stating the following: “for sending out reminders, I think that would be a great like automated feature that could be included”. Meanwhile, the teachers in the third course needed more time to make changes on the fly, making it impossible to do real-time interventions, stating the following: “for the sake of changing the course design for the better, the dashboard saves time, but to do the design changes on the fly, it doesn’t really save me enough time to use it. It should be more automated based on AI”. As a result, the feature of sending out reminders or emails to students was suggested for the TFD. The teacher in the fifth course stated the following: “I‘m not sure if it saves me time, but I think it’s very quick to use, so I don’t lose much time with it, and I think it gives me more control, more confidence, makes me calmer. Some of this information is the stuff I cannot see otherwise […] when you teach students physically, you will get a feeling for how things are going by meeting the students physically in the classroom. But when you do online courses, you lose that feel, and you get some of that back with the help of this kind of tool”.
Regarding the aspect of authorization, the teachers felt safe when the dashboard was integrated with Canvas and had login authorization, stating the following: “as soon as I‘m in Canvas and I click on links there, I feel safe”.
Regarding tool performance, the dashboard sometimes lagged and was slow when retrieving new data or when the Internet connection was slow with users stating the following: “sometimes it’s lagging”.
In regard to the user interface, the teachers expected more investment in the frontend of the dashboard, since the current version focused more on the backend side, stating the following: “in general it’s a little bit, it’s more back-end” and “I kind of needed some explanations to understand it the first time and if it was possible to go in and zoom in […] I mean, it’s a good illustration if you just want to see when the students have been active during the course”. Moreover, the teachers desired to have more options to choose which information to show in the dashboard, stating the following: “you want the latest data every time you go in […] And you need to just make sure that it’s clear in the interface” and “that could be the first function you can filter out both quiz and case”. There were differences in visualization designs among the teachers. While the teacher in the fifth course preferred visualization charts, stating the following: “It’s (the network graph) much nicer than some tables or something like that,” the teachers in the third course were more interested in summaries and texts, stating the following “click on the student and see kind of their overall activity or something […] who they replied on and how many replies” and “actually that network also has the meaning for each note, and you see also the color. We also can read that, but maybe it’s a little bit difficult. So, I just think like how to make it easier for a user to understand it […] you don’t really need the colors to see that”.
In the aspect of ease of use and ease of understanding, the teachers in the fourth and fifth courses had worked with data analytics, whereas the teachers in the third course were quite new to data analytics, which produced the differences in their opinions. Consequently, the teacher in the fourth and fifth courses had no difficulty in using and understanding the graphs, stating the following: “I teach information visualization […] They look like the standard approach. They are appropriate for the data that you have to visualize […] They are good kinds of visualizations for this kind of data. So, I think it’s straightforward”. The teachers in the third course had opposite opinions regarding this, as one teacher said: “I know what the dots are and the peaks,” whereas the other said: “I have a hard time understanding what the dots are”. However, the parallel plot needs to be reconsidered, since the teachers in the third and fourth courses could not digest it quickly, stating the following: “it’s a bit messy, but it helps when you put the cursor over and you can see” and “this graph (parallel plot) it takes time for me to understand”. Meanwhile, the parallel plots were not a problem for the teacher in the fifth course. To solve this issue, the dashboard was demonstrated not only before the deployment, but also during the running courses for the teachers. However, the demonstration of the dashboard to teachers should happen more often to help them to remember how to use it. Simultaneously, more instructions can be added to the dashboard.
The frequency of the teachers’ use of the dashboard also helps the teachers to remember how to use and address the graphs. Nevertheless, during the courses, it was observed that the teachers’ use of the dashboard was not regular. When asked about this dilemma, the teachers said that their schedules were busy, making them forget to use the dashboard, as follows: “I usually don’t find the time to evaluate and make changes to the design of a course during the course unless students react strongly somehow” and “I think if I can see that it would be useful and maybe we should have used it more. It’s just, it’s always busy”. Another reason for this issue was as follows: “if I get this data, what can I do with it”. Moreover, using this dashboard in real-time courses is a new experience for the involved teachers; therefore, it requires some time to become familiar with this new tool and process, with them stating the following: “it’s always like a process of starting to use a new platform. And to be able to see what actions I can take based on the data that I get. But now that we have started to send out reminders and so on. I think that’s something that will make me use the platform more when I can see like, oh yes, I can use it in this way.” The other factors motivating the teachers’ use were suggested, including automatic notifications from TEADASH to remind teachers of using it, as follows: “maybe get reminders that so far no one has looked at the course. And maybe an email that also has a link to the Canvas page. So, I can just get that email and click on it, and boom, I‘m on the page,” suggestions for actions, and more attractive user interfaces. These evaluation aspects are summarized in Table 3.

5. Discussion

Guided by the critical approach and the DSR, this study emphasizes the engagement and collaboration between the researcher and the teachers in designing and implementing this dashboard to solve the inconsistency between the designers and the users, which was reported in the prior literature. This collaboration ensures meaningful insights and connections between the teachers’ understanding of the dashboard and their LDs, thereby enabling the necessary intervention and changes. Frequent communication helps to avoid misunderstandings between researchers and teachers throughout the design process. When teachers stand in the process as designers, the engagement and adoption are augmented and good ideas for new design are inspired, since the teachers are key actors in designing their courses and directly interacting with students, so they know best how the problems are properly identified and solved.
These studied courses are generally small-scale, leading to small datasets. We could analyze the same data from multiple perspectives, providing teachers with numerous and deep insights. This is also a solution to work with small datasets but still gain insightful perspectives [40].
Schmitz, et al. [41] suggested the lessons typically learned, including (1) LD should have elements that can be measured, (2) measurements of effectiveness and efficiency of learning should be considered while connecting learning activities and achievement indicators, and (3) it should store learning materials in the LMS in a way that can be measured. These points refer to the preparation of qualified data, which contributes to the meaning of visualizations. Teachers may need to keep in mind the possibilities of meaningful data collection from the very beginning of designing learning activities [42].
The design and usability of TFDs play a primary role in adoption by users. The current reviews on TFDs [4,9,13] found poor evidence of grounding in learning sciences and theories, unfriendly user interfaces, a lack of usability tests, and an absence of design choice justification. As a result, TEADASH is developed based on theory-grounded LDs and contains dropdown lists so that teachers can select the options they wish. The relevant graphs are interactive with each other, which allows teachers to easily discover every case in detail. For example, in Figure 5, when the teachers click on a part of the bars in the bar chart, the parallel plot next to it will be accordingly updated. This helps teachers to explore more information beyond the inherent information in the bar chart following the contingency aspect in the theory. Moreover, the notes are clearly indicated in the graphs to support teachers in capturing the information quickly. When moving the cursor to the graphs, detailed hover information will be displayed accordingly. These features were designed following the DS and the critical theory to facilitate the teachers’ use of the dashboard.
It is remarkable to direct TFDs toward an explanatory purpose to prioritize the provision of information to users [13]. Concurrently, instructions, explanations, and recommendations are also added, respectively, to the graphs to guide users on how to interact with the graphs and to make the dashboard explanatory instead of letting teachers try to understand the graphs by themselves. In the prediction section, the instructions, the information on prediction accuracy, and the elaboration of the table information are exhibited. Despite this, the teachers still expect more instruction to be added to the dashboard. However, a balance between texts and graphs should be maintained so that separate user guidance or documentation may be considered.
To facilitate the ease of use for teachers, especially non-technical users, integrating TFDs into LMS, automating the entire process from extracting data to visualizing [4], and providing real-time analysis [43] are suggested in the prior literature. Making manual adjustments by users can cause technical issues and hindrances for TFDs. Real-time feedback fortifies teachers to make quick decisions and access critical information [43]. We implemented these features in TEADASH to fill these gaps. The data are extracted in real time. If data are extracted too frequently, this can slow down the performance of the tool and reduce the user experience. Thus, the button “retrieve new data” is designed when teachers need to access updated data (e.g., after the due date of an assignment). Nevertheless, the teachers also proposed that TEADASH should retrieve new data automatically when they open the dashboard. Additionally, from the Canvas course page, teachers can easily access this tool as a web server. The teachers expected that the dashboard would be integrated into Canvas as a plugin to show the visualizations directly in Canvas. These suggestions will be considered in the next version of TEADASH.
To be useful in practice, TFDs, like other IT artifacts, require institutional policy backing and long-term maintenance plans.

5.1. TEADASH for Teachers’ Decision Making

TEADASH provides evidence of enabling teachers to make actionable insights. The access graph helped the teachers to make a change in assignment deadlines, which increases students’ regular access to course pages and responds well to the pedagogical design. The feature of filtering low-performance students enabled the reminder activity to encourage those students. The bar charts with the percentage of students who turned in an assignment on time or late, and those who missed the assignment, helped the teachers to efficiently identify participating students in each assignment and determine the attrition rates. Consequently, the information from the bar charts drove the teachers to the decision to revise the last two cases to increase students’ submissions in upcoming courses. This real-time data-driven approach facilitates tailored interventions for individual students and makes course adjustments as needed [44], ensuring a more informed and responsive teaching strategy.
The social network graph and the prediction table are more abstract compared to the other graphs, which causes difficulty in enabling interventions, although these visualizations still provide insightful information to the teachers. Some training or workshops for teachers can help to enhance their data literacy to better understand abstract visualizations. Additionally, elaboration text can be helpful for teachers to perceive abstract visualizations.
The number of students varies every time the second and third courses are offered. Even though there are not many students, it is still helpful for teachers to be able to quickly monitor and record each student’s circumstance, as well as the situation of the entire course utilizing this TFD.
In this study, the researcher played an analyst role during the running courses to ask questions relating to the information captured from the dashboard to motivate the teachers’ critical thinking. Thanks to this activity, the teachers thoroughly reflected on the information and suggested necessary changes to the LDs. It was observed that the teachers themselves noticed the information that the researcher asked, but a small driving force was necessary to encourage the teachers’ thoughts. Therefore, to help teachers to be independent in using this dashboard in the future, management can consider providing teachers with a supportive team that motivates teachers’ critical thinking, as the researcher in this study did, or a professional development program to enhance the teachers’ data literacy, whereby teachers can themselves ask analytical questions.

5.2. Technical Aspects

To facilitate the ease of use for teachers, this TFD automates the analytical process. Nevertheless, when implementing this feature, the inconsistency between data and APIs was found, which caused challenges in looking for and combining the correct APIs to ensure the data integrity, as well as useful data to guarantee the meaningfulness of visualizations for teachers. Accordingly, a framework to guide the process of designing courses, creating components in Canvas, extracting data, analyzing, and visualizing is highly necessitated. This framework provides support for the technical, non-technical, and pedagogical aspects of this process. Firstly, the framework increases the alignment of LD and TA and offers checklists for teachers to scrutinize when designing a specific online learning activity. Secondly, this framework aids the generated data located in the right APIs, which eases the programming and coding. Thirdly, this framework also makes the entire process consistent for the relevant stakeholders to follow and avoid misunderstandings.

5.3. Teacher Role

The previous literature revealed the challenges of TFDs regarding teachers such as low adoption of teacher-facing dashboards due to concerns around ethics and privacy [45], difficulties in understanding and interpreting dashboards [46], limited involvement of teachers in the design and development process of TFDs [47], or resistance to new technology (e.g., teachers did not have much time to check analytics [48] or did not take the opportunity to analyze the data during the authentic courses [49]). Therefore, the co-design with teachers plays a key role in solving most of these dilemmas and assures the integration and alignment of LD and TA, the consistency of designers and users, and the usefulness and potential of TEADASH for teachers to make decisions [13]. Teachers’ engagement in the process of the design and implementation of dashboards could fill in the gap between what TA metrics present and what teachers actually need [50] and augment teachers’ understanding of dashboards [6]. Accordingly, teachers must shift from their conventional role as knowledge providers to designers and facilitators of learning [51]. It is also vital to establish a participatory culture of design and a habit among teachers of viewing LD as an inquiry process and TA as a part of the teaching culture [52]. In this study, the involved teachers are designers, researchers, and users at the same time. This reinforces and strengthens the robustness of this dashboard following the disclosure aspect, since the teachers know what they need to keep track of their students’ learning and measure the LDs, leading to the design of necessary features that respond to their needs. The teachers should be the ones to re-evaluate and reflect on their own designs. However, the teachers did not engage in coding the dashboard, so the teachers who were not familiar with data analytics or those that were non-technical still faced the challenge of using and understanding the different visualizations in TEADASH, despite the co-design process.
The differences regarding the teachers’ role in this process were analyzed and understood. For teachers who had not experienced data analytics, they asked pedagogical questions to indirectly motivate the proper functions of the dashboard. The teachers who had experienced data analytics to some extent directly offered various ideas and expressed expectations. It is essential to let teachers use and experience the tool by themselves to recognize the pros and cons of this dashboard.
When the dashboard was deployed and used in the real-time courses, the mismatch between the expected features and the features that the teachers actually needed was revealed. In this study, although the features of prediction and the network graph showing students’ interaction were suggested in the design process, the teachers did not use them in practice. Instead, new features were proposed to fulfill their actual needs. Therefore, teacher engagement in the design and development process is not enough to fulfill the teachers’ needs, as proposed in the prior literature [13]. Instead, letting the teachers use the tool in practice helps to fill the gap between the teachers’ expected features and the features that they really use. The teachers’ use of dashboards in real-world cases also allows teachers to come up with new designs or features for dashboards based on their teaching practices. Another observation is that, during the design process, the teachers were more interested in following students’ progress in both individual and overview insights; however, this interest changed over time toward a summary of an individual or an overview of the courses instead. This may be explained by the limited time available for teachers, and the privacy and data protection rules that changed the teachers’ preference from monitoring individuals toward capturing an overview of the courses.
Designing and developing a dashboard is costly, but teachers’ use is not equal to their demands for the dashboard. The promising benefits of TA and dashboards are known, but when a dashboard is deployed, teachers do not use it in their daily routines [48]. The reasons for this issue come from technology, humans, and the workplace. For technology, some features need to be strengthened to attract teachers more. For humans, teachers should be willing to adopt and use the dashboard more often in order to enhance their teaching practices. For the workplace, the management team should have policies to support teachers in adopting new technology (i.e., provide more time or adjust policy to integrate new technology into teachers’ daily practices). Moreover, using this dashboard to support teachers’ practices is a new experience for the teachers in this study, so slow adoption is relatively foreseen. Time is needed for the teachers to get used to the tool and add this activity as a daily routine in their task list.

6. Conclusions

Analytics about the teacher, their methods, and their actions are not given much attention [6]. Several studies have embarked on this direction by investigating how to make TA effective for teachers, especially through dashboards. The existing dashboards for teachers have not gained full advantages to create impacts on teachers and teaching. Many factors affecting the usefulness of the dashboards include, for example, LD (i.e., learning goals, learning activities, context, and demands), the quality and quantity of data, users, and stakeholders (i.e., ability, attitude), graphs, and techniques. The purpose of this study was to present the design, development, and evaluation of TEADASH using the DSR and the critical theory and to show the deployment of this tool in real-time courses in order to examine how it supports teachers in making decisions on teaching and learning. To address RQ1, the DRs and DPs were identified using the studied courses from two Swedish universities. This dashboard development process involved three cycles with multiple small loops to design the dashboard to fulfill the DPs and address the earlier challenges.
To answer RQ2, although the dashboard’s ability to meet teachers’ needs and support their decision making is great due to the co-design with teachers, letting teachers use the tool in real-time courses revealed the mismatch between the features that teachers expect for the dashboard and the features that they actually use in their daily teaching practices. Thus, co-designing with teachers is not enough, but allowing teachers to use the tool in practice should be executed.
TEADASH also has limitations regarding the features that are also considered for further development, such as automatically sending reminders, automated feedback, integration to Canvas as a plugin instead of a web application, more customization, a more attractive user interface, and more instruction. Additionally, the policies of data privacy limit this study to some extent by confusing the involved teachers in following the data of individual students. Thus, clear guidance and policy for this kind of tool should be established. Thanks to the variety of the studied courses, the dashboard can be applied in blended or online higher education courses that contain the activities of assignment submission, quizzes, viewing learning materials, and discussion forums. The dashboard is currently designed for courses with small numbers of students. To apply this dashboard to large-scale courses, the visualization designs and algorithm choices must be reconsidered. Currently, this dashboard extracts data from Canvas LMS; therefore, it is not compatible with other LMSs.

Author Contributions

Conceptualization, N.B.C.N. and T.K.; methodology, N.B.C.N., M.L., and C.M.Ö.; software, N.B.C.N.; validation, N.B.C.N.; formal analysis, N.B.C.N.; investigation, N.B.C.N.; data curation, N.B.C.N.; writing—original draft preparation, N.B.C.N., M.L., C.M.Ö., and W.J.; writing—review and editing, N.B.C.N., T.K., and W.J.; visualization, N.B.C.N.; supervision, T.K. and W.J. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

This study was approved by the management boards of the two universities involved. The ethical applications were also submitted to the local ethical committees of both universities. Due to there being no relevance to personal information, this study does not require ethical approval.

Informed Consent Statement

Not applicable.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing does not apply to this article.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A

Table A1. Details of the discussions.
Table A1. Details of the discussions.
NoTeachersDurationCycle
1Main teacher–first course80 min1st
2Main teacher–first course34 min
3Main teacher–first course66 min
4Main teacher–second course60 min
5Two teachers–second course60 min
6Three teachers–second course75 min2nd
7Three teachers–second course65 min
8Main teacher–fourth course60 min
9Main teacher–fourth course 60 min
10Main teacher–fourth course45 min
11Three teachers–second course30 min3rd
12Main teacher–fourth course60 min

Appendix A.1. Initial Questions

Did you design the pedagogical elements in the course to support the context of the combination between the learners’ workplaces and the educational offerings from the university?
How were the pedagogical elements for employed adult learners shown in the course (for example, through which educational activities, and learning time)? Could the pedagogical elements for employed adult learners be seen in Canvas?
What did you expect from the students concerning learning behaviors when you designed the elements for employed adult learners?

Appendix A.2. Question List

What are pedagogical designs for employed adult learners? Can the designs be seen on Canvas?
How do the designs help students to achieve the learning outcomes?
Regarding discussions:
Theme 2: teacher initiated -> students posted and commented on each other.
Theme 3: students initiated and developed the discussion (no teacher engagement), more friendly with joy, questions, and comments).
Theme 4: students initiated and teachers answered (there were other kinds of questions not relating to the topic, such as paper not found).
+ What do you think about this progress? What is the role of teachers in discussions?
+ How do the papers, reflection questions, and requirements motivate student engagement?
+ In the synchronous meetings, did you go through/reflect/answer what students have written in the discussions?
+ Did the shortage of time still happen?
+ Here is some feedback from students, and what do you think about these?
“Difficult to know if we should write in the forums or discuss in the lectures”
“More feedback”
“Preferably directly in the forum and not in the following lecture”
Are there problems that you still face?
How do you think to make the designs better on Canvas?
With the discussions, could you assess every theme? Can you do continuous assessment throughout the course or is there still difficulty with assessment?
What do you think about students’ technology knowledge? Do they know how to use Canvas or educational technology?
Do you have prerequisite skills about technology knowledge or similar things for students before they register for the course? And skills that are good to have and skills obtained after the course?

Appendix A.3. Ordered Key Activities in the Meetings for the Evaluations of the Design and Development Process

Demonstration of the TAD with the interpretation linking to the LDs and teachers’ wishes and the analysis of students’ learning patterns, resulting in uncovering the detected issues of either LDs or students’ learning.
Teachers’ reflections on the findings and evaluation of the TAD.
Discussion between the researcher and the teachers to explore the pedagogical solutions and improvements for the TAD regarding data and functions.
Table A2. Data Fields for TEADASH.
Table A2. Data Fields for TEADASH.
NoDataDescription
1Course IdThe Id of a course
2Course NameThe name of the course
3Student IdThe Id of a student
4Student NameThe name of a student
5Assignment IdThe Id of an assignment
6Assignment TitleThe title of an assignment
7Due AtThe deadline of an assignment
8Submitted AtThe date a student submits an assignment
9StatusThe submission status of a student for an assignment set by Canvas (floating, missing, late, on-time)
10Accessed DateThe date a student accesses the Canvas course page
11View CountThe number of times a student looks at the Canvas course page in a day
12Topic IdThe Id of a discussion forum
13Topic TitleThe title of a discussion forum
14Entry IdThe Id of an entry
15Entry User IdThe Id of a user who creates the entry
16Entry UsernameThe name of a user who creates the entry
17Reply IdThe Id of a reply for an entry
18Reply User IdThe Id of a user who creates a reply
19Reply UsernameThe name of a user who creates a reply
Figure A1. The first tab—overview of one course.
Figure A1. The first tab—overview of one course.
Informatics 11 00061 g0a1
Figure A2. The second tab—comparison between courses (submission pattern).
Figure A2. The second tab—comparison between courses (submission pattern).
Informatics 11 00061 g0a2
Figure A3. The second tab—comparison between courses (access pattern).
Figure A3. The second tab—comparison between courses (access pattern).
Informatics 11 00061 g0a3

References

  1. Alhamadi, M.; Alghamdi, O.; Clinch, S.; Vigo, M. Data Quality, Mismatched Expectations, and Moving Requirements: The Challenges of User-Centred Dashboard Design. In Proceedings of the Nordic Human-Computer Interaction Conference, Aarhus, Denmark, 8–12 October 2022; pp. 1–14. [Google Scholar]
  2. Gress, C.L.; Fior, M.; Hadwin, A.F.; Winne, P.H. Measurement and assessment in computer-supported collaborative learning. Comput. Hum. Behav. 2010, 26, 806–814. [Google Scholar] [CrossRef]
  3. Dourado, R.A.; Rodrigues, R.L.; Ferreira, N.; Mello, R.F.; Gomes, A.S.; Verbert, K. A teacher-facing learning analytics dashboard for process-oriented feedback in online learning. In Proceedings of the LAK21: 11th International Learning Analytics and Knowledge Conference, Irvine, CA, USA, 12–16 April 2021; pp. 482–489. [Google Scholar]
  4. Kaliisa, R.; Dolonen, J.A. CADA: A teacher-facing learning analytics dashboard to foster teachers’ awareness of students’ participation and discourse patterns in online discussions. Technol. Knowl. Learn. 2023, 28, 937–958. [Google Scholar] [CrossRef]
  5. Wise, A.F.; Jung, Y. Teaching with analytics: Towards a situated model of instructional decision-making. J. Learn. Anal. 2019, 6, 53–69. [Google Scholar] [CrossRef]
  6. Ndukwe, I.G.; Daniel, B.K. Teaching analytics, value and tools for teacher data literacy: A systematic and tripartite approach. Int. J. Educ. Technol. High. Educ. 2020, 17, 22. [Google Scholar] [CrossRef]
  7. Kaliisa, R.; Misiejuk, K.; López-Pernas, S.; Khalil, M.; Saqr, M. Have Learning Analytics Dashboards Lived Up to the Hype? A Systematic Review of Impact on Students’ Achievement, Motivation, Participation and Attitude. In Proceedings of the 14th Learning Analytics and Knowledge Conference, Kyoto, Japan, 18–22 March 2024; pp. 295–304. [Google Scholar]
  8. Susnjak, T.; Ramaswami, G.S.; Mathrani, A. Learning analytics dashboard: A tool for providing actionable insights to learners. Int. J. Educ. Technol. High. Educ. 2022, 19, 12. [Google Scholar] [CrossRef] [PubMed]
  9. Macfadyen, L.P.; Myers, A. The “IKEA Model” for Pragmatic Development of a Custom Learning Analytics Dashboard; ASCILITE Publications: Christchurch, New Zealand, 2023; pp. 482–486. [Google Scholar]
  10. Keim, D.; Andrienko, G.; Fekete, J.-D.; Görg, C.; Kohlhammer, J.; Melançon, G. Visual Analytics: Definition, Process, and Challenges; Springer: Berlin/Heidelberg, Germany, 2008. [Google Scholar]
  11. Khine, M.S. Learning Analytics and Adaptive Instructional Design Strategies. Int. J. Learn. 2020, 6, 225–229. [Google Scholar] [CrossRef]
  12. Edson, A.J.; Phillips, E.D. Connecting a teacher dashboard to a student digital collaborative environment: Supporting teacher enactment of problem-based mathematics curriculum. ZDM-Math. Educ. 2021, 53, 1285–1298. [Google Scholar] [CrossRef]
  13. Echeverria, V.; Martinez-Maldonado, R.; Shum, S.B.; Chiluiza, K.; Granda, R.; Conati, C. Exploratory versus explanatory visual learning analytics: Driving teachers’ attention through educational data storytelling. J. Learn. Anal. 2018, 5, 73–97. [Google Scholar] [CrossRef]
  14. Nguyen, A.; Tuunanen, T.; Gardner, L.; Sheridan, D. Design principles for learning analytics information systems in higher education. Eur. J. Inf. Syst. 2021, 30, 541–568. [Google Scholar] [CrossRef]
  15. Jayashanka, R.; Hettiarachchi, E.; Hewagamage, K. Technology Enhanced Learning Analytics Dashboard in Higher Education. Electron. J. e-Learn. 2022, 20, 151–170. [Google Scholar] [CrossRef]
  16. Corrin, L.; Kennedy, G.; De Barba, P.; Bakharia, A.; Lockyer, L.; Gašević, D.; Williams, D.; Dawson, S.; Copeland, S. Loop: A learning analytics tool to provide teachers with useful data visualisation. In Proceedings of the Annual Conference of the Australasian Society for Computers in Learning in Tertiary Education, Perth, Australia, 30 November–3 December 2015; pp. 409–413. [Google Scholar]
  17. de Menezes, D.A.T.; Florêncio, D.L.; Silva, R.E.D.; Nunes, I.D.; Schiel, U.; de Aquino, M.S. David—A model of data visualization for the instructional design. In Proceedings of the 2017 IEEE 17th International Conference on Advanced Learning Technologies (ICALT), Timisoara, Romania, 3–7 July 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 281–285. [Google Scholar]
  18. Nguyen, Q.; Huptych, M.; Rienties, B. Linking students’ timing of engagement to learning design and academic performance. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge, Sydney, NSW, Australia, 7–9 March 2018; pp. 141–150. [Google Scholar]
  19. Molenaar, I.; Knoop-van Campen, C. Teacher dashboards in practice: Usage and impact. In Data Driven Approaches in Digital Education: Proceedings of the 12th European Conference on Technology Enhanced Learning, EC-TEL 2017, Tallinn, Estonia, 12–15 September 2017; Proceedings 12; Springer: Berlin/Heidelberg, Germany, 2017; pp. 125–138. [Google Scholar]
  20. Schmitz, M.; Scheffel, M.; Bemelmans, R.; Drachsler, H. FoLA 2—A Method for Co-Creating Learning Analytics-Supported Learning Design. J. Learn. Anal. 2022, 9, 265–281. [Google Scholar] [CrossRef]
  21. Eradze, M.; Rodríguez-Triana, M.; Milikic, N.; Laanpere, M.; Tammets, K. Contextualising learning analytics with classroom observations: A case study. IDA Interact. Des. Archit. (S) 2020, 44, 71–95. [Google Scholar] [CrossRef]
  22. Schmitz, M.; Scheffel, M.; van Limbeek, E.; Bemelmans, R.; Drachsler, H. “Make It Personal!”—Gathering Input from Stakeholders for a Learning Analytics-Supported Learning DesignTool. In Lifelong Technology-Enhanced Learning: Proceedings of the 13th European Conference on Technology Enhanced Learning, EC-TEL 2018, Leeds, UK, 3–5 September 2018; Proceedings 13; Springer: Berlin/Heidelberg, Germany, 2018; pp. 297–310. [Google Scholar]
  23. Dörk, M.; Feng, P.; Collins, C.; Carpendale, S. Critical InfoVis: Exploring the politics of visualization. In Proceedings of the CHI’13 Extended Abstracts on Human Factors in Computing Systems, Paris, France, 27 April–2 May 2013; pp. 2189–2198. [Google Scholar]
  24. Echeverria, V.; Martinez-Maldonado, R.; Granda, R.; Chiluiza, K.; Conati, C.; Buckingham Shum, S. Driving data storytelling from learning design. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge, Sydney, NSW, Australia, 7–9 March 2018; pp. 131–140. [Google Scholar]
  25. Peffers, K.; Tuunanen, T.; Rothenberger, M.A.; Chatterjee, S. A design science research methodology for information systems research. J. Manag. Inf. Syst. 2007, 24, 45–77. [Google Scholar]
  26. Gregor, S.; Chandra Kruse, L.; Seidel, S. Research perspectives: The anatomy of a design principle. J. Assoc. Inf. Syst. 2020, 21, 2. [Google Scholar] [CrossRef]
  27. Semyonovich, V.L. Mind in society. In The Development of Higher Psychological Processes; Harvard University Press: Cambridge/London, UK, 1978. [Google Scholar]
  28. Paas, F.; Renkl, A.; Sweller, J. Cognitive load theory and instructional design: Recent developments. Educ. Psychol. 2003, 38, 1–4. [Google Scholar]
  29. Agarwal, P.K.; Bain, P.M.; Chamberlain, R.W. The value of applied research: Retrieval practice improves classroom learning and recommendations from a teacher, a principal, and a scientist. Educ. Psychol. Rev. 2012, 24, 437–448. [Google Scholar] [CrossRef]
  30. Roediger, H.L.; Butler, A.C. The critical role of retrieval practice in long-term retention. Trends Cogn. Sci. 2011, 15, 20–27. [Google Scholar] [CrossRef]
  31. Deslauriers, L.; McCarty, L.S.; Miller, K.; Callaghan, K.; Kestin, G. Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom. Proc. Natl. Acad. Sci. USA 2019, 116, 19251–19257. [Google Scholar] [CrossRef]
  32. Butler, A.C.; Marsh, E.J.; Slavinsky, J.; Baraniuk, R.G. Integrating cognitive science and technology improves learning in a STEM classroom. Educ. Psychol. Rev. 2014, 26, 331–340. [Google Scholar] [CrossRef]
  33. Oakley, B.A.; Sejnowski, T.J. What we learned from creating one of the world’s most popular MOOCs. NPJ Sci. Learn. 2019, 4, 7. [Google Scholar] [CrossRef]
  34. Mayer, R.E. Thirty years of research on online learning. Appl. Cogn. Psychol. 2019, 33, 152–159. [Google Scholar] [CrossRef]
  35. Castro-Alonso, J.C.; de Koning, B.B.; Fiorella, L.; Paas, F. Five strategies for optimizing instructional materials: Instructor-and learner-managed cognitive load. Educ. Psychol. Rev. 2021, 33, 1379–1407. [Google Scholar] [CrossRef] [PubMed]
  36. Nguyen, N.B.C.; Karunaratne, T. Learning Analytics with Small Datasets—State of the Art and Beyond. Educ. Sci. 2024, 14, 608. [Google Scholar] [CrossRef]
  37. Nguyen, N.B.C. Improving Online Learning Design for Employed Adult Learners. In Proceedings of the European Conference on e-Learning, Brighton, UK, 27–28 October 2022; pp. 302–309. [Google Scholar]
  38. Hevner, A.R.; March, S.T.; Park, J.; Ram, S. Design science in information systems research. MIS Q. 2004, 2004, 75–105. [Google Scholar] [CrossRef]
  39. Khosravi, H.; Shabaninejad, S.; Bakharia, A.; Sadiq, S.; Indulska, M.; Gasevic, D. Intelligent Learning Analytics Dashboards: Automated Drill-Down Recommendations to Support Teacher Data Exploration. J. Learn. Anal. 2021, 8, 133–154. [Google Scholar] [CrossRef]
  40. Gibson, A.; Kitto, K. Analysing reflective text for learning analytics: An approach using anomaly recontextualisation. In Proceedings of the Fifth International Conference on Learning Analytics and Knowledge, Poughkeepsie, NY, USA, 16–20 March 2015; pp. 275–279. [Google Scholar]
  41. Schmitz, M.; Scheffel, M.; van Limbeek, E.; van Halem, N.; Cornelisz, I.; van Klaveren, C.; Bemelmans, R.; Drachsler, H. Investigating the relationships between online activity, learning strategies and grades to create learning analytics-supported learning designs. In Lifelong Technology-Enhanced Learning: Proceedings of the 13th European Conference on Technology Enhanced Learning, EC-TEL 2018, Leeds, UK, 3–5 September 2018; Proceedings 13; Springer: Berlin/Heidelberg, Germany, 2018; pp. 311–325. [Google Scholar]
  42. Harindranathan, P.; Folkestad, J. Learning Analytics to Inform the Learning Design: Supporting Instructors’ Inquiry into Student Learning in Unsupervised Technology-Enhanced Platforms. Online Learn. 2019, 23, 34–55. [Google Scholar] [CrossRef]
  43. Cerro Martinez, J.P.; Guitert Catasus, M.; Romeu Fontanillas, T. Impact of using learning analytics in asynchronous online discussions in higher education. Int. J. Educ. Technol. High. Educ. 2020, 17, 39. [Google Scholar] [CrossRef]
  44. Nguyen, Q.; Rienties, B.; Toetenel, L. Unravelling the dynamics of instructional practice: A longitudinal study on learning design and VLE activities. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference, Vancouver, BC, Canada, 13–17 March 2017; Association for Computing Machinery: New York, NY, USA, 2017; pp. 168–177. [Google Scholar]
  45. Aslan, S.; Alyuz, N.; Tanriover, C.; Mete, S.E.; Okur, E.; D’Mello, S.K.; Arslan Esme, A. Investigating the impact of a real-time, multimodal student engagement analytics technology in authentic classrooms. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; pp. 1–12. [Google Scholar]
  46. Garnett, T.; Button, D. A case study exploring video access by students: Wrangling and visualising data for measuring digital behaviour. In Proceedings of the 33rd International Conference of Innovation, Practice and Research in the Use of Educational Technologies in Tertiary Education, Adelaide, Australia, 27–30 November 2016; p. 225. [Google Scholar]
  47. Kaliisa, R.; Jivet, I.; Prinsloo, P. A checklist to guide the planning, designing, implementation, and evaluation of learning analytics dashboards. Int. J. Educ. Technol. High. Educ. 2023, 20, 28. [Google Scholar] [CrossRef]
  48. Wen, Y.; Song, Y. Learning analytics for collaborative language learning in classrooms. Educ. Technol. Soc. 2021, 24, 1–15. [Google Scholar]
  49. Lockyer, L.; Dawson, S. Learning designs and learning analytics. In Proceedings of the 1st International Conference on Learning Analytics and Knowledge, Banff, AB, Canada, 27 February–1 March 2011; pp. 153–156. [Google Scholar]
  50. Mangaroska, K.; Giannakos, M. Learning analytics for learning design: A systematic literature review of analytics-driven design to enhance learning. IEEE Trans. Learn. Technol. 2018, 12, 516–534. [Google Scholar] [CrossRef]
  51. Laurillard, D. Teaching as a Design Science: Building Pedagogical Patterns for Learning and Technology; Routledge: London, UK, 2013. [Google Scholar]
  52. Mangaroska, K.; Giannakos, M. Learning analytics for learning design: Towards evidence-driven decisions to enhance learning. In Data Driven Approaches in Digital Education: Proceedings of the 12th European Conference on Technology Enhanced Learning, EC-TEL 2017, Tallinn, Estonia, 12–15 September 2017; Proceedings 12; Springer: Berlin/Heidelberg, Germany, 2017; pp. 428–433. [Google Scholar]
Figure 1. The DSR cycles of this study (adapted by [25]).
Figure 1. The DSR cycles of this study (adapted by [25]).
Informatics 11 00061 g001
Figure 2. The timeline of this study.
Figure 2. The timeline of this study.
Informatics 11 00061 g002
Figure 3. Interactions in forums and the details (DP1).
Figure 3. Interactions in forums and the details (DP1).
Informatics 11 00061 g003
Figure 4. Student access (DP2).
Figure 4. Student access (DP2).
Informatics 11 00061 g004
Figure 5. Submission situation (DP3).
Figure 5. Submission situation (DP3).
Informatics 11 00061 g005
Figure 6. Prediction of at-risk students (DP4).
Figure 6. Prediction of at-risk students (DP4).
Informatics 11 00061 g006
Table 1. Link between learning theories, LD, teachers’ needs, and the DPs of TEADASH.
Table 1. Link between learning theories, LD, teachers’ needs, and the DPs of TEADASH.
CourseLearning Theories
(Details in Section 3.1)
LD (Details in Section 3.1)Teachers’ Needs (DRs)DPs (Features) in TEADASH
First and second coursesSocio-cultural learningAsynchronous discussion forumsFollow up the situation in forums (DR1)DP1
Learning materialsExamine if and when the students viewed the uploaded learning materials and watched the videos (DR2)DP5
Assignments (applicable in the first course)Follow up access information with the deadlines of assignments (DR3)DP2
Follow up submissions of every assignment (DR4)DP3
Fourth courseFundamental principles of learning from cognitive psychologyQuizzes
Cases (designed as assignment submission and asynchronous discussion forum)
Examine the frequency of student participation, the number of attempts in quizzes, and the timing of accessing the course page (DR7)
Examine student engagement and completion rates for different tasks (DR8)
Explore whether students interacted with their peers in discussion forums (DR9)
Predict students who need support (DR5).
Examine how often students access the course page (DR6)
DPs 2, 3, 4


DPs 1, 3, 4

DPs 1, 4

DP4
DP2
Table 2. Evaluation results following DRs.
Table 2. Evaluation results following DRs.
CourseDRsSatisfied?
Third and fifth coursesFollow up the situation in forums (DR1)Yes, used in the fifth course but unused in the third course
Examine if and when the students viewed the uploaded learning materials and watched the videos (DR2)Yes, but needs improvement (watching videos is invalid for the third course)
Follow up access information with the deadlines of assignments (DR3)Yes
Follow up submissions of every assignment (DR4)Not tested due to no assignments except the final examination
Fourth coursePredict students who need support (DR5).
Examine how often students access the course page (DR6)
Examine the frequency of student participation, the number of attempts in quizzes, and the timing of accessing the course page (DR7)
Examine student engagement and completion rates for different tasks (DR8)
Explore whether students who interacted with their peers in discussion forums (DR9)
Yes, but unused
Yes
Yes (except the number of attempts)

Yes, but can be improved

Yes, but unused, can be improved
Table 3. Teachers’ overall experience of TEADASH.
Table 3. Teachers’ overall experience of TEADASH.
AspectsSatisfied?
UsefulnessYes
InformativenessYes, but more information can be added
Time savingYes and no
Ease of useYes
Ease of understandingDepend on teachers’ experience
User interfaceNeeds improvement
AuthorizationYes
Tool performanceNeeds improvement
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Nguyen, N.B.C.; Lithander, M.; Östlund, C.M.; Karunaratne, T.; Jobe, W. TEADASH: Implementing and Evaluating a Teacher-Facing Dashboard Using Design Science Research. Informatics 2024, 11, 61. https://doi.org/10.3390/informatics11030061

AMA Style

Nguyen NBC, Lithander M, Östlund CM, Karunaratne T, Jobe W. TEADASH: Implementing and Evaluating a Teacher-Facing Dashboard Using Design Science Research. Informatics. 2024; 11(3):61. https://doi.org/10.3390/informatics11030061

Chicago/Turabian Style

Nguyen, Ngoc Buu Cat, Marcus Lithander, Christian Master Östlund, Thashmee Karunaratne, and William Jobe. 2024. "TEADASH: Implementing and Evaluating a Teacher-Facing Dashboard Using Design Science Research" Informatics 11, no. 3: 61. https://doi.org/10.3390/informatics11030061

APA Style

Nguyen, N. B. C., Lithander, M., Östlund, C. M., Karunaratne, T., & Jobe, W. (2024). TEADASH: Implementing and Evaluating a Teacher-Facing Dashboard Using Design Science Research. Informatics, 11(3), 61. https://doi.org/10.3390/informatics11030061

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop