TEADASH: Implementing and Evaluating a Teacher-Facing Dashboard Using Design Science Research
Abstract
:1. Introduction
- RQ1:
- What are the requirements for a TFD that are useful in assisting teachers to make decisions on teaching and learning?
- RQ2:
- How can TEADASH be designed, developed, and evaluated to assist teachers in making decisions on teaching and learning?
2. Related Research
2.1. Existing Solutions and Limitations
2.2. Using Design Science Research in Developing Teacher-Facing Dashboards
2.3. Critical Theory in Information Visualization
3. Method
3.1. Research Context
3.2. Design and Development Process of TEADASH
3.3. Real-Time Evaluation of TEADASH
3.4. Ethical Consideration in TEADASH
4. Results
4.1. From LD and Teachers’ Needs to Design Requirements
4.2. Design Principles for the Development of Teaching Analytics Dashboard
4.3. Selection for Design of Visualizations in TEADASH
4.4. Evaluation Results
5. Discussion
5.1. TEADASH for Teachers’ Decision Making
5.2. Technical Aspects
5.3. Teacher Role
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Appendix A
No | Teachers | Duration | Cycle |
---|---|---|---|
1 | Main teacher–first course | 80 min | 1st |
2 | Main teacher–first course | 34 min | |
3 | Main teacher–first course | 66 min | |
4 | Main teacher–second course | 60 min | |
5 | Two teachers–second course | 60 min | |
6 | Three teachers–second course | 75 min | 2nd |
7 | Three teachers–second course | 65 min | |
8 | Main teacher–fourth course | 60 min | |
9 | Main teacher–fourth course | 60 min | |
10 | Main teacher–fourth course | 45 min | |
11 | Three teachers–second course | 30 min | 3rd |
12 | Main teacher–fourth course | 60 min |
Appendix A.1. Initial Questions
Appendix A.2. Question List
Appendix A.3. Ordered Key Activities in the Meetings for the Evaluations of the Design and Development Process
No | Data | Description |
---|---|---|
1 | Course Id | The Id of a course |
2 | Course Name | The name of the course |
3 | Student Id | The Id of a student |
4 | Student Name | The name of a student |
5 | Assignment Id | The Id of an assignment |
6 | Assignment Title | The title of an assignment |
7 | Due At | The deadline of an assignment |
8 | Submitted At | The date a student submits an assignment |
9 | Status | The submission status of a student for an assignment set by Canvas (floating, missing, late, on-time) |
10 | Accessed Date | The date a student accesses the Canvas course page |
11 | View Count | The number of times a student looks at the Canvas course page in a day |
12 | Topic Id | The Id of a discussion forum |
13 | Topic Title | The title of a discussion forum |
14 | Entry Id | The Id of an entry |
15 | Entry User Id | The Id of a user who creates the entry |
16 | Entry Username | The name of a user who creates the entry |
17 | Reply Id | The Id of a reply for an entry |
18 | Reply User Id | The Id of a user who creates a reply |
19 | Reply Username | The name of a user who creates a reply |
References
- Alhamadi, M.; Alghamdi, O.; Clinch, S.; Vigo, M. Data Quality, Mismatched Expectations, and Moving Requirements: The Challenges of User-Centred Dashboard Design. In Proceedings of the Nordic Human-Computer Interaction Conference, Aarhus, Denmark, 8–12 October 2022; pp. 1–14. [Google Scholar]
- Gress, C.L.; Fior, M.; Hadwin, A.F.; Winne, P.H. Measurement and assessment in computer-supported collaborative learning. Comput. Hum. Behav. 2010, 26, 806–814. [Google Scholar] [CrossRef]
- Dourado, R.A.; Rodrigues, R.L.; Ferreira, N.; Mello, R.F.; Gomes, A.S.; Verbert, K. A teacher-facing learning analytics dashboard for process-oriented feedback in online learning. In Proceedings of the LAK21: 11th International Learning Analytics and Knowledge Conference, Irvine, CA, USA, 12–16 April 2021; pp. 482–489. [Google Scholar]
- Kaliisa, R.; Dolonen, J.A. CADA: A teacher-facing learning analytics dashboard to foster teachers’ awareness of students’ participation and discourse patterns in online discussions. Technol. Knowl. Learn. 2023, 28, 937–958. [Google Scholar] [CrossRef]
- Wise, A.F.; Jung, Y. Teaching with analytics: Towards a situated model of instructional decision-making. J. Learn. Anal. 2019, 6, 53–69. [Google Scholar] [CrossRef]
- Ndukwe, I.G.; Daniel, B.K. Teaching analytics, value and tools for teacher data literacy: A systematic and tripartite approach. Int. J. Educ. Technol. High. Educ. 2020, 17, 22. [Google Scholar] [CrossRef]
- Kaliisa, R.; Misiejuk, K.; López-Pernas, S.; Khalil, M.; Saqr, M. Have Learning Analytics Dashboards Lived Up to the Hype? A Systematic Review of Impact on Students’ Achievement, Motivation, Participation and Attitude. In Proceedings of the 14th Learning Analytics and Knowledge Conference, Kyoto, Japan, 18–22 March 2024; pp. 295–304. [Google Scholar]
- Susnjak, T.; Ramaswami, G.S.; Mathrani, A. Learning analytics dashboard: A tool for providing actionable insights to learners. Int. J. Educ. Technol. High. Educ. 2022, 19, 12. [Google Scholar] [CrossRef] [PubMed]
- Macfadyen, L.P.; Myers, A. The “IKEA Model” for Pragmatic Development of a Custom Learning Analytics Dashboard; ASCILITE Publications: Christchurch, New Zealand, 2023; pp. 482–486. [Google Scholar]
- Keim, D.; Andrienko, G.; Fekete, J.-D.; Görg, C.; Kohlhammer, J.; Melançon, G. Visual Analytics: Definition, Process, and Challenges; Springer: Berlin/Heidelberg, Germany, 2008. [Google Scholar]
- Khine, M.S. Learning Analytics and Adaptive Instructional Design Strategies. Int. J. Learn. 2020, 6, 225–229. [Google Scholar] [CrossRef]
- Edson, A.J.; Phillips, E.D. Connecting a teacher dashboard to a student digital collaborative environment: Supporting teacher enactment of problem-based mathematics curriculum. ZDM-Math. Educ. 2021, 53, 1285–1298. [Google Scholar] [CrossRef]
- Echeverria, V.; Martinez-Maldonado, R.; Shum, S.B.; Chiluiza, K.; Granda, R.; Conati, C. Exploratory versus explanatory visual learning analytics: Driving teachers’ attention through educational data storytelling. J. Learn. Anal. 2018, 5, 73–97. [Google Scholar] [CrossRef]
- Nguyen, A.; Tuunanen, T.; Gardner, L.; Sheridan, D. Design principles for learning analytics information systems in higher education. Eur. J. Inf. Syst. 2021, 30, 541–568. [Google Scholar] [CrossRef]
- Jayashanka, R.; Hettiarachchi, E.; Hewagamage, K. Technology Enhanced Learning Analytics Dashboard in Higher Education. Electron. J. e-Learn. 2022, 20, 151–170. [Google Scholar] [CrossRef]
- Corrin, L.; Kennedy, G.; De Barba, P.; Bakharia, A.; Lockyer, L.; Gašević, D.; Williams, D.; Dawson, S.; Copeland, S. Loop: A learning analytics tool to provide teachers with useful data visualisation. In Proceedings of the Annual Conference of the Australasian Society for Computers in Learning in Tertiary Education, Perth, Australia, 30 November–3 December 2015; pp. 409–413. [Google Scholar]
- de Menezes, D.A.T.; Florêncio, D.L.; Silva, R.E.D.; Nunes, I.D.; Schiel, U.; de Aquino, M.S. David—A model of data visualization for the instructional design. In Proceedings of the 2017 IEEE 17th International Conference on Advanced Learning Technologies (ICALT), Timisoara, Romania, 3–7 July 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 281–285. [Google Scholar]
- Nguyen, Q.; Huptych, M.; Rienties, B. Linking students’ timing of engagement to learning design and academic performance. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge, Sydney, NSW, Australia, 7–9 March 2018; pp. 141–150. [Google Scholar]
- Molenaar, I.; Knoop-van Campen, C. Teacher dashboards in practice: Usage and impact. In Data Driven Approaches in Digital Education: Proceedings of the 12th European Conference on Technology Enhanced Learning, EC-TEL 2017, Tallinn, Estonia, 12–15 September 2017; Proceedings 12; Springer: Berlin/Heidelberg, Germany, 2017; pp. 125–138. [Google Scholar]
- Schmitz, M.; Scheffel, M.; Bemelmans, R.; Drachsler, H. FoLA 2—A Method for Co-Creating Learning Analytics-Supported Learning Design. J. Learn. Anal. 2022, 9, 265–281. [Google Scholar] [CrossRef]
- Eradze, M.; Rodríguez-Triana, M.; Milikic, N.; Laanpere, M.; Tammets, K. Contextualising learning analytics with classroom observations: A case study. IDA Interact. Des. Archit. (S) 2020, 44, 71–95. [Google Scholar] [CrossRef]
- Schmitz, M.; Scheffel, M.; van Limbeek, E.; Bemelmans, R.; Drachsler, H. “Make It Personal!”—Gathering Input from Stakeholders for a Learning Analytics-Supported Learning DesignTool. In Lifelong Technology-Enhanced Learning: Proceedings of the 13th European Conference on Technology Enhanced Learning, EC-TEL 2018, Leeds, UK, 3–5 September 2018; Proceedings 13; Springer: Berlin/Heidelberg, Germany, 2018; pp. 297–310. [Google Scholar]
- Dörk, M.; Feng, P.; Collins, C.; Carpendale, S. Critical InfoVis: Exploring the politics of visualization. In Proceedings of the CHI’13 Extended Abstracts on Human Factors in Computing Systems, Paris, France, 27 April–2 May 2013; pp. 2189–2198. [Google Scholar]
- Echeverria, V.; Martinez-Maldonado, R.; Granda, R.; Chiluiza, K.; Conati, C.; Buckingham Shum, S. Driving data storytelling from learning design. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge, Sydney, NSW, Australia, 7–9 March 2018; pp. 131–140. [Google Scholar]
- Peffers, K.; Tuunanen, T.; Rothenberger, M.A.; Chatterjee, S. A design science research methodology for information systems research. J. Manag. Inf. Syst. 2007, 24, 45–77. [Google Scholar]
- Gregor, S.; Chandra Kruse, L.; Seidel, S. Research perspectives: The anatomy of a design principle. J. Assoc. Inf. Syst. 2020, 21, 2. [Google Scholar] [CrossRef]
- Semyonovich, V.L. Mind in society. In The Development of Higher Psychological Processes; Harvard University Press: Cambridge/London, UK, 1978. [Google Scholar]
- Paas, F.; Renkl, A.; Sweller, J. Cognitive load theory and instructional design: Recent developments. Educ. Psychol. 2003, 38, 1–4. [Google Scholar]
- Agarwal, P.K.; Bain, P.M.; Chamberlain, R.W. The value of applied research: Retrieval practice improves classroom learning and recommendations from a teacher, a principal, and a scientist. Educ. Psychol. Rev. 2012, 24, 437–448. [Google Scholar] [CrossRef]
- Roediger, H.L.; Butler, A.C. The critical role of retrieval practice in long-term retention. Trends Cogn. Sci. 2011, 15, 20–27. [Google Scholar] [CrossRef]
- Deslauriers, L.; McCarty, L.S.; Miller, K.; Callaghan, K.; Kestin, G. Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom. Proc. Natl. Acad. Sci. USA 2019, 116, 19251–19257. [Google Scholar] [CrossRef]
- Butler, A.C.; Marsh, E.J.; Slavinsky, J.; Baraniuk, R.G. Integrating cognitive science and technology improves learning in a STEM classroom. Educ. Psychol. Rev. 2014, 26, 331–340. [Google Scholar] [CrossRef]
- Oakley, B.A.; Sejnowski, T.J. What we learned from creating one of the world’s most popular MOOCs. NPJ Sci. Learn. 2019, 4, 7. [Google Scholar] [CrossRef]
- Mayer, R.E. Thirty years of research on online learning. Appl. Cogn. Psychol. 2019, 33, 152–159. [Google Scholar] [CrossRef]
- Castro-Alonso, J.C.; de Koning, B.B.; Fiorella, L.; Paas, F. Five strategies for optimizing instructional materials: Instructor-and learner-managed cognitive load. Educ. Psychol. Rev. 2021, 33, 1379–1407. [Google Scholar] [CrossRef] [PubMed]
- Nguyen, N.B.C.; Karunaratne, T. Learning Analytics with Small Datasets—State of the Art and Beyond. Educ. Sci. 2024, 14, 608. [Google Scholar] [CrossRef]
- Nguyen, N.B.C. Improving Online Learning Design for Employed Adult Learners. In Proceedings of the European Conference on e-Learning, Brighton, UK, 27–28 October 2022; pp. 302–309. [Google Scholar]
- Hevner, A.R.; March, S.T.; Park, J.; Ram, S. Design science in information systems research. MIS Q. 2004, 2004, 75–105. [Google Scholar] [CrossRef]
- Khosravi, H.; Shabaninejad, S.; Bakharia, A.; Sadiq, S.; Indulska, M.; Gasevic, D. Intelligent Learning Analytics Dashboards: Automated Drill-Down Recommendations to Support Teacher Data Exploration. J. Learn. Anal. 2021, 8, 133–154. [Google Scholar] [CrossRef]
- Gibson, A.; Kitto, K. Analysing reflective text for learning analytics: An approach using anomaly recontextualisation. In Proceedings of the Fifth International Conference on Learning Analytics and Knowledge, Poughkeepsie, NY, USA, 16–20 March 2015; pp. 275–279. [Google Scholar]
- Schmitz, M.; Scheffel, M.; van Limbeek, E.; van Halem, N.; Cornelisz, I.; van Klaveren, C.; Bemelmans, R.; Drachsler, H. Investigating the relationships between online activity, learning strategies and grades to create learning analytics-supported learning designs. In Lifelong Technology-Enhanced Learning: Proceedings of the 13th European Conference on Technology Enhanced Learning, EC-TEL 2018, Leeds, UK, 3–5 September 2018; Proceedings 13; Springer: Berlin/Heidelberg, Germany, 2018; pp. 311–325. [Google Scholar]
- Harindranathan, P.; Folkestad, J. Learning Analytics to Inform the Learning Design: Supporting Instructors’ Inquiry into Student Learning in Unsupervised Technology-Enhanced Platforms. Online Learn. 2019, 23, 34–55. [Google Scholar] [CrossRef]
- Cerro Martinez, J.P.; Guitert Catasus, M.; Romeu Fontanillas, T. Impact of using learning analytics in asynchronous online discussions in higher education. Int. J. Educ. Technol. High. Educ. 2020, 17, 39. [Google Scholar] [CrossRef]
- Nguyen, Q.; Rienties, B.; Toetenel, L. Unravelling the dynamics of instructional practice: A longitudinal study on learning design and VLE activities. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference, Vancouver, BC, Canada, 13–17 March 2017; Association for Computing Machinery: New York, NY, USA, 2017; pp. 168–177. [Google Scholar]
- Aslan, S.; Alyuz, N.; Tanriover, C.; Mete, S.E.; Okur, E.; D’Mello, S.K.; Arslan Esme, A. Investigating the impact of a real-time, multimodal student engagement analytics technology in authentic classrooms. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; pp. 1–12. [Google Scholar]
- Garnett, T.; Button, D. A case study exploring video access by students: Wrangling and visualising data for measuring digital behaviour. In Proceedings of the 33rd International Conference of Innovation, Practice and Research in the Use of Educational Technologies in Tertiary Education, Adelaide, Australia, 27–30 November 2016; p. 225. [Google Scholar]
- Kaliisa, R.; Jivet, I.; Prinsloo, P. A checklist to guide the planning, designing, implementation, and evaluation of learning analytics dashboards. Int. J. Educ. Technol. High. Educ. 2023, 20, 28. [Google Scholar] [CrossRef]
- Wen, Y.; Song, Y. Learning analytics for collaborative language learning in classrooms. Educ. Technol. Soc. 2021, 24, 1–15. [Google Scholar]
- Lockyer, L.; Dawson, S. Learning designs and learning analytics. In Proceedings of the 1st International Conference on Learning Analytics and Knowledge, Banff, AB, Canada, 27 February–1 March 2011; pp. 153–156. [Google Scholar]
- Mangaroska, K.; Giannakos, M. Learning analytics for learning design: A systematic literature review of analytics-driven design to enhance learning. IEEE Trans. Learn. Technol. 2018, 12, 516–534. [Google Scholar] [CrossRef]
- Laurillard, D. Teaching as a Design Science: Building Pedagogical Patterns for Learning and Technology; Routledge: London, UK, 2013. [Google Scholar]
- Mangaroska, K.; Giannakos, M. Learning analytics for learning design: Towards evidence-driven decisions to enhance learning. In Data Driven Approaches in Digital Education: Proceedings of the 12th European Conference on Technology Enhanced Learning, EC-TEL 2017, Tallinn, Estonia, 12–15 September 2017; Proceedings 12; Springer: Berlin/Heidelberg, Germany, 2017; pp. 428–433. [Google Scholar]
Course | Learning Theories (Details in Section 3.1) | LD (Details in Section 3.1) | Teachers’ Needs (DRs) | DPs (Features) in TEADASH |
---|---|---|---|---|
First and second courses | Socio-cultural learning | Asynchronous discussion forums | Follow up the situation in forums (DR1) | DP1 |
Learning materials | Examine if and when the students viewed the uploaded learning materials and watched the videos (DR2) | DP5 | ||
Assignments (applicable in the first course) | Follow up access information with the deadlines of assignments (DR3) | DP2 | ||
Follow up submissions of every assignment (DR4) | DP3 | |||
Fourth course | Fundamental principles of learning from cognitive psychology | Quizzes Cases (designed as assignment submission and asynchronous discussion forum) | Examine the frequency of student participation, the number of attempts in quizzes, and the timing of accessing the course page (DR7) Examine student engagement and completion rates for different tasks (DR8) Explore whether students interacted with their peers in discussion forums (DR9) Predict students who need support (DR5). Examine how often students access the course page (DR6) | DPs 2, 3, 4 DPs 1, 3, 4 DPs 1, 4 DP4 DP2 |
Course | DRs | Satisfied? |
---|---|---|
Third and fifth courses | Follow up the situation in forums (DR1) | Yes, used in the fifth course but unused in the third course |
Examine if and when the students viewed the uploaded learning materials and watched the videos (DR2) | Yes, but needs improvement (watching videos is invalid for the third course) | |
Follow up access information with the deadlines of assignments (DR3) | Yes | |
Follow up submissions of every assignment (DR4) | Not tested due to no assignments except the final examination | |
Fourth course | Predict students who need support (DR5). Examine how often students access the course page (DR6) Examine the frequency of student participation, the number of attempts in quizzes, and the timing of accessing the course page (DR7) Examine student engagement and completion rates for different tasks (DR8) Explore whether students who interacted with their peers in discussion forums (DR9) | Yes, but unused Yes Yes (except the number of attempts) Yes, but can be improved Yes, but unused, can be improved |
Aspects | Satisfied? |
---|---|
Usefulness | Yes |
Informativeness | Yes, but more information can be added |
Time saving | Yes and no |
Ease of use | Yes |
Ease of understanding | Depend on teachers’ experience |
User interface | Needs improvement |
Authorization | Yes |
Tool performance | Needs improvement |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Nguyen, N.B.C.; Lithander, M.; Östlund, C.M.; Karunaratne, T.; Jobe, W. TEADASH: Implementing and Evaluating a Teacher-Facing Dashboard Using Design Science Research. Informatics 2024, 11, 61. https://doi.org/10.3390/informatics11030061
Nguyen NBC, Lithander M, Östlund CM, Karunaratne T, Jobe W. TEADASH: Implementing and Evaluating a Teacher-Facing Dashboard Using Design Science Research. Informatics. 2024; 11(3):61. https://doi.org/10.3390/informatics11030061
Chicago/Turabian StyleNguyen, Ngoc Buu Cat, Marcus Lithander, Christian Master Östlund, Thashmee Karunaratne, and William Jobe. 2024. "TEADASH: Implementing and Evaluating a Teacher-Facing Dashboard Using Design Science Research" Informatics 11, no. 3: 61. https://doi.org/10.3390/informatics11030061
APA StyleNguyen, N. B. C., Lithander, M., Östlund, C. M., Karunaratne, T., & Jobe, W. (2024). TEADASH: Implementing and Evaluating a Teacher-Facing Dashboard Using Design Science Research. Informatics, 11(3), 61. https://doi.org/10.3390/informatics11030061