Next Article in Journal
Exploring the Effectiveness of a Virtual Coaching Program to Support Staff Working at Families as First Teachers Playgroups in the Remote Northern Territory, Australia
Previous Article in Journal
Education for Environmental Sustainability Component: Innovative Strategies for Experiential Learning in Natural Contexts
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Integrating Digital Personalised Learning into Early-Grade Classroom Practice: A Teacher–Researcher Design-Based Research Partnership in Kenya

1
EdTech Hub and Jigsaw, London NW10 4LL, UK
2
Manchester Institute of Education, University of Manchester, Manchester M13 9PL, UK
3
School of Education, Kenyatta University, Nairobi P.O. Box 43844-00100, Kenya
4
Women Educational Researchers of Kenya, Nairobi 00100, Kenya
5
Kindergarten Experts International, P.O. Box 648, Nairobi 00300, Kenya
*
Author to whom correspondence should be addressed.
Educ. Sci. 2025, 15(6), 698; https://doi.org/10.3390/educsci15060698
Submission received: 1 March 2025 / Revised: 5 May 2025 / Accepted: 25 May 2025 / Published: 4 June 2025
(This article belongs to the Special Issue Embedding Mobile Technologies in the Classroom)

Abstract

:
Although growing evidence suggests that digital personalised learning (DPL) has the potential to enhance learning outcomes, there is little research about the effective implementation and integration of DPL into the classroom. The aim of this study is to investigate the pedagogical implications of integrating a DPL tool into Kenyan early-grade classrooms to bridge the gap between theory and practice. This paper reports on systematic, design-based research conducted over three years, featuring five phases, each testing iterations to specific aspects of DPL implementation. The findings demonstrate that the pedagogic dimensions of classroom-integrated DPL are pivotal to its effective uptake and implementation. A key research contribution is the identification of a distinct gap between theoretical and practical conceptualisations of DPL, with teachers focused primarily on curriculum alignment and classroom management. The analysis also identified teachers’ central role in the process of personalisation, nuancing existing DPL frameworks by exploring interactions between the digital and classroom environments, as well as highlighting important considerations around access and equality. Recommendations include the co-design of DPL with teachers, drawing on their pedagogical perspectives to enhance integrative approaches.

1. Introduction

There is a growing body of evidence that digital personalised learning (DPL, see Section 2.1) has the potential to enhance learning outcomes (Alrawashdeh et al., 2024; Lin et al., 2024; Major et al., 2021; Tlili et al., 2024; Zheng et al., 2022). However, the multifaceted design of DPL and many variations in its implementation demand greater critical examination of which specific aspects of DPL tools work and why. This is compounded by notable evidence gaps identified by recent reviews of the literature. First, systematic reviews on DPL indicate that the majority of existing research has taken place in high-income countries, largely neglecting the educational contexts of low- and middle-income countries (LMICs; Alrawashdeh et al., 2024; Van Schoors et al., 2021; Zheng et al., 2022). Second, there is a lack of research on the integration of DPL into existing classroom practice, with most of the literature focusing on “supplementary” approaches that are implemented outside of regular classroom instruction (see Section 2.2; Major et al., 2021; Van Schoors et al., 2025).
This paper makes an important contribution toward addressing this evidence gap by investigating the integration of a DPL tool into early-grade (pre-primary and lower primary) classrooms in Kenya, a lower-middle-income country in East Africa. Over three years of incremental design-based research (DBR, see Section 3.2), we examined different aspects of implementation, from stakeholders’ experiences of using the tool to digital strategies to support its uptake. The five phases of DBR reported in this paper tested design iterations to address identified implementation challenges, allowing the research team both to analyse specific dimensions of the DPL tool in context and to feed the findings incrementally into the design and development of the intervention. Contributing to a special section on Embedding Mobile Technologies in the Classroom, this paper not only offers insights gained over a sustained period of iterative implementation but also attends to the multidimensional nature of DPL.
Throughout the study, specific priority was given to facilitating teachers’ voices as co-contributors to the DBR process, recognising that research on DPL has tended to focus on technological implementation, overlooking pedagogical perspectives (Vanbecelaere & Benton, 2021). Furthermore, evidence suggests the potential benefits of combining the insights of teachers with DPL systems to maximise the affordances of personalisation (Sun et al., 2024a; Molenaar, 2022). As such, a key contribution of this study is a bridging of the gap between educational theory and practice—namely, between academic theories of DPL and teachers’ experiences of integrating a DPL tool into their classrooms—in order to ensure that DPL research is grounded in pedagogical realities.
Complementary to this paper is a published chapter (Major et al., 2024) that critically reflects on the theoretical underpinnings of the design-based research approach. While this subsequent paper builds on that chapter, it offers two distinct contributions. First, we present and discuss the research findings—as opposed to reflecting solely on the methodological framework. Second, we report on subsequent phases of the DBR, extending the methodological framework and reporting on three years of DBR in multiple education contexts.
While the paper reports on five phases of DBR, each of which had distinct design conjectures and data collection methods, the research was informed by one overarching research question:
What are the pedagogical implications of integrating a DPL tool into Kenyan early-grade classrooms?
In this paper, our conceptualisation of “pedagogy” draws on Beetham and Sharpe’s (2013) understanding of the term, in its original sense, as meaning “guidance-to-learn: learning in the context of teaching, and teaching that has learning as its goal” (p. 43). They assert that, at the heart of “pedagogy”, is an essential dialogue between teaching and learning, which is further framed by how these activities are planned and structured. Understood as such, the integration of technology into a pedagogical setting (i.e., a classroom) is necessarily going to change not only the ways in which teaching and learning take place but also what constitutes effective pedagogy.
This idea of pedagogical change underpins our overarching research question. It also informs the study’s two sub-research questions, which focus on two specific aspects of pedagogical design:
  • What is the pedagogical role of the DPL tool in supporting learning?
  • What is the pedagogical role of teachers in relation to the DPL tool?
The reported design-based research, while involving a variety of methods and sample groups, retained a consistent focus on these two sub-research questions. This paper begins by outlining the context of the research, including our conceptualisation of DPL, a comparison of different implementation methods, and a description of the DPL model investigated in Kenya. A summary of the overarching methodology follows, including the adopted theoretical framework of intermediate theory building and DBR phases. These phases are then presented sequentially, each providing an overview of design conjectures and iterations, methods, and key findings. Finally, the discussion aims to bridge the theory–practice gap, reflecting on the implications of the findings for the design and implementation of classroom-integrated DPL in a way that is grounded in contextualised pedagogical practice.

2. Context

2.1. Defining Digital Personalised Learning

Research on the role of technology in personalised learning dates back to the early 20th century, with the introduction of “teaching machines”, and has experienced repeated waves of interest and development in the decades since (Holmes et al., 2018; Watters, 2021). More recently, increased access to digitised tools has been argued to enhance the potential for responsive and adaptive systems to offer personalised learning opportunities within a digital environment, albeit with varying success and shortcomings in practice (FitzGerald et al., 2018). The design of such tools typically draws on a paradigm of education that proposes to shift away from a one-size-fits-all approach toward one that is tailored to individual needs (Aleven et al., 2017; Gro, 2017). This seeks to acknowledge not only the variation amongst learners’ cognitive and non-cognitive characteristics (Sampson & Karagiannidis, 2002) but also optimise individual outcomes in these areas (Bernacki et al., 2021). Notwithstanding important critiques—including concerns about over-reliance on statistical models and databases for educational decision making and exaggerated claims about educational effect (Pelletier, 2024; Selwyn, 2020; Shiohira & Holmes, 2023; Watters, 2021)—DPL has gained significant attention in education research, policy, and practice.
While developments in technology have expanded the potential of DPL tools, this has resulted in a broad conceptualisation of what “digital personalised learning” comprises, with tools differing considerably and a confusing lack of common definitions or terminology existing (Van Schoors et al., 2021). A range of frameworks have been offered to differentiate amongst the various dimensions of personalisation. For instance, Holmes et al. (2018) offer a six-dimensional model of personalisation focused on aspects of the learning process that can be personalised, while FitzGerald et al. (2018) propose a similar framework focused on personalisation within technology-enhanced learning. Vandewaetere and Clarebout (2014) focus further on the adaptivity design within DPL tools, offering a four-dimensional framework of the technical process of adaptation alongside three context-related elements related to the hardware, time and place of use.
In this paper, we follow Van Schoors et al.’s (2021) definition of DPL, which was developed following a systematic review of the diverse conceptual and methodological trends in the field. In this definition, DPL is understood as the use of a digital learning environment that adapts to the individual learner, with the goal of optimising individual and/or collaborative learning processes. This personalisation can (1) seek to enhance cognitive, affective, motivational, and/or metacognitive learner characteristics; (2) relate to different aspects of the learning environment, including the nature, number, and sequence of learning tasks, as well as the content, instruction, and support provided; (3) incorporate information from the teacher, learner, or digital tool itself; and (4) be enhanced by teachers through effective use of data generated by DPL tools. This definition is a refinement of Vandewaetere and Clarebout’s (2014) framework, recognising that theirs’s and other frameworks tend to focus on the adaptivity of DPL tools but neglect implementation as a critical factor. Van Schoors et al.’s (2021) definition, therefore, balances dimensions of technical adaptation with contextual factors—including the different ways that DPL can be implemented and the critical role of teachers in this.

2.2. Comparing DPL Implementation Models

Further frameworks have been offered to differentiate amongst the various ways in which DPL tools are implemented to support learning. Van Schoors et al. (2021) make a binary comparison between “integrated” or “standalone” implementation, distinguishing between approaches in which DPL is used as part of a larger learning environment where teachers take additional actions and approaches in which these additional activities are absent. Major and Francis (2020); later echoed by (Tailor, 2022) propose three implementation models, identified during a rapid review of DPL evidence from LMICs: substitute, supplementary, and integrative approaches (summarised in Table 1). The first of these involves DPL tools being used in lieu of (i.e., in substitution of) teaching, with instruction delivered solely through technology. Second, supplementary approaches are implemented outside of regular classroom instruction and can take place independent of or with teacher guidance. Conversely, integrative approaches involve the coexistence of teacher and technology, where teacher, student, and classroom interactions are taken into account and technology is used to facilitate teaching and learning. Prior research on the integration of technology in education also highlights curriculum alignment as central to this process (Liu et al., 2017; Baylor & Ritchie, 2002; Dias, 1999): “technology is integrated when it is used in a seamless manner to support and extend curriculum objectives and to engage students in meaningful learning. It is not something one does separately; it is part of the daily activities taking place in the classroom” (Dias, 1999, p. 11).
As Van Schoors et al. (2025) acknowledge, such categorisation of DPL implementation approaches risks oversimplification, particularly if variations in the type of DPL tool or the learning environment into which it is being implemented are not taken into account. However, this classification has revealed a critical evidence gap, as Major and Francis’s (2020) rapid evidence review highlighted a skew in the literature, with a focus on research concerning supplementary approaches (comprising 14 of the 24 studies identified). This observation is also supported by a landscape review of trends in DPL in LMICs: 35 of the 40 personalised learning products included in the review were classified as implemented for supplemental learning (UNICEF, 2022).
There is, therefore, an evident need for research on integrative DPL—tools which are designed for and implemented in classrooms, alongside regular instruction. A recent study by Van Schoors et al. (2025) in Flanders, Belgium, begins to address this gap, examining teachers’ perceptions of DPL and its implementation in their lessons. Further evidence is needed, however, from LMIC contexts, which face persistent challenges, including the cost of acquiring and maintaining hardware and a lack of teacher professional development opportunities (Angrist et al., 2023; Hennessy et al., 2022). Moreover, placing teachers at the centre of this research is all the more critical in LMIC contexts, considering the inherent bias that is often embedded in the design and delivery of education technologies, typically developed in high-income countries (Zubairi et al., 2021; Daltry et al., in press).

2.3. Investigating a Classroom-Integrated DPL Model in Kenya

Reported research focuses on a DPL tool developed and implemented by EIDU, which is being scaled across Kenya. Designed to be a cost-sensitive programme for an LMIC context and to function offline to mitigate connectivity challenges, the tool comprises an application (app) installed on low-cost Android smartphones (Friedberg, 2023), with EIDU also providing training and support to teachers on using the tool. The app features two interfaces. The learner interface is accessed through individual learner profiles and contains personalised content for literacy and numeracy learning targeting Kenyan pre- and lower-primary school learners (4–8 years old). The digital content units, each comprising three to five exercises, are structured into strands and substrands, mapped to the Kenyan competency-based national curriculum (CBC) and approved by the Kenya Institute of Curriculum Development.
The teacher interface provides access to two main features: digitised lesson plans and data dashboards. The former derive from two structured pedagogy programmes: at pre-primary, the Tayari programme, developed by RTI and the Kenyan government for numeracy and literacy lessons; at primary, the Early Grade Mathematics (EGM) programme, for numeracy lessons only. Similarly mapped to the CBC, these plans align with the DPL units, allowing the tool to suggest learning content most relevant to that day’s lesson delivery. The second feature involves dashboards that indicate learners’ weekly usage time and digital curriculum progress.
The personalisation design of the EIDU DPL tool involves adapting the sequence of the digital learning units based on individual learners’ learning history, including performance, progress, and engagement data. A long short-term memory (LSTM) neural network is used to predict and select learning units most likely to maximise each learners’ engagement with new content (i.e., to maintain learner interest through ongoing participation; Sun et al., 2024b). The app can also run locally to ensure offline compatibility.
Central to this paper is EIDU’s classroom-integrated implementation model. In addition to content of the tool being curriculum-aligned to support progress toward national education objectives, devices are used in the classroom during the school day. While the lesson plans are delivered during timetabled slots, the DPL content can be accessed throughout the day, as determined by the teacher. Users are selected to access the DPL tool through a randomly ordered list of learner profiles, unless teachers choose to override this. The integrated nature of the tool therefore depends not only on its alignment with existing curriculum and classroom structures but also on the teachers’ own pedagogical judgement—the implications of which are the primary focus of this paper.

3. Overarching Methodology

This paper reports on five phases of DBR conducted between May 2022 and September 2024, involving a team of 12 researchers based in the UK, Kenya, the US, and Egypt (six of whom co-authored this paper). While each of these phases had distinct research methods, they were informed by the overarching research and sub-research questions and a methodology focused on teacher–researcher collaboration and iterative evaluation in a real-world context. Detailed reflections on this approach are outlined in a recent chapter (Major et al., 2024), and expanded on below.

3.1. Theoretical Framework: Intermediate Theory Building

The methodological design of the study was informed by intermediate theory building: a participatory approach in which teachers and researchers, as “co-inquierers”, bridge the gap between education theory and a real-world educational setting (Hennessy & Deaney, 2009; Hennessy, 2014). This departs from traditional practitioner-led action research or academic approaches that prioritise data gathering, instead offering a reconceptualisation of the roles of practitioner and teacher where collaboration is the focus (Hennessy & Deaney, 2009).
In this study, intermediate theory building was used as a theoretical framework that underpinned the research questions and analytical strategy but not as a data collection approach. The principle of bridging a gap between research and practice was foundational to the study design: as outlined in the introduction, the (sub-)research questions sought to understand teachers’ own theories of DPL and its integration into existing classroom practice (sub-research question 1) and the impact on their own pedagogical practice (sub-research question 2). More “traditional” methods of data collection were used for each DBR phase, with teachers not involved in collecting data of their own. Nonetheless, collaboration between teachers and researchers was facilitated at intervals throughout the DBR to build a bridge between research and practice, including teachers participating in the validation of existing results or suggesting iterations to the implementation approach (see Section 3.2 and Section 3.5).

3.2. Methodology: Design-Based Research

Design-based research is a systematic approach which can inform the development of educational “products” by combining theory, research findings, and iterative practice (Smørdal et al., 2021). The intertwining of design and research—design being research-based and research being design-based (Bakker, 2019)—make it a valuable approach for developing practical design principles, patterns, and/or grounded theory that reflect the conditions in which they operate (Anderson & Shattuck, 2012). Limitations including the inherent “messiness” of DBR and the challenges of sustaining implementation after research completion are recognised (Buhl et al., 2022; McKenney & Reeves, 2018). However, its focus on collaboration with key stakeholders and generating practical knowledge relevant to real-world contexts make it an effective framework for investigating the pedagogical design of education technology (Wang & Hannafin, 2005).
In this paper, we use a process model proposed by Hoadley and Campos (2022) to structure the DBR methodology. This model comprises four stages. First, “grounding” involves the identification of a theoretical gap or practical problem that requires investigation in a real-world setting—in this case, the lack of evidence about classroom-integrated DPL in LMICs and the need to apply a critical, pedagogical lens to identify which aspects of DPL contribute toward education outcomes. “Conjecturing” and “iterating” follow, where researchers set out a theory of action for testing a design and then collecting data in the implementation context to refine initial conjectures and produce new designs. This provided a critical opportunity for collaboration, with teachers’ suggestions for iteration being directly fed into the DBR design. It is acknowledged that reported research adopted a broad approach to iteration: rather than multiple, repetitive cycles of iterating the same aspect of the DPL tool, each phase of the DBR focused on a different dimension of implementation. This is not to say the five phases of DBR were unconnected: they built incrementally, with the results of the previous phase informing the design of the next and feeding directly into implementation (see Section 3.5). However, each phase did focus on a different aspect of the DPL tool’s integration into classrooms in order to test a range of potential solutions to existing challenges, rather than repetitively refining one dimension. Finally, “reflecting” takes place through the analysis of all data collected (with teachers in this study again able to feed collaboratively into this stage through the validation of the analysed data through workshops), leading to a variety of “end points”. These outputs, Anderson and Shattuck (2012) contend, do not constitute grand theories or generalisable principles; rather, the development of design principles or grounded theory aims to help researchers and practitioners understand and adjust the intervention within its context. In the case of the present paper, design principles take the form of pedagogical implications, identified during the integration of a DPL tool into early-grade classroom practice in Kenya.

3.3. Summary of Samples and Research Context

While the sampling strategy for each phase is outlined within the relevant methods sections below, Table 2 provides an overview of participants and research contexts in which each phase took place. The DBR involved schools from seven Kenyan counties, balancing small samples to maximise teacher–researcher collaboration (e.g., Phase 2, involving two weeks of intensive, collaborative research with six teachers), with larger samples to assess the impact of new software and hardware features at scale (e.g., Phase 3, involving 4151 schools in an A/B/C test).
There are recognised strengths and limitations to this approach. The educational and geographic differences amongst samples limited our ability to control for specific contextual factors that may influence implementation; however, this was mitigated by the nature of DBR, which is necessarily grounded in the reality in which it is being conducted. Furthermore, the range of sample sizes and educational contexts allowed us to select the most appropriate data collection method for each phase that best suited the DBR iteration being investigated.
The adopted approach also reflected the reality of conducting independent research alongside a scaling education programme. At the point of research inception, EIDU were operational in pre-primary government schools in one Kenyan county (Mombasa). However, over the course of the three-year research partnership with EdTech Hub (data collection taking place between May 2022 and October 2024), EIDU’s DPL programme began a phased rollout to numerous new counties, under agreements signed with 46 of the 47 counties in Kenya (which have jurisdiction over the pre-primary education system). A large-scale pilot involving around 150 schools was also started in Nairobi as the first exploratory step to lower-primary schools. As such, the DBR samples reflect a range of contexts, allowing for design conjectures and iterations to the implementation model to be tested in new educational settings.

3.4. Informed Consent and Ethical Considerations

The study was conducted in accordance with the ethical guidelines set by the British Educational Research Association (2018), and approved by the Scientific and Ethics Review Unit of the Kenya Medical Research Institute (KEMRI) and National Commission for Science, Technology and Innovation (NACOSTI).
Following the approved protocols, informed consent for participation was obtained from all adult participants in the study. “Gatekeeper” consent was obtained from teachers for all learners taking part in classroom observations during the study, while headteachers were provided with detailed information sheets via which parents and caregivers were informed of the study and provided with opt-out details. Additionally, local authority approval and teachers’ gatekeeper consent was obtained by EIDU to collect digital data as part of the provision of the DPL tool in classrooms. These data were then anonymised locally on the digital device, uploaded in anonymous form to EIDU’s servers, and shared with researchers for secondary analysis under a comprehensive data sharing agreement.

3.5. Overview of the DBR Phases

Table 3 provides an overview of the five DBR phases, while the methods for each phase are presented in full in a sequential format in Section 4. Table 3 extends an overview of the methodology outlined in a previous publication (Major et al., 2024, Figure 16.1), with two key additions. First, instead of providing a general overview of the purpose and outcomes of each DBR phase, Table 3 details specific conjectures and iterations that underpin each one. Second, three additional phases of DBR, which occurred after the methodology chapter was disseminated, are incorporated, providing a comprehensive research account. Additional, non-reported DBR activities are also presented in italics (see Major et al., 2024 for further details).

3.6. Presentation of the DBR Methods and Results

In the following five sub-sections of Section 4, an abridged summary of the methods and results of each DBR phase is presented. These phases are presented sequentially, rather than grouping all methods and results together in separate sections, since the results of each phase directly informed the methods chosen for the subsequent phase. As such, this sequential structure reflects the iterative nature of the DBR itself.
Data analysis involved a two-stage, systematic approach. First, data from each DBR phase were rigorously analysed (as summarised in each methods section below in Section 4), prior to the next phase commencing. Second, after all phases of data collection and analysis were completed, an additional round of thematic analysis was conducted by two UK-based researchers. Using the two sub-research questions as a framework, the most prominent results from each phase were thematically grouped based on the pedagogical roles of (1) the DPL tool in supporting learning and (2) the teacher in relation to the DPL tool. The resulting “overarching” themes (summarised in Table 4) are presented in the five sections below, offering a high-level overview of the results from each phase while retaining a focus on the study’s research questions. This provides the basis for the subsequent discussion.

4. Five Phases of Design-Based Research: Methods and Results

4.1. Phase 1: Foundational Phase of Integrated, Multi-Disciplinary Methods

4.1.1. Design Conjecture and Iteration

The first phase of the DBR was informed by existing scholarship that indicates that the integration of any new technology into the classroom can be unpredictable and complex (Agyei, 2021; Reich, 2020; Outhwaite et al., 2023; Van Schoors et al., 2025). The first phase was, therefore, foundational, seeking to understand stakeholders’ perspectives of implementing the DPL tool and to observe classroom practice before co-designing any iterations to the current implementation model—an overview of which is presented in Table 5.

4.1.2. Methods: Sample, Data Collection, and Analysis

This phase combined “traditional” mixed-methods research with teacher workshops (Major et al., 2024). Data collection took place in May–June 2022 in Mombasa county—the first county in which EIDU’s DPL tool had been provided in all government pre-primary schools. Purposive sampling selected one school in each of the six subcounties, to provide a representative mix of urban, peri-urban, and rural geographies, excluding schools with fewer than 15 learners per grade.
Research protocols were developed for the three strands of data collection and analysis. First, key informant interviews (KIIs) and focus group discussions (FGDs) were conducted with teachers (7 FGDs), headteachers (6 KIIs), early childhood development (ECD) officers (6 KIIs), and EIDU staff (6 KIIs) to gather perspectives on the implementation of the DPL tool in classrooms thus far. Second, direct observations (13 full school days) and indirect observations (91 videos of DPL use, totalling 5 h 7 min in PP1 and 3 h 27 min in PP2) enabled the research team to conduct observational analyses of the DPL tool in practice, including when during the school day the tool was being used and how it was being distributed amongst learners. Third, two innovation workshops with 22 teachers, involving a “user journey mapping” strategy, facilitated the co-design of possible strategies to mitigate identified implementation challenges. Following data collection, qualitative data were transcribed, independently double-coded in Google Docs by pairs of UK and Kenyan researchers, and then imported into nVivo to calculate intercoder reliability (overall finding an average Cohen’s kappa of 0.75, indicating strong inter-rater reliability) and thematically analysed (with 414 unique codes across 44 themes identified). Quantitative data were sorted and cleaned and then analysed in Excel using descriptive statistics and frequency heatmaps to investigate the frequency of different aspects of implementation. Finally, all results and recommendations were triangulated using a “strength of evidence” framework (Building Evidence in Education, 2015) in order to systematically identify patterns amongst the results, ensuring robustness and reliability through cross-verification from multiple sources and perspectives.

4.1.3. Results

Four overarching themes were identified when assessing the complete set of findings from Phase 1 alongside the DBR study’s two sub-research questions: three relating to the pedagogical role of the DPL tool in supporting learning (1a. Variation in when the DPL tool is used during the school day; 1b. Teachers’ conceptualisation of the purpose of the DPL tool; and 1c. Suggested improvements to the DPL tool) and one relating to the pedagogical role of teachers in relation to the DPL tool (1d. Distribution of the DPL tool amongst learners).
The first of these overarching themes (1a) related to the time(s) during the school day that the DPL was observed to be used by learners. Direct observation data, capturing the activity during which the DPL tool was used and the length of time for which it was used, revealed a lack of common approach amongst classrooms. As presented in Table 6, the DPL tool was observed as being used by learners most frequently during break/lunchtime, free time (i.e., unstructured classroom activity) and independent classwork (i.e., activities following whole class lesson delivery). However, usage of the DPL tool during these activities was not consistent, with usage observed only for 46%, 80%, and 44% of all occurrences of these activities (26, 10, and 16 occurrences, respectively). This indicated that, although the DPL was being used, teachers had developed different strategies for embedding it during existing classroom practice.
A second overarching theme (1b) related to teachers’ conceptualisation of the DPL tool. Teachers and other key education stakeholders (headteachers, ECD officers, and EIDU staff) reported perceived benefits of the DPL tool during FGDs and KIIs, the most common of which were the reported improvement in soft skills (learners’ mood and motivation to learn in 19 of 25 FGDs/KIIs), reduction in learner absenteeism from school (16 of 25 FGDs/KIIs), and reported improvements in foundational teaching and learning (16 of 25 FGDs/KIIs). When asked to elaborate on the perceived purpose of the DPL tool, two main pedagogical concepts emerged: first, the development of digital literacy (16 of 25 FGDs/KIIs); and, second, the reinforcement of key concepts through the alignment of the curriculum with the DPL content (12 of 25 FGDs/KIIs). For example, one teacher [School 6, Teacher R1] explained: “it is reinforcing what you have had on the board, you have had in the classroom. Now they see it with their own—they are having it now in their hands”.
Implementation challenges and suggestions for improvement were also identified by all participating stakeholder groups (overarching theme 1c). Foremost amongst these was the request to increase the number of devices available in the classroom (21 of 25 FGDs/KIIs). While the reason for this suggestion was not always provided, two sub-themes suggested that it was predominantly to mitigate the challenge of sharing the device between the teacher and learners (13 of 25 FGDs/KIIs) and the risk that not all learners are able to access the DPL content each day (12 of 25 FGDs/KIIs).
This linked to the final overarching theme (1d): the teacher’s role in distributing the DPL tool amongst learners. Indirect observations via videos of the DPL tool being used were analysed to quantify the way in which the DPL tool was distributed amongst learners in the classroom. Although the app was designed to automatically display the next randomly sequenced learner’s profile picture after 5 min of use, so that learners can independently pass the device between each other, only 46% of the handovers captured on video (44 of 96 instances) were learner to learner. Instead 48% (46 of 96) of observed handovers involved the teacher facilitating the handover. The data also suggested that reliance on the teacher’s support may potentially lessen as learners’ age increases. Learners in PP2 were more likely to initiate handover independently of the teacher (55% of 44 instances), compared to PP1 (37% of 54 observed handovers). However, the observed reliance on the teacher to facilitate the distribution of the device has implications for a DPL model that is classroom-integrated due to the additional demand on the teachers’ time during other classroom activities.

4.2. Phase 2: Iterative, Co-Learning Phase of Lesson Study

4.2.1. Design Conjecture and Iteration

Through developing a strength of evidence framework based on the results of the first DBR phase and discussing this in a workshop with the implementing partner (Major et al., 2024), three areas of the implementation model were identified as requiring further iteration: the time at which the device is used during the school day (theme 1a), the number of devices available in the classroom (theme 1c), and to which learners the DPL tool is distributed (theme 1d). From the results of Phase 1, we conjectured that these three dimensions were central to the integration of the DPL tool into the classroom and affecting its potential effectiveness for teaching and learning. The following iterations to the implementation model were therefore proposed (as presented in Table 7): first, for teachers to identify specific groups of learners who would receive priority to use the DPL tool on specific days; second, to introduce an additional device into the classroom.

4.2.2. Methods: Sample, Data Collection and Analysis

This phase involved an adapted form of lesson study: a method originating from Japanese teacher professional development (Fernandez & Yoshida, 2004; Warwick et al., 2016). Similar to action research methodologies, lesson study has a cyclical approach to research that places emphasis on the development of instruction through reflection (Hanfstingl et al., 2019). It typically comprises cycles of instructional improvement, using four collaborative steps: goal setting or investigation; planning; implementation and research lesson; and debriefing or reflection (Lewis et al., 2006).
The lesson study took place over two weeks in October and November 2022 involving six pre-primary teachers from three of the Phase 1 sample schools in Mombasa county (Case Schools 1, 2, and 3; see Table 8). Three researchers (two from Kenya and one from the UK), after finalising together the data collection protocols and standardisation approaches, each spent the entire data collection period embedded within one of the three case schools, working in close partnership with the two teachers. This quasi-ethnographic approach (quasi due to the limited timescale) enabled the researchers to observe the entire lesson study process in the same educational setting, attending to the unique contextual factors affecting the implementing of the DPL tool and building a strong collaborative relationship with the two pre-primary teachers.
The lesson study itself was structured according to four stages, with associated data collection tools developed for each one, as visualised in Figure 1. Goal setting took place through a half-day planning workshop in Mombasa, facilitated by the three researchers and involving all six participating teachers, during which results from the foundational phase of research were discussed. In particular, teachers validated the categorisation of three learner groups: “all learners”, to whom teachers direct whole class lesson delivery; “fast learners”, whom teachers had identified during Phase 1 as the learners who complete independent work the quickest; and “time-takers”, whom teachers identified as often needing additional support. Teachers then completed a “planning matrix”, which mapped typical activities during the school day against the three learner groups, indicating whether they would give the DPL tool to each learner group during each specified activity and why.
The implementation and reflection phases followed, conducted over two weeks. Implementation involved normal classroom practice, with the only change each day being a different learner group receiving priority to use the DPL tool, and a second device being introduced in week 2. Researchers conducted observations, capturing descriptive and reflective data at 10 min intervals on the way in which the DPL tool was or was not being used. At the end of each day, teachers reviewed the planning matrix, reflecting with the researchers through semi-structured interviews on whether they had followed their initial plan or deviated from it, as well as the reasons for that. Finally, a reflection workshop was held at the end of implementation, facilitating discussion about observed changes to planned practice and implementation of recommendations.
A three-stage approach was adopted to facilitate the systematic analysis of the data, drawing on the work of Saldaña (2021). First, the entire data corpus was sorted, inductively coded and re-coded, and categorised in Google Sheets (by two UK researchers). Second, codes and categories were integrated at the school level, thematically analysing codes within the context in which the data were collected. Finally, a cross-case analysis was conducted, comparing the themes which emerged at each school, as well as considering the extent to which themes were replicated across cases. The results were then reported back to the participating teachers during an analysis workshop, during which they could validate or clarify each finding.

4.2.3. Results

Four overarching themes emerged when the full set of Phase 2 results were thematically analysed according to the two sub-research questions: one theme relating to the pedagogical role of the DPL tool in supporting learning (2a. Teachers’ reasoning for distributing the DPL tool at certain times during the school day) and three relating to the pedagogical role of teachers in relation to the DPL tool (2b. Teachers providing personal support to learners; 2c. Teachers personalising which learners receive the DPL tool; and 2d. Impact of an additional device on class management).
A key benefit of the lesson study approach was that it enabled teachers to reflect repeatedly on the ways in which they were integrating the DPL tool into their lessons and their reasons for this. An overarching theme that emerged from this (2a) was the way in which teachers explained their reasons for distributing the DPL tool at certain times during the school day. This provided implicit insight into teachers’ perceptions of the pedagogical purpose of the DPL tool. On multiple occasions while completing their planning and reflection matrices, teachers explained their reasoning for distributing the DPL tool as a way to keep learners engaged. At Case School 2, both teachers stated they would distribute the DPL tool at times in the school day when learners are “free” (i.e., when no other activities are taking place). This practice was consistent for each variable group within the case school (grade, number of devices, and whether the learner was considered a “fast learner” or “time-taker”). The PP2 teacher at Case School 1 gave a similar reason, explaining she would distribute the tool after lessons and during free time to keep learners “busy” (CS1-PP2 Planning Matrix, days 1, 2, 4, and 5; CS1-PP2 Reflection Matrix, days 2 and 3).
Inversely, teachers presented pedagogical reasons for not handing the DPL tool to learners at certain times during the school day. In Case School 2, both teachers stated that they would not distribute the DPL tool to learners during whole class lesson delivery—neither for numeracy nor literacy lessons, for which digitised lesson plans were provided, nor for other subject lessons—although the teachers did not elaborate further than that it was not appropriate to give the DPL tool to learners while they were teaching (CS2 Planning Matrix & Reflection Matrix, all days). Teachers at Case Schools 1 and 3 more consistently focused on reasons for not distributing the DPL to learners during break and lunchtime, describing these as “time for relaxation”, playing and eating (CS1 Planning Matrix, all days; CS1 Reflection Matrix, days 1, 2, 3, 5, and 6; CS3 Planning Matrix, days 1, 2, 3, 5, and 6; CS3 Researcher Observations, day 1). There were exceptions to this finding. The researcher at Case School 1 did note that some PP1 children “do not mind missing play” to use the tool (CS1-PP1 Researcher Observations, day 4). Case School 2 also differed significantly in this respect, as both teachers would generally distribute the tool to learners in the classroom during break and lunchtime while others were playing (CS2 Planning Matrix, all days; CS2 Researcher Observations, days 1, 2, 3, 4, and 5; CS2 Reflection Matrix, all days). However, the results within this overarching theme (2a) indicate that teachers positioned the DPL tool as something to be used at times that would not distract from other pre-defined activities.
The second overarching theme (2b) relates to how teachers described their own role in relation to the DPL tool, with teachers at all three schools positioning themselves as a source of personal support to the learners. Across all schools, the data suggest that teachers were personalising their own practices to meet the individual needs of the learner—both when using the DPL tool and for other classroom activities. For example, the PP1 teacher at Case School 1 reflected that the “teacher must help the time takers when using the devices”, while the PP2 teacher at the same school commented that “fast learners” “know what they are doing” (CS1 Researcher Observations, days 2, 3, and 4). In addition, “time-takers” at Case School 2 were identified by the teacher as requiring support during regular classroom activities, with the PP1 teacher noting that “no devices are handed over” to “time-takers” because they are being assisted with regular classroom tasks (CS2-PP1 Planning Matrix, day 6).
Another major theme that emerged (2c) was the way in which teachers play a role in personalising which learners receive the DPL tool. All teachers managed the DPL tool by handing it to the learners who had finished their class activities first. Teachers in both PP1 and PP2 at Case School 3 handed the tools to “fast learners”, as they tended to be “the first learner to finish” (CS3 Planning Matrix, day 2; CS3 Researcher Observations, days 2 and 4). The PP1 teacher at Case School 2 displayed similar practices but conceptualised them in terms of “time-takers”, as they said in their lesson study reflections that “time-takers had not completed the individual activities in time to allow hand over” (CS2-PP1 Reflections Matrix, day 3). In contrast, the PP2 teacher at Case School 1 stated that the tool was given to those who did not access it the previous day, and reflected to the researcher that “children who do not get the device at one point know very well that their turn will [come]”, suggesting they were developing strategies to ensure equal use of the DPL tool (CS1-PP2 Planning Matrix, days 1, 2, and 3; CS1-PP2 Researcher Observations, day 1).
A key design iteration of Phase 2 was to introduce a second device into classrooms. Interestingly, the results suggest that few pedagogical changes took place as a result of this addition. However, an overarching theme emerged (2d) related to the impact on teachers’ management of the classroom. Generally, teachers provided positive feedback related to device management; at Case School 1, both teachers shared with the researchers that two devices allowed more learners to use the DPL tool and made it easier to meet the target time set (CS1 Researcher Observations, days 4, 5, and 6). However, the results suggest that the addition of a second DPL tool did have implications for how teachers chose to manage classroom seating arrangements. The PP1 teacher at Case School 3 reflected that they tended to position learners at either side of the teacher’s desk when two devices were in use, commenting that “I have two wings, I just fly very smoothly!” (CS3-PP1 Reflection Matrix, day 4). However, at the cross-school discussions, one teacher expressed a preference for separating the learners to stop them communicating with each other, as it “helps ease management” (Reflection Workshop).

4.3. Phase 3: A/B/C Software Testing at Scale

4.3.1. Design Conjecture and Iteration

Following the Phase 2 analysis workshop with teachers, as well as a second iteration workshop to discuss the results with the implementing partner (Major et al., 2024), it was decided that the priority focus for Phase 3 should be enhancing equal use of the DPL tool amongst learners. This built directly on the results from Phase 2, which indicated that, although some teachers were developing strategies to manage their distribution of the device, there was an overwhelming trend of selecting “fast learners” as the primary users of the DPL tool (theme 2c).
Phase 3, therefore, focused on designing and testing an iteration of the DPL tool that could support teachers’ equal distribution of the tool amongst learners. A review of the literature suggested that designing a learning analytics dashboard could be an effective approach. Evidence suggests that, through presenting data collected by the DPL tool to teachers, such dashboards can potentially support pedagogical decision making (Verbert et al., 2020), prompt differentiated instructional approaches (Hase & Kuhl, 2024), and adjust feedback to suit individual needs (Knoop-van Campen et al., 2023)—although pedagogical action is not always achieved (Molenaar & Knoop-van Campen, 2019). Nonetheless, this literature supported the conjecture that teacher-facing dashboards displaying learners’ usage of the DPL tool might influence pedagogical practice and, thus, be a digital means of supporting enhanced equitable use of the DPL tool.
A software iteration of the DPL implementation model was, therefore, designed (as presented in Table 9), comprising a usage data dashboard that teachers could access on the DPL tool via the teacher interface. The design of the dashboard involved two preliminary stages of iteration conducted by the DPL provider: alpha testing (involving teacher interviews and observations) and beta testing (involving teacher feedback and an analysis of app data). This took place prior to the software iteration being rolled out to the full Phase 3 sample to mitigate any technical issues.
The dashboard itself comprised three pages, visualised in Figure 2: first, the “progress” page, displaying total weekly class usage of the DPL tool; second, the “learner usage” page, displaying an individual breakdown of learner usage per week, in descending order; and third, the “learner profile” page, dividing individual learner usage per week by language and numeracy and providing a shortcut button, taking teachers directly to personalised learning content for that individual to practice. Additionally, “onboarding” messages were designed (displayed in the blue “learner details” box at the top of each screen), providing explanatory text to help teachers make sense of the presented information.

4.3.2. Methods: Sample, Data Collection, and Analysis

While Phases 1 and 2 employed mixed methods, conducted in schools in close collaboration with teachers, Phase 3 adopted a quantitative approach of A/B/n testing. This method is used for large-scale software evaluation, providing high ecological validity by conducting experiments to evaluate digital learning tools within authentic instructional contexts (Friedberg, 2023; Ritter et al., 2022; Savi et al., 2018). It was, therefore, deemed an appropriate method for Phase 3, considering that the iteration of the DPL implementation model was primarily a change to software.
An A/B/C test was conducted in Term 1 2024 (3 months of implementation between January and March) to assess the impact of the data dashboard on the equitable use of the DPL tool. The sample for this test comprised 366,906 pre-primary learners from 4151 schools across five Kenyan counties. Each school was randomly assigned to one of the following three partitions: “No Dashboard” control partition (i.e., no iteration of the existing implementation model); “Dashboard” partition, where access to the dashboard was provided via the teacher interface on the DPL tool; and “Onboarding Dashboard” partition, for which access was the same as the “Dashboard” partition, with the addition of the onboarding messages.
To measure the comparative impact of the three partitions, anonymous learner usage data were collected through the tool before and during the A/B/C test. Two datasets were defined: post-test data, including learner usage data collected during the three months of the A/B/C test; and pre-test data, comprising six months of usage data from any of the same sample group who also used the DPL tool before the A/B/C test in Term 3, 2023 (a total of 253,184 learners from 3,857 schools). The data were first cleaned, removing schools that logged zero hours of usage. The data were then aggregated at the school level in two ways: first, the sum of monthly learner usage was calculated per school (in hours) for both the pre- and post-test datasets to determine the effect of dashboards on school-level usage; second, learners with the top 10% of total usage per school across the A/B/C test were identified in the post-test data and the sum of the top 10% usage per school calculated to investigate equality of usage (i.e., whether the top 10% users skewed the average use). Analysis of the aggregated data was conducted using Python in Jupyter Notebook. First, mixed-effects models were conducted to determine whether any differences in the monthly school usage were detected amongst schools in each partition, both pre-test and post-test. Second, ANOVA was used to determine whether any differences in the top 10% total usage were detected amongst schools in each post-test partition. While the test did not measure observed changes in teachers’ pedagogical practice, any statistically significant difference in post-test usage that was not detected in pre-test usage was taken to infer a change in the way that teachers were choosing to distribute the DPL tool to learners as a result of viewing the dashboards.

4.3.3. Results

The analytical strategy in Phase 3 naturally grouped the results into two thematic areas: the impact of the usage data dashboard on overall DPL usage (theme 3a); and the impact of the usage data dashboard on equality of use of the DPL tool (theme 3b).
The results indicate that the Onboarding Dashboard had a statistically significant impact on learner usage (theme 3a). As presented in Table 10, mean monthly school usage decreased for every partition pre-test to post-test. However, mixed-effects models (Table 11) revealed that, while there was no significant difference amongst the partitions pre-test, the Onboarding Dashboard partition had significantly higher total monthly usage post-test. This implied that a change to pedagogical practice had taken place: the fact that a significant difference was only detected post-test indicates that teachers in the Onboarding Dashboard partition had changed their practices in some way that affected the levels of usage of the DPL tool in their schools. Considering that there was no significant difference between the Dashboard and No Dashboard partitions also indicates that the onboarding messages were a key factor in facilitating uptake and behavioural change as a result of viewing the usage data dashboard.
Although the A/B/C test indicates that teachers in the Onboarding Dashboard partition had altered their distribution of the DPL tool in a way that increased usage in their school compared to the other partitions, the mixed-effect models comparing partition difference could not indicate whether this increase in usage was achieved equally across learners. Since the results from previous DBR phases suggest that teachers were favouring “fast learners” when distributing the device, it was important to investigate whether the Onboarding Dashboard was exacerbating unequal usage of the tool or decreasing it. A proxy measure was, therefore, used to explore this: comparing the sum of total usage from the top 10% of users within each school (aggregated by partitions) to determine whether there was significant difference amongst partitions (theme 3b). Table 12 presents the mean and standard deviation of the sum from the top 10% total post-test usage per school for each partition. An ANOVA test was conducted, revealing no significant differences amongst the partitions (F = 1.57, p = 0.208). While this was a proxy measure and, therefore, over-interpretation is cautioned, it could suggest that the overall increase in post-test usage observed in the Onboarding Dashboard partition compared with the other post-test partitions was not a result of the top 10% receiving the tool significantly more. Further observational research is required to investigate this in more detail to determine whether the pedagogical change inferred as a result of the Onboarding Dashboard is serving to enhance equal use of the DPL tool amongst learners.

4.4. Phase 4: Mixed-Methods Hardware Pilot

4.4.1. Design Conjecture and Iteration

While Phase 3 tested a software-based iteration of the implementation model, a second potential avenue to improving equal access to the DPL tool was a hardware-related iteration: increasing the number of devices in the classroom. This change to the implementation model was suggested frequently by teachers and stakeholders during Phase 1 (theme 1c), and the addition of a second device was also positively received during the Phase 2 lesson study (theme 2d). However, evidence from other education technology initiatives suggest exercising caution when increasing the amount of hardware in the classroom. For instance, the 2023 Global Education Monitoring Report highlighted that inappropriate or excessive use of technology can have a detrimental impact on learning (UNESCO, 2023), while research on a DPL tool used in European schools has highlighted that the logistics of managing DPL in the classroom (including charging devices and supporting logins) can increase teacher workload (Outhwaite et al., 2023).
Therefore, our conjecture for Phase 4 of the DBR was that increasing the device-to-learner ratio may improve equitable use amongst learners (due to increased access to the DPL tool), but this change could have knock-on effects (positive, negative, or a combination of multiple factors) on classroom management. The proposed iteration of the implementation model (presented in Table 13) was to introduce a 10:1 learner-to-device ratio per classroom, accounting for variation in class size. Anticipating that the increased number of devices might risk increased noise distraction from the tools in the classroom, headphones were also introduced to support learners’ concentration during DPL tool use and reduce disruption to other classroom learning.

4.4.2. Methods: Sample, Data Collection, and Analysis

A mixed-methods approach was designed to investigate the implications of increasing the quantity of technology hardware in classrooms. Data collection took place in Nairobi in 2024 as part of the pilot implementation of the DPL tool in low-cost private primary schools (LCPPSs)—the first time this had been researched in lower-primary classrooms, as opposed to pre-primary (see Section 3.3).
Two samples were involved. First, ten pilot schools were purposively sampled from a longlist of schools that had signed up to receive the DPL tool, using the following inclusion criteria: schools with separate Grade 1 and Grade 2 classes (excluding mixed-grade) and with a minimum of 17 learners per grade. One teacher per grade per school was involved in the research in March 2024, two months after the schools had first received the DPL tool. Four research protocols were developed, building on data collection approaches from Phases 1 and 2: a questionnaire to understand when during the day and to which learners the teachers deliver the tool; a workshop and four focus group discussions (1a, 1b, 2a, and 2b), facilitated by two British and two Kenyan researchers, to understand teachers’ opinions on the tool and how they use it; and classroom observations (by the same four researchers) to gain an understanding of how the tool is being integrated into classrooms. The qualitative data were cleaned, deductively and inductively coded in MAXQDA and Google Sheets, and thematically analysed. Quantitative data were cleaned, and descriptive statistics were generated using Google Sheets.
The second sample comprised 304 Grade 1 and Grade 2 teachers at 139 low-cost private primary schools in Nairobi. These schools were randomly selected from the full list of schools that had signed up to receive the DPL tool. Implementation began in May 2024, following the initial pilot in 10 schools. Data collection—which took place five months later in October 2025—comprised a multiple-choice survey, including questions (developed based on the results from the 10 pilot schools) about the technology hardware and teachers’ perceptions of the DPL tool.

4.4.3. Results

A thematic analysis of the full set of results from Phase 4, alongside the DBR study’s two sub-research questions, identified three overarching themes: one relating to the pedagogical role of the DPL tool in supporting learning (4a. Teachers’ conceptualisations of the purpose of the DPL tool) and two relating to the pedagogical role of teachers in relation to the DPL tool (4b. Impact of the increased device-to-learner ratio on teachers’ distribution of the DPL tool and 4c. Impact of introducing headphones on teachers).
Teachers’ conceptualisation of the DPL tool (theme 4a) echoed many of the results from previous DBR phases, with a few additions. During the workshop, the teachers collaboratively identified four key purposes for the tool: a means for learners to acquire knowledge and skills; an aid for classroom management; a motivation for learners to complete other activities; and to support independent learning. While the first two were also results that emerged during Phases 1 and 2 (themes 1b and 2a), the concepts of motivation and independent learning appeared more prominently in this dataset. Teachers in all FGDs also agreed that the DPL tool was well aligned with the CBC.
Strikingly, the increased device-to-learner ratio did not appear to change teachers’ approach to determining which learners would use the tool and at what time during the school day (theme 4b). As in Phase 2 (theme 2c), the results from Phase 4 revealed teachers’ tendency to select “fast learners” as the primary users of the DPL tool. The vast majority of teachers (16 of 19) indicated in the questionnaire that “fast learners” have more access to the DPL tool than other learners in their classes. In contrast, only one teacher (Grade 2) said that “time-takers” use the tools more. Further questionnaire results indicate that this is due to the time at which these two learner groups tend to receive the DPL tool; teachers frequently referenced “fast learners” in relation to distributing the DPL tool during lessons, giving the reason that these learners had finished their tasks first; conversely, in all but two references to “time-takers”, teachers stated that they would share the tool with this group during breaktimes and lunch, explaining either that it would give these learners a chance to access the tool or that it would help them to learn. The teachers in FGDs 1a and 2a corroborated this by repeating that they specifically give the tool to “fast learners” once they have completed their work. Considering the increased number of devices in the classroom and thereby the increased opportunities for learners to use the DPL tool, it was, therefore, notable that the teachers had adopted a similar pedagogical approach to distributing the tool.
The results related to the introduction of headphones were mixed. A notable theme that emerged (4c) related to how this affected teachers’ own role in the classroom. Teachers in FGDs 1b and 2a indicated that, although they could not hear the instructions provided by the DPL tool, due to learners’ use of the headphones, this did not affect their ability to support learners in their use of the DPL tool. The one exception to this was a teacher in FGD 2a, who remarked that they found it more difficult to support learners with special educational needs and disabilities as a result of the headphones. Nonetheless, challenges in implementing the headphones were reported. A recurrent theme in FGDs 1a, 2a, and 2b was that the headphones were easily broken and, in FGD 1a, a teacher elaborated that learners were finding it difficult to manipulate the headphone jack. This was supported by quantitative data. During classroom observations, researchers recorded 44 instances when learners exhibited difficulty using the headphones, compared to 52 occasions where the learners used the headphones with ease. Similarly, in the survey conducted with 304 teachers, 49% (149) reported that at least one set of headphones in their classroom was broken after five months of implementation, compared with 51% (155) who responded that all headphones were still working. These results, therefore, suggest that teachers supported the purpose of the headphones but problems existed with the sustainability of the hardware itself.

4.5. Phase 5: Comparative Structured Pedagogy Pilot

4.5.1. Design Conjecture and Iteration

The final phase of the DBR concerned a dimension of EIDU’s classroom-integration model, which had not been previously investigated: the pedagogical implications of aligning the DPL tool with digitised lesson plans. As outlined above (see Section 2.3), the lesson plans are mapped to the CBC in the same way as the DPL units, enabling the tool to suggest DPL content which is most relevant to that day’s lesson delivery. An A/B test conducted between January and April 2023 observed the positive impact of teachers being able to override the system-recommended content and select a different substrand: this had a statistically significant positive impact on learner device usage and digital unit scores, suggesting the positive potential of giving teachers agency to feed into the personalisation process (Sun et al., 2024a). However, this A/B test did not investigate the factors that influence teachers’ decision making about substrand selection—why they select specific substrands for learners to engage with, especially when this deviates from the system-recommended content.
This phase of the DBR was, therefore, designed on the conjecture that teachers’ agency to decide which substrand is selected on the DPL tool determines the way in which the tool is aligned with classroom practice. The results from Phase 1 indicate that teachers viewed the alignment of the tool with the CBC to be a key purpose of the tool (theme 1b). The aim of Phase 5 was to understand how this is operationalised in practice, investigating teachers’ pedagogical reasoning behind substrand choice, both when they were provided with digitised lesson plans and when these plans were not available. This formed the basis for the Phase 5 iteration of the implementation model (Table 14); one group received digitised lesson plans aligned to the DPL content, while the other received only the DPL functionality.

4.5.2. Methods: Sample, Data Collection, and Analysis

Two sample groups were selected for this phase of DBR. First, 20 teachers from the same pilot schools involved in Phase 4, who had only received the DPL functionality during the pilot in low-cost private primary schools in Nairobi in Term 1, 2024 (Pilot Group 1). Second, 25 teachers from a further 10 schools, randomly selected from the same sign-up list, for which implementation began in May 2025 (Pilot Group 2). These schools received access to EGM lesson plans as part of the DPL tool, as well as training and support on lesson delivery from EIDU officers as part of the structured pedagogy approach.
FGDs and interviews were conducted with these two sample groups in March 2024 with the first sample at the end of a workshop with teachers (after 3 months of implementation) and in September 2024 with the second at the end of a day of classroom observations (after 4–5 months of implementation). During both sets of semi-structured data collection, teachers were asked to explain their reasoning for how they selected the DPL substrand. The qualitative data were cleaned and analysed thematically using MAXQDA and Google Sheets.

4.5.3. Results

The Phase 5 results were all grouped under one overarching theme (5a): teachers’ reasoning for substrand selection on the DPL tool. The results from the two sample groups were considerably different. In the Pilot Group 1 (which only received the DPL tool, but no lesson plans), there was little uniformity in the data. Teachers in FGDs 1a and 2a stated that they chose the substrand based on the topic of that day’s lesson. Linked to this reasoning, one teacher in FGD 1a reported that they would not select a substrand that had not yet been covered in class. Teachers in FGDs 1a and 2b also listed the learners’ ability as a reason, selecting areas where the learners have shown weaknesses. In this latter example, the selection of an appropriate substrand was more related to personalising the tool to learners’ needs than aligning it with lesson delivery.
In contrast, teachers in the Pilot Group 2, who received the EGM lesson plans as part of the DPL tool, were far more consistent in their decision making. All but 3 of the 20 teachers interviewed reported choosing the DPL content that was system-recommended, after the completion of that day’s lesson plan. The exceptions included one teacher who did not choose to distribute the tool on that specific day; another who selected a similar topic because the lesson they had taught was not available; and a third who failed to mark the completion of the lesson plan, meaning that they chose to override the system-recommended substrand and select the correctly aligned one. This research amongst pilot schools, therefore, suggested that teachers’ pedagogical reasoning for selecting substrands on the DPL tool may be influenced by the provision of digital lesson plans, and this results in closer alignment of the DPL content with whole class instruction. However, further research is required to determine whether such alignment with whole class instruction is the optimal approach to support learning outcomes or whether a personalised approach to content selection is more effective.

5. Discussion

The aim of this discussion is twofold: to reflect on the comparative limitations and advantages of the reported DBR approach; and to consider the results from all five phases, drawing out key emergent pedagogical implications for integrating such a DPL tool into early-grade classrooms. It is underpinned by the study’s theoretical framework of intermediate theory building, the aim of which is to bridge the gap between practice and theory. As such, the discussion is framed in a way that considers the alignment (or lack thereof) of pedagogical theories and practices observed within Kenyan classrooms with the existing literature on DPL.

5.1. Limitations of the DBR Approach

This study adopted a broad approach to DBR, in which each phase of the DBR focused on a different dimension of implementation, as opposed to iteration of the same design feature. The limitations of this adopted approach are recognised—in particular, that it did not allow for repetitive refinement of the DPL tool intervention to optimise a specific educational outcome, as is more commonly achieved through standard DBR approaches (Anderson & Shattuck, 2012; Bakker, 2019; Smørdal et al., 2021). The opportunities afforded by the adopted approach are instead reflected on below (Section 5.2).
The varied sample, comprising a different number of teachers in each phase and not involving the same teachers throughout, also limited the potential outcomes of the DBR, as the design iterations could not be tested in a consistent educational context. It also limited the extent to which specific teachers could act as “co-inquirers” throughout the research process, though steps were taken to facilitate teachers’ review of the analysed data.
A third limitation of the research relates to the context in which it was conducted. Since the DBR was focused on an existing DPL intervention, the researchers were iterating a tool that was already embedded in schools. This form of “real-world” implementation research is inherently a lot messier than alternative research approaches where researchers themselves implement and control the conditions of an intervention. The lack of controlled conditions may, therefore, have led to variation across the sample in teachers’ and learners’ levels of engagement with or utilisation of the DPL tool. Furthermore, resource constraints, including challenges with stable electricity for charging the devices, may have hindered some schools’ capacity to engage fully with the intervention. Support provided by the implementation partner, EIDU, may have helped to mitigate some of the more pressing challenges, through the provision of initial training to teachers and ongoing technology support. However, variation in the ways teachers were able to integrate the DPL tool into their classrooms was not considered a limitation of the study but rather the central focus of the DBR—the benefits of which are now presented.

5.2. Opportunities from the DBR Approach

The first opportunity from the DBR was that it was grounded in the operational reality of implementation, rather than producing initial conjectures built purely from theory or empirical results from a different educational context. Phase 1 was intentionally designed as a “foundational” phase in which no changes to the existing implementation (see Table 5) were made. Instead, a focus was placed on understanding teachers’ experiences and perspectives on integrating the DPL tool into their classroom practice—including any variation in implementation and the factors that influenced this—which, in turn, provided a basis for subsequent DBR phases. As such, the design conjectures and iterations that followed built directly on existing results: Phase 2 on themes 1a, 1c, and 1d; Phase 3 on theme 2c; Phase 4 on themes 1c and 2d; and Phase 5 on theme 1b. While the DBR design was, therefore, not iterative in a traditional sense, it did incrementally build on prior results—further supported by workshops with the implementation partner held between phases, during which the analysed results were discussed and fed into the ongoing implementation model.
The second benefit of this broad approach to DBR was that due to focusing on a different aspect of the DPL tool in each phase it took into account the multidimensionality of classroom-integrated DPL. Phases 2–5 each investigated a different one of these dimensions of implementation: classroom management (Phase 2, focusing on when, by whom, and with how many devices the DPL tool is used in the classroom); software (Phase 3, designing usage data dashboards); hardware (Phase 4, increasing the number of devices and introducing headphones); and pedagogy (Phase 5, exploring the effect of aligning DPL with digitised lesson plans on teachers’ pedagogical decision making). This approach was, therefore, appropriate for a study of DPL, considering the complex and multilayered nature of DPL, as well as the need to consider how different dimensions of the learning environment can support individual learning needs (Schmid et al., 2022; Bernacki et al., 2021; Zhang et al., 2022). While further research is required to investigate the interdependent relationships amongst these different components and thereby optimise their impact on learning outcomes, this study provided a valuable opportunity to investigate the implementation of integrated DPL in a way that attended to its multidimensional nature.

5.3. The Pedagogical Role of the DPL Tool in Supporting Learning: A Chasm Between DPL Theory and Pedagogical Practice

Although an underpinning aim of this study was to build a bridge between educational theory and practice, the results instead highlight a chasm between the prevailing literature about DPL and the teachers’ own conceptualisation of the purpose of the tool they were integrating into their classrooms. Theories about DPL build on a history of educational approaches in which teachers adapt instruction to provide differentiated and individualised instruction, addressing a traditional but oft-criticised “one-size-fits-all” approach (Bernacki et al., 2021; Bhutoria, 2022). As such, conceptual frameworks of DPL focus primarily on its adaptive dimensions. The results of Van Schoors et al.’s systematic review of conceptual and methodological trends in the literature (2021) underscored this, identifying the six most common elements in research descriptions of DPL, all relating to the technical, adaptive nature of such tools (technology; personalisation; personalisation target; personalisation sources; personalisation method; and personalisation aim).
In contrast, across all three years of the DBR, participating teachers’ conceptualisations of DPL repeatedly focused on its “classroom-integrated” nature rather than its potential to facilitate differentiated learning opportunities. When describing the purpose of the DPL tool, teachers identified the benefits of its alignment with the CBC (themes 1b and 4a) and its use as an aid to classroom management (themes 2a, 2c, and 4a)—dimensions of DPL related more to the classroom environment in which it is implemented than to the digital learning environment.
This gap between researchers’ and teachers’ conceptualisations of DPL is understandable. Teachers do not see the digital personalisation that takes place within the DPL software, and therefore this aspect of the tool is unlikely to be foremost in their conceptualisation of it. This is not to imply that personalisation is not a critical component to the impact of DPL tools on learning outcomes; an A/B/C test conducted in parallel to this study revealed that the personalisation of DPL content sequencing via an LSTM-based algorithm can have a statistically significant positive impact on certain formative assessment outcomes compared to an expert-curated sequencing approach (Sun et al., 2024c). However, this chasm between academic (i.e., theoretical) and pedagogic (i.e., practical) perspectives has implications for the way in which the design and research of DPL tools is conducted—especially if such tools are to be classroom-integrated.
Evidence has suggested that teachers’ openness to change and motivation are critical prerequisites for the successful integration of technology into classrooms (Baylor & Ritchie, 2002; Backfisch et al., 2021; Tondeur et al., 2017). This is further supported by the framework of technology integration offered by Brickner (1995, as cited in Ertmer et al., 1999), which distinguishes between two types of barriers: first-order barriers, which are extrinsic to teachers, such as a lack of access to technology or inadequate support; and second-order barriers, which are intrinsic, including teachers’ beliefs about teaching, technology, and established classroom practices. This framework emphasises the fact that, as well as tackling prevalent first-order barriers to technology integration in LMICs, such as the cost of acquiring and maintaining hardware in low-connectivity settings (Angrist et al., 2023), it is critical to consider second-order barriers related to teachers’ own beliefs. If classroom-integrated DPL tools are not designed in a way that maximises their perceived usefulness and ease of use for teachers, then this may have a negative effect on DPL implementation.

5.4. The Pedagogical Role of Teachers in Relation to the DPL Tool: Altering Teaching Practices to Personalise DPL Implementation

The centrality of the teacher to integrating the DPL tool into the classroom emerged consistently across the DBR results. The overarching themes indicated that teachers can not only provide tailored support in parallel to the DPL tool (themes 2b and 4c) but are also pivotal to how the DPL tool is distributed (themes 1b, 2c, 2d, 3a, 3b, and 4b) and aligned to classroom practice (theme 5a). This aligns with definitions of “integrated” DPL, which emphasise the coexistence of teacher and technology (Major & Francis, 2020) and the additional actions taken by teachers before, during, and after the learning activity in the DPL tool (Van Schoors et al., 2021).
Strikingly, the results indicate that the teacher’s role in implementing an integrated DPL model contributes to the process of personalisation itself. Three examples of this were observed: a) teachers personalising which learners had access to the DPL tool, overriding the randomly ordered list of profiles (themes 1d, 2c, and 4b); b) teachers personalising the selection of DPL content by confirming or overriding the system-suggested substrand (theme 5a); and c) teachers adapting their own pedagogical approach in light of feedback from the usage data dashboard (themes 3a and 3b). These results have important implications for the way in which DPL frameworks are theorised. Our DBR has indicated that, in the case of classroom-integrated DPL, there can be multiple personalisation models that interact]: both personalisation within the digital DPL environment (in this case, adapting content sequencing via an LSTM neural network) and personalisation of how the DPL tool is implemented within the classroom environment (see also Daltry et al., in press). As such, it is important that frameworks of personalisation incorporate not only the software’s adaptive design but also the dimensions of personalisation that take place through the DPL tool’s implementation.
The teacher’s role in personalising DPL implementation also links to important considerations around education access and equality. Across all phases of the DBR, equal distribution of the DPL tool emerged as one of the most persistent implementation challenges (themes 2c, 3b, and 4b). Regardless of the sample group or education level, teachers repeatedly identified their tendency to distribute the DPL tool to “fast learners” first and, often, more frequently than to other learners—especially those termed “time-takers”. Although this study designed and tested mitigation strategies to address the risk of unequal use (Phases 3 and 4), further research is required to optimise these approaches. This is especially important in LMIC contexts where low device-to-learner ratios are employed to reduce the financial burden of implementing DPL technology, as it adds a further dimension of equitable access for teachers to consider.
The results underline the critical importance of considering pedagogy when designing and implementing classroom-integrated DPL. Introducing technology into classrooms necessarily requires teachers to alter their teaching practices (Baylor & Ritchie, 2002) and to be equipped with the knowledge, skills, and dispositions required for successful implementation (Kelly, 2008). As such, supporting teachers’ pedagogical decision making is an important consideration for classroom-integrated DPL approaches. The literature on ongoing teacher professional development (TPD) suggests not only that it can be an effective approach to support teachers in developing technology implementation strategies (Bai, 2019; Outhwaite et al., 2023) but that a lack of appropriate, sustained support is a major obstacle to effective technology use in the classroom (Hennessy et al., 2022; De Melo et al., 2014). Recent research also suggests that promoting familiarity and frequency of use of technology in classrooms through training may support teachers to change their methods and support more complex, student-centred integration of technology (Pérez Echeverría et al., 2025). This is of particular importance in LMIC contexts such as Kenya, where infrastructure and technical challenges present barriers to delivering learner-centred digital education (Kerkhoff & Makubuya, 2022). Approaches to scaffolding and supporting teachers’ implementation of technology in their classrooms should, therefore, be central considerations in integrated DPL models, providing teachers with the foundations from which to make contextually appropriate pedagogical decisions.

6. Conclusions

This paper reports on three years of DBR in Kenyan classrooms—a rare opportunity to present and reflect on multiple phases of research over a sustained period of time. It addresses an important research gap by providing evidence on the implementation of a classroom-integrated DPL tool in an LMIC context, giving specific priority to understanding teachers’ perspectives in order to bridge the gap between theory and practice and investigate DPL through a pedagogical lens.
The breadth of this paper, reporting on five phrases of DBR, is both a limitation and asset. The diverse range of methods and quantity of data collected necessarily required the results of each research phase to be presented as a high-level summary, leaving limited space to elaborate on some of the nuances and contextual insights. However, presenting the overarching themes from five phases of DBR allowed for a richer discussion of the multifaceted nature of integrated DPL and the implications of this for theory and practice.
Overall, this study demonstrates that the pedagogical dimensions of integrated DPL in an LMIC setting cannot be overlooked. These may have implications not only for the effective uptake and implementation of DPL by teachers but also for the process of personalisation itself, both within the digital and classroom environments. Prioritising the co-design of DPL with teachers and drawing on their pedagogical perspectives to enhance integrative approaches are, therefore, strongly recommended for future research and practice.

Author Contributions

Conceptualisation, R.D. and L.M.; Methodology, R.D., J.H., C.S., L.M., M.O. and K.O.; Formal Analysis, R.D., J.H., C.S., M.O. and K.O.; Investigation, R.D., J.H., M.O. and K.O.; Data Curation, R.D., J.H. and C.S.; Writing—Original Draft Preparation, R.D., J.H. and C.S.; Writing—Review and Editing, L.M., M.O. and K.O.; Visualisation, R.D. and C.S.; Supervision, R.D. and L.M.; Project Administration, R.D.; Funding Acquisition, L.M. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by three donors: the UK Foreign, Commonwealth & Development Office (FCDO); the Gates Foundation; and Founders Pledge. Funding from the latter two donors was received through the technology partner; however, the EdTech Hub research team retained full independence, management, and ownership.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and approved by the Scientific and Ethics Review Unit of the Kenya Medical Research Institute (KEMRI: Non-KEMRI Protocols Nos. 4444 on 27 May 2022 and Non-KEMRI Protocols Nos. 4931 on 7 March 2024), and National Commission for Science, Technology and Innovation (NACOSTI: No. NACOSTI/P/22/17399 on 29 May 2022 and NACOSTI/P/24/34300 on 4 April 2024).

Informed Consent Statement

Informed consent for participation was obtained from all adult participants in the study. “Gatekeeper” consent was obtained from teachers for all learners taking part in classroom observations during the study, while headteachers were provided with detailed information sheets via which parents and caregivers were informed of the study and provided with opt-out details. Additionally, local authority approval and teachers’ gatekeeper consent was obtained by EIDU to collect digital data as part of the provision of the DPL tool in classrooms. These data were then anonymised locally on the digital device, uploaded in anonymous form to EIDU’s servers, and then shared with researchers for secondary analysis under a comprehensive data sharing agreement.

Data Availability Statement

The anonymised data supporting the conclusions of this article can be made available by the authors upon reasonable request.

Acknowledgments

The authors are immensely grateful to participating teachers, learners, research partners, and other education stakeholders for their invaluable contributions to this research. We acknowledge the input of many colleagues at EdTech Hub and Women Educational Researchers of Kenya, particularly Tom Kaye, Katy Jordan, Ciku Mbugua, Rabia Tanweer, Daniel Plaut, Asad Rahman and Nariman Moustafa during earlier stages of the project. Further thanks go to David Hollow and Jan Sequeira for their advisory support. The research would not have been possible without the partnership with colleagues at EIDU, with special acknowledgement given to Aidan Friedburg, Nina Bolte and Joseph Gatonye.

Conflicts of Interest

The authors declare no conflict of interest. While the technology partner was consulted throughout given the nature of the research, the research team retained final decision-making authority on the study design, collection, analyses, and interpretation of data; writing of the manuscript; and the decision to publish the results.

Abbreviations

The following abbreviations are used in this manuscript:
CBCCompetency-based curriculum
DBRDesign-based research
DPLDigital personalised learning
ECDEarly childhood development
EGMEarly Grade Mathematics
FGDFocus group discussion
KIIKey informant interview
LCPPSLow-cost private primary school
LMICLow- and middle-income country
LSTMLong Short-Term Memory

References

  1. Agyei, D. D. (2021). Integrating ICT into schools in Sub-Saharan Africa: From teachers’ capacity building to classroom implementation. Education and Information Technologies, 26(1), 125–144. [Google Scholar] [CrossRef]
  2. Aleven, V., McLaughlin, E. A., Glenn, R. A., & Koedinger, K. R. (2017). Instruction based on adaptive learning technologies. In R. E. Mayer, & P. Alexander (Eds.), Handbook of research on learning and instruction (2nd ed., pp. 522–560). Routledge. Available online: https://www.cs.cmu.edu/afs/.cs.cmu.edu/Web/People/aleven/Papers/2016/Aleven_etal_Handbook2016_AdaptiveLearningTechnologies_Prepub.pdf (accessed on 5 March 2025).
  3. Alrawashdeh, G. S., Fyffe, S., Azevedo, R. F. L., & Castillo, N. M. (2024). Exploring the impact of personalized and adaptive learning technologies on reading literacy: A global meta-analysis. Educational Research Review, 42, 100587. [Google Scholar] [CrossRef]
  4. Anderson, T., & Shattuck, J. (2012). Design-based research: A decade of progress in education research? Educational Researcher, 41(1), 16–25. [Google Scholar] [CrossRef]
  5. Angrist, N., Aurino, E., Patrinos, H. A., Psacharopoulos, G., Vegas, E., Nordjo, R., & Wong, B. (2023). Improving learning in low-and lower-middle-income countries. Journal of Benefit-Cost Analysis, 14(S1), 55–80. [Google Scholar] [CrossRef]
  6. Backfisch, I., Lachner, A., Stürmer, K., & Scheiter, K. (2021). Variability of teachers’ technology integration in the classroom: A matter of utility! Computers & Education, 166, 104159. [Google Scholar] [CrossRef]
  7. Bai, H. (2019). Preparing teacher education students to integrate mobile learning into elementary education. TechTrends, 63(6), 723–733. [Google Scholar] [CrossRef]
  8. Bakker, A. (2019). Design research in education: A practical guide for early career researchers. Routledge. [Google Scholar]
  9. Baylor, A. L., & Ritchie, D. (2002). What factors facilitate teacher skill, teacher morale, and perceived student learning in technology-using classrooms? Computers & Education, 39(4), 395–414. [Google Scholar] [CrossRef]
  10. Beetham, H., & Sharpe, R. (Eds.). (2013). Rethinking pedagogy for a digital age: Designing for 21st century learning. Routledge. [Google Scholar] [CrossRef]
  11. Bernacki, M. L., Greene, M. J., & Lobczowski, N. G. (2021). A systematic review of research on personalized learning: Personalized by whom, to what, how, and for what purpose(s)? Educational Psychology Review, 33(4), 1675–1715. [Google Scholar] [CrossRef]
  12. Bhutoria, A. (2022). Personalized education and artificial intelligence in the United States, China, and India: A systematic review using a human-in-the-loop model. Computers and Education: Artificial Intelligence, 3, 100068. [Google Scholar] [CrossRef]
  13. British Educational Research Association. (2018). Ethical guidelines for educational research (4th ed.). Available online: https://www.bera.ac.uk/researchers-resources/publications/ethical-guidelines-for-educational-research-2018 (accessed on 1 June 2024).
  14. Buhl, M., Hanghøj, T., & Henriksen, T. D. (2022). Reconceptualising design-based research: Between research ideals and practical implications. Nordic Journal of Digital Literacy, 17(4), 205–210. [Google Scholar] [CrossRef]
  15. Building Evidence in Education. (2015). Assessing the strength of evidence in the education sector. Available online: https://reliefweb.int/report/world/assessing-strength-evidence-education-sector (accessed on 1 June 2024).
  16. Daltry, R., Major, L., & Mbugua, C. (in press). Design and implementation factors for digital personalized learning in low-and middle-income countries. In M. Bernacki, C. Walkington, A. Emery, & L. Zhang (Eds.), Handbook of personalized learning. Routledge. [Google Scholar]
  17. De Melo, G., Machado, A., & Miranda, A. (2014). The impact of a one laptop per child program on learning: Evidence from Uruguay. IZA Discussion Paper No. 8489. Available online: https://docs.iza.org/dp8489.pdf (accessed on 7 February 2025).
  18. Dias, L. B. (1999). Integrating technology. Learning and Leading with Technology, 27(3), 10–21. Available online: https://citeseerx.ist.psu.edu/document?repid=rep1&type=pdf&doi=a3bd6816200436aed9eca8b63d0891250bec1a23 (accessed on 23 March 2025).
  19. Ertmer, P. A., Paul, A., Molly, L., Eva, R., & Denise, W. (1999). Examining teachers’ beliefs about the role of technology in the elementary classroom. Journal of Research on Computing in Education, 32(1), 54–72. [Google Scholar] [CrossRef]
  20. Fernandez, C., & Yoshida, M. (2004). Lesson study: A Japanese approach To improving mathematics teaching and learning. Routledge. [Google Scholar] [CrossRef]
  21. FitzGerald, E., Kucirkova, N., Jones, A., Cross, S., Ferguson, R., Herodotou, C., Hillaire, G., & Scanlon, E. (2018). Dimensions of personalisation in technology-enhanced learning: A framework and implications for design. British Journal of Educational Technology, 49(1), 165–181. [Google Scholar] [CrossRef]
  22. Friedberg, A. (2023). Can A/B testing at scale accelerate learning outcomes in low-and middle-income environments? In N. Wang, G. Rebolledo-Mendez, V. Dimitrova, N. Matsuda, & O. C. Santos (Eds.), Artificial intelligence in education. Posters and late breaking results, workshops and tutorials, industry and innovation tracks, practitioners, doctoral consortium and blue sky (Vol. 1831, pp. 780–787). Springer. [Google Scholar] [CrossRef]
  23. Gro, J. S. (2017). Personalized learning: The state of the field & future direction. Center for Curriculum Redesign. Available online: https://www.media.mit.edu/publications/personalized-learning/ (accessed on 30 January 2025).
  24. Hanfstingl, B., Rauch, F., & Zehetmeier, S. (2019). Lesson study, learning study and action research: Are there more differences than a discussion about terms and schools? Educational Action Research, 27(4), 455–459. [Google Scholar] [CrossRef]
  25. Hase, A., & Kuhl, P. (2024). Teachers’ use of data from digital learning platforms for instructional design: A systematic review. Educational Technology Research and Development, 72(4), 1925–1945. [Google Scholar] [CrossRef]
  26. Hennessy, S. (2014). Bridging between research and practice: Supporting professional development through collaborative studies of classroom teaching with technology. Brill. Available online: http://www.jstor.org/stable/10.1163/j.ctv29sfq0t (accessed on 2 March 2025).
  27. Hennessy, S., D’Angelo, S., McIntyre, N., Koomar, S., Kreimeia, A., Cao, L., Brugha, M., & Zubairi, A. (2022). Technology use for teacher professional development in low-and middle-income countries: A systematic review. Computers and Education, 3, 100080. [Google Scholar] [CrossRef]
  28. Hennessy, S., & Deaney, R. (2009). The impact of collaborative video analysis by practitioners and researchers upon pedagogical thinking and practice: A follow-up study. Teachers and Teaching, 15(5), 617–638. [Google Scholar] [CrossRef]
  29. Hoadley, C., & Campos, F. C. (2022). Design-based research: What it is and why it matters to studying online learning. Educational Psychologist, 57(3), 207–220. [Google Scholar] [CrossRef]
  30. Holmes, W., Anastopoulou, S., Schaumburg, H., & Mavrikis, M. (2018). Technology-enhanced personalised learning: Untangling the evidence. Robert Bosch Stiftung GmbH. Available online: http://www.studie-personalisiertes-lernen.de/en/ (accessed on 18 February 2025).
  31. Kelly, M. A. (2008). Bridging digital and cultural divides: TPCK for equity of access to technology. In P. Mishra, & M. J. Koehler (Eds.), Handbook of technological pedagogical content knowledge (TPCK) for educators. Routledge. [Google Scholar] [CrossRef]
  32. Kerkhoff, S. N., & Makubuya, T. (2022). Professional development on digital literacy and transformative teaching in a low-income country: A case study of rural Kenya. Reading Research Quarterly, 57(1), 287–305. [Google Scholar] [CrossRef]
  33. Knoop-van Campen, C. A. N., Wise, A., & Molenaar, I. (2023). The equalizing effect of teacher dashboards on feedback in K-12 classrooms. Interactive Learning Environments, 31(6), 3447–3463. [Google Scholar] [CrossRef]
  34. Lewis, C., Perry, R., & Murata, A. (2006). How should research contribute to instructional improvement? The case of lesson study. Educational Researcher, 35(3), 3–14. [Google Scholar] [CrossRef]
  35. Lin, L., Lin, X., Zhang, X., & Ginns, P. (2024). The personalized learning by interest effect on interest, cognitive load, retention, and transfer: A meta-analysis. Educational Psychology Review, 36, 88. [Google Scholar] [CrossRef]
  36. Liu, F., Ritzhaupt, A. D., Dawson, K., & Barron, A. E. (2017). Explaining technology integration in K-12 classrooms: A multilevel path analysis model. Educational Technology Research and Development, 65, 795–813. [Google Scholar] [CrossRef]
  37. Major, L., Daltry, R., Rahman, A., Plaut, D., Otieno, M., & Otieno, K. (2024). A dialogic design-based research partnership approach: Developing close-to-practice educational technology theory in Kenya. In A. Chigona, H. Crompton, & N. Tundra (Eds.), Global perspectives on teaching with technology: Theories, case studies, and integration strategies (pp. 246–264). Routledge. [Google Scholar] [CrossRef]
  38. Major, L., & Francis, G. A. (2020). Technology-supported personalised learning: Rapid evidence review. EdTech Hub Rapid Evidence Review. [Google Scholar] [CrossRef]
  39. Major, L., Francis, G. A., & Tsapali, M. (2021). The effectiveness of technology-supported personalised learning in low-and middle-income countries: A meta-analysis. British Journal of Educational Technology, 52(5), 1935–1964. [Google Scholar] [CrossRef]
  40. McKenney, S., & Reeves, T. (2018). Conducting educational design research. Routledge. [Google Scholar] [CrossRef]
  41. Molenaar, I. (2022). Towards hybrid human-AI learning technologies. European Journal of Education, 57, 632–645. [Google Scholar] [CrossRef]
  42. Molenaar, I., & Knoop-van Campen, C. A. (2019). How teachers make dashboard information actionable. IEEE Transactions on Learning Technologies, 12(3), 347–355. [Google Scholar] [CrossRef]
  43. Outhwaite, L., Ang, L., Herbert, E., Summer, E., & Van Herwegen, J. (2023). Technology and learning for early childhood and primary education. UNESCO. [Google Scholar] [CrossRef]
  44. Pelletier, C. (2024). Against personalised learning. International Journal of Artificial Intelligence in Education, 34(1), 111–115. [Google Scholar] [CrossRef]
  45. Pérez Echeverría, M., Cabellos, B., & Pozo, J.-I. (2025). The use of ICT in classrooms: The effect of the pandemic. Education and Information Technologies. [Google Scholar] [CrossRef]
  46. Reich, J. (2020). Failure to disrupt: Why technology alone can’t transform education. Harvard University Press. [Google Scholar] [CrossRef]
  47. Ritter, S., Murphy, A., & Fancsali, S. (2022, June 1). Curriculum-embedded experimentation. Third Workshop on A/B Testing and Platform-Enabled Research (Learning @ Scale 2022), New York, NY, USA. Available online: https://www.upgradeplatform.org/wp-content/uploads/2022/09/ABTestPlatLearn2022_Ritter_etal_DRAFT.pdf (accessed on 25 January 2025).
  48. Saldaña, J. (2021). The coding manual for qualitative researchers (4th ed.). SAGE. [Google Scholar]
  49. Sampson, D., & Karagiannidis, C. (2002). Personalised learning: Educational, technological and standardisation perspective. Interactive Educational Multimedia, 4, 24–39. Available online: https://www.researchgate.net/publication/228822599_Personalised_learning_Educational_technological_and_standardisation_perspective (accessed on 7 March 2025).
  50. Savi, A. O., Ruijs, N. M., Maris, G. K. J., & van der Maas, H. L. J. (2018). Delaying access to a problem-skipping option increases effortful practice: Application of an A/B test in large-scale online learning. Computers & Education, 119, 84–94. [Google Scholar] [CrossRef]
  51. Schmid, R., Pauli, C., Stebler, R., Reusser, K., & Petko, D. (2022). Implementation of technology-supported personalized learning—Its impact on instructional quality. The Journal of Educational Research, 115(3), 187–198. [Google Scholar] [CrossRef]
  52. Selwyn, N. (2020). ‘Just playing around with Excel and pivot tables’—The realities of data-driven schooling. Research Papers in Education, 37(1), 95–114. [Google Scholar] [CrossRef]
  53. Shiohira, K., & Holmes, W. (2023). Proceed with caution: The pitfalls and potential of AI and Education. In D. Araya, & P. Marber (Eds.), Augmented education in the global age artificial intelligence and the future of learning and work (1st ed.). Routledge. Available online: https://www.taylorfrancis.com/chapters/oa-edit/10.4324/9781003230762-11/proceed-caution-kelly-shiohira-wayne-holmes (accessed on 15 March 2025).
  54. Smørdal, O., Rasmussen, I., & Major, L. (2021). Supporting classroom dialogue through developing the Talkwall microblogging tool: Considering emerging concepts that bridge theory, practice, and design. Nordic Journal of Digital Literacy, 16(2), 50–64. [Google Scholar] [CrossRef]
  55. Sun, C., Major, L., Daltry, R., Moustafa, N., & Friedberg, A. (2024a). Teacher-AI collaboration in content recommendation for digital personalised learning among pre-primary learners in Kenya. In L@S ’24: Proceedings of the Eleventh ACM Conference on Learning @ Scale (pp. 346–350). Association for Computing Machinery. [Google Scholar] [CrossRef]
  56. Sun, C., Major, L., Moustafa, N., Daltry, R., & Friedberg, A. (2024b). Learner agency in personalised content recommendation: Investigating its impact in Kenyan pre-primary education. In Artificial intelligence in education. Posters and late breaking results, workshops and tutorials, industry and innovation tracks, practitioners, doctoral consortium and blue sky. AIED 2024. Springer. [Google Scholar] [CrossRef]
  57. Sun, C., Major, L., Moustafa, N., Daltry, R., Lazar, O., & Friedberg, A. (2024c). The impact of different personalisation algorithms on literacy and numeracy in Kenyan pre-primary education: A comparative study of summative and formative assessments results. In Companion proceedings 14th international conference on learning analytics & knowledge (LAK24) (pp. 109–111). Available online: https://www.solaresearch.org/wp-content/uploads/2024/03/LAK24_CompanionProceedings.pdf (accessed on 24 May 2025).
  58. Tailor, K. (2022). Evaluating the impact of technology-supported personalised learning interventions on the mathematics achievements of elementary students in India. Cambridge Educational Research e-Journal, 9, 198–209. [Google Scholar] [CrossRef]
  59. Tlili, A., Salha, S., Wang, H., Huang, R., Rudolph, J., & Weidong, R. (2024, July 1–4). Does personalization really help in improving learning achievement? A meta-analysis. 2024 IEEE International Conference on Advanced Learning Technologies (ICALT), Nicosia, Cyprus. [Google Scholar] [CrossRef]
  60. Tondeur, J., van Braak, J., Ertmer, P. A., & Ottenbreit-Leftwich, A. (2017). Understanding the relationship between teachers’ pedagogical beliefs and technology use in education: A systematic review of qualitative evidence. Educational Technology Research and Development, 65, 555–575. [Google Scholar] [CrossRef]
  61. UNESCO. (2023). Global education monitoring report, 2023: Technology in education: A tool on whose terms? UNESCO. [Google Scholar] [CrossRef]
  62. UNICEF. (2022). Trends in digital personalized learning in low-and middle-income countries. Available online: https://www.unicef.org/innocenti/reports/trends-digital-personalized-learning (accessed on 16 January 2025).
  63. Van Schoors, R., Elen, J., Raes, A., & Depaepe, F. (2021). An overview of 25 years of research on digital personalised learning in primary and secondary education: A systematic review of conceptual and methodological trends. British Journal of Educational Technology, 52, 1798–1822. [Google Scholar] [CrossRef]
  64. Van Schoors, R., Elen, J., Raes, A., Vanbecelaere, S., Rajagopal, K., & Depaepe, F. (2025). Teachers’ perceptions concerning digital personalized learning: Theory meet practice. Technology, Knowledge and Learning, 30, 833–859. [Google Scholar] [CrossRef]
  65. Vanbecelaere, S., & Benton, L. (2021). Technology-mediated personalised learning for younger learners: Concepts, design, methods and practice. British Journal of Educational Technology, 52, 1793–1797. [Google Scholar] [CrossRef]
  66. Vandewaetere, M., & Clarebout, G. (2014). Advanced technologies for personalized learning, instruction, and performance. In J. M. Spector, M. D. Merrill, M. J. Jan Elen, & Bishop (Eds.), Handbook of research on educational communications and technology (pp. 425–437). Springer. [Google Scholar] [CrossRef]
  67. Verbert, K., Ochoa, X., De Croon, R., Dourado, R. A., & De Laet, T. (2020, March 23–27). Learning analytics dashboards: The past, the present and the future. Proceedings of the Tenth International Conference on Learning Analytics & Knowledge, Frankfurt, Germany. [Google Scholar] [CrossRef]
  68. Wang, F., & Hannafin, M. J. (2005). Design-based research and technology-enhanced learning environments. Educational Technology Research and Development, 53(4), 5–23. [Google Scholar] [CrossRef]
  69. Warwick, P., Vrikki, M., Vermunt, J. D., Mercer, N., & van Halem, N. (2016). Connecting observations of student and teacher learning: An examination of dialogic processes in lesson study discussions in mathematics. ZDM—Mathematics Education, 48, 555–569. [Google Scholar] [CrossRef]
  70. Watters, A. (2021). Teaching machines: The history of personalized learning. MIT Press. [Google Scholar]
  71. Zhang, L., Basham, J. D., & Carter, R. A., Jr. (2022). Measuring personalized learning through the Lens of UDL: Development and content validation of a student self-report instrument. Studies in Educational Evaluation, 72, 101121. [Google Scholar] [CrossRef]
  72. Zheng, L., Long, M., Zhong, L., & Fosua Gyasi, J. (2022). The effectiveness of technology-facilitated personalized learning on learning achievements and learning perceptions: A meta-analysis. Education and Information Technologies, 27, 11807–11830. [Google Scholar] [CrossRef]
  73. Zubairi, A., Kreimeia, A., Jefferies, K., & Nicolai, S. (2021). EdTech to reach the most marginalised: A call to action [FP-ETH Position paper]. EdTech Hub. [Google Scholar] [CrossRef]
Figure 1. Four stages and data collection methods of our adapted lesson study approach.
Figure 1. Four stages and data collection methods of our adapted lesson study approach.
Education 15 00698 g001
Figure 2. Usage data dashboard (including onboarding messages), comprising the progress (left), learner usage (centre), and learner profile (right) pages.
Figure 2. Usage data dashboard (including onboarding messages), comprising the progress (left), learner usage (centre), and learner profile (right) pages.
Education 15 00698 g002
Table 1. Overview of DPL implementation models, building on the categorisation offered by Major and Francis (2020).
Table 1. Overview of DPL implementation models, building on the categorisation offered by Major and Francis (2020).
SubstituteSupplementaryIntegrative
Instruction delivered solely through DPL technology, in lieu of teaching.DPL technology implemented outside of regular classroom instruction, with or without teacher guidance.Coexistence of teachers and DPL technology, with the technology designed to facilitate teaching and learning and be curriculum-aligned.
Table 2. Overview of sample sizes and research contexts for the five DBR phases.
Table 2. Overview of sample sizes and research contexts for the five DBR phases.
DBR PhaseSample SizeResearch Context
Phase 122 teachers and their classes (c. 550 learners) from 6 schools; 6 headteachers; 6 early childhood development officers; and 6 EIDU staffPre-primary government schools in Mombasa county
Phase 26 teachers and their classes (c. 430 learners) from 3 schoolsPre-primary government schools in Mombasa county
Phase 3366,906 learners from 4151 schools (post-test) and 253,184 learners from 3857 schools (pre-test)Pre-primary government schools in 5 counties (Embu, Machakos, Makueni, Murang’a, Nakuru)
Phase 4324 teachers from 149 schools; 10 classes (c. 560 learners)Low-cost private primary schools in Nairobi
Phase 545 teachers from 20 schools Low-cost private primary schools in Nairobi
Table 3. Overview of the DBR phases (adapted from Major et al., 2024).
Table 3. Overview of the DBR phases (adapted from Major et al., 2024).
DBR PhaseConjectureIteration
Scoping and initiating dialogue (May 2022)
1. Foundational phase of integrated multi-disciplinary methods (June–July 2022)Integrating new technology into classrooms can have an unpredictable and complex impact on existing pedagogy.No iteration, in order to investigate perspectives and practices in relation to the existing implementation of the DPL tool.
Implementation iteration workshop #1 (August 2022)
2. Iterative, co-learning phase of lesson study (October–November 2022)Implementation of the DPL tool into classrooms is influenced by factors including when it is used, by which learners and how many devices are available.Iteration (by teachers) of which learners are prioritised to receive the DPL tool and the introduction of a second device into classrooms.
Implementation iteration workshop #2 (November 2022)
Innovation sandbox (April–November 2023)
Evaluating practical and theoretical contributions (November 2023–February 2024)
3. A/B/C software testing at scale (January–March 2024)Providing data from the DPL tool to teachers can influence pedagogical practice in a way which impacts device usage.Iteration of the DPL tool’s teacher interface, with two groups receiving usage data via a dashboard, one of which has onboarding messages, and a further group not provided with the dashboard.
4. Mixed-methods hardware pilot (March–October 2024)Increasing the device-to-learner ratio will impact classroom management.Iteration of hardware, with a 10:1 device-to-learner ratio implemented and headphones introduced as a mitigation strategy.
5. Comparative structured pedagogy pilot (March–September 2024)Teachers’ agency to decide which substrand is selected on the DPL tool determines the way in which the tool is aligned with classroom practice.Iteration of the intervention, with one group of schools provided with digitised lesson plans aligned with the DPL content, and one group provided with the DPL functionality only.
Table 4. The overarching themes from each set of the DBR phase results, categorised according to the study’s two sub-research questions.
Table 4. The overarching themes from each set of the DBR phase results, categorised according to the study’s two sub-research questions.
Sub-RQ 1: What is the pedagogical role of the DPL tool in supporting learning?Sub-RQ 2: What is the pedagogical role of teachers in relation to the DPL tool?
Phase 1(1a) Variation in when the DPL tool is used during the school day(1d) Distribution of the DPL tool
(1b) Teachers’ conceptualisations of the purpose of the DPL tool
(1c) Suggested improvements to the DPL tool
Phase 2(2a) Teachers’ reasoning for distributing the DPL tool at certain times during the school day(2b) Teachers providing personal support to learners
(2c) Teachers personalising which learners receive the DPL tool
(2d) Impact of an additional device on class management
Phase 3 (3a) Impact of the usage data dashboard on overall DPL tool usage
(3b) Impact of the usage data dashboard on equality of use of the DPL tool
Phase 4(4a) Teachers’ conceptualisations of the purpose of the DPL tool(4b) Impact of the increased device-to-learner ratio on teachers’ distribution of the DPL tool
(4c) Impact of introducing headphones on teachers
Phase 5 (5a) Teachers’ reasoning for substrand selection on the DPL tool
Table 5. DPL implementation model for Phase 1 (no iterations to existing practice).
Table 5. DPL implementation model for Phase 1 (no iterations to existing practice).
Curriculum AlignmentDevice DistributionDPL Use
  • Learning content: mapped to CBC
  • Lesson plans: mapped to CBC
  • Number of devices: 1 per classroom
  • User selection: via a randomly ordered list of profiles (teachers can override)
  • Time of use: throughout the school day (teacher’s choice)
  • Duration of use: default of 5 min per profile
Table 6. Top three classroom activities during which the DPL tool was used by learners and the proportion of all occurrences of each activity during which the DPL tool was used.
Table 6. Top three classroom activities during which the DPL tool was used by learners and the proportion of all occurrences of each activity during which the DPL tool was used.
Top Three Classroom Activities During Which the DPL Tool Was Used by Learners (Frequency of Observed Use)Proportion of All Occurrences of Each Activity During Which the DPL Tool Was Used
Break/lunchtime46%
Free time80%
Independent classwork44%
Table 7. DPL implementation model for Phase 2 (iterations to Phase 1 model indicated in bold).
Table 7. DPL implementation model for Phase 2 (iterations to Phase 1 model indicated in bold).
Curriculum AlignmentDevice DistributionDPL Use
  • Learning content: mapped to CBC
  • Lesson plans: mapped to CBC
  • Number of devices: 1 vs. 2 per classroom
  • User selection: teachers prioritise different learner groups
  • Time of use: throughout the school day (teacher’s choice)
  • Duration of use: default of 5 min per profile
Table 8. Characteristics of the three case schools in Phase 2, including geographic location within Mombasa county, number of learners, and number and gender of teachers per grade.
Table 8. Characteristics of the three case schools in Phase 2, including geographic location within Mombasa county, number of learners, and number and gender of teachers per grade.
Case School 1Case School 2Case School 3
SchoolUrban, c. 200 pre-primary learnersPeri-urban, c. 150 pre-primary learnersCBD, c. 80 pre-primary learners
PP1 class1 teacher (female) and c. 100 learners1 teacher (female), 1 teaching assistant (female), and c. 70 learners1 teacher (female), 1 teaching assistant (female), and c. 40 learners
PP2 class1 teacher (female) and c. 80 learners1 teacher (male) and c. 33 learners1 teacher (female), 1 teaching assistant (female), and c. 25 learners
Table 9. DPL implementation model for Phase 3 (iterations to Phase 2 model indicated in bold).
Table 9. DPL implementation model for Phase 3 (iterations to Phase 2 model indicated in bold).
Curriculum AlignmentDevice DistributionDPL Use
  • Learning content: mapped to CBC
  • Lesson plans: mapped to CBC
  • Number of devices: 1–2 per classroom
  • User selection: introduction of usage data dashboard to inform distribution
  • Time of use: throughout the school day (teacher’s choice)
  • Duration of use: default of 5 min per profile
Table 10. Means and standard deviations of monthly school usage (hours) by partition.
Table 10. Means and standard deviations of monthly school usage (hours) by partition.
Pre-Test UsagePost-Test Usage
MeanSDMeanSD
No Dashboard101.4058.3079.7045.72
Onboarding Dashboard103.0159.2183.5049.69
Dashboard101.4958.5779.5945.13
Table 11. Multi-level modelling on partition difference, pre-test and post-test (reference = No Dashboard partition).
Table 11. Multi-level modelling on partition difference, pre-test and post-test (reference = No Dashboard partition).
CoefficientZp
Pre-test usage
Intercept95.6172.85<0.001
Partition (Onboarding Dashboard)1.410.760.449
Partition (Dashboard)−0.17−0.090.928
Post-test usage
Intercept79.6670.33<0.001
Partition (Onboarding Dashboard)3.812.370.018
Partition (Dashboard)−0.07−0.050.964
Table 12. Means and standard deviations of the top 10% total usage (post-test) per school (hours) by partition.
Table 12. Means and standard deviations of the top 10% total usage (post-test) per school (hours) by partition.
MeanSD
No Dashboard57.7035.86
Onboarding Dashboard59.3339.45
Dashboard56.8536.32
Table 13. DPL implementation model for Phase 4 (iterations to Phase 3 model indicated in bold).
Table 13. DPL implementation model for Phase 4 (iterations to Phase 3 model indicated in bold).
Curriculum AlignmentDevice DistributionDPL Use
  • Learning content: mapped to CBC
  • Number of devices: 10:1 learner-to-device ratio per classroom and headphones introduced
  • User selection: via randomly ordered list of profiles (teachers can override, informed by usage data dashboard)
  • Time of use: throughout the school day (teacher’s choice)
  • Duration of use: default of 5 min per profile
Table 14. DPL implementation model for Phase 5 (iterations to Phase 4 model indicated in bold).
Table 14. DPL implementation model for Phase 5 (iterations to Phase 4 model indicated in bold).
Curriculum AlignmentDevice DistributionDPL Use
  • Learning content: mapped to CBC
  • Lesson plans: lesson plans (mapped to CBC) vs. no lesson plans
  • Number of devices: 10:1 learner-to-device ratio per classroom and headphones
  • User selection: via randomly-ordered list of profiles (teachers can override, informed by usage data dashboard)
  • Time of use: throughout the school day (teacher’s choice)
  • Duration of use: default of 5 min per profile
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Daltry, R.; Hinks, J.; Sun, C.; Major, L.; Otieno, M.; Otieno, K. Integrating Digital Personalised Learning into Early-Grade Classroom Practice: A Teacher–Researcher Design-Based Research Partnership in Kenya. Educ. Sci. 2025, 15, 698. https://doi.org/10.3390/educsci15060698

AMA Style

Daltry R, Hinks J, Sun C, Major L, Otieno M, Otieno K. Integrating Digital Personalised Learning into Early-Grade Classroom Practice: A Teacher–Researcher Design-Based Research Partnership in Kenya. Education Sciences. 2025; 15(6):698. https://doi.org/10.3390/educsci15060698

Chicago/Turabian Style

Daltry, Rebecca, Jessica Hinks, Chen Sun, Louis Major, Mary Otieno, and Kevin Otieno. 2025. "Integrating Digital Personalised Learning into Early-Grade Classroom Practice: A Teacher–Researcher Design-Based Research Partnership in Kenya" Education Sciences 15, no. 6: 698. https://doi.org/10.3390/educsci15060698

APA Style

Daltry, R., Hinks, J., Sun, C., Major, L., Otieno, M., & Otieno, K. (2025). Integrating Digital Personalised Learning into Early-Grade Classroom Practice: A Teacher–Researcher Design-Based Research Partnership in Kenya. Education Sciences, 15(6), 698. https://doi.org/10.3390/educsci15060698

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop