Next Article in Journal
Participation in Tasks Outside the Classroom and the Educational Institution of Non-University Teachers in Spain
Previous Article in Journal
Development of Epistemic Meta Didactic–Mathematical Knowledge in Mathematics Teachers When Teaching Functions: A Scoping Review
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Beyond the Numbers: K-12 Teachers’ Experiences, Beliefs, and Challenges in Developing Data Literacy

National Institute of Education, Nanyang Technological University, Singapore 637616, Singapore
*
Author to whom correspondence should be addressed.
Educ. Sci. 2025, 15(11), 1527; https://doi.org/10.3390/educsci15111527
Submission received: 3 September 2025 / Revised: 28 October 2025 / Accepted: 31 October 2025 / Published: 12 November 2025

Abstract

As schools increasingly adopt data-informed practices, the importance of teacher data literacy has grown significantly. However, little is known about how teachers in Singapore understand and experience data use in their daily work. This qualitative study explores the beliefs, challenges, and professional learning experiences of Ministry of Education teachers in Singapore. Semi-structured interviews were conducted with in-service teachers from diverse school settings, and data were analyzed using Braun and Clarke’s thematic analysis approach to ensure methodological rigor. Four key themes emerged: (1) teachers value data but differ in how they define and apply it; (2) professional learning opportunities are often fragmented or insufficiently targeted; (3) emotional and time-related tensions hinder sustained data use; and (4) teachers desire clearer structures and dedicated support roles for data work. The findings underscore the need for ongoing, contextualized support and highlight the importance of addressing both the cognitive and affective dimensions of teacher data literacy. The study offers implications for contextualized professional development and system-level reforms to better support data-informed teaching.

1. Introduction

As education systems around the world increasingly rely on data to inform policy, instruction, and intervention, the role of teacher data literacy (DL) has become more critical than ever. When equipped with strong DL skills, teachers are better able to plan lessons, monitor student progress, and design targeted interventions that enhance learning outcomes (Kim & Yu, 2023; Lee et al., 2024). International research emphasizes that teachers’ dispositions such as their beliefs, confidence, and perceived value, strongly influence how data is used in practice. While global discourse on DL continues to grow, much of it centers on broad policy or technical frameworks, often overlooking the lived realities of educators on the ground. In contrast, Singapore presents a unique context: it is a high-performing, centrally governed education system where teachers are expected to align with top-down policies, including the increasing use of student data. The Ministry of Education (MOE), Singapore, has introduced initiatives such as the EdTech Masterplan 2030 and platforms such as the Student Learning Space (SLS) to promote data-informed practices. However, little is known about how teachers engage with these initiatives, or how they perceive data’s relevance and feasibility in everyday teaching. This study seeks to address that gap by adopting a qualitative lens to uncover the lived experiences, beliefs, and challenges faced by MOE teachers in developing DL. Teachers’ self-efficacy (SE) and perceived value of data use (PVOD) are widely recognised as key affective constructs that shape how DL is enacted in practice. Higher self-efficacy has been shown to support teachers’ confidence in analysing and applying data, while stronger perceived value increases their willingness to integrate data into instructional decisions (Hamilton et al., 2022; Filderman et al., 2021; Lee et al., 2024; Öngören, 2021). Using Braun and Clarke’s (2006) thematic analysis approach, this study contributes nuanced, ground-level insights to complement policy-driven narratives and inform more context-sensitive strategies for professional development.
The research is guided by the following questions:
  • How do MOE teachers perceive data literacy and its relevance to their professional roles?
  • What types of professional learning experiences have shaped teachers’ understanding of data use?
  • What challenges do teachers face in developing and applying data literacy?
  • What support structures or systemic changes do teachers believe are necessary to sustain growth in this area?

2. Literature Review

2.1. What Is DL?

The increasing integration of technology in education has made DL a vital skill for teachers. It enables teachers to implement differentiated instruction, offer targeted feedback, and reflect on their teaching practices to foster continuous improvement (Kim & Yu, 2023; Lee et al., 2024). Teachers who use data effectively can make evidence-informed decisions rather than relying on intuition, thus improving teaching quality and student outcomes (Ifenthaler & Yau, 2020). However, the term “data literacy” lacks a universally agreed-upon definition. Different scholars use varied terms such as “data-wise,” “data inquiry,” and “evidence literacy,” each reflecting distinct understandings of what data literacy entails (Henderson & Corry, 2020; Mandinach & Gummer, 2013). Without a shared definition, designing training programs becomes problematic (Wolff et al., 2016). A clearer conceptual framework is thus necessary to guide both research and professional learning efforts in this area. In this study, we have adopted Mandinach and Gummer’s (2016) definition, describing DL as the ability to transform information into actionable instructional practices by collecting, analyzing, and interpreting diverse forms of data. Mandinach and Gummer’s (2016) framework was selected due to being one of the most widely cited frameworks in international research on teacher DL, providing a common conceptual basis that enables meaningful comparison with prior studies (e.g., Lee et al., 2024; Filderman et al., 2021). In addition, their framework explicitly links technical skills of using and analyzing data with instructional application making it highly relevant to teachers’ daily practice.

2.2. Affective Dimensions: Self-Efficacy (SE), Perceived Value of Data Use (PVOD)

Teachers’ engagement with data is shaped not only by their technical knowledge but also by psychological dimensions such as their SE and PVOD. These internal beliefs significantly influence both the extent and effectiveness of data-use in classrooms.

2.2.1. SE of Data Literacy in Teachers

SE in data literacy refers to a teacher’s belief in their ability to effectively access, interpret, and apply data to guide instructional decisions. This belief plays a critical role in shaping how confidently educators engage in data-informed practices (Hamilton et al., 2022; Keuning et al., 2017). Research has shown that teachers with higher SE are more likely to utilize data meaningfully in the classroom, while those with lower confidence may experience data anxiety, which deters them from interpreting or acting on data insights (Reeves & Chiang, 2019; Lee et al., 2024). However, high SE alone may not guarantee effective data use. External factors such as school organizational characteristics (leadership, training, etc.) can affect how teachers integrate data into their practice (Schildkamp et al., 2015b). This highlights the need to move beyond an individual focus and consider the broader ecosystem in which teachers operate. While boosting SE remains essential, professional learning initiatives must also address these contextual constraints to ensure sustainable and meaningful data use in schools (Lee et al., 2024).

2.2.2. Teachers’ PVOD

PVOD captures the extent to which teachers view data use as meaningful and beneficial to their instructional goals. Studies consistently show that high PVOD correlates with greater data use (Lee et al., 2024; Lim, 2023), however it was shown that perceptions were hard to change. For instance, Filderman et al. (2021) found that altering PVOD is far more challenging than enhancing SE. This distinction points to a deeper issue: beliefs on the usage of data are often entrenched and influenced by broader school cultures, prior experiences, and the perceived relevance of data tools. There is also a notable gap in the literature concerning how PVOD interacts with external mandates, and it asks the question: Do teachers value data more when it aligns with their instructional priorities, or when it is mandated by policy? Such tensions underscore the importance of designing professional development (PD) not only to build skills, but also to reshape deeply held beliefs through authentic, practice-based engagement. The limited understanding of how these beliefs shift over time suggests the need for further qualitative research to unpack the affective dimensions of teacher mindsets around data.

2.3. Barriers to Implementation

Despite increasing efforts to build data literacy among teachers, several persistent challenges still hinder its integration into classroom practice. A recurring issue across multiple studies is the practical difficulty teachers face in accessing and making sense of data. For example, studies highlighted that even when data is available, many teachers struggle to interpret it effectively or use it to inform instructional decisions (Dunlap & Piro, 2016; Miaomiao et al., 2024). However, not all studies agree on the root cause of this difficulty. Some argue that it stems from insufficient training in data analysis and application (Van Den Bosch et al., 2017; Espin et al., 2017), while others point to broader systemic issues such as lack of leadership support, and organizational attitude towards data (Schildkamp et al., 2015b). This tension suggests that barriers to data use are both technical and contextual in nature. Furthermore, even when teachers possess technical know-how, affective barriers such as low SE and a weak PVOD can further limit uptake (Lee et al., 2024). As a result, one-off PD programs that focus solely on skill-building often fall short. To address this complex layered barriers, researchers should increasingly advocate for more holistic interventions that integrate both psychological support and contextual enablers.

2.4. Global Policy/Practice Perspectives

Across education systems worldwide, there is increasing recognition that DL is essential for improving teaching and learning. Globally, national education policies and reforms have actively sought to embed data-use into daily teaching practices (Mandinach & Gummer, 2016). For instance, in the U.S., federal initiatives such as Race to the Top have underscored the need for educators to make instructional decisions based on student data (U.S. Department of Education, 2009). Similarly, in the Netherlands and the U.K., teacher PD frameworks now frequently incorporate data analysis skills, reflecting a broader push toward data-informed practice (Keuning et al., 2017). These global developments indicate a growing consensus around the importance of DL, while also underscoring the need to adapt such practices to local educational realities.

2.5. Need for Qualitative Inquiry in Singapore

In recent years, Singapore’s MOE has placed increasing emphasis on data use to support school effectiveness and differentiated teaching. System-level platforms such as SLS and the School Cockpit have enabled schools and teachers to access dashboards and analytics on student performance and attendance. In line with this direction, national strategies such as the EdTech Masterplan 2030 have underscored the importance of leveraging technology and data to enhance student outcomes. A key component of this plan is to equip teachers with the skills to interpret and use data meaningfully, enabling them to better understand students’ needs and provide more targeted support. However, while resources are increasingly available, qualitative insights into how teachers perceive and engage with data remain limited, especially in Singapore. Most international studies on data literacy have relied on large-scale surveys or quantitative indicators in decentralised education systems (Mandinach & Gummer, 2016; Hamilton et al., 2022; Keuning et al., 2017). These studies provide valuable evidence of general trends but offer less insight into how affective beliefs and systemic conditions combine to shape teachers’ data practices. In contrast, Singapore represents a policy-intensive system where reforms are centrally directed and teachers work within strong accountability structures (Lee et al., 2024; Lim, 2023). This study addresses this gap by capturing teachers’ lived experiences and beliefs surrounding data literacy in Singapore. Using a qualitative lens allows us to integrate affective constructs such as self-efficacy and perceived value of data use with systemic features of a high-performance, top-down policy environment. In doing so, the study contributes conceptual insight into why teachers may engage cautiously with data despite seeing its potential, complementing existing international literature and informing more sustainable, context-sensitive approaches to professional learning.

3. Research Methods

3.1. Participants

After completing the preliminary survey, participants attended a PD session on data literacy before being invited for an interview. The PD lasted approximately six hours and was jointly facilitated by the research team and MOE practitioners. The PD was designed to introduce teachers to key DL concepts while providing opportunities for practice and reflection. Content included: (a) an overview of data literacy and its role in Singapore’s education system, (b) guided demonstrations of national platforms such as the Student Learning Space (SLS) and School Cockpit dashboards, (c) small-group exercises analysing anonymised student data to identify learning needs, and (d) structured peer discussions linking data interpretation to classroom practice. Teachers were also provided with short pre-learning materials on SLS to familiarise themselves with basic concepts prior to the session. Five teachers (3 females, 2 males) were purposefully selected from MOE teachers who had participated in the professional learning. Sampling aimed to ensure diversity in teaching experience (ranging from 3 to 33 years), age (21 to 60 years), and subject specialization, including Science, Humanities, Language, and Mathematics. Interview participants were selected from the broader survey cohort using maximum variation sampling, prioritizing teachers who had completed the PD session, ensuring that interviewees could reflect on both survey items and their PD experience. While this variation enhanced the transferability of the findings, the small sample size (N = 5) remains a key limitation. Such a small group allows for in-depth exploration of individual perspectives but cannot represent the full diversity of teachers’ experiences within the education system. As such, this study should be understood as exploratory, offering illustrative insights rather than general conclusions.

3.2. Study Design

This study adopted a qualitative research design to explore MOE teachers’ experiences, beliefs, and challenges in developing and applying DL in their teaching. Participants had previously completed a survey measuring teachers’ data knowledge, PVOD and SE. After attending a PD session on DL, participants were invited for an interview. The interview phase served as a follow-up, allowing researchers to probe deeper into the patterns observed in the quantitative data and to surface rich contextual insights not captured in the survey. For example, the survey results indicated lower self-efficacy scores among some teachers in relation to data use. Therefore, the interview protocol included specific probes to explore the underlying reasons for this, such as teachers’ confidence in interpreting student performance dashboards and their comfort level with statistical concepts.
In this way, the qualitative phase was conceptually and analytically informed by the quantitative findings. Integration occurred at both the design and interpretation stages: survey constructs guided the focus and phrasing of interview questions, while qualitative narratives were later compared against statistical patterns to explain and contextualize them. The qualitative data thus functioned not merely to confirm but to elaborate and extend the quantitative trends, offering insight into why certain patterns, such as lower SE, emerged and how they manifested in classroom practice. This design supported a mixed-methods integration, linking broad survey trends with nuanced individual perspectives. Although the quantitative survey findings informed the designof the qualitative phase, the quantitative results are not presented in this manuscript, as they are reported in a separate publication (Khor & Chee, 2025). In this study, the quantitative phase served primarily to guide the qualitative exploration by identifying key constructs and general patterns of interest rather than as a source of results for discussion. Ethical approval for this study was obtained from the authors’ Institutional Review Board and all participants provided informed consent prior to data collection. A sample audit trail of the study is provided in Table 1.

3.3. Instruments & Measures

A semi-structured interview protocol was developed by the research team to investigate teachers’ perceptions of DL, their professional learning experiences, challenges encountered in using data, and the kinds of support they deemed necessary. Each interview lasted approximately 40–60 minutes and was conducted through an online call, depending on teachers’ availability. Interviews were recorded with consent and transcribed verbatim. The final sample was determined by data adequacy, with interviews continuing until a sufficiently rich and diverse set of perspectives was obtained. The protocol was informed by themes from the quantitative findings, relevant literature, and research questions. A sample of the interview protocol is shown in Table 2.

3.4. Data Analysis

Interview transcripts were analyzed thematically using Braun and Clarke’s (2006) six-phase framework. An inductive approach guided the identification of recurring patterns and themes, which were iteratively refined through coding and memo writing. NVivo was used to support the organization and development of codes. The six phases included: (1) familiarization with the data, (2) generation of initial codes, (3) searching for themes, (4) reviewing themes, (5) defining and naming themes, and (6) producing the report. The first author led the coding process, following Braun and Clarke’s (2006) six phases of thematic analysis. Initial codes were generated inductively from the transcripts, and emerging codes and themes were iteratively reviewed and refined in consultation with the research team. Table 3 shows sample coding of transcripts. These discussions provided opportunities for peer debriefing and validation, ensuring that the analysis was not shaped solely by the first author’s perspective. NVivo 12 was used to organize data, support thematic mapping, and facilitate cross-case comparison. This process enhanced the transparency and credibility of the analysis while keeping participants’ voices central to theme development.
Trustworthiness in qualitative research involves four essential aspects, confirmability, creditability, dependability, and transferability (Ahmed, 2024). Confirmability is achieved in this study through reflexive journaling of researchers’ observations, thoughts, and refinements made in the data collection and analyses process. The audit trail as shown in Table 1 reinforces dependability and confirmability of findings because it provides detailed documentation to show the coherence and internal consistency throughout the study across research questions, methods, and findings. Creditability is established through persistent observation and understanding of participants at different junctures of the study, such as during the PD, using the survey to inform the interview protocol, and having In-depth interviews with participants. Transferability of findings is established through purposeful sampling of participants. Inter-coder reliability is not sought because the focus of researcher reflectivity is to uncover nuances and enable rich descriptions through discussions arising from different interpretations. Alternatives to inter-rater reliability were established to ensure trustworthiness and rigor of findings through triangulation with survey data, audit tails, and consensus coding (McDonald et al., 2019).

4. Results

The paper focuses primarily on qualitative analysis, while briefly reporting the quantitative results. For the survey, the self-efficacy (SE) scale was adapted from the Data-Driven Decision-Making Efficacy and Anxiety Inventory (3D-MEA; Dunn et al., 2013), which demonstrated reliabilities of 0.81–0.92 and convergent validity via CFA. The perceived value of data use (PVDU) scale was adapted from the Teacher Data Use Survey (TDUS; Wayman et al., 2016), with all Cronbach’s α estimates > 0.80 (mostly > 0.90). The data knowledge questionnaire (DKQ) aligned with the Locate-Comprehend-Interpret framework (Means et al., 2011), as well as our PD content which drew upon Filderman et al.’s (2021) data literacy training model. At baseline (N = 144), teachers reported moderately high SE (M = 45.12, SD = 5.55, range: 25–60), high PVDU (M = 61.70, SD = 7.27, range: 43–80), and moderate Data Knowledge (M = 13.41, SD = 6.77). The 12-item SE scale (5-point Likert; range 12–60) showed excellent internal consistency at baseline (N = 144; α = 0.90, ω = 0.90) and post-test (n = 21; α = 0.93, ω = 0.95). The correlation matrix was factorable (KMO = 0.88; Bartlett’s χ2(66) = 892.08, p < 0.001), with parallel analysis supporting one-dimensionality and communalities of 0.40–0.62.
The 16-item PVDU scale (range 16–80) demonstrated excellent baseline reliability (α = 0.92, ω = 0.92) and acceptable post-test reliability (n = 21; α = 0.84, ω = 0.80). The matrix was suitable for factoring (KMO = 0.89; Bartlett’s χ2(120) = 1657.92, p < 0.001). Parallel analysis indicated two factors: attitudes/value beliefs and perceived support/conditions, with communalities of 0.64–0.75. The 23-item DKQ comprised 18 MCQ and 5 open-ended questions, scored via standardised rubric with dual-rater marking. The matrix was factorable (KMO = 0.63; Bartlett’s χ2(153) = 362.34, p < 0.001), with parallel analysis suggesting three factors representing distinct knowledge domains. MCQ item difficulty ranged from p = 0.17–0.96 (five items showed ceiling effects, p > 0.90; one was overly difficult, p < 0.20). Discrimination indices ranged from r = 0.19–0.49; two items showed weak discrimination (r < 0.15). Inter-rater agreement for open-ended items ranged from ICC(2,1) = 0.30–0.55 (M = 0.43), indicating fair reliability.
As this study was conceptually grounded in teachers’ SE and PVOD, these constructs guided both the coding and interpretation of qualitative data. Instances where teachers discussed their confidence or skills in handling data were interpreted as reflections of SE, while those expressing beliefs about the usefulness, relevance, or meaning of data to their teaching were categorized as reflections of PVOD. Across the four research questions, themes related to low SE emerged through teachers’ descriptions of limited data skills, uncertainty, or dependence on external support (e.g., limited skills and confidence, uneven data competencies). Themes reflecting PVOD surfaced where teachers emphasized the benefits, relevance, or emotional meaning of data (e.g., positive beliefs about data, time and workload tensions, role-specific relevance). These conceptual anchors allowed the analysis to link individual perceptions to broader affective dimensions of data literacy. The results are organized according to the four research questions guiding this study. Selected quotations from participants are included to support each theme and illustrate teachers’ voices.

4.1. Perceptions of Data Literacy and Its Relevance to Teaching

This section introduces how teachers perceive data literacy and how they perceive it to be relevant to their roles in education as a teacher. A total of 3 themes were formed from the interviews:
Theme 1: Positive Beliefs About the Usefulness of Data for Teacher Reflection and Decision-Making to meet Students Needs
Through the interviews, teachers consistently expressed strong, positive beliefs about the value of data in supporting effective teaching and learning. Data was viewed as a powerful tool for informing instruction, identifying learning gaps, and tailoring support for students with diverse needs. Participant W explained: “It’s useful for the teachers because we get to have insights and we can make meaningful conclusions on the status or the condition of the class…”
Participant Y also emphasized how data enabled more targeted and purposeful interventions: “We are able to find the correct intervention for the target group of students, whether it’s high ability students, how to stretch them, low ability students, how to help them with their misconceptions, and specific areas through the data that we find, what they don’t understand, and how we could intervene to help them.” This aligns with literature suggesting that data use can drive instructional precision and improve learning outcomes (Mandinach & Gummer, 2016).
In addition to guiding student support, data was seen as instrumental in promoting teacher reflection. As Participant Z shared “We will need to have some analysis on how we are teaching, how are the students receiving the instructional teachings to the syllabus.” These perspectives collectively underscore a central belief among participants that when data is effectively used, it enhances both teacher decision-making and student outcomes by enabling more informed, responsive, and reflective teaching.
Theme 2: Differentiated Data Roles in Schools informed by Teachers’ Strengths and the Collective Responsibility to use Data Productively. Some participants questioned whether all teachers should be equally involved in data analysis, raising important considerations about role clarity and institutional capacity. While there was consensus on the value of data-informed instruction, several teachers emphasized that not every educator should be expected to engage in data work. As Participant Z remarked, “I would think that data literacy doesn’t have to be something for every single educator.” Similarly, Participant Y expressed a desire for specialized support, stating, “I would like to have a data manager in the school… I tell you this, this, this. You just do that for me.”. These reflections point to a preference for a more distributed model of data use, in which responsibilities are aligned with individual strengths and workloads are more equitably managed.
This notion challenges the assumption that all teachers should become data analysts. Instead, it foregrounds the need for role differentiation, especially in environments where infrastructural and human resource limitations are significant. As Kashaka (2024) observed, “Several schools do not have access to sophisticated IT infrastructures, and many educators lack training and knowledge about how to use analytics tools, interpret the findings, and implement them effectively.” These external constraints compound internal pressures, leading to tensions between policy expectations and what is realistically manageable on the ground.
The findings suggest that equitable and effective data use may require specialized roles, such as in-school data coaches or dedicated support staff, to help translate data into actionable insights. When such structures are absent, the burden often falls on teachers who are already stretched thin. In this context, expectations around DL risk becoming performative rather than productive.
In sum, this shows that while data use is promoted as a collective responsibility, its successful enactment depends on differentiated support, clearly defined roles, and the infrastructure to sustain them. Without such scaffolding, the promise of data-informed teaching may inadvertently place undue strain on educators and hinder rather than enhance teaching and learning.
Theme 3: Uneven Data Competencies as a Barrier to Equitable Practice and Teachers’ Decision-making
Participants consistently expressed that many teachers lacked the foundational knowledge and confidence to engage with data meaningfully. For example, Participant X shared, “We are very familiar with average and all that, but from this course I also learned about what’s the median and how to use the standard deviation. These are things that I think maybe not every one of us is on the same page about.” Another added, “I would say about one in four teachers understand what data literacy is and how they can apply it in their education career.” These accounts suggest that while DL is widely promoted as essential, the actual skillsets required for interpretation and application remain unevenly developed across the teaching workforce. This gap in competency not only limits the potential for consistent data-informed instruction but also raises questions about how equitably data use can be implemented in practice.
These reflections echo concerns raised in existing research, which suggests that data-use is often hindered by uneven levels of teacher competence (Schildkamp & Poortman, 2015a). Without a strong foundation in key data skills, teachers may struggle to translate data into effective instructional decisions, even when they recognize its importance.

4.2. Professional Learning Experiences That Shaped Data Use

This section describes what types of professional learning experiences have contributed to the development of teachers’ data-use. A total of 2 themes were formed: Interactive and collaborative learning, and self-directed pre-learning online.
Theme 1: Interactive and Collaborative Learning with Real-life Examples promoted Practical Understandings
Participants highlighted the value of interactive and peer-driven PD in strengthening their understanding of data use. They described learning experiences that involved hands-on tasks, real-life case studies, scenario-based exercises, and peer discussions as especially meaningful and effective. The opportunity to engage with actual classroom data and reflect collaboratively with colleagues helped build confidence and promote a deeper, more practical grasp of DL concepts.
As Participant V noted, “I think it was quite useful to have that group discussion to look at the data and talk about it. Because otherwise, I’ll be looking at it on my own and I won’t know what I’m supposed to look out for.” Similarly, Participant W shared, “It made it very clear what to look out for in the heatmap, and in a real-world scenario. We were trying to make sense of something that we can relate to.”
These reflections underscore the importance of contextually relevant examples and collaborative dialogue in making data use both accessible and applicable. Participants expressed a clear preference for active, peer-engaged learning over passive or purely theoretical instruction. This aligns with Mandinach and Gummer (2016), who argued that teacher preparation programs should move away from isolated assignments focused on data instruction and instead adopt a collaborative data inquiry approach. Such an approach fosters teachers’ ability to analyze and interpret data within authentic educational contexts, while strengthening their confidence through shared reflections.
Theme 2: Self-Directed Pre-Learning Online helps to Orient and create Teachers’ Awareness to New Concepts
Some teachers shared that the pre-learning component on SLS provided useful exposure to data concepts prior to the in-person session. While some admitted to skimming through the materials, others found it beneficial for setting the foundation and prepared them well for the PD. Participant W noted “We actually had that online learning, right? That one also comes in useful. So, this gives us the flexibility to learn things which we cannot on our own self-directed learning beforehand”. These findings indicate that pre-session materials, even when not deeply engaged with, helped orient teachers with new concepts and also prepared them for a professional learning experience on data literacy.

4.3. Challenges in Developing and Applying Data Literacy

This section describes what types of challenges teachers face in developing and applying data. A total of 4 themes were formed: Tensions between data-use and time-intensive teaching realities, limited skills and confidence, concerns about data accuracy and use and emotional and professional pressures.
Theme 1: Tensions arising from Teachers’ Teaching Realities versus Time Constraints, Lack of Support and Complexity of Using Data for Decision-making
Despite recognizing its benefits, teachers often expressed time as a significant barrier to using data in practice. The process of collecting, analyzing, and interpreting data was perceived as labor-intensive, especially amidst their existing teaching responsibilities. As a result, data work was often deprioritized in favor of more immediate teaching tasks such as lesson preparation and marking. Participant W stressed that “It is actually difficult to really spend some time to sit down and do all this unless, of course, it’s for special projects. Otherwise, on a day-to-day basis, we may not have the bandwidth to do this now because we have all this other additional work like community work and so on”. Another participant echoed this concern, noting that “keying in all their marks… these were time consuming for teachers.”
This perspective reflects a broader concern in literature that although data use is strongly encouraged at the system level, time constraints continue to limit its practical implementation in schools (Schildkamp & Kuiper, 2010). This tension highlights the practical challenges of embedding data into everyday teaching, particularly when teachers are expected to manage it independently without additional support or allocated time. Besides, perceived complexity and time consumption may heighten teacher resistance or surface as a latent stressor that undermines engagement with data initiatives.
Theme 2: Limited Skills, Confidence, and Lack of Sustained Support that Scaffolds their Application to Practice
A recurring challenge identified by participants was a lack of skill in interpreting complex data sets. Teachers often found dashboards and numerical information intimidating, especially when they had not received sufficient training in data interpretation. For instance, Participant X shared, “All this seems to just stop at the collection point… I feel that I have not moved on to the next level where I’m actually making use of this data to help my students learn better because… I don’t know all this data, are they important data… are they like different touch points for me?” This sense of uncertainty extended to the use of data tools as well. Participant Y noted, “I don’t have enough knowledge of the tools that I need,” expressing that this lack of familiarity with data tools limited her ability to implement data-informed practices effectively. Similarly, Participant Z highlighted a more fundamental concern “Perhaps the biggest problem… will be what kind of data we need to collect and how do we collect [it].”
These reflections highlight a broader issue that while many teachers are provided with tools and access to data systems, they often lack the necessary confidence and understanding to use them effectively. Access alone is insufficient, teachers require sustained, practice-based support to build their SE in data-use. These findings underscore the importance of having comprehensive PD for teachers, that not only introduces data tools, but also scaffolds their application in real instructional contexts (Mandinach & Gummer, 2016).
Theme 3: Concerns about Data Accuracy, Purposeful Inquiry and Meaningful Use of Data for Reflection
Concerns surrounding the reliability and meaningfulness of data emerged prominently in participants’ reflections. One key issue raised was the questionable quality of student-generated data, particularly in online platforms. Participant X observed, “Some students just want to complete it, they do not want to get into trouble.” illustrating a worry that student responses may be perfunctory rather than authentic. Such practices undermine the integrity of data that teachers rely on to inform instruction, raising important questions about the validity of digital assessments and the extent to which student data genuinely reflects learning.
Compounding this issue is a perceived lack of intentionality in how some teachers engage with data. As the same participant remarked, “We also need to be mindful and not just like mindlessly crunching data, you know, just for the sake of it.” This comment signals a deeper concern that without adequate guidance or purpose, data engagement risks becoming a routine, compliance-driven task rather than a reflective, pedagogically grounded practice.
These insights point to two critical tensions: (1) the need to ensure that student data is both authentic and valid, and (2) the importance of cultivating DL that goes beyond analysis to foster purposeful, inquiry-driven use. Addressing these challenges calls for strategies that promote intrinsic motivation among students and reflective data cultures among educators.
Theme 4: Emotional and Professional Pressures leads to Compliance instead of Meaningful Data Use for Learning
There was also attention drawn to the emotional strain associated with data-use. One teacher described how the availability of student data created an implicit expectation to revise lessons on the fly: “Now that I have the data, I’m expected to change my lesson, which I think can be a little bit challenging, especially if I didn’t expect certain kinds of responses to come out”, alluding to a perceived rise in professional demands tied to data engagement.
These reflections resonate with Lee et al. (2024), who observed that the pressure to quickly master new data management systems and formulate data-informed strategies can elicit anxiety among teachers, particularly when they lack the necessary DL skills. Such emotional pressures are compounded by broader accountability demands, where schools are expected to use student assessment data to demonstrate performance outcomes. As Lee et al. (2024) further note, this can shift educators’ focus on compliance, often at the expense of authentic learning-oriented data use.
Taken together, these findings suggest that developing teachers’ DL cannot be limited to technical training alone. Schools must also attend to the emotional and psychological demands of data work, ensuring that data initiatives are supportive. Without such support, emotional fatigue may undermine teachers’ willingness and ability to engage meaningfully with data.

4.4. Desired Support Structures for Sustaining Data Literacy

This section presents the types of support structures or systemic changes that teachers believe are necessary to sustain long-term growth in data literacy within education. Based on the interview data, four main themes emerged: reducing cognitive load through intelligent data systems, dedicated data personnel and mentorship, and pre-service and ongoing professional learning.
Theme 1: Reducing Cognitive Load Through Intelligent Data Systems Coupled with Efforts to Prevent Over Reliance
Several participants highlighted the potential of technology, particularly intelligent systems and Artificial Intelligence (AI), to automate or streamline data collection and interpretation. These tools were seen as a means of reducing workload and making data use more accessible and sustainable. As participant V noted, “My hope is as my students answer on iPad, the system is already collecting some data like SLS, and we can compare to what their outcomes are.” Participant W similarly suggested, “One area this course can look at will be to introduce AI into this and how to make use of AI to manage the data that you get.”
While there is clear enthusiasm for using technology to automate data processes, it raises the questions if teachers may be too dependent on AI for data work. If educators become overly reliant on AI-driven tools, they may be less equipped to analyze data independently or critically evaluate the outputs provided. This tension highlights the dual role of technology as both an enabler and a potential crutch. As such, while the integration of AI offers promise in reducing cognitive load, it must be paired with efforts to build teachers’ data interpretation skills so that they remain active users
Theme 2: Dedicated Data Personnel and Mentorship for Shared Responsibility in DL
As mentioned above, participants envisioned the value of having designated individuals in schools such as data support staff or teacher-mentors, who could assist with data interpretation and report preparation. This support was seen as essential in reducing the cognitive demands placed on classroom teachers. Participant Z explained, “We did tell the leaders before, school leaders, why not you have a teacher or a few teachers or we hire someone that is just here to collect data while the other teachers teach.” Participant Y expressed a similar desire, stating, “I would like to have a data manager in the school.”
These reflections reveal an underlying recognition that effective data use requires both time and expertise, resources not all teachers have at their disposal. Rather than expecting every teacher to master all aspects of DL independently, participants advocated for a more distributed model of expertise in schools. This perspective aligns with Mandinach and Gummer (2016), who emphasize that fostering a data-informed school culture involves building collaborative structures, such as Professional Learning Communities (PLCs) or data teams, that allow for shared responsibility in data interpretation and use.
The call for dedicated data personnel also points to a broader systemic issue: the assumption that data work is an add-on to teaching rather than a distinct professional function that requires targeted support. By embedding expert roles within schools, teachers can focus more deeply on instructional responses to data rather than struggling through its technical management. This structural shift may be key to sustaining meaningful engagement with data over time.
Theme 3: Develop DL through Continuous Efforts from Pre-service to Ongoing Professional Learning
There was a strong and consistent call for both pre-service training in DL and opportunities for sustained, hands-on PD throughout teachers’ careers. Participant V proposed, “Perhaps there could be some kind of a data literacy module so that every teacher that graduates at least comes in with a baseline understanding of how to use data or even some basic Excel.” Similarly, participant V shared, “I try to go and look for courses like under SkillsFuture to go and find out like how to really optimize your Excel document to do some of this but didn’t get to find that kind of SkillsFuture course.”
These reflections highlight a deeper issue that DL is a long-term professional competency that requires ongoing reinforcement. Participants recognized that data-informed decision making must be a part of pre-service teacher’s training, not an add-on responsibility. This aligns with Mandinach and Gummer’s (2016) assertion that DL must begin early in a teacher’s career through dedicated coursework and practical experiences, and be sustained through embedded structures such as data coaching, common planning time, and a shared school-wide vision for data use.
Furthermore, these findings reflect the broader educational challenge of bridging the gap between data knowledge and data action. Without ongoing opportunities to apply and refine their data skills in real contexts, teachers may struggle to translate abstract knowledge into instructional improvement. Therefore, system-wide strategies that integrate both pre-service and in-service support are crucial, not just to upskill teachers, but to normalize data-informed practices within the professional culture of teaching.

5. Discussion

5.1. Teachers’ Beliefs: Willing but Wary

Across participants, there was a consistent recognition that data holds value in informing teaching practices. Teachers discussed how data could help identify student learning gaps, adjust instruction, and support pedagogical decision-making. These views reflect a strong conceptual understanding of data’s potential in instructional decision-making. This marks a positive step towards a data-driven education system in Singapore as valuing data is key in fostering more sustained engagement (Öngören, 2021).
However, their belief and enthusiasm in data-use was tempered by a lack of confidence in using different data tools, interpreting and analyzing data independently. Teachers were wary to apply data in their practices due to their low SE in data-use. This corresponds with Filderman et al. (2021), who identified SE as a foundational component of data-use in education. Even when teachers acknowledge the importance of data, low confidence can hinder deeper engagement.
Rather than reflecting resistance to data use, this tension between valuing data and feeling unable to use it effectively underscores the need for guided, supportive conditions that build teachers SE in data-use. This highlights a critical gap in that teachers are not rejecting data but are requesting support to use it meaningfully. Ultimately, perceptions of DL must be understood not only in terms of attitudes, but also in terms of readiness and support to translate those beliefs into practice.
This “willing but wary” stance can also be understood in relation to Singapore’s policy-driven schooling context, where teachers are expected to enact centrally designed reforms such as the EdTech Masterplan. In such systems, teachers’ self-efficacy functions as a capability condition (whether they feel able to use data) while perceived value of data use reflects a motivational condition (whether they see data as meaningful to their teaching). As seen in our findings, low SE magnified the strain of workload demands, while uneven PVOD led some teachers to view data as more compliance than pedagogy. These dynamics illustrate how affective beliefs interact with systemic features of Singapore’s education environment, shaping whether data initiatives are taken up as opportunities for inquiry or treated more cautiously in daily practice (Lee et al., 2024; Filderman et al., 2021; Öngören, 2021).

5.2. Professional Learning and Data Literacy Development

The findings highlight the importance of a well-designed PD in building teachers’ DL. Participants consistently valued learning formats that were interactive, collaborative, and grounded in real-life teaching contexts. Activities such as analyzing real classroom data, working through scenario-based exercises, and engaging in peer discussions helped teachers make sense of data in a way that felt relevant and accessible. Our findings align with research that show collaborative inquiry within data teams can deepen teachers’ understanding of data use, support professional growth and improve education (Keuning et al., 2017; Schildkamp et al., 2015b). The preference for interactive, collaborative PD aligns with situated learning (Lave & Wenger, 1991), where learning is best when knowledge is developed in real-life contexts and through social interaction, not just from abstract teaching. This suggests that DL initiatives should not be designed as top-down training courses but as embedded, community-oriented practices.
Participants also appreciated access to pre-learning materials on the SLS, which helped them familiarise themselves with basic concepts before in-person sessions. Although engagement varied, some found that self-paced learning supported learning later in the PD. These findings resonate with Stavermann (2024), who emphasized the value of combining asynchronous and synchronous learning, and balancing individual and group activities to meet diverse teacher needs.
These findings suggest that impactful PD for DL includes a combination of individual preparation, collaborative engagement, and contextually relevant tasks. It also underscores the importance of scaffolding teachers to build knowledge individually before exploring application collectively. Designing learning environments that allow teachers to reflect together while also building foundational knowledge independently may be key to strengthening both their confidence and competence in using data effectively.

5.3. Systemic Barriers to Teachers’ Data Use in Singapore’s Policy-Driven Context

Within Singapore’s policy-driven environment, teachers highlighted challenges that complicated the translation of data initiatives into everyday practice. Time constraints were the most frequently cited barrier, with teachers describing data tasks as burdensome amid competing professional responsibilities. This concern is not just an individual issue but reflects broader workload norms in Singapore’s education system, where teachers juggle large class sizes, co-curricular commitments, and additional administrative duties alongside lesson preparation. In such contexts, data work is easily deprioritised, even when teachers value its potential. This echo concerns raised by Schildkamp and Kuiper (2010), who observed that systemic workload pressures often limit opportunities for sustained data engagement.
In addition to time pressures, many teachers reported uncertainty around using data tools, analysing results, and interpreting findings. This is particularly salient in Singapore, where national platforms such as the Student Learning Space (SLS) and School Cockpit generate large volumes of student data. Teachers in our study described difficulties in making sense of these dashboards without sufficient training, underscoring the gap between policy-driven availability of data and teachers’ self-efficacy to use it effectively (Filderman et al., 2021; Lee et al., 2024).
Concerns about data accuracy also reflected system-level issues. Teachers questioned whether online student responses collected via SLS truly reflected learning, with some noting that students often “click through” tasks to avoid penalties. These concerns point to a tension between policy expectations for digital data collection and the ground-level reality of ensuring data quality. Without guidance on how to judge the trustworthiness of such data, teachers risk engaging in compliance-oriented “data crunching” rather than meaningful pedagogical use.
Finally, emotional pressures were closely tied to Singapore’s accountability culture. Teachers described implicit expectations to adapt lessons immediately when new data surfaced, creating stress and anxiety. This reflects how high-stakes, performance-oriented policy environments can intensify the affective burden of data use (Lee et al., 2024). Rather than resisting data, teachers in this study revealed how the emotional and professional demands of working under centralised expectations limited their ability to act confidently, especially when self-efficacy was low.
Taken together, these findings suggest that the barriers teachers face are not generic challenges, but ones deeply embedded in Singapore’s policy-driven context. The intersection of high workload, national digital platforms, centralised accountability, and strong performance pressures explains why teachers often appear “willing but wary” of data use.

5.4. Support Structures and Systemic Changes Needed

The findings highlight that teachers view DL not as an isolated skill but as one that requires sustained, contextual support. Across interviews, they expressed a need for systemic enablers such as technology, personnel, and ongoing training to make data use more practical and less cognitively demanding. Participants were particularly optimistic about the role of technology, especially intelligent systems and AI, in streamlining data collection and analysis. Their suggestions reflected a pragmatic hope that future tools could reduce manual workload. However, this enthusiasm also raises concerns about teachers becoming overly reliant on automated systems to process and interpret data. Such dependence may limit the development of teachers’ critical data reasoning and reduce opportunities for professional reflection. This concern is echoed in the broader literature: Ifenthaler and Yau (2020) emphasize that analytics systems enhance learning only when teachers critically engage with their outputs; Kim and Yu (2023) caution that reliance on technological tools without pedagogical grounding risks narrowing teacher agency; and Lee et al. (2024) stress that teacher dispositions and judgment must remain central for sustainable data use. Taking together, these insights suggest that while AI and automated systems can reduce workload, they should serve as support that enhance and not replace teachers’ professional decision-making.
The call for designated data personnel or mentors reflected recognition of the cognitive and emotional demands of data work. Rather than expecting every teacher to be a data expert, participants advocated role differentiation, with some staff supporting the translation of data into instructional decisions. This aligns with Mandinach and Gummer (2016), who emphasise the need for shared leadership and distributed expertise. In practice, schools could consider appointing in-school data mentors or coordinators who model effective data use, facilitate peer discussions, and provide just-in-time coaching. In-school data coaches or managers can be hired to ensure teachers are not left to navigate data alone. However, while this approach could ease teacher workload, it also carries potential risks. Delegating data responsibilities to specialized personnel may inadvertently limit teachers’ opportunities to develop their own DL skills or create a disconnect between data interpretation and pedagogical application. To avoid this, dedicated data staff should be seen not as replacements for teacher engagement with data, but as facilitators who build capacity and model effective practices within schools.
International evidence also points to integrated support models that combine AI tools, designated personnel, and teacher-led collaboration without reducing teacher agency. For example, data teams and PLCs show that inquiry cycles are most effective when teachers jointly analyse evidence and co-construct next steps (Keuning et al., 2017; Schildkamp et al., 2015b). Building on these findings, schools could institutionalize weekly or fortnightly PLC data inquiry sessions where teachers collaboratively examine classroom evidence, supported by data mentors and guided reflection protocols. In these contexts, AI systems can automate repetitive tasks such as collation and visualization, while data coaches or designated staff scaffold teachers’ analysis and build confidence (Mandinach & Gummer, 2016). Crucially, teachers remain central to professional judgment: AI reduces routine workload, personnel provide facilitation and expertise, and collaborative structures preserve agency by embedding data use within peer dialogue. Applied in Singapore, such an integrated model would allow teachers to engage in meaningful, inquiry-oriented data work without being overwhelmed by technical or administrative burdens.
Participants also stressed the importance of foundational training at the pre-service level, alongside long-term, job-embedded professional learning. This supports findings by Wayman and Jimerson (2013), who noted limited opportunities for hands-on data training in many schools. Participants described gaps in in-service learning, particularly in accessing relevant support post-graduation. To address this, teacher education programmes could introduce a dedicated pre-service data literacy module that equips beginning teachers with the conceptual and practical skills needed for data use. At the in-service level, school leaders might allocate protected time within teachers’ schedules for continued data inquiry and reflection. These concerns point to the need for a coordinated professional learning system that supports teachers across career stages and includes protected time during the workday for data engagement.
Taken together, the findings reinforce that cultivating data-literate educators require more than individual motivation. It requires systemic infrastructure that positions data use as a shared and embedded part of teaching. By establishing structured PLCs, designating data mentors, introducing pre-service DL training, and integrating AI tools thoughtfully, education systems can move toward a coherent, sustainable model of data-informed teaching. As Lee et al. (2024) argue, meaningful engagement with data depends not only on teacher attitudes but also on supportive leadership, collaboration, and time. Without these conditions, efforts to promote DL are unlikely to take hold in everyday practice.

5.5. Implications for Practice and Policy

5.5.1. Strengthen Professional Development for Teachers

Professional learning must move beyond episodic workshops to become a sustained, iterative process embedded within teachers’ work routines. Schools and policymakers can consider establishing structured time for collaborative inquiry, where teachers analyze classroom data together and co-construct action steps. Schools can leverage PLCs or cross-departmental data teams to enable these processes as part of regular work routines. This aligns directly with the Masterplan’s goal of developing collaborative teachers who embrace teamwork, adapt best practices, and co-construct meaning from learning data. In doing so, data use becomes a shared pedagogical activity rather than an isolated task. There is also a need to reconceptualize “data” in a way that is inclusive of both quantitative and qualitative sources. This includes not only teaching teachers how to use digital tools but also how to interpret learning analytics to better understand student needs. This can help increase their SE in using data to inform instructional decision-making.

5.5.2. Empower School Leaders to Model Inquiry and Support

School leaders play a pivotal role in shaping a culture of data use. Leaders who model inquiry, provide support, and foster trust can empower teachers to see data not as a burden but as a bridge to better teaching and learning. Leadership development programs should therefore include training on how to facilitate open, non-punitive data conversations and build school-wide data fluency. These practices reflect the EdTech Masterplan’s vision of schools that promote professional collaboration and innovation. Leaders who model curiosity, experimentation, and team-based problem-solving help reinforce data use as part of an inquiry-oriented school culture. By cultivating trust and modelling continuous learning, leaders empower teachers to use data more confidently and creatively.

5.5.3. Align National Policies with a Learning-Driven Data Culture

At the policy level, supporting data literacy requires aligning national initiatives, school resources, and professional expectations. This includes embedding data-use competencies into pre-service teacher education, improving access to reliable school-based tools and platforms, and creating protected time within the school day for teachers to work with data meaningfully. The EdTech Masterplan 2030 provides a timely opportunity to further embed these supports into system-wide strategies. However, for its goals to translate into classroom change, policy must move beyond digital enablement to address the human and structural conditions that make data use sustainable. This includes reviewing accountability practices to ensure that they promote professional inquiry rather than fear or compliance.

6. Conclusions

This study explored the perceptions, professional learning experiences, challenges, and perceived needs related to DL of teachers in Singapore. While teachers broadly valued the role of data in informing instruction and improving student outcomes, they also highlighted persistent tensions between data availability and usability, between performance and learning, and between policy expectations and classroom realities. Rather than portraying teachers as resistant, the findings suggest that many are willing but constrained by limited time, insufficient training, and institutional pressures surrounding data work. The teachers in this study were not resistant to data, but they demonstrated awareness of its potential, while also highlighting constraints. Fostering a data-informed teaching culture will require more than building technical proficiency. It demands attention to the relational and emotional dimensions of data use, including the need for psychological safety, professional trust, and time for collective reflection. Supporting teacher agency and providing sustained, context-sensitive professional learning are essential to this effort. Ultimately, data literacy is not only a matter of skills but of empowerment. When teachers are given the opportunity to engage with data as reflective practitioners, data becomes more than a tool for monitoring, it becomes a foundation for inquiry, dialogue, and continuous improvement. The future of data literacy lies in systems that respond to the realities of teaching and learning, and in a profession that is trusted to lead with both evidence and care.
This study is not without its limitations. The study focused solely on teachers’ perspectives and did not triangulate findings with other stakeholders such as school leaders or data specialists. Including these voices could offer a more holistic understanding of the systemic enablers and constraints related to data literacy in schools. In addition, the qualitative phase involved a small, purposive sample, which limits the extent to which the findings can be generalized. The insights presented here should therefore be interpreted as context-specific and illustrative rather than representative. The intent of this study was to achieve analytic transferability by offering rich, detailed accounts that readers may consider in relation to their own professional contexts, rather than statistical generalization.
Future research should include larger, more diverse samples, but even at this stage, our findings point to several implications for policy and practice. Professional learning should prioritize collaborative formats that allow teachers to analyze real classroom data together, rather than relying on decontextualized training. Embedding DL learning within existing school communities or professional learning teams may increase sustainability and uptake. At the system level, policymakers could consider providing time allowances and structural support that make such collaborative inquiry feasible within teachers’ workload constraints.
Beyond these practical implications, this study makes a conceptual contribution by linking teachers’ affective beliefs in SE and PVOD use with the systemic conditions of Singapore’s policy-driven education system. Our analysis shows how confidence and perceived value shape the way teachers navigate workload pressures, accountability demands, and support structures, producing the “willing but wary” stance observed across participants. By integrating effective and systemic dimensions, this study extends understanding of why teachers value data yet hesitate to use it fully, offering a framework that can inform both international scholarship and local practice in data literacy.

Author Contributions

Conceptualization, E.T.K.; Methodology, E.T.K.; Software, A.C.; Validation, E.T.K.; Formal analysis, E.T.K. and A.C.; Resources, A.C.; Data curation, A.C. and S.S.L.; Writing—original draft, E.T.K. and A.C.; Writing—review & editing, E.T.K. and S.S.L.; Project administration, E.T.K.; Funding acquisition, E.T.K. All authors have read and agreed to the published version of the manuscript.

Funding

This study was funded by Singapore Ministry of Education (MOE) under the Education Research Funding Programme (OER 19/22 KET) and administered by National Institute of Education (NIE), Nanyang Technological University, Singapore. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the Singapore MOE and NIE NTU, Singapore.

Institutional Review Board Statement

The study was approved by Nanyang Technological University Institution Review Board (IRB-2023-362 2023-06-15).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data are not publicly available due to privacy or ethical restrictions.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ahmed, S. K. (2024). The pillars of trustworthiness in qualitative research. Journal of Medicine, Surgery, and Public Health, 2, 100051. [Google Scholar] [CrossRef]
  2. Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. [Google Scholar] [CrossRef]
  3. Dunlap, K., & Piro, J. S. (2016). Diving into data: Developing the capacity for data literacy in teacher education. Cogent Education, 3(1), 1132526. [Google Scholar] [CrossRef]
  4. Dunn, K. E., Airola, D. T., Lo, W.-J., & Garrison, M. (2013). Becoming data driven: The influence of teachers’ sense of efficacy on concerns related to data-driven decision making. The Journal of Experimental Education, 81(2), 222–241. [Google Scholar] [CrossRef]
  5. Espin, C. A., Wayman, M. M., Deno, S. L., McMaster, K. L., & De Rooij, M. (2017). Data-based decision-making: Developing a method for capturing teachers’ understanding of CBM graphs. Learning Disabilities Research and Practice, 32(1), 8–21. [Google Scholar] [CrossRef]
  6. Filderman, M. J., Toste, J. R., Didion, L., & Peng, P. (2021). Data literacy training for K–12 teachers: A meta-analysis of the effects on teacher outcomes. Remedial and Special Education, 43(5), 328–343. [Google Scholar] [CrossRef]
  7. Hamilton, V., Onder, Y., Andzik, N. R., & Reeves, T. D. (2022). Do data-driven decision-making efficacy and anxiety inventory scores mean the same thing for Pre-Service and In-Service teachers? Journal of Psychoeducational Assessment, 40(4), 482–498. [Google Scholar] [CrossRef]
  8. Henderson, J., & Corry, M. (2020). Data literacy training and use for educational professionals. Journal of Research in Innovative Teaching & Learning, 14(2), 232–244. [Google Scholar] [CrossRef]
  9. Ifenthaler, D., & Yau, J. Y. (2020). Utilising learning analytics to support study success in higher education: A systematic review. Educational Technology Research and Development, 68(4), 1961–1990. [Google Scholar] [CrossRef]
  10. Kashaka, N. D. (2024). The role of data analytics in school management. Eurasian Experiment Journal of Humanities and Social Sciences (EEJHSS), 5, 76–77. Available online: https://www.eejournals.org/ (accessed on 3 September 2025).
  11. Keuning, T., Van Geel, M., & Visscher, A. (2017). Why a data-based decision-making intervention works in some schools and not in others. Learning Disabilities Research and Practice, 32(1), 32–45. [Google Scholar] [CrossRef]
  12. Khor, E. T., & Chee, A. (2025, September). Exploring K–12 Teachers’ Data Literacy in Singapore: Knowledge, Self-Efficacy, and Perceived Value of Data. Proceedings of the 1st International Conference on Learning Evidence and Analytics, Fukuoka, Japan. [Google Scholar]
  13. Kim, M. S., & Yu, F. (2023). ‘Teacher data literacies practice’ meets ‘pedagogical documentation’: A scoping review. Review of Education, 11(2), e3414. [Google Scholar] [CrossRef]
  14. Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. Cambridge University Press. [Google Scholar] [CrossRef]
  15. Lee, J., Alonzo, D., Beswick, K., Abril, J. M. V., Chew, A. W., & Oo, C. Z. (2024). Dimensions of teachers’ data literacy: A systematic review of literature from 1990 to 2021. Educational Assessment Evaluation and Accountability, 36(2), 145–200. [Google Scholar] [CrossRef]
  16. Lim, E. M. (2023). The effects of pre-service early childhood teachers’ digital literacy and self-efficacy on their perception of AI education for young children. Education and Information Technologies, 28(10), 12969–12995. [Google Scholar] [CrossRef]
  17. Mandinach, E. B., & Gummer, E. S. (2013). A systemic view of implementing data literacy in educator preparation. Educational Researcher, 42(1), 30–37. [Google Scholar] [CrossRef]
  18. Mandinach, E. B., & Gummer, E. S. (2016). Every teacher should succeed with data literacy. Phi Delta Kappan, 97(8), 43–46. [Google Scholar] [CrossRef]
  19. McDonald, N., Schoenebeck, S., & Forte, A. (2019). Reliability and inter-rater reliability in qualitative research: Norms and guidelines for CSCW and HCI practice. Proceedings of the ACM on Human-Computer Interaction, 3(CSCW), 1–23. [Google Scholar] [CrossRef]
  20. Means, B., Chen, E., DeBarger, A., & Padilla, C. (2011). Teachers’ ability to use data to inform instruction: Challenges and supports. U.S. Department of Education. [Google Scholar]
  21. Miaomiao, Z., Lu, X., & Xun, M. (2024). Self-efficacy of data literacy among primary and secondary school teachers in Guangdong Province. Journal of Electrical Systems, 20(2), 2113–2125. [Google Scholar] [CrossRef]
  22. Öngören, S. (2021). Investigation of prospective preschool teachers’ digital literacy and teacher readiness levels. International Journal of Modern Education Studies, 5(1), 181–204. [Google Scholar] [CrossRef]
  23. Reeves, T. D., & Chiang, J. L. (2019). Effects of an asynchronous online data literacy intervention on pre-service and in-service educators’ beliefs, self-efficacy, and practices. Computers & Education, 136, 13–33. [Google Scholar] [CrossRef]
  24. Schildkamp, K., & Kuiper, W. (2010). Data-informed curriculum reform: Which data, what purposes, and promoting and hindering factors. Teaching and Teacher Education, 26(3), 482–496. [Google Scholar] [CrossRef]
  25. Schildkamp, K., & Poortman, C. (2015a). Factors influencing the functioning of data teams. Teachers College Record the Voice of Scholarship in Education, 117(4), 1–42. [Google Scholar] [CrossRef]
  26. Schildkamp, K., Poortman, C. L., & Handelzalts, A. (2015b). Data teams for school improvement. School Effectiveness and School Improvement, 27(2), 228–254. [Google Scholar] [CrossRef]
  27. Stavermann, K. (2024). Online teacher professional development: A research synthesis on effectiveness and evaluation. Technology, Knowledge and Learning, 30(1), 203–240. [Google Scholar] [CrossRef]
  28. U.S. Department of Education. (2009). Race to the top executive summary. In Race to the TOP program. U.S. Department of Education. Available online: https://files.eric.ed.gov/fulltext/ED557422.pdf (accessed on 3 September 2025).
  29. Van Den Bosch, R. M., Espin, C. A., Chung, S., & Saab, N. (2017). Data-based decision-making: Teachers’ comprehension of curriculum-based measurement progress-monitoring graphs. Learning Disabilities Research and Practice, 32(1), 46–60. [Google Scholar] [CrossRef]
  30. Wayman, J. C., & Jimerson, J. B. (2013). Teacher needs for data-related professional learning. Studies in Educational Evaluation, 42, 25–34. [Google Scholar] [CrossRef]
  31. Wayman, J. C., Snodgrass Rangel, V. W., Jimerson, J. B., & Cho, V. (2016). Organizational behaviors that support data use. University of Texas at Austin. [Google Scholar]
  32. Wolff, A., Gooch, D., Montaner, J. J. C., Rashid, U., & Kortuem, G. (2016). Creating an understanding of data literacy for a data-driven society. The Journal of Community Informatics, 12(3), 9–26. [Google Scholar] [CrossRef]
Table 1. Sample Audit Trail.
Table 1. Sample Audit Trail.
Study TitleBeyond the Number: K-12 Teachers’ Experiences, Beliefs, and Challenges in Developing Data Literacy
Research MethodReflective Thematic Analyses
Stage of Research Rationale and Decisions
Research DesignResearch questions:
  • How do MOE teachers perceive data literacy and its relevance to their professional roles?
  • What types of professional learning experiences have shaped teachers’ understanding of data use?
  • What challenges do teachers face in developing and applying data literacy?
  • What support structures or systemic changes do teachers believe are necessary to sustain growth in this area?
Qualitative research design to explore MOE teachers’ experiences, beliefs, and challenges in developing and applying DL in their teaching. After attending a PD session on DL, participants were invited for an interview. The interview phase served as a follow-up, allowing researchers to probe deeper into the patterns observed in the quantitative data and to surface rich contextual insights not captured in the survey.
Sampling strategyPurposive sampling of MOE teachers who participated in PD to understand teachers’ lived experiences and beliefs about DL.
Sampling criteriaSampling ensured diversity in teaching experience (ranging from 3 to 33 years), age (21 to 60 years), and subject specialization, including Science, Humanities, Language, and Mathematics.
Data CollectionRecruitmentInterview participants were selected from the broader survey cohort using maximum variation sampling, prioritizing teachers who had completed the PD session, ensuring that interviewees could reflect on both survey items and their PD experience.
Data collection processSemi-structured interview protocol was developed by the research team to investigate teachers’ perceptions of DL, their professional learning experiences, challenges encountered in using data, and the kinds of support they deemed necessary. Each online interview lasted about 40–60 min.
Reflective notesFocused on data adequacy to ensure rich and diverse perspectives were obtained.
Data Analyses TranscriptionInterviews were recorded with consent and transcribed verbatim.
Coding process NVivo was used to support the organization and development of codes. Initial codes were generated inductively from the transcripts, and emerging codes and themes were iteratively reviewed and refined
Theme developmentBraun and Clarke’s (2006) six-phase framework. An inductive approach guided the identification of recurring patterns and themes, which were iteratively refined through coding and memo writing.
VerificationNVivo 12 was used to organize data, support thematic mapping, and facilitate cross-case comparison. Themes were iteratively reviewed and refined in consultation with the research team. These discussions provided opportunities for peer debriefing and validation, ensuring that the analysis was not shaped solely by the first author’s perspective.
Table 2. Sample Interview Protocol.
Table 2. Sample Interview Protocol.
Research Questions Sample Interview Questions
-
How do MOE teachers perceive data literacy and its relevance to their professional roles?
-
What are your current perceptions of data literacy in education?
-
In your opinion, how important is data literacy for teachers? Why or why not?
-
Have you received any professional learning experiences related to data literacy in the past?
-
If yes, what aspects were useful or not useful and why
-
What types of professional learning experiences have shaped teachers’ understanding of data use?
-
What types of professional learning activities or learning formats do you find most engaging and effective (e.g., workshops, hands-on activities, case studies, online courses, peer collaborations)? Why?
-
How does professional learning experience affect your knowledge and beliefs on data literacy?
How do you determine the effectiveness of the professional learning activities in enhancing your data literacy skills and knowledge?
-
What challenges do teachers face in developing and applying data literacy?
-
What are the barriers or challenges that you anticipate in implementing data literacy skills in your teaching practice
-
What support or resources would help overcome these barriers?
-
What support structures or systemic changes do teachers believe are necessary to sustain growth in this area?
-
How would you like these learning experiences to be conducted?
-
Reflecting on your teaching practice, can you provide examples of a time when data literacy played a significant role in your decision-making or instructional planning? What are some key enablers and challenges?
Table 3. Sample Data Coding for Thematic Analyses.
Table 3. Sample Data Coding for Thematic Analyses.
Data CodesTheme
I think it was quite useful to have that group discussion to look at the data and talk about it. Because otherwise, I’ll be looking at it on my own and I won’t know what I’m supposed to look out for.Group discussion Interactive and Collaborative Learning
Hands-on works very well. Lecture-based, a bit less. Let’s say giving scenarios and asking teachers what’s the best way of analyzing the data. That was pretty good.Hands-on tasks
Yes. Yes. Yeah. The teachers were engaged, right? We have to bring it down for the teachers to understand what is happening. I guess it has to be relatable. So case studies based on the great profile of students giving hypothetical scenarios, they work well, but I’m not sure what longer lasting impacts of this professional development, probably there will need to be a follow-up. Real-life case studies
Scenario-based exercises
Real-life case studies
So you tell us how to do this and then maybe give us some hands-on activity to try to use those tools. Like what you did during the session, you gave us that set of data, but we wish we could have it. Then we don’t have to type it into the computer and do our own analysis. So we need that hands-on thing. We need to do that practice and then figure out on our own, then find out from you, the experts, whether our interpretation is correct or wrong. What we missed out, that kind of thing would be helpfulHands-on activity
Reflective collaboratively
okay actually the think discussion kind of thing like discussing with the group and listening to their perspective I think it is it is valuable to me yeah and then after that I think the professor actually got us to share our thoughts I think that is also good because then the sharing becomes rich and we learn more and what it’s just that the content downloading of the content was a bit too much for that dayPeer discussion
Reflective collaboratively
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Khor, E.T.; Chee, A.; Lee, S.S. Beyond the Numbers: K-12 Teachers’ Experiences, Beliefs, and Challenges in Developing Data Literacy. Educ. Sci. 2025, 15, 1527. https://doi.org/10.3390/educsci15111527

AMA Style

Khor ET, Chee A, Lee SS. Beyond the Numbers: K-12 Teachers’ Experiences, Beliefs, and Challenges in Developing Data Literacy. Education Sciences. 2025; 15(11):1527. https://doi.org/10.3390/educsci15111527

Chicago/Turabian Style

Khor, Ean Teng, Amelia Chee, and Shu Shing Lee. 2025. "Beyond the Numbers: K-12 Teachers’ Experiences, Beliefs, and Challenges in Developing Data Literacy" Education Sciences 15, no. 11: 1527. https://doi.org/10.3390/educsci15111527

APA Style

Khor, E. T., Chee, A., & Lee, S. S. (2025). Beyond the Numbers: K-12 Teachers’ Experiences, Beliefs, and Challenges in Developing Data Literacy. Education Sciences, 15(11), 1527. https://doi.org/10.3390/educsci15111527

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop