Next Article in Journal
Exploring Predictive Insights on Student Success Using Explainable Machine Learning: A Synthetic Data Study
Previous Article in Journal
Maieutic, Natural, and Artificial Forms in Automatic Control Case Study
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Systematic Review

Technologies for Reflective Assessment in Knowledge Building Communities: A Systematic Review

by
Paula Rodríguez-Chirino
1,*,
Calixto Gutiérrez-Braojos
1,
Mario Martínez-Gámez
1 and
Carlos Rodríguez-Domínguez
2
1
Research Methods in Education, Faculty of Education, Campus Universitario de la Cartuja, University of Granada, Calle Prof. Vicente Callao, Beiro, 18011 Granda, Spain
2
Department of Computer Languages and Systems, Faculty of Education, Economics and Technology of Ceuta, University of Granada, 18011 Granda, Spain
*
Author to whom correspondence should be addressed.
Information 2025, 16(9), 762; https://doi.org/10.3390/info16090762
Submission received: 28 July 2025 / Revised: 24 August 2025 / Accepted: 29 August 2025 / Published: 3 September 2025

Abstract

Reflective assessment is central to knowledge building, as it enables learners to evaluate and improve their collective ideas. In recent years, a wide range of analytic technologies have been developed to support this process, yet there is still a limited understanding of how these tools are designed, implemented, and connected to knowledge building principles. This study addresses this gap through a systematic review of the literature focused on analytic technologies that foster reflective assessment in knowledge building environments. Following a rigorous PRISMA methodology, a search in Web of Science was conducted. Studies that did not meet the inclusion criFteria were excluded, resulting in a final selection of 31 empirical studies. The analysis shows that most analytic tools (e.g., KBDeX, KCA, KBAT) are applied to data generated in Knowledge Forum, supporting students in visualizing, analyzing, and reflecting on the collective knowledge advancement process. The review highlights a growing diversity of tools designed to enhance processes such as idea improvement and epistemic agency. By mapping these contributions, the study provides a clearer understanding of how analytic technologies can be used to strengthen collaborative practices and reflective assessment in knowledge building contexts.

Graphical Abstract

1. Introduction

In recent years, academic interest in the design, development, and implementation of analytics technologies in educational contexts has grown considerably (Zheng et al., 2022) [1]. Several studies have examined the role of these technologies in classroom settings, particularly their potential to enhance knowledge building processes (Oshima & Shaffer, 2021) [2]. Such tools enable the analysis, visualization, and synthesis of data generated by students, thereby facilitating personalized feedback and aiding educators in identifying issues related to student engagement (Ong et al., 2023) [3]. Within the framework of knowledge building, analytics technologies offer real-time insights that support the monitoring of learning progress and the refinement of ideas, thus fostering collaborative knowledge advancement (Gutiérrez-Braojos et al., 2023) [4].
Learning analytics is meaningful insofar as it is embedded within a clearly articulated pedagogical design (Stoyanov & Kirschner, 2023) [5]. Various proposed frameworks underscore this as essential for understanding learning patterns; however, their effective implementation in educational practice remains a significant challenge (Mangaroska et al., 2020) [6]. The knowledge building framework, as described by Stahl (2000) [7], was conceived to facilitate the integration and pedagogical use of technological tools that promote peer collaboration. Knowledge building is an educational model that conceptualizes learning as a collective process of knowledge creation, wherein students assume an active role in the continual refinement and advancement of ideas (Gutiérrez-Braojos et al., 2024; Scardamalia, 2002) [8,9]. In this approach, the Knowledge Forum software serves as a key tool for enacting its principles. The Knowledge Forum software was specifically designed to facilitate the exchange of ideas among students, structure the critical evaluation and progressive refinement of those ideas, and support the development of higher-order syntheses (Cress et al., 2015) [10]. The Knowledge Forum exemplifies how technological tools can expand opportunities for interaction and collaborative knowledge construction in online educational environments (Scardamalia & Bereiter, 2006) [11]. One of the fundamental principles of knowledge building is the continuous evaluation of ideas, particularly through reflective assessment processes. Recent research emphasizes the significance of assessment, especially when mediated by technology, as a critical mechanism for fostering the advancement of knowledge within learning communities (Scardamalia, 2004) [12]. In knowledge building contexts, the implementation of analytics technologies is often intrinsically linked to the use of Knowledge Forum, as the platform not only enables collaborative discourse but also systematically generates rich data on student interactions. This data is essential for applying learning analytics aimed at evaluating the quality of discourse and supporting reflective assessment processes.
To date, only a few systematic reviews have explored the knowledge building approach and the use of learning analytics. However, very few have focused on technologies specifically designed to support reflective assessment within knowledge building. Notable examples of systematic reviews addressing learning analytics in this context include Zhu and Kim (2017) [13] and Apiola et al. (2022) [14]. A systematic review is a type of scientific study that examines previous research on the same topic to answer a clearly defined question in a rigorous manner (Sánchez-Serrano et al., 2022) [15]. This study aims to identify learning analytics developed within the knowledge building framework and its use in reflective assessments. It also seeks to analyze which principles of the approach these technologies aim to support, and under what conditions their implementation has proven effective.

1.1. Knowledge Building

The knowledge building educational model introduces students to a culture focused on the construction of knowledge, with the goal of advancing ideas. This process is supported by the Knowledge Forum platform, which facilitates dialogue and progressive discourse within learning communities (Scardamalia & Bereiter, 2021) [16]. In classroom settings, knowledge building follows a principle-based approach (Scardamalia & Bereiter, 2014) [17]. In this environment, students address authentic problems through collaboration and idea development (Scardamalia & Bereiter, 1994) [18]. This approach promotes a strong commitment to the knowledge community. Students take collective responsibility to improve knowledge, seek accurate information from reliable sources, and aim to refine their ideas (Gutiérrez-Braojos et al., 2019) [19]. According to Zhang et al. (2009) [20], students show greater social and cognitive responsibility when they understand that ideas can be improved and strive for deeper explanations. This environment encourages the exchange of ideas, questioning, discussion of different perspectives, and the acquisition of new skills. All of these contribute to a better understanding of real-world problems (Ma & Scardamalia, 2022) [21]. Some authors highlight the importance of identifying promising ideas as a key step in creative problem-solving (Chen, 2017) [22]. The principles of knowledge building provide both the theoretical and practical foundation for this educational approach (Cacciamani et al., 2021) [23].
In relation to studies on knowledge building, Chen and Hong (2016) [24] conducted a systematic review of thirty years of research on this pedagogy. Their study outlines its conceptual framework, guiding principles, and main features. It also highlights the challenges of sustaining and expanding knowledge building practices over time. The review illustrates how schools can function as knowledge building communities and offers valuable insights for future research and educational improvement. Similarly, Gutiérrez-Braojos et al. (2020) [25] carried out a systematic review of the knowledge building approach. This study provides a comprehensive overview of the model’s development, emphasizing core principles such as the continual improvement of ideas and epistemic agency. It also explores the transformative potential of this pedagogy within educational contexts.
Within the knowledge building framework, assessment plays a key role in its implementation. Scardamalia (2002) [9] emphasizes that assessment is not a separate activity but an integral part of the knowledge advancement process and daily educational practices. A common goal shared by multiple studies is to design innovative and reflective assessments that enhance explanation-oriented discourse supported by technology (i.e., Yang et al., 2020) [26]. Engaging students in assessment activities that are closely aligned with their learning objectives and the collaborative process of knowledge building is crucial for fostering meaningful participation (Cacciamani et al., 2021) [23]. Moreover, in this type of learning environment, both self-assessment and peer assessment are promoted, as they encourage deep reflection and strengthen collaborative work (Lei & Chan, 2018) [27].
Within the knowledge building environments, technologies are used to support reflective assessment positions. Knowledge Forum is a central tool for collecting and analyzing data generated during the knowledge building process. Developed in alignment with the principles of knowledge building, Knowledge Forum enables the exchange of ideas among students, fosters critical evaluation, and supports the improvement of ideas (Cress et al., 2015) [10]. Moreover, in most of the reviewed studies, the data used to apply learning analytics come directly from the interactions on this platform. This makes Knowledge Forum a key source for conducting discourse analysis within knowledge building environments.

1.2. Reflective Assessment

Recent research on assessment within the knowledge building approach highlights the central role of reflective assessment in advancing and improving ideas (Tan et al., 2021) [28]. Yang et al. (2020) [26] define reflective assessment as a process that supports students in developing key skills to think critically about their own knowledge building practices. This enables them to make more conscious and autonomous decisions about their learning. According to Van Aalst and Chan (2007) [29], reflective assessment positively influences the development of students’ scientific understanding. It also promotes collaboration and strengthens their capacity to actively construct knowledge. These metacognitive processes include goal setting, planning, monitoring, and regulating the knowledge building and inquiry process—both individually and collectively (Yang et al., 2024) [30]. Reflective assessment involves students taking an active role in monitoring both their individual and group progress. It requires recognizing which aspects of knowledge need improvement and adjusting their information-seeking strategies accordingly (Yang et al., 2022) [31]. This type of assessment enables students to build a deeper understanding of scientific concepts (Chan & Van Aalst, 2004) [32]. In doing so, it continuously guides collective efforts to better address the needs of the learning community. Gutiérrez-Braojos et al. (2023) [4] emphasize that assessment sessions play a crucial role in identifying themes the group needs to explore further. They also strengthen students’ sense of belonging and help detect challenges or barriers to progress, such as low participation or the repetition of ideas.
The systematic review conducted by Yan et al. (2023) [33] represents one of the most relevant precedents for the present study. This review focused on understanding how students perceive reflective assessment. The authors found that students generally view feedback in a positive light. They also identified several factors that influence student engagement in reflective assessment, including perceived self-efficacy, classroom climate, and teacher support. The study highlights that those technological tools should be designed to help students engage with assessment in an active and reflective manner.

1.3. Knowledge Building Analytics

Learning analytics have been gaining increasing relevance in educational settings (Zheng et al., 2022) [1]. Their use provides meaningful feedback in collaborative knowledge building processes, fostering deeper interaction and more informed metacognitive regulation (Bereiter & Scardamalia, 2014) [34]. While systematic reviews on learning analytics have primarily focused on higher education, student performance, and teacher development, few have specifically explored their role within knowledge building environments.
To conduct this study, we reviewed published systematic reviews on this topic. Avella et al. (2016) [35] examined the benefits, challenges, and methodological features of 20 empirical studies on learning analytics in online learning environments. Among the most frequently cited benefits were the promotion of knowledge acquisition, progress monitoring, and improved academic outcomes. However, they also identified key challenges, such as the need for technical training for teachers and the lack of conceptual models to guide the design of educational analytics tools. Similarly, Sergis and Sampson (2016) [36] conducted a review focused on analytics to support teacher inquiry. Their work highlights the importance of providing tools that not only collect student data but also generate meaningful indicators for pedagogical reflection and the improvement of teaching practices. Based on 34 selected studies, they proposed a conceptual framework that links data analysis to pedagogical decision-making processes while also noting the limited attention given to collaborative reflective assessment in these contexts. Along similar lines, Oliva-Córdova et al. (2021) [37] reviewed 37 studies that applied analytics to support the development of teaching competencies. Their results revealed a predominance of visual tools, network analysis, and semantic approaches. However, they also noted a limited number of studies explicitly linking these technologies with pedagogical practices that promote metacognition or the collective construction of knowledge. Similarly, Sonderlund et al. (2018) [38] focused their review on evaluating the effectiveness of interventions that integrate analytics in higher education. Based on 36 studies, they concluded that, although many interventions are presented as promising, few provide empirical evidence of their impact. They also highlighted the lack of research exploring how analytics tools influence complex cognitive processes such as critical thinking, collaborative inquiry, or epistemic agency.
Regarding systematic reviews on learning analytics integrated into the knowledge building educational model, two key studies stand out. First, Zhu and Kim (2017) [13] reviewed assessment tools used in knowledge building environments, with particular attention to the meaningful integration of analytics in educational settings. The authors argue that these tools should be embedded within the knowledge construction process itself and possess a transformative character. Second, Apiola et al. (2022) [14] concluded that analytics applied to collaborative pedagogies such as knowledge building hold significant potential to support 21st century skills, including epistemic agency and design thinking.
Several studies have shown that analytics can enhance knowledge building when they are coherently integrated with two key areas of the model, organized according to knowledge building principles as follows:
  • Collaborative efforts. This category underscores knowledge construction as an inherently social and collaborative process. Learning analytics plays a crucial role by making group interactions visible, which fosters equitable participation, coordinated efforts, and collective responsibility for advancing shared understanding (Gutiérrez-Braojos et al., 2023) [4].
  • Idea improvement. This dimension highlights the continuous improvement of ideas and epistemic agency. It involves the ability to identify promising contributions, explore deeper explanations, and synthesize diverse perspectives—processes that enhance the epistemic quality of the community’s knowledge (Hong et al., 2015) [39].
Although Scardamalia & Bereiter (2006) [11] describe a wider set of knowledge building principles, this study focused on collaborative efforts and idea improvement. These two principles were chosen because they are closely connected to the research goals and to the role of learning analytics in supporting participation and the improvement of ideas.

1.4. The Current Review

This study provides an updated overview of how learning analytics are used in knowledge building contexts. It identifies which tools are employed and what types of analyses they enable. It also describes the conditions in which these tools are implemented and how they support reflective assessment. Finally, it examines how learning analytics help fulfil knowledge building principles, achieve learning objectives, and improve student performance. Specifically, this study aims to answer the following research questions:
O1: What learning analytics are used to conduct reflective assessments in knowledge building contexts?
O2: What types of analyses do these technologies enable to support the understanding and exploration of knowledge building principles?
O3: What conditions influence the effectiveness of these learning analytics?
O4: To what extent do the reviewed studies fulfil knowledge building principles, achieve their objectives, and improve student performance?

2. Method

To ensure transparency, rigor, and reproducibility, this study followed the PRISMA 2020 protocol (Preferred Reporting Items for Systematic Reviews and Meta-Analyses). This protocol provides a structured guideline for conducting systematic reviews, consisting of a 27-item checklist organized into sections ranging from the title to the discussion and supplementary materials (Sánchez-Serrano et al., 2022) [15]. In this study, the application of PRISMA made it possible to clearly define inclusion and exclusion criteria, carry out a systematic search in the Web of Science database, extract data in an organized manner, and evaluate the selected studies using consistent criteria. This helped minimize bias and ensure the validity of the results. The review process was conducted and supervised by two researchers. The inclusion criteria considered in this review are as follows:
  • Type of study. Empirical studies using quantitative, qualitative, or mixed methods approaches.
  • Pedagogical context. Implementation within knowledge building environments, with or without the use of Knowledge Forum.
  • Analytical technology. Use of analytics tools or technologies specifically linked to reflective assessments during knowledge building implementation.
  • Study objective. Focus on reflective assessment as an integral part of the knowledge building process.
  • Participants. Students at any educational level (primary, secondary, or higher education).
According to the exclusion criteria, studies were discarded if they were duplicates, theoretical works without direct empirical intervention, or if they used technologies without applying analytics techniques or technologies aimed at reflective assessment. In addition, studies that focused exclusively on grading or final outcomes, without any connection to knowledge building principles, were also excluded (see Figure 1).

2.1. Search Strategy and Inclusion Criteria

The literature search was conducted in the Web of Science database up to May 2025. Two separate searches were carried out, one month apart. To identify relevant studies, the following Boolean operators were used: “knowledge building,” “learning analytic,” “analytic,” and “knowledge building analytic.” Additional terms such as “self-assessment,” “assessment,” and “reflective assessment” were also considered in the manual search. These keywords helped narrow the search to studies focused on knowledge building that included elements of reflective assessment and the use of analytics. No filters were applied regarding document type, language, or year of publication, to avoid excluding potentially relevant studies. Nevertheless, all retrieved documents were written in English. To ensure methodological consistency, only empirical studies that addressed the implementation of analytics to support reflective assessments in knowledge building educational contexts were included.
According to the exclusion criteria, all duplicate studies were removed. In addition, studies that did not focus on the implementation of knowledge building environments based on the principles described by Scardamalia & Bereiter (2006) [11] were excluded. Studies that did not incorporate reflective assessment in which students engaged in a process of reflection on their own learning were also discarded. Another exclusion criterion was the absence of analytic tools to conduct reflective assessments, regardless of whether these tools were used by teachers, students, or both. Finally, non-empirical studies were excluded.

2.2. Data Extraction

Following the initial search in the Web of Science database, a total of 71 publications were retrieved. Duplicate records were removed. The study selection process was carried out in two successive phases. First, titles and abstracts were screened to identify studies that reported the implementation of a specific analytics technology and involved an empirical intervention. Studies meeting these criteria were selected for inclusion. In cases of uncertainty, the full text was reviewed to confirm whether the study met the established inclusion criteria. The selection process was conducted independently by two researchers. Any discrepancies were resolved through joint discussion until a consensus was reached. In the second phase, the full texts of the remaining studies were thoroughly reviewed to ensure that they aligned with the objectives of the review—specifically, that they presented empirical data on the implementation of analytics technologies applied to reflective assessment and/or knowledge building processes. As a result, 31 studies were ultimately selected that met all the inclusion criteria.
An Excel template specifically designed for this review was used to extract the data. Coding procedures were jointly established by two researchers to ensure consistency in data interpretation. The variables recorded for each study included author, year of publication, document type, analytics technologies used, type of analysis performed, knowledge building principles addressed, educational context, participant characteristics, group configuration, implementation time, timing of assessment, person responsible for assessment, achievement of the stated objectives, and evidence of improved academic performance. The extracted dataset was aligned with all the objectives defined for this study.

3. Results

The results are organized according to the four specific objectives defined in this systematic review. First, the general characteristics of the included studies are described, with special attention to the learning analytics identified and their presence in the reviewed literature (O1). Next, the types of analyses enabled by these technologies are examined, as well as how they relate to understanding one or more knowledge building principles according to the two spheres (collaborative efforts and idea improvement) (O2). The third objective (O3) explores the pedagogical and contextual conditions in which these technologies are most effective, considering factors such as educational level, group dynamics, assessment timing, etc. Finally, the results show whether the studies state achieving their objectives and improvements in student performance (O4).
(O1) What learning analytics are used to conduct reflective assessments in knowledge building contexts?
The selected studies were analyzed using a bibliometric approach to explore the technologies employed, their evolution over time, and their academic impact. The following table presents the studies organized by year of publication, authorship, title, document type, intervention design, and number of citations across all indexed databases (see Table 1).
In line with the first specific objective of this review, thirteen learning analytics were identified in studies focused on the knowledge building approach. Several of these tools were specifically developed to support reflection, evaluation, and the continuous improvement of ideas within this framework. These technologies support the analysis of discourse, community engagement, and the visualization of conceptual progress, among other aspects. Their application contributes to the monitoring, assessment, and enhancement of collaborative knowledge building processes. The main tools identified are summarized in Table 2, along with a brief description of their functions and purposes in educational contexts.
The review identified the tools used and the frequency with which each technology appeared across the selected studies. According to the results, KBDeX stands out as the most widely used technology for analyzing discursive interactions and visualizing knowledge construction (see Figure 2). The Social Network Tool and Idea Thread Mapper (ITM) are also notable for their effectiveness in exploring collaborative networks among participants. Other tools—such as KCA, CiA, KBAT, and the ENA Web Tool—reflect an emerging interest in diversifying the ways in which reflection can be supported, although their use remains less frequent. Technologies like the Vocabulary Analyzer, Classroom Discourse Analyzer (CDA), and Semantic Overlap Tool appear only occasionally. This distribution reflects a clear preference for tools that provide discourse visualization, facilitating actionable feedback for both teachers and students. The diversity of tools highlights ongoing efforts to create reflective learning environments. In some cases, studies used multiple technologies in combination to support different aspects of the knowledge building process.
The results also reveal a progressive evolution in the use of analytics technologies applied to the study of knowledge building between 2010 and 2025. In the early years, tools such as the Analytic Toolkit were predominant. However, beginning in 2011, the use of KBDeX increased significantly, establishing it as one of the most influential tools in the field, particularly after the publication of Oshima et al. (2012) [50], which has received 111 citations. Over time, the range of tools diversified, with the emergence of technologies such as KCA, ITM, ENA, and KBAT. Regarding the type of publication, peer-reviewed journal articles are the most common, although conference papers and book chapters are also represented. This suggests a broad and cross-disciplinary interest in the topic. In addition, many of the studies reviewed employed mixed methods designs, combining quantitative and qualitative approaches to evaluate the implementation and outcomes of analytics technologies. Finally, the impact analysis shows that some studies have achieved high visibility within the academic community, indicating that certain tools have not only been adopted but have also contributed significantly to research on reflective assessment and knowledge construction.
(O2) What types of analyses do these technologies enable to support the understanding and exploration of knowledge building principles?
The main learning analytics identified in this systematic review, along with the description of the tools and the types of analyses they enable, are summarized in Table 2. This classification highlights the methodological diversity and the analytical possibilities available to support reflective assessment in knowledge building contexts.
As shown in Table 2, most technologies allow researchers to analyze discourse interactions, map social and semantic structures, and track idea development. Tools like KBDeX, Social Network Tool, and ITM focus on visualizing participation and idea flow within knowledge communities. Others, such as ENA Web Tool and KCA, support more detailed semantic and conceptual mapping. KBAT stands out as a tool specifically designed to classify discourse moves and analyze participation patterns in knowledge building discussions. Additional tools, including Vocabulary Analyzer and the CDA, enable the examination of lexical features and critical discourse aspects. Overall, the variety of tools reflects a growing interest in combining structural, semantic, and reflective analyses to better understand and support knowledge building processes.
On the other hand, Table 3 presents a checklist of the empirical studies included in this systematic review. For each publication, the table indicates the analytical technology used and marks whether it addresses any of the two main knowledge building categories. The categories defined previously and coherent with knowledge building principles are collaborative efforts and idea improvement.
Table 3 contributes to the analysis by synthesizing how different learning analytics tools align with the following knowledge building principles: collaborative efforts and idea improvement. This categorization highlights the extent to which the reviewed technologies support collective knowledge advancement, either by fostering participation and coordination among learners, by promoting continuous idea improvement, or by combining both dimensions. The table makes it easier to identify patterns across studies and provides a clearer basis for discussing the contribution of learning analytics to reflective assessment in knowledge building contexts. The results show that most studies employ technologies capable of supporting multiple principles simultaneously. Tools like KBDeX and KBAT are commonly used to examine both collaborative interactions and the progressive improvement of ideas. Some technologies, such as ENA Web Tool, are more focused on discourse structure and semantic mapping, and, therefore, may not directly address collaborative efforts in some cases. Overall, the results highlight the predominance of multi-dimensional analytical approaches in recent empirical studies.
(O3) What conditions influence the effectiveness of these learning analytics?
This objective aims to explore the conditions under which learning analytics are most effective in supporting reflective learning and assessment. The conditions identified in the selected empirical studies are outlined below as follows:
  • Academic course. The educational level or stage at which the analytics technology is implemented.
  • Participants. The total number of participants involved in the study or educational experience.
  • Participant setup. The organization of participants during the intervention, including large groups (whole class), small groups (subdivided), or individual work.
  • Platform used. The platforms used to record community contributions, from which the data for analytics are extracted.
  • Implementation time: The duration of the implementation during the educational intervention.
  • Assessment moment. The phase of the educational process when assessment occurs (initial, formative, or summative).
  • Learning analytic assessment agent. The agent responsible for conducting the assessment using analytics technologies, whether for students, teachers, or both.
A summary table is presented below (see Table 4), showing the results of the coding applied to the studies included in the review.
The reviewed studies show that learning analytics are implemented at different educational levels. Their use is most common in primary education (twelve studies), followed by higher education (eleven studies) and, to a lesser extent, secondary education (eight studies). Regarding participants, the number varies considerably, with an average of approximately 59 participants per study (a range of 6 to 353). There are experiences involving big groups, small groups, and even individual work. Some studies (15 studies) combine more than one type of grouping within the same intervention. In relation to the platforms used, Knowledge Forum is the most common platform (27 studies), combined with analytical tools, as it provides the data needed to carry out reflective assessment processes. Only one other platform was registered (WeChat), and three studies did not specify the platform used. The duration of the implementation also shows wide variation. While most studies last between 8 and 16 weeks, the actual duration ranges from 2 weeks to 1 year, with a mean duration of around 15 weeks. There are both short-term experiences and long-term studies covering almost an entire academic year. Regarding the assessment moment, most studies apply learning analytics for formative purposes (eight studies) or combine formative and summative assessment (nineteen studies). Only a small number (four studies) use them exclusively for summative evaluation. Finally, concerning who primarily uses learning analytics to carry out reflective assessments, most studies indicate that reflective assessments using analytics involve teachers and students (25 studies). However, only a few studies (six studies) report teachers as the main users of the analytic technology. Additionally, some studies include student self-assessment as part of the reflective assessment process.
The learning sequences in which these technologies are integrated usually include stages such as posing initial questions, producing notes, collective review and improvement, using rubrics or reflective templates, and a final reflection. In some cases, they also involve final products such as knowledge syntheses, concept maps, or shared artifacts.
(O4) To what extent do the reviewed studies fulfil knowledge building principles, achieve their objectives, and improve student performance?
According to the reviewed studies, all interventions (100%) explicitly indicate that their objectives were successfully met, showing that the use of learning analytics supports planned learning goals. However, only about 39% of the studies explicitly report measurable improvements in student performance. Evidence of improvement in student performance was only collected when academic achievement was explicitly measured through an evaluation instrument or when the study specified an increase in final course grades. However, this does not mean that students in these studies did not improve their performance. Since 100% of the studies reported having achieved their objectives, it can be inferred that students’ performance was positive overall. This means that while the technologies help facilitate reflective assessment and goal achievement, clear evidence of direct performance gains is less frequently documented.

4. Discussion

In relation to O1 (What learning analytics are used in reflective assessments?), this systematic review demonstrates the growing interest in using learning analytics within knowledge building contexts. These tools have evolved beyond simple monitoring to serve as resources for reflective assessment, fostering greater collective awareness and self-regulation during the knowledge building process. Across the studies analyzed, a wide variety of tools is evident, with established technologies such as Social Network Tool, KBDeX, and KCA remaining popular, alongside newer developments, like KBAT and ENA Web Tool, reflecting technical and pedagogical progress in their implementation. In line with Zhu and Kim (2017) [13], the findings confirm that many of these tools help summarize information, reveal participant relationships, and analyze the content of discourse within platforms such as Knowledge Forum. For example, KBDeX is widely used for discourse analysis and visualization of semantic networks (Oshima et al., 2012) [50]. In comparison, newer tools like KBAT focus on assessing the epistemic quality of contributions through dashboards that provide real-time feedback aligned with knowledge building principles, supporting both self-assessment and peer assessment by students.
Regarding O3 (What conditions influence the effectiveness of these learning analytics?), the effectiveness of these technologies also depends on thoughtful pedagogical design and appropriate learning conditions. The studies cover a wide range of educational contexts and levels, demonstrating their flexibility but also highlighting the need to adapt tool complexity to the learners’ stage. For younger or less experienced students, more accessible and visual tools help foster autonomy and reflection. As highlighted by Park and Zhang (2022) [51], learning analytics tools for knowledge building should help users understand how ideas evolve, how they connect, and what new questions arise—fostering deeper engagement. Small-group configurations are particularly beneficial for closer teacher support and more equitable discourse monitoring. Knowledge Forum remains the most widely used environment due to its compatibility with analytical tools that support reflective assessment.
With respect to O2 (What types of analyses do these technologies enable to support knowledge building principles?), at a pedagogical level, the analytical tools reviewed contribute directly to the two main knowledge building dimensions identified in this study: collaborative efforts and Idea improvement. Regarding collaborative efforts, tools such as Idea Thread Mapper help students organize and trace the development of ideas, encouraging perspective integration and richer community discourse (Zhang & Chen, 2019) [42]. Participation visualizations, like those provided by KBAT, also promote equal involvement by making disparities visible and encouraging more balanced interaction among group members (Oshima et al., 2012) [50]. Interpreting analytics results allows students to monitor their own participation and adjust their contributions autonomously, reinforcing self-regulation and shared ownership (Yang et al., 2024) [52]. At the group level, collective contribution indicators strengthen the sense of responsibility for community progress and reduce over-reliance on the teacher (Gutiérrez-Braojos et al., 2024) [8]. In terms of idea improvement, KBDeX enables the identification and refinement of preliminary contributions (Ong et al., 2023) [3] and supports students in synthesizing ideas into more complex theories, thus fostering deeper understanding and idea advancement (Zhang et al., 2018) [53]. Other tools, such as ENA Web Tool or KBAT, provide feedback on the epistemic quality of contributions, which guides students to continuously improve their ideas (Yang et al., 2024) [30]. Additionally, some analytical tools also enable students to self-assess by making it clear how their grades are derived, which enhances transparency and empowers them to take an active role in their own evaluation process (Gutiérrez-Braojos et al., in press) [54]. Interactive dashboards also support ongoing reflection and peer coordination, encouraging leadership roles to emerge naturally within the community (Yuan et al., 2022) [55].
Finally, in relation to O4 (To what extent do the reviewed studies fulfil knowledge building principles and improve student performance?), the review shows that most studies align with the principles of knowledge building and report positive outcomes in participation, collaboration, and epistemic quality, accomplishing their study objectives. However, only a minority of studies provide clear evidence of measurable performance improvement, which highlights an open challenge and a direction for future research.

5. Conclusions

In conclusion, learning analytics represents a valuable tool for enhancing reflective assessment in knowledge building, offering promising pathways to support students’ deeper engagement with their own learning processes. Nevertheless, this study has certain limitations, most notably the restriction to empirical studies published in Web of Science that explicitly addressed both reflective assessment and learning analytics. Future research could benefit from expanding the scope to include additional databases, which may reveal a broader range of approaches and evidence. Finally, the findings underline the need for further studies on measurable learning gains and open new directions for innovation, such as the integration of artificial intelligence, to enable more adaptive and personalized forms of reflective assessment (Oshima & Shaffer, 2021) [2].
A limitation of this review is that the search was conducted exclusively in Web of Science. Although this database offers wide coverage of high-impact journals in education and technology, relying on a single source may introduce bias and increase the risk of omitting relevant studies. Previous research has shown that systematic reviews achieve greater recall and reliability when multiple databases are combined (Bramer et al., 2017; Harari et al., 2020) [56,57]. Another limitation concerns the lack of measurable results on academic performance, as only a small number of studies reported explicit improvements in student achievement following the interventions. This indicates the need for more empirical studies that systematically examine performance outcomes alongside reflective assessment processes. Finally, further research not covered in this review but emerging as promising future directions includes the analysis of tool usability across different educational levels and the integration of learning analytics with emerging AI-based reflective assessment.

Author Contributions

Conceptualization, P.R.-C., C.G.-B., M.M.-G. and C.R.-D.; methodology, P.R.-C. and C.G.-B.; software, P.R.-C., C.G.-B., M.M.-G. and C.R.-D.; validation, C.G.-B. and C.R.-D.; formal analysis, P.R.-C. and C.G.-B.; investigation, P.R.-C. and C.G.-B.; resources, P.R.-C. and C.G.-B.; data curation, P.R.-C. and C.G.-B.; writing—original draft preparation, P.R.-C., C.G.-B., M.M.-G. and C.R.-D.; writing—review and editing, P.R.-C., C.G.-B., M.M.-G. and C.R.-D.; visualization, P.R.-C. and C.G.-B.; supervision, P.R.-C. and C.G.-B.; project administration, C.G.-B. and C.R.-D.; funding acquisition, C.G.-B. and C.R.-D. All authors have read and agreed to the published version of the manuscript.

Funding

This publication is part of a project (PID 2020-116872-RA-I00) that is financed/supported by MCIN/AEI/10.13039/501100011033/. This study (project) involving human participants was reviewed and approved by the Research Ethics Committee of the University of Granada.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Zheng, L.; Niu, J.; Zhong, L. Effects of a Learning Analytics-Based Real-Time Feedback Approach on Knowledge Elaboration, Knowledge Convergence, Interactive Relationships and Group Performance in CSCL. Br. J. Educ. Technol. 2022, 53, 130–149. [Google Scholar] [CrossRef]
  2. Oshima, J.; Shaffer, D.W. Learning Analytics for a New Epistemological Perspective of Learning. Inf. Technol. Educ. Learn. 2021, 1, Inv-p003. [Google Scholar] [CrossRef]
  3. Ong, E.T.; Lim, L.H.; Chan, C.K.K. Collective Reflective Assessment for Knowledge Building. Comput. Educ. 2023, 190, 104–639. [Google Scholar]
  4. Gutiérrez-Braojos, C.; Rodríguez-Domínguez, C.; Daniela, L.; Carranza-García, F. An Analytical Dashboard of Collaborative Activities for the Knowledge Building. Technol. Knowl. Learn. 2023, 1, 1–27. [Google Scholar] [CrossRef]
  5. Stoyanov, S.; Kirschner, P.A. Text Analytics for Uncovering Untapped Ideas at the Intersection of Learning Design and Learning Analytics: Critical Interpretative Synthesis. J. Comput. Assist. Learn. 2023, 39, 899–920. [Google Scholar] [CrossRef]
  6. Mangaroska, K.; Sharma, K.; Gaševic, D.; Giannakos, M. Multimodal Learning Analytics to Inform Learning Design: Lessons Learned from Computing Education. J. Learn. Anal. 2020, 7, 79–97. [Google Scholar] [CrossRef]
  7. Stahl, G. A Model of Collaborative Knowledge-Building. In Proceedings of the Fourth International Conference of the Learning Sciences; Fishman, B., O’Connor-Divelbiss, S., Eds.; Erlbaum: Mahwah, NJ, USA, 2000; pp. 70–77. [Google Scholar]
  8. Gutiérrez-Braojos, C.; Rodríguez-Chirino, P.; Vico, B.P.; Fernández, S.R. Teacher Scaffolding for Knowledge Building in the Educational Research Classroom. RIED-Rev. Iberoam. Educ. Distancia 2024, 27, 1–25. [Google Scholar] [CrossRef]
  9. Scardamalia, M. Collective Cognitive Responsibility for the Advancement of Knowledge. In Liberal Education in a Knowledge Society; Smith, B., Ed.; Open Court: Chicago, IL, USA, 2002; pp. 67–98. [Google Scholar]
  10. Cress, U.; Stahl, G.; Ludvigsen, S.; Law, N. The Core Features of CSCL: Social Situation, Collaborative Knowledge Processes and Their Design. Int. J. Comput. Support. Collab. Learn. 2015, 10, 109–116. [Google Scholar] [CrossRef]
  11. Scardamalia, M.; Bereiter, C. Knowledge Building. In The Cambridge Handbook of the Learning Sciences; Sawyer, R.K., Ed.; Cambridge University Press: Cambridge, UK, 2006; pp. 97–118. [Google Scholar]
  12. Scardamalia, M. CSILE/Knowledge Forum®. In Encyclopedia of Education and Technology; Guthrie, J.W., Ed.; ABC-CLIO: Santa Barbara, CA, USA, 2004; pp. 183–192. [Google Scholar]
  13. Zhu, G.; Kim, M.S. A Review of Assessment Tools of Knowledge Building towards the Norm of Embedded and Transformative Assessment. In Proceedings of the Knowledge Building Summer Institute, Philadelphia, PA, USA, 18–22 June 2017. [Google Scholar]
  14. Apiola, M.V.; Lipponen, S.; Seitamaa, A.; Korhonen, T.; Hakkarainen, K. Learning Analytics for Knowledge Creation and Inventing in K-12: A Systematic Review. In Proceedings of the Science and Information Conference, Cham, Switzerland, 14–16 July 2022; Springer: Cham, Switzerland, 2022; pp. 238–257. [Google Scholar]
  15. Sánchez-Serrano, S.; Navarro, I.P.; González, M.D. ¿Cómo Hacer una Revisión Sistemática Siguiendo el Protocolo PRISMA?: Usos y Estrategias Fundamentales para su Aplicación en el Ámbito Educativo a Través de un Caso Práctico. Bordón Rev. Pedagog. 2022, 74, 51–66. [Google Scholar] [CrossRef]
  16. Scardamalia, M.; Bereiter, C. Knowledge Building: Advancing the State of Community Knowledge. In International Handbook of Computer-Supported Collaborative Learning; Springer: Cham, Switzerland, 2021; pp. 261–279. [Google Scholar]
  17. Scardamalia, M.; Bereiter, C. Smart Technology for Self-Organizing Processes. Smart Learn. Environ. 2014, 1, 1. [Google Scholar] [CrossRef]
  18. Scardamalia, M.; Bereiter, C. Computer Support for Knowledge-Building Communities. J. Learn. Sci. 1994, 3, 265–283. [Google Scholar] [CrossRef]
  19. Gutiérrez-Braojos, C.; Montejo-Gámez, J.; Ma, L.; Chen, B.; Muñoz de Escalona-Fernández, M.; Scardamalia, M.; Bereiter, C. Exploring Collective Cognitive Responsibility through the Emergence and Flow of Forms of Engagement in a Knowledge Building Community. In Didactics of Smart Pedagogy; Springer: Cham, Switzerland, 2019; pp. 213–232. [Google Scholar]
  20. Zhang, J.; Scardamalia, M.; Reeve, R.; Messina, R. Designs for Collective Cognitive Responsibility in Knowledge-Building Communities. J. Learn. Sci. 2009, 18, 7–44. [Google Scholar] [CrossRef]
  21. Ma, L.; Scardamalia, M. Teachers as Designers in Knowledge Building Innovation Networks. In The Learning Sciences in Conversation; Shanahan, M.C., Kim, B., Takeuchi, M.A., Koh, K., Preciado-Babb, A.P., Sengupta, P., Eds.; Routledge: New York, NY, USA, 2022; pp. 107–120. [Google Scholar]
  22. Chen, B. Fostering Scientific Understanding and Epistemic Beliefs through Judgments of Promisingness. Educ. Technol. Res. Dev. 2017, 65, 255–277. [Google Scholar] [CrossRef]
  23. Cacciamani, S.; Perrucci, V.; Fujita, N. Promoting Students’ Collective Cognitive Responsibility through Concurrent, Embedded and Transformative Assessment in Blended Higher Education Courses. Technol. Knowl. Learn. 2021, 26, 1169–1194. [Google Scholar] [CrossRef]
  24. Chen, B.; Hong, H.Y. Schools as Knowledge-Building Organizations: Thirty Years of Design Research. Educ. Psychol. 2016, 51, 266–288. [Google Scholar] [CrossRef]
  25. Gutiérrez-Braojos, C.; Montejo-Gámez, J.; Poza Vilches, F.; Marín-Jiménez, A. Evaluation of Research on the Knowledge Building Pedagogy: A Mixed Methodological Approach. Relieve Rev. ELectrón. Investig. Eval. Educ. 2020, 26, 1–22. [Google Scholar]
  26. Yang, Y.; Van Aalst, J.; Chan, C.K.K. Collective Reflective Assessment for Shared Epistemic Agency by Undergraduates in Knowledge Building. Br. J. Educ. Technol. 2020, 51, 423–437. [Google Scholar] [CrossRef]
  27. Lei, C.; Chan, C.K. Developing Metadiscourse through Reflective Assessment in Knowledge Building Environments. Comput. Educ. 2018, 126, 153–169. [Google Scholar] [CrossRef]
  28. Tan, S.C.; Chan, C.; Bielaczyc, K.; Ma, L.; Scardamalia, M.; Bereiter, C. Knowledge Building: Aligning Education with Needs for Knowledge Creation in the Digital Age. Educ. Technol. Res. Dev. 2021, 69, 2243–2266. [Google Scholar] [CrossRef]
  29. Van Aalst, J.; Chan, C.K. Student-Directed Assessment of Knowledge Building Using Electronic Portfolios. J. Learn. Sci. 2007, 16, 175–220. [Google Scholar] [CrossRef]
  30. Yang, Y.; Chan, C.K.; Zhu, G.; Tong, Y.; Sun, D. Reflective Assessment Using Analytics and Artifacts for Scaffolding Knowledge Building Competencies among Undergraduate Students. Int. J. Comput. Support. Collab. Learn. 2024, 19, 231–272. [Google Scholar] [CrossRef]
  31. Yang, Y.; Yuan, K.C.; Zhang, J. Analytics-Supported Reflective Assessment for Fostering Collective Knowledge Advancement. Int. J. Comput. Support. Collab. Learn. 2022, 17, 375–400. [Google Scholar]
  32. Chan, C.K.; Van Aalst, J. Learning, Assessment and Collaboration in Computer-Supported Environments. In What We Know About CSCL: And Implementing It in Higher Education; Strijbos, J.W., Kirschner, P.A., Martens, R.L., Eds.; Springer: Dordrecht, The Netherlands, 2004; pp. 87–112. [Google Scholar]
  33. Yan, Z.; Zhang, K.C.; Brown, G.T.L.; Panadero, E. Students’ Perceptions of Self-Assessment: A Systematic Review of Empirical Studies. Educ. Psychol. Rev. 2023, 36, 35–81. [Google Scholar]
  34. Bereiter, C.; Scardamalia, M. Knowledge Building and Knowledge Creation: One Concept, Two Hills to Climb. In Knowledge Creation in Education; Tan, S.C., So, H.J., Yeo, J., Eds.; Springer: Singapore, 2014; pp. 35–52. [Google Scholar]
  35. Avella, J.T.; Kebritchi, M.; Nunn, S.G.; Kanai, T. Learning Analytics Methods, Benefits, and Challenges in Higher Education: A Systematic Literature Review. J. Asynch. Learn. Netw. 2016, 20, 13–35. [Google Scholar]
  36. Sergis, S.; Sampson, D.G. Teaching and Learning Analytics to Support Teacher Inquiry: A Systematic Literature Review. Learn. Res. Pract. 2016, 2, 42–64. [Google Scholar]
  37. Oliva-Córdova, V.; Martínez-Abad, F.; Rodríguez-Conde, M.J. Learning Analytics to Support Teaching Skills: A Systematic Literature Review. Int. J. Educ. Technol. High. Educ. 2021, 18, 58351–58363. [Google Scholar] [CrossRef]
  38. Sonderlund, A.L.; Hughes, E.; Smith, J. The Efficacy of Learning Analytics Interventions in Higher Education: A Systematic Review. Br. J. Educ. Technol. 2018, 50, 2594–2618. [Google Scholar] [CrossRef]
  39. Hong, H.Y.; Scardamalia, M.; Messina, R.; Teo, C.L. Fostering Sustained Idea Improvement with Principle-Based Knowledge Building Analytic Tools. Comput. Educ. 2015, 87, 227–240. [Google Scholar] [CrossRef]
  40. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. Declaración PRISMA 2020: Una Guía Actualizada para la Publicación de Revisiones Sistemáticas. Rev. Esp. Cardiol. 2021, 74, 790–799. [Google Scholar] [CrossRef]
  41. Matsuzawa, Y.; Oshima, J.; Oshima, R.; Niihara, Y.; Sakai, S. KBDeX: A Platform for Exploring Discourse in Collaborative Learning. Procedia-Soc. Behav. Sci. 2011, 26, 198–207. [Google Scholar] [CrossRef]
  42. Zhang, J.; Chen, M.H. Idea Thread Mapper: Designs for Sustaining Student-Driven Knowledge Building across Classrooms. In Proceedings of the 13th International Conference on Computer Supported Collaborative Learning (CSCL 2019), École Normale Supérieure de Lyon, France, 17–21 June 2019; Lund, K., Niccolai, G., Lavoué, E., Hmelo-Silver, C., Gweon, G., Baker, M., Eds.; International Society of the Learning Sciences: Lyon, France, 2019; pp. 144–151. [Google Scholar]
  43. Shaffer, D.W.; Collier, W.; Ruis, A.R. A Tutorial on Epistemic Network Analysis: Analyzing the Structure of Connections in Cognitive, Social, and Interaction Data. J. Learn. Anal. 2016, 3, 9–45. [Google Scholar] [CrossRef]
  44. Van Aalst, J.; Mu, J.; Yang, Y. Formative Assessment of Computer-Supported Collaborative Learning and Knowledge Building. In Measuring and Visualizing Learning in the Information-Rich Classroom; Shum, S.B., Ferguson, R., Martínez-Maldonado, R., Eds.; Routledge: London, UK, 2015; pp. 154–166. [Google Scholar]
  45. Gutiérrez-Braojos, C.; Rodríguez-Domínguez, C.; Daniela, L.; Rodríguez-Chirino, P. Evaluating a New Knowledge Building Analytics Tool (KBAT). In Proceedings of the 17th International Conference on Computer-Supported Collaborative Learning (CSCL 2024), Buffalo, NY, USA, 10–14 June 2024; International Society of the Learning Sciences: Bodo, Norway, 2024; pp. 413–414. [Google Scholar]
  46. Tan, S.C.; Lee, A.V.Y.; Lee, M. Ideations for AI-Supported Knowledge Building. In Proceedings of the Knowledge Building Summer Institute, Wageningen, The Netherlands, 15–19 August 2022. [Google Scholar]
  47. Chen, B.; Scardamalia, M.; Bereiter, C. Advancing Knowledge-Building Discourse through Judgments of Promising Ideas. Int. J. Comput. Support. Collab. Learn. 2015, 10, 345–366. [Google Scholar] [CrossRef]
  48. Gutiérrez-Braojos, C.; Daniela, L.; Montejo-Gámez, J.; Aliaga, F. Developing and Comparing Indices to Evaluate Community Knowledge Building in an Educational Research Course. Sustainability 2022, 14, 10603. [Google Scholar] [CrossRef]
  49. Burtis, J. Analytic Toolkit for Knowledge Forum; Centre for Applied Cognitive Science, Ontario Institute for Studies in Education, University of Toronto: Toronto, ON, Canada, 1998. [Google Scholar]
  50. Oshima, J.; Oshima, R.; Matsuzawa, Y. Knowledge Building Discourse Explorer: A Social Network Analysis Application for Knowledge Building Discourse. Educ. Technol. Res. Dev. 2012, 60, 903–921. [Google Scholar] [CrossRef]
  51. Park, H.; Zhang, J. Learning Analytics for Teacher Noticing and Scaffolding: Facilitating Knowledge Building Progress in Science. In Proceedings of the 15th International Conference on Computer-Supported Collaborative Learning (CSCL 2022), Hiroshima, Japan, 6–10 June 2025; International Society of the Learning Sciences: Hiroshima, Japan, 2022; pp. 147–154. [Google Scholar]
  52. Yang, Y.; Feng, X.; Zhu, G.; Xie, K. Effects and Mechanisms of Analytics-Assisted Reflective Assessment in Fostering Undergraduates’ Collective Epistemic Agency in Computer-Supported Collaborative Inquiry. J. Comput. Assist. Learn. 2024, 40, 1098–1122. [Google Scholar] [CrossRef]
  53. Zhang, J.W.; Hong, H.Y.; Scardamalia, M.; Teo, C.; Morley, E. Sustaining Knowledge Building as a Principle-Based Innovation at an Elementary School. Int. J. Comput. Support. Collab. Learn. 2018, 13, 361–384. [Google Scholar] [CrossRef]
  54. Gutiérrez-Braojos, C.; Daniela, L.; Rodríguez, C.; Berrocal, E. Resignifying Assessment through Knowledge Building: A Grounded Theory on Students’ Perceived Value of Reflective Evaluation with KBAT and AI. In press.
  55. Yuan, G.; Zhang, J.; Chen, M.H. Cross-Community Knowledge Building with Idea Thread Mapper. Int. J. Comput. Support. Collab. Learn. 2022, 17, 293–326. [Google Scholar] [CrossRef]
  56. Bramer, W.M.; Rethlefsen, M.L.; Kleijnen, J.; Franco, O.H. Optimal database combinations for literature searches in systematic reviews: A prospective exploratory study. Syst. Rev. 2017, 6, 245. [Google Scholar] [CrossRef]
  57. Harari, M.B.; Parola, H.R.; Hartwell, C.J.; Riegelman, A. Literature searches in systematic reviews and meta-analyses: A review, evaluation, and recommendations. J. Vocat. Behav. 2020, 118, 103–377. [Google Scholar] [CrossRef]
Figure 1. Prepared by the authors based on the flow diagram from Page et al. (2021) [40].
Figure 1. Prepared by the authors based on the flow diagram from Page et al. (2021) [40].
Information 16 00762 g001
Figure 2. Frequency analysis of learning analytics used in knowledge building studies (prepared by the authors).
Figure 2. Frequency analysis of learning analytics used in knowledge building studies (prepared by the authors).
Information 16 00762 g002
Table 1. Publications and citations on knowledge building and learning analytics studies.
Table 1. Publications and citations on knowledge building and learning analytics studies.
YearAuthorsTitleStudy DesignType of DocumentTimes Cited (All Databases)
12010Erkunt, HEmergence of Epistemic Agency in College Level Educational Technology Course for Pre-Service Teachers Engaged in CSCLMixedArticle10
22011Chan, CKK; Chan, YYStudents’ views of collaboration and online participation in Knowledge ForumMixedArticle87
32012Oshima, Jun; Oshima, Ritsuko; Matsuzawa, YoshiakiKnowledge Building Discourse Explorer: a social network analysis application for knowledge building discourseMixedArticle111
42015Chen, B; Scardamalia, Marlene; Bereiter, CAdvancing knowledge building discourse through judgments of promising ideasMixedArticle77
52015Hong, HY; Scardamalia, M; Messina, R; Teo, CLFostering sustained idea improvement with principle-based knowledge building analytic toolsMixedArticle66
62016Yang, Yuqin; van Aalst, Jan; Chan, Carol K. K.; Tian, WenReflective assessment in knowledge building by students with low academic achievementMixedArticle45
72017Chen, BodongFostering scientific understanding and epistemic beliefs through judgments of promisingnessMixedArticle 29
82018Zhang, JW; Tao, D; Chen, MH; Sun, YQ; Judson, D; Naqvi, S Co-Organizing the Collective Journey of Inquiry with Idea Thread MapperMixedArticle74
92019Yang, YuqinReflective assessment for epistemic agency of academically low-achieving studentsMixedArticle6
102019Khanlari, Ahmad; Zhu, Gaoxia; Scardamalia, MarleneKnowledge Building Analytics to Explore Crossing Disciplinary and Grade-Level BoundariesMixedArticle4
112020Yang, YQ; Chen, QQ; Yu, YW; Feng, XQ; van Aalst, JCollective reflective assessment for shared epistemic agency by undergraduates in knowledge buildingMixedArticle33
122020Hod, Y; Katz, S; Eagan, BRefining qualitative ethnographies using Epistemic Network Analysis: A study of socioemotional learning dimensions in a Humanistic Knowledge Building CommunityMixedArticle24
132021Yang, Yuqin; van Aalst, Jan; Chan, CarolExamining Online Discourse Using the Knowledge Connection Analyzer Framework and Collaborative Tools in Knowledge BuildingMixedArticle7
142021Ong, Aloysius; Teo, Chew Lee; Tan, Samuel; Kim, Mi SongA knowledge building approach to primary science collaborative inquiry supported by learning analyticsQualitativeArticle32
152021Tao, D; Zhang, JW Agency to transform: how did a grade 5 community co-configure dynamic knowledge building practices in a yearlong science inquiry?QualitativeArticle20
162021Gutiérrez-Braojos, Calixto; Rodriguez-Dominguez, Carlos; Carranza-Garcia, Francisco; Navarro-Garulo, GabrielComputer-supported knowledge building community A new learning analytics toolMixedBook Chapter0
172022Yang, Yuqin; Zhu, Gaoxia; Sun, Daner; Chan, Carol K. K.Collaborative analytics-supported reflective Assessment for Scaffolding Pre-service Teachers’ collaborative Inquiry and Knowledge BuildingMixedArticle0
182022Yuan, GJ; Zhang, JW; Chen, MHCross-community knowledge building with idea thread mapperMixedArticle12
192022Yang, YQ; Yuan, KC; Feng, XQ; Li, XH; van Aalst, JFostering low-achieving students’ productive disciplinary engagement through knowledge building inquiry and reflective assessmentMixedArticle14
202023Ong, Aloysius; Teo, Chew Lee; Lee, Alwyn Vwen Yen; Yuan, GuangjiEpistemic Network Analysis to assess collaborative engagement in Knowledge Building discourseMixedProceedings Paper8
212023Gutiérrez-Braojos, C.; Rodriguez-Dominguez, C.; Daniela, L.; Carranza-Garcia, F.An Analytical Dashboard of Collaborative Activities for the Knowledge BuildingMixedArticle5
222023Yang, Yuqin; Zheng, Zhizi; Zhu, Gaoxia; Salas-Pilco, Sdenka ZobeidaAnalytics-supported reflective assessment for 6th graders’ knowledge building and data science practices: An exploratory studyMixedArticle3
232023Jiang, JP; Xie, WL; Wang, SY; Zhang, YB; Gao, JAssessing team creativity with multi-attribute group decision-making in a knowledge building community: A design-based researchMixedArticle1
242023Chai, SM; Oon, EPT; Chai, Y; Li, ZKExamining the role of metadiscourse in collaborative knowledge building communityMixedArticle; Early Access2
252024Yang, Yuqin; Chen, Yewen; Feng, Xueqi; Sun, Daner; Pang, ShiyanInvestigating the mechanisms of analytics-supported reflective assessment for fostering collective knowledgeMixedArticle2
262024Yang, Yuqin; Chan, Carol K. K.; Zhu, Gaoxia; Tong, Yuyao; Sun, DanerReflective assessment using analytics and artifacts for scaffolding knowledge building competencies among undergraduate studentsMixedArticle3
272024Yang, Yuqin; Feng, Xueqi; Zhu, Gaoxia; Xie, KuiEffects and mechanisms of analytics-assisted reflective assessment in fostering undergraduates’ collective epistemic agency in computer-supported collaborative inquiryMixedArticle1
282024Yu, Yawen; Tao, Yang; Chen, Gaowei; Sun, CanUsing learning analytics to enhance college students shared epistemic agency in mobile instant messaging: A new way to support deep discussionMixedArticle2
292024Gutiérrez-Braojos, Calixto; Rodríguez-Chirino, Paula; Vico, Beatriz Pedrosa; Fernández, Sonia RodriguezTeacher scaffolding for knowledge building in the educational research classroomMixedArticle2
302024Gutiérrez-Braojos, C; Rodríguez-Domínguez, C; Daniela, L; Rodríguez-Chirino, PEvaluating a New Knowledge Building Analytics Tool (KBAT)MixedProceedings Paper0
312025Tong, YY; Chen, GW; Jong, MSY Video-based analytics-supported formative feedback for enhancing low-achieving students’ conception of collaboration and classroom discourse engagementMixedArticle0
Source: prepared by the authors based on data collected in WoS.
Table 2. Types of analyses used in knowledge building studies that incorporate learning analytics for reflective assessments.
Table 2. Types of analyses used in knowledge building studies that incorporate learning analytics for reflective assessments.
Analytical TechnologyDescriptionTypes of Analysis Enabled
KBDeXKBDeX is a discourse analysis tool designed to explore collaborative learning processes by visualizing network structures that link participants, discourse units, and key terms. Based on a bipartite graph model, it allows researchers to investigate how ideas and interactions develop within a community of learners (Matsuzawa et al., 2011) [41]. Discourse visualization, network structure analysis, idea flow tracking.
Social Network ToolA tool used to analyze group dynamics and community members’ interactivity. It helps identify who is collaborating with whom based on responses, links, references, or annotations and highlights members working in isolation due to a lack of interaction. It also shows how many notes each participant builds on, connects to, references, or otherwise interacts with, providing a clear overview of collaboration patterns within the community (Hong et al., 2015) [39].Social network analysis, participation mapping, community structure.
ITM Idea Thread Mapper (ITM) is introduced as a research-based digital platform that enables students to collaboratively engage in sustained, self-directed knowledge building. It supports learners in visualizing and organizing idea threads over time, fostering metacognitive reflection and collective inquiry within and across classrooms. The platform also includes shared community spaces, such as “Super Talk,” designed to promote cross-group interaction and comparison of idea development trajectories (Zhang & Chen, 2019) [42]. Content sequence analysis, idea development, cross-thread connections.
ENA Web Tool The ENA Web Tool is introduced as an online platform designed to support researchers in modeling and analyzing discourse data through the principles of Epistemic Network Analysis. While not extensively defined, the tool allows users to upload coded datasets, construct networks of co-occurring elements (such as skills or concepts), and visualize patterns of connections across individuals or groups. It is positioned as a key resource for exploring how knowledge is built through interactions (Shaffer et al., 2016) [43]. Epistemic Network Analysis, co-occurrence patterns, connections between discourse elements, epistemic frame mapping.
KCA The Knowledge Connections Analyzer (KCA) is a web-based analytical tool designed to support student reflection and metacognitive awareness within knowledge building environments. It is integrated into a broader framework that includes four guiding inquiry questions, and it extracts discourse data from the Knowledge Forum platform to generate visualizations and indicators. These outputs are intended to facilitate collective evaluation and refinement of online collaborative work (van Aalst et al., 2015) [44]. Concept frequency, semantic overlap, principle-specific mapping.
KBAT The Knowledge Building Analytics Tool (KBAT) is a tool designed to enable the construction of customized analytical dashboards within the Knowledge Forum environment. It includes visual and interactive components that allow users to analyze key dimensions of participation, such as discourse patterns, conceptual progress, equity of engagement, and leadership dynamics. These dashboards support metacognitive self-assessment and collaborative reflection aligned with knowledge building principles (Gutiérrez-Braojos et al., 2024) [45].Discourse moves classification, contribution patterns.
CiA Curriculum-ideas Analytics (CiA) is an educational analytics tool developed to support teachers and students in identifying and reflecting on “big ideas” that span multiple subjects and grade levels. It facilitates the analysis of discourse and curriculum content by mapping conceptual trajectories across science and humanities domains, promoting a deeper understanding of interdisciplinary knowledge development (Tan et al., 2022) [46].Interaction types, contribution quality, reflection indicators.
Promising Ideas ToolThe Promising Ideas Tool was developed as a component of the Knowledge Forum platform to enhance student engagement in knowledge building discourse. It enables participants to identify, highlight, and track promising contributions within online discussions. The tool includes features for tagging key ideas, aggregating and ranking them based on peer recognition, and exporting selected ideas to new collaborative spaces for further development (Chen et al., 2015) [47]. Idea tagging and ranking, promising idea identification, idea aggregation and exportation for further development
Vocabulary AnalyzerA tool developed to track the growth of a user’s vocabulary over time and to assess their vocabulary level against a pre-defined dictionary. This dictionary typically contains key terms relevant to a specific domain or important concepts identified in curriculum guidelines, helping monitor students’ conceptual development (Hong et al., 2015) [39].Lexical diversity, term frequency, conceptual depth.
CDAThe Classroom Discourse Analyzer (CDA) is introduced as a practical tool designed to help educators overcome common challenges in analyzing classroom discourse. By facilitating the coding and examination of student–teacher interactions, the CDA enables teachers to reflect on and improve their instructional practices using data derived from authentic classroom environments (Chen et al., 2015) [47]. Critical discourse features, power relations in discourse.
Semantic Overlap ToolThe Semantic Overlap Tool is an analytic feature that compares notes or sets of notes to detect shared terms and phrases, thereby identifying conceptual similarities across student discourse. It is particularly useful for analyzing the degree to which students’ contributions align with curriculum goals by highlighting overlapping key vocabulary (Hong et al., 2015) [39]. Semantic similarity, conceptual alignment.
KFCE It is an analytical tool integrated into Knowledge Forum that allows key information about a learning community’s activity to be extracted and organized. It supports the evaluation of indicators such as individual participation, connections between contributions, and collective cognitive responsibility. These data provide a basis for monitoring progress, identifying interaction patterns, and supporting reflective assessment processes in knowledge building contexts (Gutiérrez-Braojos et al., 2022) [48].Data extraction, community participation metrics, content export.
Analytic ToolThe Analytic Toolkit, developed for the Knowledge Forum environment, was designed to help users examine their participation in collaborative discourse. It offered metrics such as the number and type of contributions, response patterns, and lexical overlap, allowing students and educators to reflect on idea development and engagement patterns within the online learning community (Burtis, 1998) [49]. Contribution frequency analysis, response pattern analysis, lexical overlap analysis, participation monitoring
Vocabulary AnalyzerA tool developed to track the growth of a user’s vocabulary over time and to assess their vocabulary level against a pre-defined dictionary. This dictionary typically contains key terms relevant to a specific domain or important concepts identified in curriculum guidelines, helping monitor students’ conceptual development (Hong et al., 2015) [39].Vocabulary growth tracking, conceptual development monitoring, domain-specific term analysis
Source: prepared by the authors.
Table 3. Exploring the relationship between learning analytics, collaborative efforts, and idea improvement.
Table 3. Exploring the relationship between learning analytics, collaborative efforts, and idea improvement.
YearTitleLearning Analytics
Collaborative Efforts
2011Students’ views of collaboration and online participation in Knowledge ForumAnalytic Toolkit
2020Refining qualitative ethnographies using Epistemic Network Analysis: A study of socioemotional learning dimensions in a Humanistic Knowledge Building CommunityENA Web Tool
Idea Improvement
2015Advancing knowledge building discourse through judgments of promising ideasPromising Ideas Tool
2020Collective reflective assessment for shared epistemic agency by undergraduates in knowledge buildingKBDeX/Promising Ideas Tool
2022Fostering low-achieving students’ productive disciplinary engagement through knowledge building inquiry and reflective assessmentENA Web Tool
2023Epistemic Network Analysis to assess collaborative engagement in Knowledge Building discourseENA Web Tool
2023Analytics-supported reflective assessment for 6th graders’ knowledge building and data science practices: An exploratory studyKBDeX
2024Reflective assessment using analytics and artifacts for scaffolding knowledge building competencies among undergraduate studentsKBDeX
2024Effects and mechanisms of analytics-assisted reflective assessment in fostering undergraduates’ collective epistemic agency in computer-supported collaborative inquiryKBDeX
Collaborative Efforts and Idea Improvement
2010Emergence of Epistemic Agency in College Level Educational Technology Course for Pre-Service Teachers Engaged in CSCL Social Network Tool
2012Knowledge Building Discourse Explorer: a social network analysis application for knowledge building discourseKBDeX/Social Network Tool
2015Fostering sustained idea improvement with principle-based knowledge building analytic toolsSocial Network Tool, Vocabulary Analyzer, Semantic Overlap Tool
2016Reflective assessment in knowledge building by students with low academic achievementKCA
2017Fostering scientific understanding and epistemic beliefs through judgments of promisingnessITM
2018Co-Organizing the Collective Journey of Inquiry with Idea Thread MapperITM
2019Reflective assessment for epistemic agency of academically low-achieving studentsKCA
2019Knowledge Building Analytics to Explore Crossing Disciplinary and Grade-Level BoundariesKBDeX
2021Examining Online Discourse Using the Knowledge Connection Analyzer Framework and Collaborative Tools in Knowledge BuildingKBDeX/Promising Ideas
2021A knowledge building approach to primary science collaborative inquiry supported by learning analyticsCiA
2021Agency to transform: how did a grade 5 community co-configure dynamic knowledge building practices in a yearlong science inquiry?ITM
2021Computer-supported knowledge building community A new learning analytics toolKFCE
2022Collaborative analytics-supported reflective Assessment for Scaffolding Pre-service Teachers’ collaborative Inquiry and Knowledge BuildingKBDeX
2022Cross-community knowledge building with idea thread mapperITM
2023An Analytical Dashboard of Collaborative Activities for the Knowledge BuildingKBAT
2023Assessing team creativity with multi-attribute group decision-making in a knowledge building community: A design-based researchSocial Network Tool
2023Examining the role of metadiscourse in collaborative knowledge building communityKBDeX
2024Investigating the mechanisms of analytics-supported reflective assessment for fostering collective knowledgeKBDeX
2024Using learning analytics to enhance college students shared epistemic agency in mobile instant messaging: A new way to support deep discussionKBDeX
2024Teacher scaffolding for knowledge building in the educational research classroomKBAT
2024Evaluating a New Knowledge Building Analytics Tool (KBAT)KBAT
2025Video-based analytics-supported formative feedback for enhancing low-achieving students’ conception of collaboration and classroom discourse engagementCDA
Source: prepared by the authors.
Table 4. Conditions of knowledge building studies that incorporate learning analytics.
Table 4. Conditions of knowledge building studies that incorporate learning analytics.
YearTitleParticipantsParticipant SetupPlatform UsedPeriodAssessment MomentAssessment Agent
Primary Education
2015Fostering sustained idea improvement with principle-based knowledge building analytic tools22Not specifiedKF17 weeksFormative and summativeStudent and teacher
2015Advancing knowledge-building discourse through judgments of promising ideas40Small groupKF8 weeksFormative and summativeStudent and teacher
2017Fostering scientific understanding and epistemic beliefs through judgments of promisingness26Big group, small group, and individualKF10 weeksFormative and summativeStudent and teacher
2018Co-Organizing the Collective Journey of Inquiry with Idea Thread Mapper47Big group and small groupKF12 weeksFormative and summativeStudent and teacher
2019Reflective assessment for epistemic agency of academically low-achieving students33Big group and small groupKF21 weeksFormative and summativeStudent and teacher
2019Knowledge Building Analytics to Explore Crossing Disciplinary and Grade-Level Boundaries40Big group and individualKF13 weeksFormative and summativeTeacher
2021A knowledge building approach to primary science collaborative inquiry supported by learning analytics25Big group and individualKFNot specifiedFormative and summativeStudent and teacher
2021Agency to transform: how did a grade 5 community co-configure dynamic knowledge building practices in a yearlong science inquiry?24Big group, small group, and individualKF1 yearFormative and summativeStudent and teacher
2022Cross-community knowledge building with idea thread mapper76Small groupKF26 weeksFormative and summativeStudent and teacher
2023Epistemic Network Analysis to assess collaborative engagement in Knowledge Building discourse6Big groupKF2 weeksSummativeStudent and teacher
2023Analytics-supported reflective assessment for 6th graders’ knowledge building and data science practices: An exploratory study56Big group and individualKF8 weeksFormative and summativeTeacher
2023Assessing team creativity with multi-attribute group decision-making in a knowledge building community: A design-based research37Big group and individualKF39 weeksFormative and summativeStudent and teacher
2024Investigating the mechanisms of analytics-supported reflective assessment for fostering collective knowledge93Small groupKF18 weeksFormative and summativeStudent and teacher
Secondary Education
2011Students’ views of collaboration and online participation in Knowledge Forum23Small groupKF8 weeksSummativeTeacher
2016Reflective assessment in knowledge building by students with low academic achievement20Small group and individualKF2 weeksFormative and summativeStudent and teacher
2021Examining Online Discourse Using the Knowledge Connection Analyzer Framework and Collaborative Tools in Knowledge Building353Big group and individualKF21 weeksFormative and summativeStudent and teacher
2022Fostering low-achieving students’ productive disciplinary engagement through knowledge-building inquiry and reflective assessment34Big group, small group, and individualKF17 weeksSummativeStudent and teacher
2025Video-based analytics-supported formative feedback for enhancing low-achieving students’ conception of collaboration and classroom discourse engagement98Small groupNot specified7 weeksFormativeStudent and teacher
Higher Education
2010Emergence of Epistemic Agency in College Level Educational Technology Course for Pre-Service Teachers Engaged in CSCL 44Big groupKF6 weeksFormative and summativeTeacher
2012Knowledge Building Discourse Explorer: a social network analysis application for knowledge building discourse6Small groupNot specifiedNot specifiedFormativeTeacher
2020Collective reflective assessment for shared epistemic agency by undergraduates in knowledge building73Big group, small group, and individualKF17 weeksFormative and summativeStudent and teacher
2020Refining qualitative ethnographies using Epistemic Network Analysis: A study of socioemotional learning dimensions in a Humanistic Knowledge Building Community18Big group, small group, and individualKF13 weeksSummativeTeacher
2022Computer-supported knowledge building community A new learning analytics tool59Small groupKF16 weeksFormativeStudent and teacher
2022Collaborative analytics-supported reflective Assessment for Scaffolding Pre-service Teachers’ collaborative Inquiry and Knowledge Building68Big group and small groupKF18 weeksFormative and summativeStudent and teacher
2023An Analytical Dashboard of Collaborative Activities for the Knowledge Building126Big group and individualKF16 weeksFormative and summativeStudent and teacher
2023Examining the role of metadiscourse in collaborative knowledge building community35Small groupNot specified12 weeksFormativeStudent and teacher
2024Reflective assessment using analytics and artifacts for scaffolding knowledge building competencies among undergraduate students41Big group and individualKF2 weeksFormativeStudent and teacher
2024Effects and mechanisms of analytics-assisted reflective assessment in fostering undergraduates’ collective epistemic agency in computer-supported collaborative inquiry81Big group and individualKF16 weeksFormative and summativeStudent and teacher
2024Using learning analytics to enhance college students shared epistemic agency in mobile instant messaging: A new way to support deep discussion40Small groupWeChat14 weeksFormativeStudent and teacher
2024Teacher scaffolding for knowledge building in the educational research classroom59Small groupKF16 weeksFormativeStudent and teacher
2024Evaluating a New Knowledge Building Analytics Tool (KBAT)122Small groupKF16 weeksFormativeStudent and teacher
Source: elaborated by the authors.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Rodríguez-Chirino, P.; Gutiérrez-Braojos, C.; Martínez-Gámez, M.; Rodríguez-Domínguez, C. Technologies for Reflective Assessment in Knowledge Building Communities: A Systematic Review. Information 2025, 16, 762. https://doi.org/10.3390/info16090762

AMA Style

Rodríguez-Chirino P, Gutiérrez-Braojos C, Martínez-Gámez M, Rodríguez-Domínguez C. Technologies for Reflective Assessment in Knowledge Building Communities: A Systematic Review. Information. 2025; 16(9):762. https://doi.org/10.3390/info16090762

Chicago/Turabian Style

Rodríguez-Chirino, Paula, Calixto Gutiérrez-Braojos, Mario Martínez-Gámez, and Carlos Rodríguez-Domínguez. 2025. "Technologies for Reflective Assessment in Knowledge Building Communities: A Systematic Review" Information 16, no. 9: 762. https://doi.org/10.3390/info16090762

APA Style

Rodríguez-Chirino, P., Gutiérrez-Braojos, C., Martínez-Gámez, M., & Rodríguez-Domínguez, C. (2025). Technologies for Reflective Assessment in Knowledge Building Communities: A Systematic Review. Information, 16(9), 762. https://doi.org/10.3390/info16090762

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop