Next Article in Journal / Special Issue
A Structured Dataset for Automated Grading: From Raw Data to Processed Dataset
Previous Article in Journal
Event Prediction Using Spatial–Temporal Data for a Predictive Traffic Accident Approach Through Categorical Logic
Previous Article in Special Issue
SAPEx-D: A Comprehensive Dataset for Predictive Analytics in Personalized Education Using Machine Learning
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Dataset on Programming Competencies Development Using Scratch and a Recommender System in a Non-WEIRD Primary School Context

by
Jesennia Cárdenas-Cobo
1,
Cristian Vidal-Silva
2,* and
Nicolás Máquez
3,*
1
Vicerrectora Académica de Formación de Grado, Universidad Estatal de Milagro, Milagro 091050, Ecuador
2
Facultad de Ingeniería y Negocios, Universidad de Las Américas, Manuel Montt 948, Providencia, Santiago 7500975, Chile
3
Escuela de Ingeniería Comercial, Facultad de Economía y Negocios, Universidad Santo Tomás, Talca 3460000, Chile
*
Authors to whom correspondence should be addressed.
Data 2025, 10(6), 86; https://doi.org/10.3390/data10060086
Submission received: 30 April 2025 / Revised: 24 May 2025 / Accepted: 28 May 2025 / Published: 3 June 2025

Abstract

The ability to program has become an essential competence for individuals in an increasingly digital world. However, access to programming education remains unequal, particularly in non-WEIRD (Western, Educated, Industrialized, Rich, and Democratic) contexts. This study presents a dataset resulting from an educational intervention designed to foster programming competencies and computational thinking skills among primary school students aged 8 to 12 years in Milagro, Ecuador. The intervention integrated Scratch, a block-based programming environment that simplifies coding by eliminating syntactic barriers, and the CARAMBA recommendation system, which provided personalized learning paths based on students’ progression and preferences. A structured educational process was implemented, including an initial diagnostic test to assess logical reasoning, guided activities in Scratch to build foundational skills, a phase of personalized practice with CARAMBA, and a final computational thinking evaluation using a validated assessment instrument. The resulting dataset encompasses diverse information: demographic data, logical reasoning test scores, computational thinking test results pre- and post-intervention, activity logs from Scratch, recommendation histories from CARAMBA, and qualitative feedback from university student tutors who supported the intervention. The dataset is anonymized, ethically collected, and made available under a CC-BY 4.0 license to encourage reuse. This resource is particularly valuable for researchers and practitioners interested in computational thinking development, educational data mining, personalized learning systems, and digital equity initiatives. It supports comparative studies between WEIRD and non-WEIRD populations, validation of adaptive learning models, and the design of inclusive programming curricula. Furthermore, the dataset enables the application of machine learning techniques to predict educational outcomes and optimize personalized educational strategies. By offering this dataset openly, the study contributes to filling critical gaps in educational research, promoting inclusive access to programming education, and fostering a more comprehensive understanding of how computational competencies can be developed across diverse socioeconomic and cultural contexts.
Dataset: The full dataset generated and analyzed during the study, along with the paper draft, are available at the (1) Institutional Repository for the CARAMBA system: https://github.com/nvalerod/carambaNew (accessed on 20 May 2025) and (2) GitHub Repository: https://github.com/cvidalmsu/UNEMI_1 (accessed on 10 May 2025). The dataset is openly accessible under the Creative Commons Attribution (CC BY 4.0) license.
Dataset License: The dataset associated with this article is distributed under the terms and conditions of the Creative Commons Attribution (CC BY 4.0) license.

1. Introduction

Computational thinking and programming competencies are increasingly recognized as foundational skills for participation in modern digital societies [1,2,3]. Learning programming fosters problem-solving abilities, logical reasoning, creativity, and digital literacy, making it an essential competence for education in the 21st century [4,5]. However, access to structured programming education remains unequal, particularly across non-WEIRD (Western, Educated, Industrialized, Rich, and Democratic) contexts [6,7]. Several studies have indicated that cultural, socioeconomic, and technological differences significantly affect how students engage with and develop computational skills [8,9,10].
Numerous initiatives have incorporated computational thinking into K-12 education systems within WEIRD nations, aiming to mainstream these skills into early education [11,12]. Nevertheless, interventions designed and validated in non-WEIRD environments are limited and critically needed to ensure global equity in digital education [13].
Scratch, a block-based programming environment, has demonstrated effectiveness in supporting the acquisition of computational thinking by enabling learners to focus on logical structures and problem solving without the burden of syntactic complexity [14,15,16]. Through its intuitive design and visual metaphors, Scratch encourages creativity, collaboration, and algorithmic reasoning among young learners. Beyond learning environments, education recommendation systems, such as CARAMBA, play a significant role in personalizing the learning experience by adapting activities based on the profiles and preferences of the learners [17,18]. Personalized learning trajectories foster participation, improve motivation, and support learners in achieving better outcomes [19].
Despite these advances, most of the empirical evidence on programming education still stems from WEIRD populations. The present study addresses this gap by implementing a structured educational intervention in a public primary school in Milagro, Ecuador. The intervention combined Scratch and the CARAMBA recommendation system within the school’s computing curriculum and involved 428 students aged 8–12 years.
The dataset generated through this intervention captures demographic data, logical reasoning assessments, computational thinking tests administered before and after the learning phases, detailed logs of Scratch activities, recommendation histories from CARAMBA, and qualitative feedback from tutors. By making this dataset openly available, the study contributes to enhancing research in computational thinking education across culturally and socioeconomically diverse contexts.
The present study follows a structured analytical process that integrates data compilation, dimensionality reduction, and validation steps to uncover patterns of student progression and intervention effectiveness. A summary of this methodological framework is presented in Figure 1.
The remainder of this paper is structured as follows: Section 2 describes the dataset and participants. Section 3 details the educational intervention and data collection methods. Section 4 discusses user notes and potential applications. Section 5 concludes with key findings and future research directions.

2. Data Description

2.1. Participant Overview

The educational intervention was conducted at a public primary school located in Milagro, Ecuador. A total of 428 students between the ages of 8 and 12 participated in the study, as described in detail in previous work [20]. The participant pool reflected a diverse demographic background, including students from urban, peri-urban, and rural areas, and encompassed a wide range of socioeconomic conditions. All 428 participating students were between 8 and 12 years old and had no prior formal exposure to programming environments before the intervention. The school is located in a peri-urban region of Milagro and serves a predominantly low-income population, with an average student–teacher ratio of 35:1 and limited access to high-speed internet and individual computing devices.
Approximately 7.5% of the students reported physical, visual, auditory, or cognitive disabilities. This inclusivity aspect makes the dataset particularly valuable for studies aiming to enhance accessibility in programming education. Table 1 summarizes the demographic distribution of the participants.

2.2. Dataset Content

The dataset consists of multiple components that capture various aspects of the intervention and student progress:
  • Logical Reasoning Test Results: Baseline logical reasoning skills assessed through a custom-designed multiple choice test.
  • Computational Thinking Test (CTT V2) Scores: Pre- and post-intervention assessments evaluating students’ computational thinking abilities using a validated instrument [21].
  • Scratch Activity Logs: Sequential records of programming exercises completed by students using Scratch.
  • CARAMBA Recommendation Histories: Data on personalized exercise suggestions generated through collaborative filtering algorithms implemented in CARAMBA [20].
  • Tutor Survey Responses: Reflections and qualitative feedback from students in Systems Engineering at the university who supported the intervention. For example, one tutor noted: “Students who struggled initially became more confident when tasks were personalized.” Another reported: “CARAMBA helped identify the ideal starting point for each student, which improved motivation and autonomy.”
Records with incomplete test responses were excluded from paired comparisons using listwise deletion, which accounted for less than 3% of total data points. Table 2 details the available files:

2.3. Availability of Data and Tools

The preliminary results, logical reasoning tests, computational thinking assessments, and the CARAMBA recommendation system (including its source code and documentation) are openly available in the following repositories:
Both resources are provided under a Creative Commons Attribution (CC BY 4.0) license to facilitate accessibility, reuse, and replication of this educational intervention.

2.4. Ethical Considerations

All data collected were anonymized following established ethical guidelines. The identities of the students were replaced with random unique identifiers and no personally identifiable information (PII) was collected or stored. The study received ethical approval from the research committee of the Universidad Estatal de Milagro.

3. Methods

3.1. Educational Intervention Design

The educational intervention was divided into two main phases following the conceptual framework proposed by Cárdenas-Cobo [22] and further detailed in related work [20]:
  • Phase I: Introduction to Scratch and Basic Computing Concepts
    Students participated in instructor-led sessions focused on basic computing skills, file management, and fundamental programming concepts using the Scratch platform [14]. Core computational concepts such as sequencing, conditionals, loops, and event handling were progressively introduced.
  • Phase II: Personalized Learning with CARAMBA
    After completing foundational training, students engaged with the CARAMBA recommendation system, which provided personalized exercise pathways based on collaborative filtering techniques [20]. This adaptive phase aimed to foster autonomy and tailor difficulty progression to individual student performance.
Phase I (Scratch instruction) lasted for 3 weeks, with two 90 min sessions per week. Phase II (CARAMBA-based personalized learning) lasted 2 weeks, also with two 90 min sessions per week. The collaborative filtering algorithm in CARAMBA used k = 10 nearest neighbors and Pearson correlation as the similarity metric. University Systems Engineering students served as tutors during the intervention, supporting the primary instruction and personalized learning activities. Tutors received specific training sessions to ensure consistency in pedagogical practices and technological usage.

3.2. Data Collection Procedures

The data collection process was structured around several key instruments and stages:
  • Logical Reasoning Diagnostic Test: At the beginning of the study, the students completed a custom-designed multiple choice test assessing the skills of logical, mathematical, and abstract reasoning.
  • Pre-Intervention Computational Thinking Test (CTT V2): To establish baseline computational thinking skills, students were administered the validated CTT V2 instrument [21].
  • Activity Logs: Throughout the instructional and personalized phases, activity logs were recorded from students’ interactions with Scratch and CARAMBA.
  • Post-Intervention Computational Thinking Test (CTT V2): After the intervention, the CTT V2 was reapplied to measure learning gains.
  • Tutor Reflection Surveys: Systems Engineering tutors provided qualitative feedback through surveys assessing instructional dynamics and technological support effectiveness.
It is important to note that although the CTT V2 instrument is validated in Spanish, no independent psychometric validation was performed in the Ecuadorian context. This limitation is addressed in the discussion. All collected data were anonymized, respecting ethical standards.

3.3. Statistical Validation

The data collected were analyzed using robust statistical methods:
  • The pre-test and post-test CTT V2 scores were analyzed for normality using the Shapiro–Wilk test.
  • Paired t-tests were used for normally distributed data and Wilcoxon signed rank tests for nonparametric distributions.
  • The mean pre-test score was 45.2, and the post-test score was 67.8. There was a statistically significant difference ( p < 0.001 ) between the the pre-test and post-test scores.
  • The effect size, calculated using Cohen’s d, was 1.38, indicating a large practical effect.
  • Rank-biserial correlation was also computed where applicable.
The methodological choices were aligned with the standard practices in the research of computational thinking education [1].

3.4. Educational and Data Collection Workflow

Figure 2 illustrates the overall process flow implemented during the educational intervention, detailing the sequence of instructional activities, assessments, and data collection stages.

3.5. CTT Score Distribution Analysis

To evaluate the effectiveness of the educational intervention, a comparative analysis was performed on the students’ Computational Thinking Test (CTT) scores before and after the instructional phases. This analysis quantified learning gains and explored distributional performance changes across the cohort. In particular, we sought to assess changes in average achievement, consistency improvements, and reductions in underperformance.
Figure 3 displays the distribution of the CTT scores grouped into defined score ranges for both the pre-test and the post-test. A prominent rightward shift is observable in the post-test results, indicating that a large proportion of students attained higher scores after the intervention. This upward trend suggests that the instructional model, which combined Scratch-based learning with personalized support from the CARAMBA system, effectively promoted computational thinking development. In particular, the post-test data show an increased concentration of students in the upper performance brackets (61–80 and 81–100), while the number of students scoring in the lower ranges (0–20 and 21–40) was substantially reduced. This pattern provides compelling evidence of the positive impact of the intervention on overall learning outcomes.
Using box plots, Figure 4 provides a comparative summary of the CTT score distributions before and after the intervention. The median score increased markedly from 45 (pre-test) to 68 (post-test), and the interquartile range increased from [30, 60] to [55, 80], indicating that the central 50% of the students performed at a higher level after the intervention. In addition, the narrowing of the lower tail, from a minimum score of 10 to 30, suggests a reduction in extreme underperformance. These results collectively point to an improvement in the average performance, consistency, and equity of learning outcomes across the cohort of students.

4. User Notes

The dataset provided through this study offers a valuable resource for researchers, educators, and policymakers interested in advancing computational thinking education, adaptive learning technologies, and educational equity.

4.1. Potential Applications of the Dataset

Given its multidimensional structure, including diagnostic assessments, computational thinking tests (CTT V2), personalized recommendation histories, and qualitative tutor feedback, the dataset can support a variety of future research directions, including:
  • Computational Thinking Research: The pre- and post-intervention CTT V2 results allow researchers to model the development of computational thinking skills among primary school students in a non-WEIRD context [20,21].
  • Educational Data Mining and Learning Analytics: Activity logs from Scratch and CARAMBA provide rich sequential data ideal for mining patterns, predicting learning trajectories, and evaluating the impact of personalized recommendations [17].
  • Development of Adaptive Learning Systems: The recommendation interaction histories can be used to train and validate new models for adaptive learning environments, particularly in early programming education settings [22].
  • Comparative Cross-Cultural Studies: Researchers can compare the outcomes observed in this non-WEIRD setting against results from WEIRD populations to explore the influence of socioeconomic and cultural variables on computational thinking development [6,13].
  • Design of Inclusive Educational Curricula: Insights from the intervention can inform the creation of more inclusive curricula that incorporate personalization strategies and accessible technologies like Scratch [14].

4.2. Reuse Guidelines

Researchers intending to reuse the dataset should follow the following guidelines:
  • Proper Citation: Any publication or derived work should acknowledge this study as the original source of the dataset.
  • Ethical Use: Although the dataset is anonymized, users must avoid any attempt to re-identify participants and must ensure compliance with ethical standards.
  • Contextualization: Any interpretation of the results should take into account the educational, socioeconomic, and technological characteristics of the Ecuadorian primary school setting from which the data originate.
The dataset is made available under a Creative Commons Attribution (CC BY 4.0) license, maximizing its potential for reuse in research, educational innovation, and policy development.

5. Conclusions

This study presents the results of a structured educational intervention designed to foster programming competencies and computational thinking skills among primary school students in a non-WEIRD educational context. The pedagogical model combined Scratch-based learning with the personalized recommendations offered by the CARAMBA system, aiming to adapt the pace and complexity of activities to individual learner profiles.
The intervention involved 428 students aged 8 to 12 years in Milagro, Ecuador, and used validated instruments such as CTT V2 to assess the development of computational thinking. Statistical analyses confirmed significant improvements in students’ abilities, both in terms of mean scores and score distribution, as visualized through histograms and boxplots. The results highlight the effectiveness of combining block-based programming environments and recommendation systems to personalize and enhance programming education for young learners.
A key contribution of this work is the dataset generated during the intervention, which includes demographic information, logical reasoning assessments, CTT V2 scores before and after the intervention, Scratch activity logs, CARAMBA recommendation histories, and qualitative tutor feedback. The dataset is anonymized and openly available under a Creative Commons Attribution (CC BY 4.0) license, encouraging broad reuse. The absence of a traditional-instruction control group limits our ability to isolate the specific contribution of the CARAMBA system. Additionally, due to logistical constraints, no longitudinal tracking was conducted to evaluate the sustainability of learning gains over time.
Future research could extend this work by:
  • Conducting longitudinal studies to assess the sustainability of computational thinking skills over time.
  • Exploring enhancements to the recommendation system, such as the integration of reinforcement learning approaches.
  • Applying the intervention model to different socioeconomic contexts, educational levels, or countries for comparative analysis.
  • Investigating the impact of personalization strategies on students with disabilities or diverse learning needs.
By providing this dataset and documenting the intervention process, this work aims to contribute to the global efforts to democratize programming education, foster computational literacy across diverse contexts, and support the development of more inclusive and adaptive educational technologies.

Author Contributions

Conceptualization, J.C.-C.; methodology, J.C.-C.; software, J.C.-C.; validation, J.C.-C., C.V.-S. and N.M.; formal analysis, J.C.-C.; investigation, J.C.-C.; resources, J.C.-C.; data curation, J.C.-C.; writing—original draft preparation, J.C.-C.; writing—review and editing, J.C.-C., C.V.-S. and N.M.; visualization, J.C.-C.; supervision, C.V.-S.; project administration, C.V.-S.; funding acquisition, C.V.-S. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was carried out according to the ethical guidelines of the Universidad Estatal de Milagro (UNEMI), Ecuador. Ethical approval was obtained from the institutional review board prior to the implementation of the intervention.

Informed Consent Statement

Informed consent was obtained from the legal guardians of all student participants involved in the study.

Data Availability Statement

The data supporting the findings of this study are openly available at https://github.com/cvidalmsu/UNEMI_1 and https://github.com/nvalerod/carambaNew.

Acknowledgments

The authors express their gratitude to the faculty, students, and administrative staff of the participating educational institution for their collaboration and support throughout the study. Special thanks are extended to the Systems Engineering students of Universidad Estatal de Milagro who served as tutors during the intervention.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Wing, J.M. Computational thinking: What and why. Link 2010, 6, 33–35. [Google Scholar]
  2. Selby, C.; Woollard, J. Computational thinking: The developing definition. ITiCSE Work. Group Rep. 2013, 2013, 1–6. Available online: https://eprints.soton.ac.uk/356481/ (accessed on 27 May 2025).
  3. Bocconi, S.; Chioccariello, A.; Earp, J. Developing computational thinking in compulsory education—Implications for policy and practice. JRC Sci. Policy Rep. 2016. [Google Scholar] [CrossRef]
  4. Grover, S.; Pea, R. Computational thinking in K–12: A review of the state of the field. Educ. Res. 2013, 42, 38–43. [Google Scholar] [CrossRef]
  5. Wang, C.; Shen, J.; Chao, J. Integrating Computational Thinking in STEM Education: A Literature Review. Int. J. Sci. Math. Educ. 2022, 20, 1949–1972. [Google Scholar] [CrossRef]
  6. Henrich, J.; Heine, S.J.; Norenzayan, A. Most people are not WEIRD. Nature 2010, 466, 29. [Google Scholar] [CrossRef] [PubMed]
  7. Tucker, B. Computational thinking: Teacher enactment and student learning. In Proceedings of the Annual Meeting of the American Educational Research Association (AERA), Philadelphia, PA, USA, 3–7 April 2014. [Google Scholar]
  8. Jiang, B.; Li, Z. Effect of Scratch on Computational Thinking Skills of Chinese Primary School Students. J. Comput. Educ. 2021, 8, 505–525. [Google Scholar] [CrossRef]
  9. Belmar, H. Review on the Teaching of Programming and Computational Thinking in the World. Front. Comput. Sci. 2022, 4, 997222. [Google Scholar] [CrossRef]
  10. Holanda, M.; Silva, D.D. Latin American Women and Computer Science: A Systematic Literature Mapping. IEEE Trans. Educ. 2022, 65, 356–372. [Google Scholar] [CrossRef]
  11. Yadav, A.; Hong, H.Y.; Stephenson, C. The role of computational thinking in STEM learning: A review. In Proceedings of the Annual Meeting of the American Educational Research Association (AERA), San Antonio, TX, USA, 27 April–1 May 2017. [Google Scholar]
  12. Zhang, S.; Wong, G.K.W. Development and Validation of a Computational Thinking Test for Lower Primary School Students. Educ. Technol. Res. Dev. 2023, 71, 1595–1630. [Google Scholar] [CrossRef]
  13. Roberts, S.O.; Bareket-Shavit, C.; Dollins, F.A.; Goldie, P.D.; Mortenson, E. Racial inequality in psychological research: Trends of the past and recommendations for the future. Perspect. Psychol. Sci. 2020, 15, 1295–1309. [Google Scholar] [CrossRef] [PubMed]
  14. Resnick, M.; Maloney, J.; Monroy-Hernández, A.; Rusk, N.; Eastmond, E.; Brennan, K.; Millner, A.; Rosenbaum, E.; Silver, J.; Silverman, B.; et al. Scratch: Programming for all. Commun. ACM 2009, 52, 60–67. [Google Scholar] [CrossRef]
  15. Montiel, H.; Gomez-Zermeño, M.G. Educational Challenges for Computational Thinking in K–12 Education: A Systematic Literature Review of “Scratch” as an Innovative Programming Tool. Computers 2021, 10, 69. [Google Scholar] [CrossRef]
  16. Sun, L.; Guo, Z.; Zhou, D. Developing K–12 Students’ Programming Ability: A Systematic Literature Review. Educ. Inf. Technol. 2022, 27, 7059–7097. [Google Scholar] [CrossRef]
  17. Cárdenas-Cobo, J.; Puris, A.; Novoa-Hernández, P.; Galindo, J.A.; Benavides, D. Recommender systems and Scratch: An integrated approach for enhancing computer programming learning. IEEE Trans. Learn. Technol. 2019, 14, 243–256. [Google Scholar] [CrossRef]
  18. Ricci, F.; Rokach, L.; Shapira, B.; Kantor, P.B. Introduction to Recommender Systems Handbook; Springer: Berlin/Heidelberg, Germany, 2011. [Google Scholar] [CrossRef]
  19. Leitner, P.; Khalil, M.; Ebner, M. Learning Analytics in Higher Education—A Literature Review. In Learning Analytics: Fundaments, Applications, and Trends: A View of the Current State of the Art to Enhance e-Learning; Peña-Ayala, A., Ed.; Springer International Publishing: Cham, Switzerland, 2017; pp. 1–23. [Google Scholar] [CrossRef]
  20. Cárdenas-Cobo, J.; Vidal-Silva, C.; Arévalo, L.; Torres, M. Applying recommendation system for developing programming competencies in children from a non-WEIRD context. Educ. Inf. Technol. 2023, 29, 9355–9386. [Google Scholar] [CrossRef]
  21. Román-González, M. Codigoalfabetización y Pensamiento Computacional en Educación Primaria y Secundaria: Validación de un Instrumento y Evaluación de Programas. Ph.D. Thesis, Universidad Nacional de Educación a Distancia, Madrid, Spain, 2016. [Google Scholar]
  22. Cardenas-Cobo, J. Enhancing the Learning of Programming Using Scratch: A Recommender-Systems-Based Approach in Non WEIRD Communities. Ph.D. Thesis, University of Seville, Seville, Spain, 2020. [Google Scholar]
Figure 1. Methodological flowchart of the multivariate STATIS analysis process.
Figure 1. Methodological flowchart of the multivariate STATIS analysis process.
Data 10 00086 g001
Figure 2. Educational intervention and data collection workflow.
Figure 2. Educational intervention and data collection workflow.
Data 10 00086 g002
Figure 3. Distribution of CTT scores before and after the intervention.
Figure 3. Distribution of CTT scores before and after the intervention.
Data 10 00086 g003
Figure 4. Comparison of pre- and post-intervention CTT scores using boxplots.
Figure 4. Comparison of pre- and post-intervention CTT scores using boxplots.
Data 10 00086 g004
Table 1. Demographic characteristics of participants.
Table 1. Demographic characteristics of participants.
CategoryNumber of StudentsPercentage (%)
Males23655.1
Females19244.9
Students with disabilities327.5
Ecuadorian nationality41997.9
Foreign nationality92.1
Table 2. Structure of the dataset files.
Table 2. Structure of the dataset files.
File NameFormatDescription
diagnostic_test.csvCSVResults from logical reasoning diagnostic test
ctt_pre_post.csvCSVPre- and post-CTT V2 computational thinking assessments
tutor_survey.csvCSVReflections from university student tutors
caramba_logs.jsonJSONPersonalized recommendation history and activity tracking
metadata.pdfPDFComplete description of variables and codebook
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Cárdenas-Cobo, J.; Vidal-Silva, C.; Máquez, N. Dataset on Programming Competencies Development Using Scratch and a Recommender System in a Non-WEIRD Primary School Context. Data 2025, 10, 86. https://doi.org/10.3390/data10060086

AMA Style

Cárdenas-Cobo J, Vidal-Silva C, Máquez N. Dataset on Programming Competencies Development Using Scratch and a Recommender System in a Non-WEIRD Primary School Context. Data. 2025; 10(6):86. https://doi.org/10.3390/data10060086

Chicago/Turabian Style

Cárdenas-Cobo, Jesennia, Cristian Vidal-Silva, and Nicolás Máquez. 2025. "Dataset on Programming Competencies Development Using Scratch and a Recommender System in a Non-WEIRD Primary School Context" Data 10, no. 6: 86. https://doi.org/10.3390/data10060086

APA Style

Cárdenas-Cobo, J., Vidal-Silva, C., & Máquez, N. (2025). Dataset on Programming Competencies Development Using Scratch and a Recommender System in a Non-WEIRD Primary School Context. Data, 10(6), 86. https://doi.org/10.3390/data10060086

Article Metrics

Back to TopTop