Bebras-Based Assessment for Computational Thinking: Performance and Gender Analysis
Abstract
1. Introduction
- To create an instrument to assess the CT components of decomposition, pattern recognition, abstraction, modeling, algorithms, and debugging in 7th-grade students.
- To analyze students’ performance, considering item-level performance and group differences, with special attention to gender differences.
- Is it possible to identify a preliminary factorial structure in the responses to the Bebras-based CT instrument?
- Are there any significant differences in student performance with respect to gender?
2. Theoretical Framework
2.1. Computational Thinking
2.2. CT Assessment
2.3. CT Assessment Through Bebras
3. Method
3.1. Participants
3.2. Instrument Design
3.3. Instrument Description
3.4. Process and Analysis
4. Results
4.1. Performance of the Total Sample in Each Item from BBACT
4.2. Performance in Each Item from BBACT According to Gender
4.3. Exploratory Factor Analysis
5. Discussion
- Difficult items: 3 items with accuracy rates below 30.37%.
- Average difficulty items: 11 items with accuracy rates between 30.37% and 75.01%.
- Easy items: 3 items with accuracy rates above 75.01%, including 2 items that are very easy, with accuracy rates exceeding 90%.
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
CT | Computational thinking |
BBACT | Bebras-based assessment for computational thinking |
EFA | Exploratory factor analysis |
1 | The bebras tasks used in the test were taken from various Bebras repositories, including Bebras Australia (https://www.amt.edu.au/bebras-365) (accessed date: 1 March 2023), Bebras UK (https://www.bebras.uk/), Bebras USA (https://www.bebraschallenge.org/), and Bebras main webpage (https://www.bebras.org/) (accessed date: 1 March 2023). |
References
- Angeli, C., Voogt, J., Fluck, A., Webb, M., Cox, M., Malyn-Smith, J., & Zagami, J. (2016). A K-6 computational thinking curriculum framework. Journal of Educational Technology and Society, 19(3), 47–57. [Google Scholar]
- Arzarello, F., & Ferretti, F. (2021). The connection between the mathematics INVALSI test and the teaching practices. In INVALSI data to investigate the characteristics of student, schools and society: IV Seminar “INVALSI data: A tool for teaching and scientific research (pp. 96–109). FrancoAngeli. [Google Scholar]
- Atmatzidou, S., & Demetriadis, S. (2016). Advancing students’ computational thinking skills through educational robotics: A study on age and gender relevant differences. Robotics and Autonomous Systems, 75, 661–670. [Google Scholar] [CrossRef]
- Barr, V., & Stephenson, C. (2011). Bringing computational thinking to K-12. ACM Inroads, 2(1), 48–54. [Google Scholar] [CrossRef]
- Basnet, R. B., Doleck, T., Lemay, D. J., & Bazelais, P. (2018). Exploring computer science students’ continuance intentions to use Kattis. Education and Information Technologies, 23, 1145–1158. [Google Scholar] [CrossRef]
- Bocconi, S., Chioccariello, A., Kampylis, P., Dagienė, V., Wastiau, P., Engelhardt, K., Earp, J., Horvath, M., Jasutė, E., Malagoli, C., Masiulionytė-Dagienė, V., & Stupurienė, G. (2022). Reviewing computational thinking in compulsory education. JRC Publications Repository, 138. [Google Scholar] [CrossRef]
- Brennan, K., & Resnick, M. (2012, April 13–17). New frameworks for studying and assessing the development of computational thinking. AERA Annual Meeting, Vancouver, BC, Canada. [Google Scholar]
- Chen, G., Shen, J., Barth-Cohen, L., Jiang, S., Huang, X., & Eltoukhy, M. (2017). Assessing elementary students’ computational thinking in everyday reasoning and robotics programming. Computers & Education, 109, 162–175. [Google Scholar]
- Chongo, S., Osman, K., & Nayan, N. A. (2020). Level of computational thinking among students. Science Education International, 31(2), 159–163. [Google Scholar] [CrossRef]
- Combéfis, S., & Stupurienė, G. (2020). Bebras based activities for computer science education: Review and perspectives. In K. Kori, & M. Laanpere (Eds.), Informatics in schools. Engaging learners in computational thinking. ISSEP 2020. Lecture notes in computer science (Vol. 12518, pp. 15–29). Springer. [Google Scholar] [CrossRef]
- CSTA & ISTE. (2011). Operational definition of computational thinking for K-12 education. National Science Foundation. [Google Scholar]
- Cuesta, M., & Muñiz, J. (1994). Utilización de modelos unidimensionales. Psicothema, 6(2), 283–296. [Google Scholar]
- Cuesta, M., & Muñiz, J. (1995). Efectos de la multidimensionalidad. Psicología, 16(1), 65–86. [Google Scholar]
- Dagiene, V., & Dolgopolovas, V. (2022). Short tasks for scaffolding computational thinking by the global Bebras challenge. Mathematics, 10(17), 3194. [Google Scholar] [CrossRef]
- Dagienė, V., & Sentance, S. (2016). It’s computational thinking! Bebras tasks in the curriculum. In Informatics in schools (pp. 28–39). Springer. [Google Scholar] [CrossRef]
- Del Olmo-Muñoz, J., Cózar-Gutiérrez, R., & González-Calero, J. A. (2020). Computational thinking through unplugged activities in early years of primary education. Computers & Education, 150, 103832. [Google Scholar] [CrossRef]
- El-Hamamsy, L., Zapata-Cáceres, M., Martín-Barroso, E., Mondada, F., Zufferey, J. D., Bruno, B., & Román-González, M. (2025). The competent Computational Thinking test (cCTt): A valid, reliable and gender-fair test for longitudinal CT studies in grades 3–6. Technology, Knowledge and Learning, 1–55. [Google Scholar] [CrossRef]
- Elosua, P., & López-Jauregui, A. (2002). Indicadores de dimensionalidad para ítems binarios. Metodología de las Ciencias del Comportamiento, 4, 121–137. [Google Scholar]
- Fernández-Alonso, R. (2005). Evaluación del rendimiento matemático [Unpublished Doctoral thesis, Universidad de Oviedo]. [Google Scholar]
- Fraillon, J. (2025). ICILS 2023 international report: An international perspective on digital literacy (revised). Springer. Available online: https://www.iea.nl/publications/icils-2023-international-report (accessed on 20 May 2025).
- Fraillon, J., Ainley, J., Schulz, W., Friedman, T., & Duckworth, D. (2019). Preparing for life in a digital world: IEA international computer and information literacy study 2018 international report. Springer. Available online: https://www.iea.nl/publications/study-reports/international-reports-iea-studies/preparing-life-digital-world (accessed on 13 February 2023).
- Grover, S., Fisler, K., Lee, I., & Yadav, A. (2020, March 11–14). Integrating computing and computational thinking into K-12 STEM learning. Proceedings of the 51st ACM Technical Symposium on Computer Science Education (pp. 481–482), Portland, OR, USA. [Google Scholar] [CrossRef]
- Grover, S., & Pea, R. (2013). Computational thinking in K–12: A review of the state of the field. Educational Researcher, 42(1), 38–43. [Google Scholar] [CrossRef]
- Grover, S., & Pea, R. (2018). Computational thinking: A competency whose time has come. In S. Sentance, E. Barendsen, & C. Schulte (Eds.), Computer science education: Perspectives on teaching and learning in school (pp. 20–35). Bloomsbury Academic. [Google Scholar]
- Guggemos, J. (2021). Predictors of CT and its growth. Computers & Education, 161, 104060. [Google Scholar]
- Huang, X., & Qiao, C. (2024). Enhancing computational thinking skills through artificial intelligence education at a STEAM high school. Science & Education, 33, 383–403. [Google Scholar] [CrossRef]
- Isharyadi, R., & Juandi, D. (2023). CT in mathematics education: Benefits and challenges. Formative, 13(1), 69–80. [Google Scholar] [CrossRef]
- Izu, C., Mirolo, C., Settle, A., Mannila, L., & Stupuriene, G. (2017). Exploring bebras tasks content and performance: A multinational study. Informatics in Education, 16(1), 39–59. [Google Scholar] [CrossRef]
- Knuth, D. E. (1974). Computer science and its relation to mathematics. The American Mathematical Monthly, 81(4), 323–343. [Google Scholar] [CrossRef]
- Korkmaz, Ö., Çakir, R., & Özden, M. Y. (2017). Validity and reliability of the computational thinking scales. Computers in Human Behavior, 72, 558–569. [Google Scholar] [CrossRef]
- Lockwood, J., & Mooney, A. (2018, September 3). Developing a computational thinking test using Bebras problems. Proceedings of the CC-TEL 2018 and TACKLE 2018 Workshops, Leeds, UK. [Google Scholar]
- Lord, F. M. (1980). Applications of item response theory to practical testing problems. LEA. [Google Scholar]
- Lu, C., Macdonald, R., Odell, B., Kokhan, V., Demmans Epp, C., & Cutumisu, M. (2022). A scoping review of computational thinking assessments in higher education. Journal of Computing in Higher Education, 34(2), 416–461. [Google Scholar] [CrossRef]
- Moreno-León, J., & Robles, G. (2015a, August 12–15). Analyze your Scratch projects with Dr. Scratch and assess your computational thinking skills. Scratch Conference (pp. 12–15), Amsterdam, The Netherlands. [Google Scholar]
- Moreno-León, J., & Robles, G. (2015b, November 9–11). Dr. Scratch: A web tool to evaluate Scratch projects. WiPSCE ’15 Proceedings of the Workshop in Primary and Secondary Computing Education (pp. 132–133), London, UK. [Google Scholar] [CrossRef]
- Mouza, C., Pan, Y. C., Yang, H., & Pollock, L. (2020). A multiyear investigation of student computational thinking. Journal of Educational Computing Research, 58(5), 1029–1056. [Google Scholar] [CrossRef]
- Mühling, A., Ruf, A., & Hubwieser, P. (2015, November 9–11). Design and first results of a psychometric test for programming abilities. Proceedings of the 10th Workshop in Primary and Secondary Computing Education (pp. 2–10), London, UK. [Google Scholar] [CrossRef]
- OECD. (2022). PISA 2022. Results volume I: The state of learning and equity in education. OECD Publishing. [Google Scholar] [CrossRef]
- Palop, B., & Díaz, I. (2024). El pensamiento computacional en el aula. In L. Velázquez (Ed.), Nuevos escenarios educativos, el enfoque competencial y las metodologías activas (pp. 215–244). Grupo Magro Editores. [Google Scholar]
- Palop, B., Díaz, I., Rodríguez-Muñiz, L. J., & Santaengracia, J. J. (2025). Redefining computational thinking: A holistic framework and its implications for K-12 education. Education and Information Technologies, 30, 13385–13410. [Google Scholar] [CrossRef]
- Palts, T., Pedaste, M., Vene, V., & Vinikienė, L. (2017, November 16–18). Tasks for assessing CT skills. ICERI2017 Proceedings (pp. 2750–2759), Seville, Spain. [Google Scholar] [CrossRef]
- Papert, S. (1980). Mindstorms: Children, computers, and powerful ideas. Basic Books. [Google Scholar]
- Rao, T. S. S., & Bhagat, K. K. (2024). Computational thinking for the digital age: A systematic review of tools, pedagogical strategies, and assessment practices. Educational Technology Research and Development, 72(4), 1893–1924. [Google Scholar] [CrossRef]
- Reckase, M. D. (1979). Unifactor latent trait models. Journal of Educational Statistics, 4(3), 207–230. [Google Scholar] [CrossRef]
- Román-González, M., Moreno-León, J., & Robles, G. (2017a, July 13–15). Complementary tools for computational thinking assessment. Proceedings of the International Conference on Computational Thinking Education (pp. 154–159), Hong Kong, China. [Google Scholar]
- Román-González, M., Pérez-González, J. C., & Jiménez-Fernández, C. (2017b). Which cognitive abilities underlie computational thinking? Computers in Human Behavior, 72, 678–691. [Google Scholar] [CrossRef]
- Royal Society. (2012). Shut down or restart? The way forward for computing in UK schools. Royal Society. [Google Scholar]
- Selby, C. C., & Woollard, J. (2014, March 5–8). Computational thinking: The developing definition. SIGCSE ’14, Atlanta, GA, USA. [Google Scholar]
- Shute, V. J., Sun, C., & Asbell-Clarke, J. (2017). Demystifying computational thinking. Educational Research Review, 22, 142–158. [Google Scholar] [CrossRef]
- Souto Oliveira, A. L., Andrade, W. L., Guerrero, D. D. S., & Melo, M. R. A. (2021, October 13–16). How do Bebras tasks explore algorithmic thinking contest? IEEE Frontiers in Education Conference (pp. 1–7), Lincoln, NE, USA. [Google Scholar] [CrossRef]
- Tang, X., Yin, Y., Lin, Q., Hadad, R., & Zhai, X. (2020). Assessing computational thinking: A systematic review of empirical studies. Computers and Education, 148, 103798. [Google Scholar] [CrossRef]
- Weintrop, D., & Wilensky, U. (2015, August 9–13). Using commutative assessments. Proceedings of the Eleventh Annual International Conference on International Computing Education Research, ICER ’15 (pp. 101–110), Omaha, NE, USA. [Google Scholar] [CrossRef]
- Werner, L., Denner, J., Campe, S., & Kawamoto, D. C. (2012, March 9–12). The fairy performance assessment. Proceedings of the 43rd ACM Technical Symposium on CS Education (pp. 215–220), Dallas, TX, USA. [Google Scholar] [CrossRef]
- Wiebe, E., London, J., Aksit, O., Mott, B. W., Boyer, K. E., & Lester, J. C. (2019, February 27–March 2). Development of a lean computational thinking abilities assessment for middle grades students. Proceedings of the 50th ACM Technical Symposium on Computer Science Education (pp. 456–461), Minneapolis, MN, USA. [Google Scholar] [CrossRef]
- Wing, J. M. (2006). Computational thinking. Communications of the ACM, 49(3), 33–35. [Google Scholar] [CrossRef]
- Wing, J. M. (2008). Computational thinking and thinking about computing. Philosophical Transactions of the Royal Society A, 366(1881), 3717–3725. [Google Scholar] [CrossRef]
- Yadav, S., & Chakraborty, P. (2023). Introducing schoolchildren to computational thinking using smartphone apps: A way to encourage enrolment in engineering education. Computer Applications in Engineering Education, 31, 831. [Google Scholar] [CrossRef]
- Zapata-Cáceres, M., Martín-Barroso, E., & Román-González, M. (2020, April 27–30). Computational thinking test for beginners. 2020 IEEE Global Engineering Education Conference (EDUCON) (pp. 1905–1914), Porto, Portugal. [Google Scholar]
- Zeng, Y., Yang, W., & Bautista, A. (2023). Computational thinking in early childhood education: Reviewing the literature and redeveloping the three-dimensional framework. Educational Research Review, 39, 100520. [Google Scholar] [CrossRef]
- Zhang, L., & Nouri, J. (2019). A systematic review of learning computational thinking through Scratch in K-9. Computers and Education, 141, 103607. [Google Scholar] [CrossRef]
Item Name | Components | Expected Difficulty | Success Rate (%) | Perceived Difficulty | Decision | Item Number in Final Version |
---|---|---|---|---|---|---|
Rings | Decomposition | Average | 40 | Average | Maintained | 1 |
Jumps | Decomposition | High | 20 | High | Changed | 2 |
Frieze | Decomposition/ patterns | Low | 68.75 | Average | Rephrased | 3 |
Nimm | Decomposition | High | 42.5 | Average | Maintained | 4 |
Bracelet | Patterns | Low | 90 | Low | Maintained | 5 |
Footprints | Patterns | Low | 92.5 | Low | Maintained | 6 |
Necklace | Patterns | Average | 75 | Average | Maintained | 7 |
Logs | Patterns | High | 56.25 | Average | Rephrased | 8 |
Robot heart | Algorithms | Low | 78.75 | Low | Changed | 9 |
Ball swap | Debugging/ algorithms | Average | 52.5 | Average | Changed | 10 |
Log order | Debugging/ algorithms | High | 55 | Average | Changed | 11 |
Log order b | Debugging/ algorithms | Average | 25 | Eliminated | Eliminated | - |
Graph | Abstraction | High | 75 | Average | Maintained | 12 |
Flipflop | Abstraction | Average | 68.75 | Average | Maintained | 13 |
Roulette | Abstraction/patterns | Low | 58.75 | Average | Changed | 14 |
Strip | Abstraction/patterns | High | 25 | High | Maintained | 15 |
Houses | Modelization | Low | 62.5 | Average | Maintained | 16 |
Watchtower | Modelization | Average | 27.5 | High | Rephrased | 17 |
Rings | Decomposition | Average | 40 | Average | Maintained | 18 |
Item | Correct Answers (Total Answers) | % |
---|---|---|
1. | 806 (1507) | 53.48 |
2. | 376 (1495) | 25.15 |
3. | 860 (1503) | 57.22 |
4. | 547 (1482) | 36.91 |
5. | 1345 (1508) | 89.19 |
6. | 1210 (1508) | 80.24 |
7. | 253 (1503) | 16.83 |
8. | 764 (1503) | 50.83 |
9. | 1053 (1500) | 70.20 |
10. | 513 (1485) | 34.55 |
11. | 394 (1470) | 26.80 |
12. | 822 (1481) | 55.50 |
13. | 855 (1464) | 58.40 |
14. | 758 (1491) | 50.84 |
15. | 251 (1476) | 17.01 |
16. | 833 (1482) | 56.21 |
17. | 468 (1482) | 31.58 |
Items | Female | Male | Non-Binary | Other |
---|---|---|---|---|
1. | 344 (52%) | 433 (55%) | 3 (17%) | 16 (55%) |
2. | 159 (24%) | 206 (27%) | 2 (11%) | 7 (24%) |
3. | 380 (58%) | 450 (57%) | 6 (33%) | 16 (55%) |
4. | 241 (37%) | 284 (37%) | 6 (35%) | 8 (28%) |
5. | 580 (88%) | 709 (90%) | 14 (78%) | 27 (93%) |
6. | 529 (80%) | 631 (80%) | 14 (78%) | 25 (86%) |
7. | 114 (17%) | 127 (16%) | 2 (12%) | 6 (21%) |
8. | 338 (51%) | 398 (51%) | 8 (44%) | 14 (48%) |
9. | 472 (72%) | 544 (70%) | 8 (50%) | 18 (62%) |
10. | 229 (35%) | 265 (35%) | 4 (22%) | 10 (34%) |
11. | 181 (28%) | 206 (27%) | 2 (12%) | 4 (14%) |
12. | 361 (56%) | 423 (55%) | 6 (33%) | 22 (76%) |
13. | 373 (58%) | 456 (60%) | 7 (39%) | 9 (32%) |
14. | 323 (50%) | 408 (53%) | 7 (39%) | 12 (41%) |
15. | 109 (17%) | 136 (18%) | 3 (17%) | 3 (10%) |
16. | 366 (56%) | 437 (57%) | 8 (44%) | 13 (45%) |
17. | 211 (32%) | 235 (31%) | 8 (44%) | 9 (31%) |
Items | Female | Male | χ2 (p Value) |
---|---|---|---|
1. | 344 (52%) | 433 (55%) | 1.402 (p = 0.236) |
2. | 159 (24%) | 206 (27%) | 0.823 (p = 0.364) |
3. | 380 (58%) | 450 (57%) | 0.008 (p = 0.931) |
4. | 241 (37%) | 284 (37%) | 0.014 (p = 0.904) |
5. | 580 (88%) | 709 (90%) | 1.970 (p = 0.160) |
6. | 529 (80%) | 631 (80%) | 0.022 (p = 0.881) |
7. | 114 (17%) | 127 (16%) | 0.250 (p = 0.617) |
8. | 338 (51%) | 398 (51%) | 0.005 (p = 0.943) |
9. | 472 (72%) | 544 (70%) | 0.521 (p = 0.470) |
10. | 229 (35%) | 265 (35%) | 0.011 (p = 0.915) |
11. | 181 (28%) | 206 (27%) | 0.049 (p = 0.825) |
12. | 361 (56%) | 423 (55%) | 0.014 (p = 0.904) |
13. | 373 (58%) | 456 (60%) | 0.492 (p = 0.483) |
14. | 323 (50%) | 408 (53%) | 1.074 (p = 0.300) |
15. | 109 (17%) | 136 (18%) | 0.183 (p = 0.669) |
16. | 366 (56%) | 437 (57%) | 0.070 (p = 0.791) |
17. | 211 (32%) | 235 (31%) | 0.507 (p = 0.477) |
Performance | Decomposition | Patterns | Debugging/Algorithms | Modeling | Abstraction |
---|---|---|---|---|---|
Low | X | X | |||
Average | X | X | X | X | X |
High | X | X | X | X |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Santaengracia, J.J.; Palop, B.; García, T.; Rodríguez Pérez, C.; Rodríguez-Muñiz, L.J. Bebras-Based Assessment for Computational Thinking: Performance and Gender Analysis. Educ. Sci. 2025, 15, 899. https://doi.org/10.3390/educsci15070899
Santaengracia JJ, Palop B, García T, Rodríguez Pérez C, Rodríguez-Muñiz LJ. Bebras-Based Assessment for Computational Thinking: Performance and Gender Analysis. Education Sciences. 2025; 15(7):899. https://doi.org/10.3390/educsci15070899
Chicago/Turabian StyleSantaengracia, Juan J., Belén Palop, Trinidad García, Celestino Rodríguez Pérez, and Luis J. Rodríguez-Muñiz. 2025. "Bebras-Based Assessment for Computational Thinking: Performance and Gender Analysis" Education Sciences 15, no. 7: 899. https://doi.org/10.3390/educsci15070899
APA StyleSantaengracia, J. J., Palop, B., García, T., Rodríguez Pérez, C., & Rodríguez-Muñiz, L. J. (2025). Bebras-Based Assessment for Computational Thinking: Performance and Gender Analysis. Education Sciences, 15(7), 899. https://doi.org/10.3390/educsci15070899