Fostering Critical Thinking in STEM Education
Abstract
1. Introduction
1.1. Societal Relevance of Critical Thinking in STEM Education
1.2. STEM Education: Rationale, Definitions, and Goals
1.3. Challenges of Implementing and Assessing Critical Thinking
1.4. Research Gap
2. Theoretical Background
2.1. Societal Relevance and Educational Policy
2.2. The Challenge of Defining Critical Thinking
2.3. Connecting Critical Thinking to STEM Content
3. Methodology
3.1. Design Process
3.2. Procedure and Data Collection
- Does the rubric capture CT in STEM?
- Is it useful? Why and why not?
- Is it practical? Why and why not?
- Is it adaptable? Example?
- Are you willing to use it? Why and why not?
3.3. Data Analysis
3.4. Ethical Considerations
4. Results
4.1. Result of the Design Process
- -
- Quality check of resources;
- -
- Variety of methods;
- -
- Argumentation and coherence of conclusions;
- -
- Use of tools and data;
- -
- Understanding and consideration of different norms and values;
- -
- Communication of the results;
- -
- Presence of goal-orientation and perseverance;
- -
- Richness of reflection.
4.2. Preliminary Case: Workshop at the International Conference
4.3. Case: Austria
“[…], one thinks critically about the whole thing, and I do believe that it will definitely improve the quality of one’s teaching.”
“If I just take that into account, I don’t know if it will be effective. Because there is nothing in there about how you assess learning success or whether there is any learning success.”
“[…] but even so, in a discussion like this, I can only observe the group dynamics as a whole and not respond to the individual. And the problem is that when I reflect on the discussion, I always focus on the good students, and perhaps many students simply get lost in the shuffle.”
“So yes, I would probably use it, but I find it a bit difficult to draw a clear line between what is basic level, what is intermediate level, and what is advanced level.”
4.4. Case: Croatia
“Yes, it triggered my own critical thinking, so I would expect it to also do that in others. [The workshop] solidified my knowledge about critical thinking and forced me to think about its importance.”
“Students are too unique, so we can’t apply the same criteria to everyone.”
4.5. Case: Germany
“… I am missing some specific definitions to be able to apply this concretely in the MINT (STEM) domain.”
“… if you don’t have the subject knowledge, it’s difficult to check sources to see if they are right or wrong.”
“… ‘efficiency’ and ‘creativity’ are such subjective assessments.”
“… in science lessons, something like conducting an experiment and then writing a report.”
“… for me, ‘Intermediate’ should basically be the minimum standard; what falls below that is simply not critical thinking …”
“… not necessarily as a scoring sheet, but … to have a rubric where one could theoretically categorise critical thinking …”
“… as it is now, I would perhaps keep it in the back of my mind for myself … but wouldn’t put too much value on it yet, rather just use it as a rough orientation.”
4.6. Case: The Netherlands
“It is adaptable for the practice, especially for the biology context.”“Yes. In a mathematical modelling task, students estimate the number of tiles needed to cover a floor. Neutral: just give a number without reasoning. Basic: calculate area, but don’t check assumptions. Proficient: reflect on measuring errors, consider tile gaps, etc.”
“I think for it to be useful for your lessons, you might need to make your own version for that specific assignment/project. Maybe more as a starting point for designing/development of a lesson/assignment.”
4.7. Cross-Case Analysis
5. Discussion
6. Limitations
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
| CT | Critical Thinking |
| STEM | Science Technology Engineering Mathematics |
Appendix A
| Levels | Description |
|---|---|
| Basic level | The basic level is characterised by modest use of critical thinking while solving a problem:
|
| Intermediate level | The intermediate level is characterised by solving a problem with certain elements of critical thinking without using its full potential:
|
| Advanced level | The advanced level is characterised by solving a problem based on extensive experience and a professional approach, exhibiting critical thinking in every aspect:
|
| Levels | Description |
|---|---|
| Basic level | The basic level is characterised by modest use of critical thinking and creativity while solving a problem:
|
| Intermediate level | The intermediate level is characterised by solving a problem with certain elements of critical thinking and creativity, without using its full potential:
|
| Advanced level | The advanced level is characterised by solving a problem based on extensive experience and a professional approach, exhibiting critical thinking and creativity in every aspect:
|
| Levels | Description |
|---|---|
| Neutral level | The neutral level is characterised by the absence of critical approach to solving a problem:
|
| Intermediate level | The intermediate level is characterised by solving a problem with certain elements of critical thinking and creativity, without using its full potential:
|
| Advanced level | The advanced level is characterised by solving a problem based on extensive experience and a professional approach, exhibiting critical thinking and creativity in every aspect:
|
| Dimension\Level | Basic | Intermediate | Advanced |
|---|---|---|---|
| Quality check of resources | |||
| Variety of methods | |||
| Originality of ideas and approaches | |||
| Argumentation and coherence of conclusions | |||
| Use of tools and data | |||
| Understanding and consideration of norms and values | |||
| Communication of results | |||
| Presence of goal orientation | |||
| Richness of reflection |
| Dimension\Level | Basic | Intermediate | Advanced |
|---|---|---|---|
| Quality check of resources | no quality check of external sources of information | only limited sources of information used, and sources are poorly checked | multiple sources are considered and selected based on quality checks |
| Variety of methods | alternative methods are not taken into consideration | the variation of methods is limited to procedures shown by others in similar contexts | a variety of methods is used or even invented for the purpose of analysing and solving the problem |
| Argumentation and coherence of conclusions | explicit arguments for a decision not given | arguments are given, but with limited knowledge and potential to be generalised | conclusions are coherent, logical, and supported by theoretical and empirical arguments, based on sources and considering norms and values relevant for the problem |
| Use of tools and data | tools are used with systematic errors | use of tools and data processing follows standard procedures, but it is flawed or misinterpreted | tools are used efficiently and in original ways |
| Understanding and consideration of norms and values | no normative considerations | occasional understanding of norms and values | multiple perspectives considered with care |
| Communication of results | presentation is sloppy and incoherent | all steps of the process presented, but the structure, coherence, or attractivity of the presentation could be improved | clear, structured, coherent, attractive, and comprehensive presentation |
| Presence of goal orientation | visible lack of motivation or perseverance | limited perseverance, based on external motivation | goal orientation is continuously present, and the capability to act reasonably and rationally is exhibited throughout the process |
| Richness of reflection | no reflection about the answer | in the reflection, the solution is checked, and the answer is evaluated inside the context, but the metacognitive relation to the whole process is rather weak | reflection is rich and includes the higher aims of the activity and the possibility to evaluate findings in a wider context |
- ·
- Do we understand the problem and its context?
- ·
- Which competences might help us in investigating the problem further?
- ·
- Do we know similar tasks and methods to solve them?
- ·
- How many different approaches can we come up to solve the given problem?
- ·
- Have we allowed ourselves to think “outside of the box”?
- ·
- Which sources of information we consider using?
- ·
- Have we checked the information from multiple sources and checked their reliability?
- ·
- Have we included all the key information that is available to us?
- ·
- Do we have enough measurements, or might our sample be biased?
- ·
- How precise are our measuring tools?
- ·
- Do we respect the prescribed procedures for using the tools and processing data?
- ·
- What kind of an answer do we accept and which solutions we consider to be “good”?
- ·
- Which values and norms condition our reasoning?
- ·
- Do we understand social values, norms, and rules to act properly?
- ·
- Do we understand the motivation of the subject or object we are confronted with?
- ·
- Are we motivated to complete the task? What is our motivation?
- ·
- Do we have enough evidence for our conclusions?
- ·
- How do we make sure that our reasoning is correct?
- ·
- How much time did we invest in the solving process?
- ·
- How to present our solution to others?
- ·
- Do we respect the norms and principles that our community uses in communication?
- ·
- Is the solution that we reached understandable and meaningful to us?
- ·
- Does the solution make sense with respect to the context of the problem?
- ·
- Are we able to reflect on, interpret, and evaluate the solution?
- ·
- Could a more general viewpoint provide us a deeper understanding of the solution?
- ·
- Are there more aspects/dimensions that we can consider in our reflection?
- ·
- What have we learned from the interaction with this problem?
References
- Aliakbari, M., & Sadeghdaghighi, A. (2013). Teachers’ perception of the barriers to critical thinking. Procedia-Social and Behavioral Sciences, 70, 1–5. [Google Scholar] [CrossRef]
- Bacigalupo, M. (2022). Competence frameworks as orienteering tools. RiiTE Revista Interuniversitaria de Investigación en Tecnología Educativa, 12, 20–33. [Google Scholar] [CrossRef]
- Bacovic, M., Andrijasevic, Z., & Pejovic, B. (2022). STEM education and growth in Europe. Journal of the Knowledge Economy, 13(3), 2348–2371. [Google Scholar] [CrossRef]
- Baidoo-Anu, D., & Ansah, L. O. (2023). Education in the era of generative artificial intelligence (AI): Understanding the potential benefits of ChatGPT in promoting teaching and learning. Journal of AI, 7(1), 52–62. [Google Scholar] [CrossRef]
- Bailin, S. (2002). Critical thinking and science education. Science & Education, 11(4), 361–375. [Google Scholar] [CrossRef]
- Bailin, S., Case, R., Coombs, J. R., & Daniels, L. B. (1999). Conceptualizing critical thinking. Journal of Curriculum Studies, 31(3), 285–302. [Google Scholar] [CrossRef]
- Bakker, A. (2018). Design research in education: A practical guide for early career researchers. Routledge. [Google Scholar]
- Balán, L. (2025). Educational quality in strategic negotiation: A critical discourse analysis of the rebranding of the PISA 2025 science test rationale. Research Papers in Education, 40(6), 871–894. [Google Scholar] [CrossRef]
- Bender, E. M., Gebru, T., McMillan-Major, A., & Shmitchell, S. (2021). On the dangers of stochastic parrots: Can language models be too big? In Proceedings of the 2021 ACM conference on fairness, accountability, and transparency (pp. 610–623). Association for Computing Machinery. [Google Scholar]
- Bianchi, G., Pisiotis, U., & Cabrera Giraldez, M. (2022). GreenComp: The European sustainability competence framework (pp. 1–40). EUR 30955. Publications Office of the European Union. [Google Scholar]
- Breiner, J. M., Harkness, S. S., Johnson, C. C., & Koehler, C. M. (2012). What is STEM? A discussion about conceptions of STEM in education and partnerships. School Science and Mathematics, 112(1), 3–11. [Google Scholar] [CrossRef]
- Bybee, R. W. (2013). The case for STEM education: Challenges and opportunities. NSTA Press. [Google Scholar]
- Bybee, R. W., Taylor, J. A., Gardner, A., Van Scotter, P., Powell, J. C., Westbrook, A., & Landes, N. (2006). The BSCS 5E instructional model: Origins and effectiveness (Vol. 5). BSCS. [Google Scholar]
- Choy, S. C., & Cheah, P. K. (2009). Teacher perceptions of critical thinking among students and its influence on higher education. International Journal of Teaching and Learning in Higher Education, 20(2), 198–206. [Google Scholar]
- Daugherty, M. K. (2013). The prospect of an “A” in STEM education. Journal of STEM Education: Innovations and Research, 14(2), 10–15. [Google Scholar]
- Dresing, T., & Pehl, T. (2018). Praxisbuch interview, transkription & analyse: Anleitungen und regelsysteme für qualitativ forschende. Dr. Dresing & Pehl Gmbh. [Google Scholar]
- Drost, E. A. (2011). Validity and reliability in social science research. Education Research and Perspectives, 38(1), 114–121. [Google Scholar] [CrossRef]
- Elder, L., & Paul, R. (2022). Critical thinking. Routledge. [Google Scholar] [CrossRef]
- Ennis, R. H. (1993). Critical thinking assessment. Theory into Practice, 32(3), 179–186. [Google Scholar] [CrossRef]
- Facione, P. (1990). Critical thinking: A statement of expert consensus for purposes of educational assessment and instruction (The Delphi Report). Centre for Digital Philosophy. [Google Scholar]
- Fleming, W., Hayes, A. L., Crosman, K. M., & Bostrom, A. (2021). Indiscriminate, irrelevant, and sometimes wrong: Causal misconceptions about climate change. Risk Analysis, 41(1), 157–178. [Google Scholar] [CrossRef]
- Freire, P. (2020). Pedagogy of the oppressed. In Toward a sociology of education (pp. 374–386). Routledge. [Google Scholar]
- Halawa, S., Lin, T. C., & Hsu, Y. S. (2024). Exploring instructional design in K-12 STEM education: A systematic literature review. International Journal of STEM Education, 11(1), 43. [Google Scholar] [CrossRef]
- Halpern, D. F. (2003). Thought & knowledge: An introduction to critical thinking. Lawrence Erlbaum Associates. [Google Scholar]
- Hitchcock, D. (2017). Critical thinking as an educational ideal. In On reasoning and argument: Essays in informal logic and on critical thinking (pp. 477–497). Springer International Publishing. [Google Scholar]
- Hsu, Y.-S., Tang, K.-Y., & Lin, T.-C. (2023). Trends and hot topics of STEM and STEM education: A co-word analysis of literature published in 2011–2020. Science & Education, 33, 1069–1092. [Google Scholar] [CrossRef]
- Kelley, T. R., & Knowles, J. G. (2016). A conceptual framework for integrated STEM education. International Journal of STEM Education, 3(1), 11. [Google Scholar] [CrossRef]
- Koro-Ljungberg, M., Yendol-Hoppey, D., Smith, J. J., & Hayes, S. B. (2009). (E)pistemological awareness, instantiation of methods, and uninformed methodological ambiguity in qualitative research projects. Educational Researcher, 38(9), 687–699. [Google Scholar] [CrossRef]
- Kuhn, D. (1999). A developmental model of critical thinking. Educational Researcher, 28(2), 16–26. [Google Scholar] [CrossRef]
- Lai, E. R. (2011). Critical thinking: A literature review. Pearson’s Research Reports, 6(1), 40–41. [Google Scholar]
- Li, Y., Wang, K., Xiao, Y., & Froyd, J. E. (2020). Research and trends in STEM education: A systematic review of journal publications. International Journal of STEM Education, 7(1), 11. [Google Scholar] [CrossRef]
- Long, D., & Magerko, B. (2020). What is AI literacy? Competencies and design considerations. In Proceedings of the 2020 CHI conference on human factors in computing systems (pp. 1–16). Association for Computing Machinery (ACM). [Google Scholar]
- Lu, J., Si, H., Xu, J., & Xu, T. (2025). An overview of applications and trends of STEM for learning effectiveness. An umbrella review based on 22 meta-analyses. Educational Research Review, 48, 100712. [Google Scholar] [CrossRef]
- Machete, P., & Turpin, M. (2020). The use of critical thinking to identify fake news: A systematic literature review. In Conference on e-business, e-services and e-society (pp. 235–246). Springer. [Google Scholar]
- Mayring, P. (2014). Qualitative content analysis: Theoretical foundation, basic procedures and software solution. Social Science Open Access Repository. Available online: https://nbn-resolving.org/urn:nbn:de:0168-ssoar-395173 (accessed on 3 March 2026).
- McKenney, S., & Reeves, T. (2019). Conducting educational design research. Routledge. [Google Scholar]
- Moore, T. (2013). Critical thinking: Seven definitions in search of a concept. Studies in Higher Education, 38(4), 506–522. [Google Scholar] [CrossRef]
- Ng, D. T. K., Leung, J. K. L., Chu, S. K. W., & Qiao, M. S. (2021). Conceptualizing AI literacy: An exploratory review. Computers and Education: Artificial Intelligence, 2, 100041. [Google Scholar] [CrossRef]
- OECD. (2021). 21st-century readers: Developing literacy skills in a digital world. OECD Publishing. [Google Scholar] [CrossRef]
- OECD. (2023). PISA 2025 science framework (draft). Available online: https://pisa-framework.oecd.org/science-2025/assets/docs/PISA_2025_Science_Framework.pdf (accessed on 3 March 2026).
- Ortiz-Revilla, J., Greca, I. M., & Arriassecq, I. (2022). A theoretical framework for integrated STEM education. Science & Education, 31(2), 383–404. [Google Scholar] [CrossRef]
- Osborne, J. (2014). Teaching scientific practices: Meeting the challenge of change. Journal of Science Teacher Education, 25(2), 177–196. [Google Scholar] [CrossRef]
- Owens, D. C., & Sadler, T. D. (2023). Socio-scientific issues instruction for scientific literacy: 5E framing to enhance teaching practice. School Science and Mathematics, 124(3), 203–210. [Google Scholar] [CrossRef]
- Paul, R., & Elder, L. (2013). Critical thinking: Tools for taking charge of your professional and personal life. Pearson Education. [Google Scholar]
- Paul, R. W., Elder, L., & Bartell, T. (1997). California teacher preparation for instruction in critical thinking: Research findings and policy recommendations. California Commission on Teacher Credentialing. [Google Scholar]
- Rafolt, S., Kapelari, S., & Kremer, K. (2019). Kritisches Denken im naturwissenschaftlichen unterricht–synergiemodell, problemlage und desiderata. Zeitschrift für Didaktik der Naturwissenschaften, 25(1), 63–75. [Google Scholar] [CrossRef]
- Reynders, G., Lantz, J., Ruder, S. M., Stanford, C. L., & Cole, R. S. (2020). Rubrics to assess critical thinking and information processing in undergraduate STEM courses. International Journal of STEM Education, 7(1), 9. [Google Scholar] [CrossRef]
- Rosa, M., Orey, D. C., & de Sousa Mesquita, A. P. S. (2023). An ethnomodelling perspective for the development of a citizenship education. ZDM Mathematics Education, 55(5), 953–965. [Google Scholar] [CrossRef]
- Sadler, T. D. (2004). Informal reasoning regarding socioscientific issues: A critical review of research. Journal of Research in Science Teaching, 41(5), 513–536. [Google Scholar] [CrossRef]
- Saxton, E., Belanger, S., & Becker, W. (2012). The Critical Thinking Analytic Rubric (CTAR): Investigating intra-rater and inter-rater reliability of a scoring mechanism for critical thinking performance assessments. Assessing Writing, 17(4), 251–270. [Google Scholar]
- Sermeus, J., De Cock, M., & Elen, J. (2021). Critical thinking in electricity and magnetism: Assessing and stimulating secondary school students. International Journal of Science Education, 43(16), 2597–2617. [Google Scholar] [CrossRef]
- Siegel, H. (1988). Educating reason: Rationality, critical thinking and education. Routledge. [Google Scholar]
- Singer-Brodowski, M. (2023). The potential of transformative learning for sustainability transitions: Moving beyond formal learning environments. Environment, Development and Sustainability, 27, 20621–20639. [Google Scholar] [CrossRef]
- Skovsmose, O. (2020). Critical mathematics education. In Encyclopedia of mathematics education (pp. 154–159). Springer International Publishing. [Google Scholar]
- Solomon, F., Champion, D., Steele, M., & Wright, T. (2022). Embodied physics: Utilizing dance resources for learning and engagement in STEM. Journal of the Learning Sciences, 31(1), 73–106. [Google Scholar] [CrossRef]
- Thornhill-Miller, B., Camarda, A., Mercier, M., Burkhardt, J.-M., Morisseau, T., Bourgeois-Bougrine, S., Vinchon, F., El Hayek, S., Augereau-Landais, M., Mourey, F., Feybesse, C., Sundquist, D., & Lubart, T. (2023). Creativity, critical thinking, communication, and collaboration: Assessment, certification, and promotion of 21st century skills for the future of work and education. Journal of Intelligence, 11(3), 54. [Google Scholar] [CrossRef]
- Tricot, A., & Sweller, J. (2014). Domain-specific knowledge and why teaching generic skills does not work. Educational Psychology Review, 26(2), 265–283. [Google Scholar]
- van der Linden, S., Leiserowitz, A., Rosenthal, S., & Maibach, E. (2017). Inoculating the public against misinformation about climate change. Global Challenges, 1(2), 1600008. [Google Scholar] [CrossRef]
- Voogt, J., & Roblin, N. P. (2012). A comparative analysis of international frameworks for 21st century competences: Implications for national curriculum policies. Journal of Curriculum Studies, 44(3), 299–321. [Google Scholar]
- Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes (Vol. 86). Harvard University Press. [Google Scholar]
- Willingham, D. T. (2007). Critical thinking: Why it is so hard to teach? American Federation of Teachers Summer, 31, 8–19. [Google Scholar] [CrossRef]
- Wineburg, S., McGrew, S., Breakstone, J., & Ortega, T. (2016). Evaluating information: The cornerstone of civic online reasoning. Stanford Digital Repository. [Google Scholar]
- Zeidler, D. L. (2014). Socioscientific issues as a curriculum emphasis: Theory, research, and practice. In Handbook of research on science education (Vol. II, pp. 697–726). Routledge. [Google Scholar]
- Zhan, Z., Shen, W., Xu, Z., Niu, S., & You, G. (2022). A bibliometric analysis of the global landscape on STEM education (2004–2021): Towards global distribution, subject integration, and research trends. Asia Pacific Journal of Innovation and Entrepreneurship, 16(2), 171–203. [Google Scholar] [CrossRef]
| Level | Description |
|---|---|
| Basic | No quality check of external sources of information. |
| Intermediate | Only limited sources of information are used, and sources are poorly checked. |
| Advanced | Multiple sources are considered and selected based on quality checks. |
| Level | Description |
|---|---|
| Basic | Only the step-counting strategy, with no reference to other measuring tools/strategies. |
| Intermediate | Includes a reflection on the limitation of step counting as a measuring strategy. |
| Advanced | Includes a reflection on the limitation of step counting as a measuring strategy and suggestions for alternative measuring strategies. |
| Category | Croatia | Austria | Germany | The Netherlands |
|---|---|---|---|---|
| Concept Validity (dimensions and levels) | Synthesises CT well. Served as an introduction to CT. Some concerns relate to the levels (descriptions are more advanced than labels suggest). | Beneficial framework, but does not capture students’ learning. Too general. | Levels are seen as not perfectly calibrated. Adaption needed. Inclusion of specific definitions and a weighting system. | Broad agreement that the rubric captures key aspects of CT in STEM, but uncertainty about full coverage of all CT aspects. CT progression across levels seen as meaningful. Difficulty mapping diverse student responses to fixed level descriptions. Some require a better connection to theory. |
| Practicality and Barriers | Half of the students see practicality, while some concerns relate to complexity and time needed for classroom use. | Positive for structure and overview. Its use requires additional workload. Limitations to use it in group work or with introverted students. Goal orientation and reflection are difficult to observe. Hard to distinguish levels. | Knowledge barriers due to a lack of subject specific knowledge. Subjectivity as barrier: practicability is not given yet. Several constructs, including efficiency and creativity, were considered not operationalisable. Potential for open specific tasks. | Rubric perceived as helpful but demanding. Complexity and time effort seen as major barriers/risk of overburdening assessment practices. Better connection to theory required. Not all CT aspects considered necessary for every task. The rubric alone does not help to develop critical thinkers. |
| Willingness | Most students are positive on using the tool in a variety of situations (classroom, self-reflection, and teacher education). A concern is raised on levels becoming labels for diverse students. | Positive to use as a supporting tool for the teachers. | Willing to use as orientation and reflection. | Almost all declared a high intention to use the rubric if able to adapt it to personal needs. |
| Usefulness/Purpose | Positive about using it as a tool for learning, teaching, and assessment. | Positive. Supporting tool in teaching. Not for assessing student learning. | The tool was viewed positively as a diagnostic instrument, but several barriers were identified, leading to recommendations against its use in its current form due to the perceived subjectivity of key criteria. | Positive, as a tool to monitor students’ progress. Could work as starting point for designing/development of a lesson/assignment. |
| Adaptability | Perceived as a fixed tool. | Positive, but it requires additional workload. Needs adaptation for social forms. | Potential is recognised but needs to adapt. | The rubric is adaptable to specific disciplines (biology and mathematics). Strict adherence is not always possible, and there is a need for personalisation. |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2026 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Straser, O.; Bašić, M.; Doorman, M.; Weinberg, L.; Kapelari, S.; Maaß, K. Fostering Critical Thinking in STEM Education. Educ. Sci. 2026, 16, 461. https://doi.org/10.3390/educsci16030461
Straser O, Bašić M, Doorman M, Weinberg L, Kapelari S, Maaß K. Fostering Critical Thinking in STEM Education. Education Sciences. 2026; 16(3):461. https://doi.org/10.3390/educsci16030461
Chicago/Turabian StyleStraser, Oliver, Matija Bašić, Michiel Doorman, Lucas Weinberg, Suzanne Kapelari, and Katja Maaß. 2026. "Fostering Critical Thinking in STEM Education" Education Sciences 16, no. 3: 461. https://doi.org/10.3390/educsci16030461
APA StyleStraser, O., Bašić, M., Doorman, M., Weinberg, L., Kapelari, S., & Maaß, K. (2026). Fostering Critical Thinking in STEM Education. Education Sciences, 16(3), 461. https://doi.org/10.3390/educsci16030461

