1. Introduction
Scientific dissemination is no longer the exclusive domain of specialists but has become a cross-cutting competence in higher education. It is particularly strategic in initial teacher training, where future educators must not only master disciplinary content but also communicate it effectively to diverse audiences—students, families, and educational communities (
Fuentes-Cancell et al., 2026;
Flores Mejía et al., 2024;
O’Connor et al., 2021). In this way, communication serves as a bridge between science and everyday life, strengthening scientific literacy and critical thinking from the earliest stages of education (
Kankam et al., 2024).
In this study, we use the term scientific dissemination or science communication competence to refer to the capacity of future teachers to mediate scientific knowledge for non-specialist audiences in educational contexts. This competence involves identifying and critically selecting reliable sources, synthesising complex information without distorting evidence, and translating disciplinary concepts into accessible explanations linked to relevant social issues. It also includes organising messages into coherent narratives, using clear and engaging language, making informed decisions about the most appropriate media and formats—such as texts, videos, infographics, or podcasts—and applying visual and design principles to support understanding. Finally, science communication competence has an ethical and social component, which implies recognising the limits of available evidence, avoiding misleading or categorical claims, and promoting critical thinking in order to counter misinformation and reflect on the broader social implications of scientific knowledge (
Stofer & Wolfe, 2018;
L. S. Davis, 2014).
Mastering this skill demands adapting discourse for non-specialist audiences and employing clear, engaging language free from unnecessary technicality (
Baram-Tsabari & Lewenstein, 2017;
Dahlstrom, 2014). Communicative effectiveness is further enhanced through structured narratives and coherent oral presentations (
Dahlstrom, 2014). In digital contexts, dissemination entails selecting appropriate formats—texts, videos, infographics, or podcasts (
Al-Khresheh, 2024)—using visual and design principles strategically (
Allgaier, 2019;
Gioltzidou et al., 2024) and developing a critical understanding of the communicative logic of social media (
Gioltzidou et al., 2024).
In the context of teacher education, scientific dissemination fosters a deeper understanding of disciplinary content, reinforces professional identity, and positions future teachers as mediators between science and society. Consequently, it constitutes a complex professional competence—cognitive, communicative, digital, and ethical—that must be developed through authentic learning experiences supported by Project-Based Learning (PBL) and Experiential Learning (EL) methodologies.
1.1. Project-Based Learning and Experiential Learning in Higher Education
In higher education, the development of professional and cross-cutting competencies requires methodologies that transcend unidirectional knowledge transmission, integrating theory, practice, and critical reflection. Among these, PBL and EL stand out for their constructivist orientation and student-centered focus. Although they differ in emphasis, both approaches converge in situating learning within authentic contexts, linked to real-world problems and mediated by collaboration and reflective inquiry (
Shivni et al., 2021).
PBL frames a problem or research question as the nucleus of the learning process, promoting autonomy, active inquiry, and teamwork (
Kokotsaki et al., 2016). Its culmination in a verifiable product—such as a report, proposal, prototype, or presentation—serves as tangible evidence of learning and requires the integration of knowledge, skills, and attitudes around a shared goal (
Ammar et al., 2024). It has become an effective strategy for developing communication, management, and collaboration skills, while connecting academic content with professional and social demands (
Kong et al., 2024).
In turn, EL, grounded in Kolb’s experiential cycle and its subsequent revisions, places lived experience at the core of learning (
Morris, 2019). Knowledge emerges from contextualized experiences that demand active engagement and acceptance of uncertainty (
Jamison et al., 2022). Its sequence—concrete experience, reflective observation, abstract conceptualization, and active experimentation—fosters deep learning by critically analyzing experience, recognizing the provisional nature of knowledge, and transferring insights to new contexts (
Kavitha Devi & Thendral, 2023;
Tembrevilla et al., 2023).
Both approaches emphasize the role of active learners capable of linking theory with practice and reflecting on their own learning process (
Singha & Singha, 2024). While PBL structures learning around collaborative projects and concrete products, EL underscores situated experience, uncertainty, and critical reflection as drivers of transformative learning (
Su, 2024). Together, they provide a comprehensive pedagogical framework for developing complex competencies—communication, problem solving, collaboration, and adaptability—beyond the mere acquisition of disciplinary knowledge.
1.2. Previous Studies on the Development of Scientific Dissemination Skills
Recent research in higher education has highlighted that students often display limited skills in communicating scientific content to non-specialist audiences (
Ashcraft et al., 2020;
Leon Duarte et al., 2025). Studies on science communication training and dissemination-oriented tasks report recurrent difficulties in identifying reliable sources, translating technical language into accessible explanations, structuring coherent narratives, and making a critical and pedagogically grounded use of digital and multimodal resources (
R. Davis & D’Lima, 2020;
Brownell et al., 2013;
Burns et al., 2003). Although several programmes and courses have incorporated communication activities, these experiences are frequently peripheral, short-term, or focused on isolated products rather than on the systematic development of a broader competence in science communication (
Reincke et al., 2020;
Tillinghast et al., 2020).
In parallel, numerous studies have highlighted the potential of Project-Based Learning and Experiential Learning for developing scientific competencies in higher education. Both methodologies, grounded in constructivist principles, posit that knowledge is constructed more meaningfully when students actively engage in solving authentic problems, reflect on their experiences, and produce communicable results (
Maulida et al., 2024). Empirical and theoretical research converges in showing that these approaches strengthen cognitive, procedural, communicative, and social competencies, dimensions closely related to the development of scientific dissemination skills.
PBL provides an organisational framework that immerses students in complex, collaborative projects. The creation of a tangible, socially relevant final product requires integrating knowledge, planning processes, distributing responsibilities, and effectively communicating results. In contrast, Experiential Learning emphasises situated experience and critical reflection as catalysts for deep learning. By confronting students with real contexts, uncertainty, and decision-making, it fosters conceptual understanding and the transfer of knowledge to new scenarios (
Kokotsaki et al., 2016;
Morris, 2019). The convergence of both approaches is particularly evident in science education, where PBL promotes scientific competence—encompassing conceptual, procedural, and epistemic dimensions—through realistic challenges that demand modelling, experimental design, argumentation, validation of evidence, and communication of findings to diverse audiences (
Lavado-Anguera et al., 2024).
Empirical research in university contexts has also shown that working on collaborative projects enhances autonomy, creativity, leadership, and communication skills (
Toledo Morales & Sánchez García, 2018), while bibliometric analyses link these methodologies to improvements in scientific literacy, critical thinking, and research competence (
Misbah et al., 2024). Community-based experiences further support these findings: projects situated in real social contexts promote metacognitive awareness, the construction of a scientific identity (
Avila-Bront, 2025), and the development of communication, problem-solving, and adaptability skills (
Collins-Nelsen et al., 2021). When PBL and Experiential Learning are integrated into authentic projects, students not only learn disciplinary content but also develop the ability to translate and mobilise knowledge within educational and community environments.
Furthermore, Experiential Learning increases motivation and engagement through the sequence of doing, reflecting, thinking, and applying, which encourages cognitive, emotional, and behavioural involvement. Research in mathematics and STEM education confirms that these approaches foster creativity, participation, and academic performance, demonstrating that students learn more effectively in practical and communicative contexts (
Maulida et al., 2024;
Nguyen et al., 2025;
Uyen et al., 2022).
Overall, evidence indicates that PBL and Experiential Learning not only reinforce conceptual understanding but also cultivate comprehensive scientific competencies, such as inquiry, argumentation, critical reflection, collaboration, and communication. These competencies are inseparable from scientific dissemination, as they enable students to make scientific knowledge understandable, relevant, and socially meaningful.
However, despite the growing body of work on active and experiential methodologies, there is still a lack of empirical studies that focus specifically on the development of science communication competence in initial teacher education. Most contributions either address scientific competence in general, examine isolated communication activities, or focus on STEM undergraduates, while research on pre-service teachers remains scarce. Furthermore, few studies combine Project-Based Learning and Experiential Learning in a coherent pedagogical framework, and even fewer adopt mixed-methods designs that capture both the quantitative evolution and the qualitative meanings of students’ acquisition of science communication skills across cognitive, communicative, media–digital, and ethical–social dimensions. The present study seeks to contribute to this gap by examining how an integrated PBL and Experiential Learning proposal supports the development of science communication competence in two cohorts of pre-service teachers.
1.3. Research Objectives and Hypotheses
The overall purpose of this study was to analyse the development of science communication competence in initial teacher education through a teaching proposal grounded in Project-Based Learning and Kolb’s Experiential Learning model. Previous research and our own diagnostic evidence indicate that many pre-service teachers reach advanced stages of their degree with limited skills in identifying reliable sources, translating scientific knowledge for non-specialist audiences, and using digital and multimodal resources in pedagogically meaningful ways. This situation reveals a gap between the central role that science communication is expected to play in contemporary teaching and the fragmented or insufficient preparation typically offered in university curricula. The present study addresses this gap by examining the impact of an integrated PBL and Experiential Learning intervention on the development of science communication competence across cognitive, communicative, media–digital, and ethical–social dimensions.
Based on this problem, the general objective was to determine whether the intervention produced significant improvements in the four dimensions of science communication competence.
1.3.1. Quantitative Hypotheses
H1. Students will show a statistically significant improvement in the cognitive dimension of science communication competence between the pretest and posttest.
H2. Students will show a statistically significant improvement in the communicative dimension of science communication competence between the pretest and posttest.
H3. Students will show a statistically significant improvement in the media–digital dimension of science communication competence between the pretest and posttest.
H4. Students will show a statistically significant improvement in the ethical–social dimension of science communication competence between the pretest and posttest.
1.3.2. Qualitative Research Questions
To complement and deepen the quantitative results, the study addressed the following qualitative research questions:
- 1.
How do students describe their experience with the instructional intervention based on Project-Based Learning and Experiential Learning?
- 2.
What aspects of the intervention do students perceive as relevant in shaping their learning processes related to science communication competence?
- 3.
How do students explain the ways in which the intervention may have influenced their approaches to communicating, interpreting, and mediating scientific knowledge?
2. Materials and Methods
2.1. Participants
The study involved a convenience sample of 79 students from the University of Valladolid (Duques de Soria Campus, Spain), who participated voluntarily after receiving detailed information about the study’s objectives and procedures. Participants were enrolled in the Bachelor’s Degree in Early Childhood Education (n = 36; 26 women; aged 19–23) and the Bachelor’s Degree in Primary Education (n = 43; 13 women; aged 20–23).
The selection of participants was intentional and context-based, corresponding to the educational and organizational characteristics of the courses in which the teaching intervention was implemented. Due to these contextual conditions, random assignment was not possible. All participants completed both the pretest and posttest assessments, and no dropouts occurred during the study.
The qualitative phase was conducted with the same group of participants, consistent with the logic of the sequential explanatory mixed-methods design (QUANT→QUAL), which aimed to explore in depth the perceptions and experiences of those who had taken part in the intervention.
2.2. Ethical Conditions
Participation was voluntary, and all students signed an informed consent form after receiving detailed information about the objectives, procedures, and guarantees of the study. The research was approved by the Ethics Committee of the University of Valladolid (reference: 2025-CEUVa-003-Z-1-D-4), complying with the ethical principles established in the Declaration of Helsinki and European data protection regulations.
2.3. Design
The study followed a sequential explanatory mixed-methods design (QUANT→QUAL) structured in two complementary phases. In the quantitative phase, the changes produced after the teaching intervention were identified and measured; in the qualitative phase, these results were explored in greater depth by analyzing participants’ perceptions and experiences. This approach enabled the integration of numerical and narrative evidence to generate interpretive meta-inferences aimed at achieving a broader understanding of the phenomenon under study (
Creswell & Plano Clark, 2018;
Fetters et al., 2013).
The quantitative component employed a quasi-experimental design with intact, nonequivalent groups and pretest–posttest measurements, without random assignment of participants. Such a design is suitable for educational contexts where randomization is constrained by ethical or organizational factors, yet it allows the impact of a teaching intervention to be examined under real classroom conditions (
Hernández Sampieri et al., 2018). The findings from this phase provided the empirical basis for the subsequent qualitative analysis, aimed at deepening the interpretation of the observed changes.
The subjects in which the intervention was implemented were taught in a face-to-face format, supplemented by supervised self-study sessions. Each degree program had a single theoretical class group; practical sessions were organized into one group in the Early Childhood Education degree and two parallel groups in the Primary Education degree. The intervention was carried out in two independent cohorts corresponding to different degree subjects to assess the consistency of impact across training contexts.
To mitigate threats to internal validity—such as maturation, history, or testing effects—the materials, teaching sequence, and assessment rubrics were standardized in both cohorts, and the same instructors implemented the intervention. This procedural consistency ensured comparability between programs and strengthened the reliability of the results.
The qualitative phase adopted an interpretive-descriptive approach, designed to explain and expand the quantitative findings by identifying the underlying learning processes. The same participants took part, in accordance with the principle of sequential connection (QUANT→QUAL). Three focus groups, each comprising 10–13 students, were organized to ensure balanced representation by degree program, gender, and level of engagement. Additionally, systematic pedagogical observations were conducted throughout the intervention.
Focus-group sessions were held after completion of the teaching experience, recorded, and fully transcribed. Their content was analyzed through thematic coding (
Braun & Clarke, 2006) until conceptual saturation was achieved.
Finally, the quantitative and qualitative results were integrated through joint interpretation, relating patterns of change to participants’ perceptions and experiences. This process led to the formulation of the following meta-inferential questions guiding the overall analysis:
- 4.
How do students’ perceptions explain and complement the quantitative improvements observed in scientific dissemination skills?
- 5.
What pedagogical mechanisms associated with PBL and EL emerge from the integration of quantitative and qualitative results within each competency dimension?
- 6.
What meta-inferences can be derived from combining both data sets regarding the impact of the teaching proposal on initial teacher education?
2.4. Instruments
Three complementary instruments were used to assess the dimensions of scientific communication competence, selected according to the objectives of each phase of the sequential explanatory mixed-methods design (QUANT→QUAL).
In the quantitative phase, an ad hoc questionnaire was administered, composed of 16 items distributed across four dimensions: cognitive, communicative, media–digital, and ethical–social (see
Table 1). The items were developed based on a systematic review of recent literature on science communication and aligned with key frameworks of digital teaching competence, including DigCompEdu (
Redecker, 2017) and the Spanish Reference Framework for Digital Teaching Competence (
INTEF, 2022).
Content validity was established through expert judgment involving 15 PhD specialists in Education, complemented by a pilot test with students sharing similar characteristics to the study sample. Construct validity was confirmed through exploratory factor analysis (EFA), and the instrument demonstrated satisfactory internal consistency (α = 0.80; ω = 0.86).
In the qualitative phase, two complementary instruments were employed. The first was a structured pedagogical observation guide, designed according to the criteria proposed by
Ruiz-Olabuénaga (
2012) to systematically record students’ behaviors during practical activities. Content validity was ensured through review by three specialists in Education, and minor adjustments were incorporated prior to implementation.
The second instrument consisted of a semi-structured focus group protocol, designed to elicit participants’ perceptions of the scientific dissemination process. The protocol was reviewed by experts, piloted with a non-sample group, and subsequently applied to the full participant cohort across three sessions involving 10 to 13 students each. Illustrative guiding questions included: “How did project-based design and experiential learning contribute to developing your scientific dissemination skills?” and “How did you feel during the learning process?”
All sessions were audio-recorded, transcribed verbatim, and independently coded by two researchers. Discrepancies were resolved by consensus, ensuring inter-rater reliability and analytical rigor.
It should be noted that the questionnaire assesses students’ perceived competence rather than direct performance-based ability. This approach aligns with competence-assessment traditions in higher education, where self-perceived ability is considered an indicator of metacognitive awareness, confidence, and understanding of the processes involved in communicating scientific content. The qualitative phase, based on structured observations and focus groups, was included to complement this limitation by exploring how students enacted and articulated their communication strategies in authentic dissemination tasks.
2.5. Procedure
The research was conducted during the 2024–2025 academic year in two initial teacher training courses: Guidance and Tutoring for Students and Families (Early Childhood Education Degree, n = 36) and Methods of Innovation and Diagnosis in Education (Primary Education Degree, n = 43). Each course comprised 60 h of classroom instruction and 90 h of supervised independent work. The intervention lasted 15 weeks per cohort (September–January and February–June) and was structured into three successive phases: diagnostic, development, and evaluation.
To ensure that the intervention explicitly targeted the development of science communication competence, the PBL and Experiential Learning sequence was contextualised around the creation of scientific dissemination products for non-specialist audiences. Students selected a scientific topic, identified an intended audience, and analysed reliable scientific sources in order to translate technical information into accessible explanations. They planned and produced multimodal dissemination outputs such as videos, infographics, podcasts, or illustrated texts, applying principles of clarity, narrative coherence, audience adaptation, ethical communication, and responsible use of evidence. Iterative feedback sessions focused specifically on the effectiveness of students’ communication strategies, while structured reflection cycles based on Kolb’s model guided them to evaluate and refine how they mediated scientific knowledge. These actions ensured that the intervention was intentionally oriented toward strengthening science communication competence rather than functioning as a general PBL–EL experience.
In the initial phase, a diagnostic questionnaire assessing science communication competence was administered to establish a baseline across the cognitive, communicative, media–digital, and ethical–social dimensions (
Figure 1).
The development phase, constituting the core of the intervention, was grounded in the principles of PBL and EL. Activities were designed to foster the progressive development of science communication competence through authentic experiences, collaboration, and critical reflection. In the Early Childhood Education course, students developed a Tutorial Action Plan focused on communication with families and students. In the Primary Education course, students prepared a literature review and an innovative educational proposal addressing a real educational challenge. Throughout the process, diverse strategies were integrated:
- (a)
The use of academic and digital resources (Scopus, Web of Science, UVaDoc, ResearchGate);
- (b)
The educational application of generative artificial intelligence (ChatGPT v3) to support content synthesis and reformulation;
- (c)
Multimodal production through infographics, summaries, and audiovisual capsules;
- (d)
Critical and ethical reflection on the products developed (
Table 2).
Continuous structured pedagogical observations were conducted to record student performance and provide individualised feedback aimed at reinforcing strengths and addressing areas for improvement.
In the final phase, a post-intervention questionnaire was administered to assess the evolution of students’ competencies. Subsequently, the qualitative phase was developed to deepen the interpretation of the quantitative findings. Three focus groups, each comprising 10–13 participants selected from the intervention cohort, were organised at the end of the teaching experience. Sessions were audio-recorded, fully transcribed, and analysed through thematic coding (
Braun & Clarke, 2006) until conceptual saturation was achieved. The focus groups were not intended to assess performance directly, but to explore students’ reflections on the strategies, criteria, and decisions they used while producing dissemination products, thereby providing insight into key metacognitive and strategic components of science communication competence.
Finally, the quantitative and qualitative results were jointly interpreted to integrate patterns of change with students’ perceptions and experiences.
Continuous formative assessment was implemented throughout the intervention through structured pedagogical observations, which made it possible to monitor students’ progress and adjust learning activities as needed. Individualized feedback was also provided to reinforce strengths—such as the accurate use of sources and the clarity of explanations—and to identify areas for improvement, particularly in the critical appraisal of information and the adaptation of scientific content for non-specialist audiences. This formative process supported the development of science communication competence by guiding students to make progressively more informed, coherent, and ethically responsible dissemination decisions.
2.6. Data Analysis
Data analysis followed a mixed sequential integration approach, combining quantitative and qualitative procedures to enhance the credibility and interpretive validity of the findings through the convergence of evidence.
In the quantitative phase, descriptive and inferential statistical analyses were conducted. The normality of the distributions was examined using the Shapiro–Wilk test, after which paired-sample t-tests were applied to compare pretest and posttest scores. The effect size was estimated using Pearson’s r coefficient and interpreted according to the criteria established by
Cohen et al. (
2002).
In the qualitative phase, data were analyzed through thematic content analysis, following the procedures proposed by
Braun and Clarke (
2006) and
Strauss and Corbin (
1995) for open and axial coding. Coding was conducted in two stages, based on the transcripts of focus groups and pedagogical observations, and organized into categories aligned with the theoretical model of scientific dissemination competencies. Two researchers independently performed the coding and resolved discrepancies through consensus, ensuring inter-coder reliability. The development of a codebook further ensured the transparency and traceability of the analytical process.
The results were integrated through a sequential connection strategy (QUAN→QUAL), whereby the quantitative findings guided the subsequent qualitative interpretation, enabling a dialogue between data sets. This articulation facilitated corroboration and interpretive convergence, allowing the results to be contrasted and expanded from complementary methodological perspectives. Moreover, the consistency of effects across the two participating cohorts—Early Childhood Education and Primary Education—was analyzed, and the findings were interpreted in light of the conceptual frameworks of Project-Based Learning (PBL) and
Kolb’s (
1984) Experiential Learning Cycle.
This procedure—consistent with current recommendations in mixed-methods research (
Creswell & Plano Clark, 2018;
Hernández Sampieri et al., 2018;
Fetters et al., 2013)—strengthened the internal validity and interpretive credibility of the study, providing a deeper and contextualized understanding of the impact of the teaching proposal on the development of scientific dissemination skills in initial teacher education.
3. Results
3.1. Quantitative Phase
Shapiro–Wilk tests of normality were applied to the differences between the initial and final assessments, revealing significant deviations from normality in some cases (
p < 0.05). However, since the t-test for related samples is considered robust to minor violations of normality in moderate sample sizes (n = 36 and n = 43), it was retained as the primary analytical procedure. To strengthen the robustness of the findings, the analysis was complemented by the nonparametric Wilcoxon signed-rank test (see
Table 3).
The results revealed a consistent pattern of improvement across all dimensions of scientific communication competence. In both cohorts, the differences between the initial and final measurements were statistically significant (p < 0.001), with very large effect sizes (Group 1: r = 0.89–0.91; Group 2: r = 0.93–0.97). Group 2 obtained slightly higher final means (3.73–3.76) compared to Group 1 (3.35–3.61), showing particularly strong effects in the communicative (r = 0.97) and media–digital (r = 0.96) dimensions.
Inferential analyses confirmed statistically significant gains in the four evaluated dimensions (p < 0.001), providing evidence of a substantial and consistent enhancement of scientific dissemination skills following the intervention. The effect sizes obtained indicate a large practical impact in both groups, especially within the communicative and media–digital domains.
As shown in
Figure 2, the initial scores reflected low levels in all dimensions (M = 1.1–1.6 out of 5), particularly in the cognitive and communicative dimensions. This homogeneity suggests that, regardless of degree program or subject area, participants began with notable deficiencies in scientific communication skills. Within these dimensions, the indicators with the lowest initial performance corresponded to the identification of reliable sources (ID1) and the adaptation of discourse for non-specialized audiences (ID5).
After the intervention, scores increased to medium-to-high levels (M = 3.3–3.8), representing an average improvement of at least two points compared with the initial values.
In the cognitive dimension, both groups demonstrated significant progress in conceptual understanding and knowledge mediation. The indicator ID3 (explaining concepts using analogies) showed the greatest gains (r = 0.93 in G1; r = 0.98 in G2). In contrast, ID1 (identifying reliable sources) exhibited a more moderate increase (from 1.7 to 2.7 in G1; from 1.3 to 2.9 in G2).
The communicative dimension yielded the largest effect sizes, particularly in Group 2 (r = 0.97). Indicators ID5 (adapting discourse to non-specialized audiences) and ID8 (clarity in oral presentations) stood out, both showing remarkable improvements (r ≥ 0.92).
The media–digital dimension also displayed very large improvements in both cohorts (r = 0.91 in G1; r = 0.96 in G2). The most strengthened indicators were ID9 (selection of appropriate media) and ID10 (use of data visualization).
Within the ethical and social dimension, which began with relatively higher pretest scores (M ≈ 1.5), substantial progress was observed. Indicators ID13 (accuracy of information conveyed) and ID14 (recognizing the limits of evidence) showed high effect sizes (r ≈ 0.92–0.96). Conversely, ID15 (promoting critical thinking) and ID16 (considering social, cultural, and political impacts) presented slightly lower yet still large effects (r ≈ 0.81 in G1; r = 0.84–0.86 in G2).
Overall, the data analysis across both samples confirmed low initial levels in all dimensions (M = 1.1–1.6), followed by significant post-intervention increases to medium-high levels (M = 3.3–3.8). The overall effect sizes were very large (Group 1: r = 0.89–0.91; Group 2: r = 0.93–0.97). These findings demonstrate a substantial and sustained improvement in scientific communication competence as a result of the teaching intervention, with large effects across all evaluated dimensions.
3.2. Qualitative Phase
The focus groups conducted in Early Childhood Education (Group 1, three subgroups) and Primary Education (Group 2, four subgroups) provided explanatory nuances that deepened and broadened the interpretation of the quantitative results, revealing how students experienced the teaching approach based on Problem-Based Learning (PBL) and Kolb’s Experiential Learning (AE) cycle.
Participants consistently described the intervention as a departure from traditional methodologies. During the initial weeks, this novelty generated uncertainty and tension, which gradually transformed into motivation and pride in their achievements. One student from subgroup 2 of Group 1 reflected: “At first it was stressful—we didn’t know if we could produce a quality product—but in the end, we felt proud because what we wanted to communicate was understood.”
Similarly, a participant from Group 2 noted: “It was a different way of learning, more practical and motivating than the usual classes.”
These perceptions help explain the quantitative increases in motivation and perceived usefulness of the proposal, establishing a direct link between the learning experience and the improvement of communicative competence.
The testimonies revealed advances across the four evaluated dimensions:
Cognitive dimension. Students emphasized the use of analogies as a key strategy for understanding and communicating complex concepts. A member of Group 1 stated, “Using analogies was the most useful; it helped us think differently,” while Group 2 added, “Analogies helped me reflect on how we learn ourselves.” These insights support the quantitative improvement in ID3 and underscore the connection between metacognitive reflection and experiential learning.
Communicative dimension. Participants highlighted the importance of speaking clearly and adapting discourse to different audiences. In Group 1, a student recalled, “When we presented to families, we realized we had to use fewer technical terms.” In Group 2, another noted, “The project helped us lose our fear of public speaking; now we feel more confident explaining complicated topics.” These accounts complement the quantitative gains in ID5 and ID8, clarifying why this dimension recorded the largest effect sizes—situated practice promoted self-assessment and communicative improvement.
Media–digital dimension. Students recognized the usefulness of technological resources in enhancing message clarity. One participant commented, “We had never used digital tools so much; it was a discovery to learn how to choose what best conveyed the message.” Group 2 added, “A good graphic or clear image is worth more than a long explanation.” These testimonies align with the improvements in ID9 and ID10, demonstrating how digital literacy strengthened communicative effectiveness.
Ethical and social dimensions. Students developed an awareness of the responsibility inherent in communicating scientific information accurately. A Group 1 student reflected, “We realized how important it is to convey information well so as not to confuse families.” A participant from Group 2 added, “We saw that communicating science also has social consequences; speaking inaccurately can lead to misinformation.” These reflections support the quantitative progress in ID13 and ID14, while also highlighting persistent challenges in ID1 (source literacy) and ID16 (consideration of social impacts), which were less developed in the questionnaires.
The emerging patterns revealed both convergences and tensions with the statistical findings. While the questionnaires documented generalized improvement, the qualitative evidence clarified the processes underlying these changes:
Motivation stemmed from overcoming initial uncertainty.
Communicative clarity emerged from authentic interaction with real audiences.
The ethical dimension deepened through awareness of the risks of misinformation.
Thus, the focus groups not only validated the quantitative results but also explained, nuanced, and expanded them, highlighting both strengths (ID3, ID5, ID8, ID13) and areas for improvement (ID1, ID16).
The thematic analysis (
Braun & Clarke, 2006) made it possible to trace these perceptions, coding them into categories aligned with the dimensions and indicators of scientific communication competence. This analytic process enabled the articulation of qualitative experiences with the quantitative outcomes, yielding richer interpretive insights into the observed changes.
Across both cohorts, students expressed positive emotions—pride, confidence, and creativity—which acted as catalysts for experiential learning, reinforcing the pedagogical logic of PBL and Kolb’s AE cycle.
Table 4 summarizes this WHAT→WHY integration, presenting for each dimension the relationships among the questionnaire results, students’ perceptions, the degree of integration (strong, moderate, or complementary convergence), and the theoretical anchoring in PBL and Kolb’s Experiential Learning framework. The themes presented in
Table 4 represent students’ perceptions of their learning processes, specifically how they understood, justified, and reflected on the strategies they employed to communicate scientific knowledge throughout the intervention.
The focus groups provided an integrated qualitative explanation of the quantitative results, illustrating how the teaching proposal not only fostered the acquisition of technical skills in scientific dissemination but also cultivated an awareness of the social responsibility inherent in communicating with rigor, clarity, and an educational purpose.
This integration of QUANTITATIVE→QUALITATIVE evidence demonstrates that the learning outcomes identified in the questionnaires were underpinned by collaborative dynamics, critical reflection, and experiential learning processes. Together, these findings reinforce the interpretive validity of the study and highlight its pedagogical relevance for initial teacher education in contemporary contexts of scientific literacy.
3.3. Integration of the Results Obtained
Table 5, presented as a joint display, summarizes in parallel the correspondence between the statistical results, students’ perceptions, and their theoretical interpretation, providing a visual synthesis of the convergence and complementarity between both strands of evidence.
The first integrative question explored how students’ perceptions explain and complement the quantitative improvements observed in scientific communication skills. A strong convergence was identified between the two strands in the cognitive, communicative, and media–digital dimensions.
In the cognitive dimension, the improvement in the ability to explain concepts using analogies (ID3) coincided with testimonies emphasizing their usefulness in promoting understanding—“Using analogies was the most useful; it helped us think differently.” This correspondence leads to Meta-inference 1: the systematic use of analogies functions as a pedagogical mechanism that facilitates the transition from concrete experience to abstract conceptualization, consistent with Kolb’s Experiential Learning Cycle.
The communicative dimension recorded the most pronounced effects (r up to 0.97 in Group 2), supported by qualitative reports of reduced stage fright and greater clarity in oral expression. This alignment supports Meta-inference 2: practice in front of authentic audiences enhances communicative competence in initial teacher training by fostering self-efficacy and reflective awareness.
In the media–digital dimension, improvements in media selection (ID9) and data visualization (ID10) were linked to reflections on communicative efficiency—“A good graphic is worth more than a long explanation.” This relationship underpins Meta-inference 3: digital literacy transcends technical mastery, fostering critical decision-making regarding the most effective media for scientific dissemination.
The second integrative question examined the pedagogical mechanisms associated with PBL and AE. The findings indicate that PBL acted as a catalyst for situated learning by engaging students in authentic, problem-centered tasks, while AE accounted for the progression from initial uncertainty to critical reflection and active experimentation. This integration suggests that progress extends beyond statistical improvement, reflecting deep learning processes mediated by practice, collaboration, and reflection.
The third integrative question addressed the implications for teacher education. In the ethical and social dimension, significant gains were found in information fidelity (ID13–ID14) and in awareness of the responsibility to communicate accurately, although persistent limitations remained in information literacy (ID1) and in the consideration of social impacts (ID16). From this emerges Meta-inference 4: the ethical development of scientific dissemination requires pedagogical scaffolding that strengthens critical reading, media literacy, and communicative responsibility.
Taken together,
Table 5 and the derived meta-inferences demonstrate a substantive integration of results, where quantitative and qualitative approaches not only complement each other but also generate explanatory knowledge about the pedagogical mechanisms underpinning the development of scientific dissemination skills. The study is thus consolidated as a sequential explanatory mixed-methods design (QUANTITATIVE→QUALITATIVE) that provides both empirical evidence of effectiveness and a deep interpretive understanding of the training processes involved, thereby strengthening validity, internal consistency, and transferability to other educational contexts.
The integration of results corroborated the consistency of findings, broadened their interpretive scope, and reinforced the credibility and contextual relevance of the knowledge generated.
4. Discussion
The diagnostic phase results revealed low initial levels across all dimensions of scientific communication competence (M = 1.1–1.6). From a pedagogical perspective, this finding indicates that students, even at advanced stages of their university studies, showed notable limitations in identifying reliable sources (ID1), adapting discourse for non-specialized audiences (ID5), and considering the social implications of knowledge (ID16). Interpreted through Kolb’s Experiential Learning Model, participants appeared to remain in a phase of fragmented concrete experience, not yet having reached the stages of reflective observation and abstract conceptualization necessary for the reorganization of knowledge and the development of critical understanding.
These initial shortcomings were corroborated in the qualitative phase, where the focus groups revealed convergent perceptions: “At first, we had a hard time finding reliable sources” (G1, Sub1) and “I didn’t think I could explain a scientific concept” (G1, Sub3). Such testimonies underscore the value of Problem-Based Learning (PBL) as a framework for promoting authentic, socially meaningful learning capable of compensating for the weaknesses detected in the initial diagnosis.
Following the implementation of the teaching proposal, data from the development phase reflected consistent and statistically significant progress in the four competency dimensions (M = 3.3–3.8; r ≥ 0.89–0.97). When interpreted alongside the qualitative results, this evidence demonstrates the internalization of new ways of understanding, communicating, and applying scientific knowledge.
In the cognitive dimension, a marked strengthening was observed in the ability to explain concepts through analogies (ID3)—a strategy explicitly valued by participants: “The exercise of using analogies was the most useful; it helped us think differently” (G1, Sub3); “Using analogies helped me reflect on how we learn ourselves” (G2, Sub4).
In the communicative dimension, the indicators of discourse adaptation (ID5) and clarity in oral presentations (ID8) showed remarkable progress, closely linked to processes of self-efficacy and confidence: “We had to speak more clearly and use fewer technical terms” (G1, Sub1); “The project helped me overcome my fear of public speaking” (G2, Sub1).
The media–digital dimension also exhibited significant improvement, particularly in media selection (ID9) and data visualization (ID10). These quantitative findings were complemented by student reflections on the communicative power of digital resources: “We had never used digital tools so much before; we learned to choose what best conveyed the message” (G1, Sub2); “A good graph is worth more than a long explanation” (G2, Sub2).
Finally, the ethical and social dimension revealed substantial progress in the accuracy of information conveyed (ID13) and in the recognition of the limits of evidence (ID14). The focus group discussions reflected a heightened awareness of the social responsibility inherent in rigorous dissemination: “We realized that disseminating science also has social consequences; you can’t communicate without rigor” (G2, Sub3). Nevertheless, the integration of both methodological strands revealed that identifying reliable sources (ID1) and considering social impacts (ID16) remain areas of lesser development, suggesting the need for sustained pedagogical support aimed at consolidating critical and slow-maturing skills.
4.1. Interpretation and Pedagogical Integration of the Results
The integration of results confirms the internal coherence of the didactic design and its methodological soundness within a sequential mixed-methods approach. In this sense, PBL functioned as a structural bridge between theory and practice, connecting learning with real-world problems and authentic audiences, while Kolb’s AE cycle provided the explanatory framework for the transformative process observed among students.
The progression identified—from initial uncertainty to the critical appropriation of scientific dissemination—can be interpreted as a transition through the phases of Kolb’s experiential cycle: concrete experience (development of authentic products), reflective observation (analysis of one’s own difficulties), abstract conceptualization (identification of effective communication strategies), and active experimentation (refinement and improvement of the developed products). This trajectory indicates that learning extended beyond the acquisition of technical skills, encompassing a cognitive and attitudinal reconstruction oriented toward communicating knowledge with clarity, rigor, and social purpose.
Students’ reported emotions of pride, confidence, and motivation—for instance, “When we saw the final result, we felt proud because what we wanted to communicate was understood” (G1, Sub2)—reinforce the interpretation that concrete experience and active experimentation evolved into reflection and critical conceptualization, consistent with the logic of Kolb’s model. Thus, positive emotions acted as affective mediators of experiential learning, fostering cognitive engagement and meaningful knowledge retention.
The consistency between quantitative and qualitative findings across two cohorts—Early Childhood Education and Primary Education—strengthens the internal validity and transferability of the proposed model. The evidence demonstrates that the teaching approach not only led to measurable improvements in scientific communication competencies but also generated a transformative shift in how future teachers approach, interpret, and disseminate scientific knowledge. Overall, participants evolved from being recipients of information to becoming active mediators of knowledge, capable of interpreting, adapting, and communicating science from a critical, ethical, and socially responsible perspective.
4.2. Integration of Findings with the Literature on Active and Experiential Methodologies
The findings of this study are clearly aligned with the international literature that underscores the transformative potential of active methodologies in higher education. Numerous studies have shown that AE and PBL foster deep understanding, affective engagement, and the development of cross-cutting competencies in complex educational contexts.
In particular,
Kolano and Sanczyk (
2022) demonstrate that narrative and digital learning experiences generate sustainable attitudinal change, a phenomenon consistent with the trust, responsibility, and self-regulation expressed by participants in this study. Similarly, the works of
Bennett et al. (
2016) and
Tinkler et al. (
2019) reveal that critical service-learning and community engagement projects strengthen social awareness, professional identity, and civic commitment—dimensions also evidenced here in the ethical–social advances (ID13–ID14) and in students’ reflection on the social impact of scientific communication (ID16).
Likewise,
Zocher and Hougham (
2020) confirm that linking academic content to real-world problems promotes critical reflection and the contextualization of knowledge, findings that resonate with the improvements observed in conceptual mediation (ID4) and in the understanding of dissemination as a situated social practice. From this perspective, the present study not only corroborates existing research but also extends it by showing how the systematic integration of PBL and AE operates synergistically to generate meaningful learning in initial teacher education.
Furthermore, recent studies focused on PBL (e.g.,
Castillo-Salvatierra et al., 2025) support the idea that designing and developing authentic projects enhances communication and media–digital skills, a pattern replicated in this study’s indicators ID5, ID8, ID9, and ID10. Likewise,
Beissembayeva et al. (
2025) emphasize that critical thinking, digital literacy, and scientific communication constitute essential pillars of contemporary teacher education—core components that this proposal addresses in an integrated and coherent manner.
Finally,
Guaya et al. (
2025) show that incorporating social networks and digital environments into educational projects enhances students’ ability to select, produce, and disseminate scientific information responsibly, a tendency mirrored in the significant advances in media and digital literacy observed in this research.
Taken together, the convergence between empirical results and theoretical evidence reinforces the external validity of the proposed model. The integration of PBL and AE not only enhances academic performance and self-perceived competence but also transforms the relationship that future teachers establish with scientific knowledge, fostering an ethical, communicative, and technologically critical understanding of science dissemination in contemporary society.
4.3. Educational Implications of the Study
The findings of this study have significant implications for higher education, particularly in the field of initial teacher education. First, the results confirm that active methodologies such as PBL and Kolb’s Experiential Learning constitute effective pedagogical frameworks for the development of transversal competencies, specifically those related to scientific communication and dissemination. The progression from low initial levels to medium-high performance across the four dimensions of competence demonstrates that universities can go beyond the transmission of disciplinary knowledge to also foster the communicative, digital, and ethical skills required for responsible teaching practice.
Second, the implemented teaching approach illustrates how PBL fosters situated and meaningful learning by engaging future teachers in authentic problem-solving tasks and the creation of products for real audiences (families, students, and the wider educational community). This type of experience transforms traditional university instruction into a participatory learning environment, where knowledge is applied, communicated, and evaluated in context. Such dynamics enhance motivation, engagement, and commitment—key conditions for persistence and high-quality learning in teacher preparation.
Third, Kolb’s experiential model offers a structural framework for understanding the training process experienced by the participants: concrete experience (creation of dissemination products), reflective observation (analysis of difficulties and learning), abstract conceptualization (formulation of communication and ethical strategies), and active experimentation (adjustment and improvement of outputs and presentations). Embedding this cycle in university curriculum design strengthens autonomy, self-regulation, and transferability of learning, ensuring that teacher education is both deep and transformative.
Finally, the educational experience described provides higher education institutions with a replicable and adaptable pedagogical model for diverse curricular contexts. In an era marked by information overload and digital misinformation, the ability to communicate science with rigor, clarity, and social awareness emerges as an essential professional competence for contemporary educators. Integrating PBL and AE into university teaching not only optimizes knowledge construction and transfer but also promotes a critical, ethical, and socially engaged education, preparing future teachers to act as cultural mediators between science and society.
4.4. Limitations of the Study
Despite its methodological coherence and internal consistency, this study presents certain limitations that should be considered when interpreting and transferring the findings.
First, although the sequential explanatory mixed design (QUANTITATIVE→QUALITATIVE) enabled the progressive integration of quantitative and qualitative results, it did not include random assignment of participants or the use of control groups, given the natural and contextualized nature of the educational settings involved. This circumstance prevents the establishment of strict causal relationships between the teaching proposal and the observed changes; however, it does allow for educational and theoretical inferences based on the consistency of the patterns detected across both cohorts.
Second, while the sample size was adequate for the statistical and interpretive analyses (n = 36 and n = 43), it limits the generalizability of the findings to other university contexts or teacher education programs. Nevertheless, the diversity of the degrees included and the consistency of the effects observed in two distinct training contexts reinforce the transferability of the model and suggest its potential applicability in comparable educational environments.
Third, qualitative data were collected through focus groups conducted after the intervention. Although these provided rich and complementary insights, they did not allow for a longitudinal assessment of the evolution of participants’ perceptions throughout the process. Additionally, a degree of social desirability bias may have influenced responses, as students could have expressed more favorable opinions due to the academic nature of the project.
Finally, although all four dimensions of scientific communication competence showed significant improvement, the more complex indicators, particularly the identification of reliable sources (ID1) and the consideration of social impacts (ID16), remained at incipient levels. This finding suggests that such competencies require longer development periods, along with conceptual scaffolding and reflective support, to reach full consolidation.
Overall, these limitations do not undermine the validity of the study but rather delineate its interpretive scope and indicate directions for future research aimed at strengthening methodological integration, expanding sample diversity, and deepening longitudinal understanding of competency development in teacher education.
4.5. Areas for Future Work
Based on the results obtained, several lines of future research have been identified to consolidate and expand knowledge on scientific communication and dissemination training within the university setting.
First, it is proposed to replicate and extend this study to other degree programs and institutional contexts, incorporating larger samples and longitudinal designs that enable the observation of the evolution of scientific dissemination skills over time. This approach would facilitate a deeper understanding of how learning derived from PBL and AE becomes consolidated, as well as the sustainability of its effects on teaching practice in the medium and long term.
Second, further research should focus on the indicators that showed more moderate progress, particularly literacy in reliable sources (ID1) and the consideration of social impacts (ID16). Future studies could incorporate specific instructional strategies for critical thinking and media literacy, combined with the reflective use of emerging technologies, to reinforce these more complex and slow-developing components of competence.
Third, it is recommended to adapt and apply the proposed teaching model to continuing professional development for in-service teachers, in order to examine its relevance at different stages of professional growth. Such research would allow for an evaluation of the model’s potential to promote scientific updating, effective communication, and social responsibility among educators in real classroom contexts.
Moreover, future work should integrate technological tools for digital performance analysis and objective metrics of media production, allowing for richer methodological triangulation and more robust empirical evidence regarding the quality, creativity, and impact of the dissemination products created by students.
In summary, these research directions contribute to advancing a teacher education model that organically integrates scientific, communicative, digital, and ethical competencies, thereby consolidating university social responsibility as the articulating axis of higher education in the 21st century.
5. Conclusions
This study provides empirical and theoretical evidence supporting the integration of PBL and Kolb’s Experiential Learning as effective strategies for developing scientific dissemination skills in initial teacher education. Using a sequential explanatory mixed-methods approach (QUANTITATIVE→QUALITATIVE), the results reveal a significant and consistent alignment between quantitative and qualitative data, demonstrating not only measurable improvements in performance but also transformations in how students understand, communicate, and apply scientific knowledge.
The findings indicate that future teachers can become active mediators between science and society when their training experiences promote reflection, situated practice, and ethical responsibility. The articulation of quantitative and qualitative phases enabled an understanding of how the observed progress is grounded in authentic learning processes, where engagement with real problems, experimentation with media resources, and collaborative work act as catalysts for competency development.
From a methodological perspective, the study highlights the value of mixed method designs in educational research, not merely as a combination of techniques but as an epistemological approach that facilitates the construction of comprehensive and contextually grounded knowledge about teacher training processes. By integrating data, perceptions, and theoretical foundations, this research achieves a deep understanding of the pedagogical impact of the implemented model, thereby strengthening the credibility, internal validity, and transferability of the results.
Finally, the study confirms that the development of scientific communication and dissemination skills constitutes a cornerstone of university social responsibility, as it equips teachers to communicate with rigor, clarity, and critical awareness in increasingly complex educational and social contexts. The proposed approach—replicable and adaptable to other disciplinary and institutional settings—contributes to the promotion of a more ethical, inclusive, and socially engaged higher education, committed to fostering scientific literacy and citizenship in the 21st century.