Next Article in Journal
The Design & Pitch Challenges in STEM: A Theoretical Framework for Centering Mathematics Learning in Entrepreneurial Pitch Competitions
Previous Article in Journal
The Relational Refugee Child: Trauma-Informed and Culturally Responsive Approaches to Educational Inclusion
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Supporting the Development of the Perceived and Actual Research Competence of Pre-Service Teachers

Department of Educational studies, Faculty of Education, University of Ljubljana, SI-1000 Ljubljana, Slovenia
*
Author to whom correspondence should be addressed.
Educ. Sci. 2025, 15(6), 652; https://doi.org/10.3390/educsci15060652
Submission received: 10 March 2025 / Revised: 9 May 2025 / Accepted: 23 May 2025 / Published: 25 May 2025
(This article belongs to the Section Teacher Education)

Abstract

:
The aim of this study was to investigate the development of the perceived and actual research competence of pre-service teachers enrolled in a master’s programme using a pre-test–post-test research design. Research competence, which encompasses a range of knowledge, skills, and attitudes and includes both perceived (self-reported) and actual (objective) concepts, is critical for effective teaching, teacher professional development, and the integration of research-based practises into the classroom. The study examined how perceived research competence, as measured by a web questionnaire, changed over the course of the semester and how these changes corresponded to actual research competence as measured by the knowledge test. The participants were pre-service teachers from the University of Ljubljana who completed both the web questionnaire and the knowledge test at two time intervals during a research-based course. The results indicate a significant increase in perceived research competence, especially in terms of research knowledge and skills, as well as a significant improvement in actual research competence. While there was a statistically significant increase in positive attitudes towards research, this increase was moderate. This study also emphasises the alignment between perceived and actual research competence over time, suggesting that practical research experiences help students to develop more realistic self-assessments. By furthering the understanding of how research competence develops over time, this study provides actionable insights for the design of more effective research-based teaching.

1. Introduction

The importance of integrating research into pre-service teacher education is increasingly recognised, as there is growing evidence that research is a crucial factor in the teaching profession. The growing recognition of research has been highlighted by many authors (Matjašič & Vogrinc, 2025; Saqipi & Vogrinc, 2016; Štemberger, 2020; Molina-Torres, 2022) who have categorised the benefits of pre-service teachers’ research work as follows: (1) research enables pre-service teachers to critically and reflectively examine their own practise; (2) through research, pre-service teachers advance their professional development; (3) when pre-service teachers know how to conduct and use research, they can apply the results of educational research to improve their own practise. These benefits were also confirmed by Van Katwijk et al. (2023) and the authors of the BERA-RSA (2014) report, who found that research has a positive impact on learning outcomes and showed that pre-service teachers who actively engage in research are better able to apply evidence-based practises, critically evaluate their teaching methods and make informed decisions that benefit student learning. In addition, Visser-Wijnveen et al. (2012) found that linking research and teaching can enhance students’ awareness of authentic research processes and help them to adopt a critical academic disposition.
Despite the recognised benefits, pre-service teachers often struggle to read, interpret and conduct research (Matos et al., 2023; Toquero, 2021; Van Katwijk et al., 2023); thus, it is important that pre-service teachers become competent in using research findings and also contributing their own findings to their field (Matjašič & Vogrinc, 2025; Shank & Brown, 2007). In other words, they should develop research competence.
In educational research, a distinction can be made between perceived and actual research competence. Perceived research competence refers to self-assessed confidence and ability in conducting research, which is usually measured using self-report instruments. In contrast, actual research competence comprises objectively assessed skills and knowledge, which are often measured using knowledge tests (Matjašič & Vogrinc, 2024). Moreover, recent studies show that the two are only weakly correlated and that students tend to overestimate their abilities (Bauer et al., 2024; Mamolo & Sugano, 2020). Despite this discrepancy, most empirical studies are cross-sectional and rely almost exclusively on self-reporting (Matos et al., 2023), which means that in the literature, little is known about how the perceived and actual research competence of pre-service teachers develop during the master’s programme.
Given this context, this study aims to fill the gap in the literature regarding the development of perceived and actual research competence in pre-service teachers over time. In particular, the study investigates how these competencies develop during the first semester of a master’s programme. To guide this investigation, the following research questions were formulated:
RQ1: How do pre-service teachers rate their perceived research competence at the beginning of the master’s programme and at the end of the first semester?
RQ2: How does the actual research competence of pre-service teachers develop over the course of the master’s programme, especially from the beginning of the master’s programme to the end of the first semester?
RQ3: Are there significant differences between the perceived and actual research competence of pre-service teachers at the beginning of the master’s programme and at the end of the first semester?

2. Literature Review

2.1. Research Competence

Matjašič and Vogrinc (2024) define research competence as a concept that encompasses a set of knowledge, skills and attitudes that are essential for successful research. They emphasise that research competence requires key skills such as critical thinking, independent learning and the organisational ability to plan and conduct research activities. Developing research competence is crucial for pre-service teachers, as it supports reflective practice, enhances pedagogical effectiveness, and contributes significantly to their professional development (Magnaye, 2022; Thiem et al., 2023). According to Magnaye (2022), pre-service teachers’ involvement in authentic research experiences can strengthen their overall pedagogical competence, particularly in relation to classroom management and assessment, by promoting deeper reflection on how research-based evidence improves classroom practise. Therefore, pre-service teachers need to undergo intensive training in both educational theory and research to improve their skills as future teachers (Magnaye, 2022; Thiem et al., 2023). Furthermore, Saqipi and Vogrinc (2016) found that engaging in practical, collaborative research tasks is crucial for pre-service teachers as it enhances their ability to evaluate and improve their teaching practises, which enables them to act as researchers in their own classrooms in the future.

2.2. Research-Based Teaching

Research competence plays an important role in promoting research-based professional practise among educators. This has led to a shift from a research-led teaching environment, where students are an audience, to a research-based teaching environment, where many institutions now integrate research tasks directly into the curriculum and encourage students to engage in research or align their projects with ongoing institutional initiatives (Healey & Jenkins, 2009; Thiem et al., 2023). In addition, research-based teaching has become a popular approach that allows pre-service teachers to apply their research skills to educational challenges while collaborating with colleagues and faculty (Matjašič & Vogrinc, 2025). This practical engagement mirrors the strategies explored by Prince et al. (2007), who identified three key methods for more effective integration of research into teaching: (1) presenting research and research findings in lectures, (2) involving students in research projects, and (3) extending the model of research scholarship. They conclude that this can benefit students, lecturers and universities. Thus, research is not only important for the development of academic disciplines and practise, but also for the development of research competence, which significantly influences the quality of professional development of pre-service teachers (Bayrak Özmutlu, 2022; Gussen et al., 2023; Magnaye, 2022).

2.3. Gaps in the Literature

Although both perceived and actual research competence offer valuable perspectives, a comprehensive understanding of how these competences develop requires the examination of both. In reviewing the literature (Matjašič & Vogrinc, 2024), we found that there are few studies examining the perceived and actual research competence of pre-service teachers and that most studies examine perceived research competence from the subjective perspective of the individual’s own experiences, which poses methodological limitations due to a reliance on self-assessment that often overlooks the need for objective measures that can provide more reliable insights into the effectiveness of research-based teaching. For example, a study by Bauer et al. (2024) investigated perceived and actual competence during long-term internships in pre-service teacher education programmes and found that perceived competence and actual competence were not strongly correlated, which is consistent with the findings of previous research that self-perception does not always reflect actual competence. Furthermore, Mamolo and Sugano (2020) found that student’s self-perceived competence was generally higher than their actual competence. More specifically, students rated their self-perceived competence as »satisfactory«, while their actual performance as measured by objective tests was only »fair«. In addition, some studies (Salmento et al., 2021; Saqipi & Vogrinc, 2016) do not clearly distinguish between perceived and actual research competence and only describe research competence, which can lead to inaccurate measurements and misleading conclusions about research competence. The next gap we see in the current literature is that apart from the few studies that have measured perceived and actual research competence, the majority of these studies have not used objective measures such as knowledge tests to measure actual research competence, which limits the validity of the results. Instead, they relied on qualitative assessments such as interviews or focus groups (Matjašič & Vogrinc, 2024). Matos et al. (2023), for example, conducted a systematic literature review of 68 studies on teaching and learning research methods. Their study found that the majority of studies (64.7%) relied on qualitative data collection and analysis, while only 13.2% used quantitative designs. This heavy reliance on qualitative and self-reporting approaches raises the question of whether current assessments fully capture the multifaceted nature of research competence. Furthermore, we could not find any study comparing the perceived and actual research competence of pre-service teachers over time. Most studies (e.g., Gussen et al., 2023; Magnaye, 2022) in this area are cross-sectional studies that do not capture the development of research competence over time, or studies that do measure research competence over time but only either perceived or actual research competence, not both. Furthermore, to our knowledge, the only study that measured both concepts of research competence over time was the study by Böttcher-Oschmann et al. (2021), which measured actual research competence in terms of »using research« (i.e., reflection on and use of evidence to solve problems in teaching practice) and not in terms of »establishing research« or engagement in research (i.e., formulating research questions, formulating hypotheses, selecting an appropriate research method, interpreting results, etc.). This leads to an incomplete understanding of how research competence develops over the course of pre-service teacher education and how theoretical learning translates into practical research competence. This discrepancy between perceived and actual research competence represents a significant gap in the current literature.
The aim of the present study was therefore to address this gap by investigating both the perceived and actual research competence of pre-service teachers using a pre-test–post-test research design. This approach was applied in a mandatory research-based course during the first semester designed to support the development of research competence to ensure that both perceived and actual research competence could be accurately measured. By incorporating a validated web questionnaire and a knowledge test at different time intervals, this study aimed to provide a more accurate picture of the development of research competence and thus inform how pre-service teacher education programmes or courses can be modified to more effectively support the development of both concepts of research competence.

3. Materials and Methods

3.1. Context of the Study

All participants in this study had previously completed an introductory research-based course during their undergraduate studies, usually taken in the second year of a four-year undergraduate degree programme. At the undergraduate level, they were introduced to the basics of educational research, including the role of theory, the formulation of basic research questions, and the preliminary design of simple empirical studies. They also learned basic statistical concepts relevant to education, such as data organisation, descriptive statistics (e.g., mean, median, and variance), correlation analysis (e.g., Pearson correlation) and basic non-parametric tests (e.g., Chi-square test for independence). In addition to these basics, students familiarised themselves with common methods of data collection (e.g., surveys, interviews, observations) and ethical considerations (e.g., participant consent, confidentiality). Although this gave them a broad overview of research methodology at the undergraduate level, the emphasis was on basic analytical techniques rather than in-depth application.
Building on this background, the compulsory research-based course offered during the master’s programme was designed to significantly enhance this knowledge and skill by enabling students to apply it in concrete contexts. Students went beyond the introductory undergraduate framework and explored more advanced research questions and statistical techniques. They formulated focused research problems and hypotheses, created questionnaires tailored to their own research interests, and learnt about survey methods, gaining hands-on experience with online survey tools and sampling methods suitable for different educational settings. They also learnt to apply both non-parametric and parametric statistical tests (e.g., t-tests, one-way ANOVA) using the Statistical Package for the Social Sciences (SPSS) and to critically interpret the results of these analyses. In addition, they dealt with different types of educational research such as action research and evaluation research and deepened their knowledge of the quality criteria for qualitative and quantitative research, as well as the ethical principles that apply to these different methodological approaches. By the end of the semester, they had not only practised identifying relevant theoretical frameworks, planning empirical research designs and conducting data analyses, but also applied this knowledge and these skills to concrete examples, making their methodological and statistical understanding more practise-oriented.

3.2. Participants

The participants of this study were master’s students enrolled in pre-service teacher education at the Faculty of Education, University of Ljubljana, in the academic year 2023/2024.
A total of 110 students participated in the initial data collection by completing the questionnaire on perceived research competence at the beginning of the semester. However, when the answers from the questionnaire before and after the first semester were combined, only 73 students provided valid identification codes that allowed for comparison across both time points. When the results of the knowledge tests before and after the first semester were combined, 64 students gave a valid identification code. Finally, comparing the students’ perceived research competence with their actual research competence reduced the sample size even further. A total of 58 students provided valid identification codes in both the questionnaires and the knowledge tests, so a complete analysis of perceived and actual research competence was possible. The discrepancies in sample size were mainly due to two factors: first, some participants chose not to participate in the entire research process, and second, there were inconsistencies in the identification codes, as several participants provided incomplete or incorrect codes, so their responses could not be linked across time points.
In order to maintain confidentiality and still match students’ responses across the different stages of the study, students created a unique identification code based on their parents’ personal details and home address. This anonymised coding procedure was used consistently for both the surveys and the knowledge tests.

3.3. Operationalising Perceived Research Competence

Perceived research competence was assessed using the validated questionnaire developed by Matjašič and Vogrinc (2025), which measures three key dimensions of research competence: research knowledge, research skills and research attitudes. Each dimension was measured using a series of items on a Likert scale from 1 (strongly disagree) to 5 (strongly agree). The questionnaire was conducted as a web-based survey at two points in time: once at the beginning of the master’s programme and again at the end of the first semester. Both surveys were conducted during the lecture period to ensure the use of a consistent framework and reduce the likelihood of external influences. Students were also informed of the aims of the study, that their participation was voluntary and that their anonymity was guaranteed.
Given the complexity of research competence as a multifaceted construct, an exploratory factor analysis (EFA) using Varimax rotation and the minimum residual method (Minres) was used to analyse the latent structure of the questionnaire. Due to the ordinal nature of the Likert scale items used in the questionnaire, we also used polychoric correlations to estimate the relationships between the variables (Holgado-Tello et al., 2010).
Before conducting the EFA, we assessed the suitability of our data using the Kaiser–Meyer–Olkin (KMO) measure of sampling adequacy and Bartlett’s test of sphericity for each dimension. The KMO measure for research knowledge was 0.71, above the recommended threshold of 0.60 (Field, 2013), indicating good sampling adequacy. In addition, Bartlett’s test of sphericity was significant (p < 0.001), confirming that the items were sufficiently correlated for factor analysis. For research skills, the KMO value was 0.76 and Bartlett’s test of sphericity showed a significant result (p < 0.001). The same applies to research attitudes (KMO = 0.76, Bartlett’s Test of Sphericity p < 0.001), which confirms the applicability of the EFA to our data.
In the next step, the EFA was used to determine whether the questionnaire items could be grouped into different factors corresponding to the theoretical dimensions of research competence, namely research knowledge, skills and attitudes. To determine the appropriate number of factors to be retained, we first conducted several analyses, such as the Scree test, parallel analysis, eigenvalue inspection and the Velicer minimum average partial test. Based on these analyses, we decided to retain three factors in each dimension for further analysis. In addition, for each factor in each dimension, only items with standardised factor loadings of 0.4 or higher were retained, which is in line with established guidelines (Field, 2013; Stevens, 2002). Finally, the reliability of each factor was assessed using Cronbach’s α, resulting in satisfactory internal consistency (Cronbach’s α > 0.6) (see Appendix A).
While a factor analysis was conducted to address RQ1, for the purposes of addressing RQ3, we created a composite variable by averaging the scores of all items of the questionnaire, which represents the overall perceived research competence. Having a single variable to measure perceived research competence allowed us to compare it with actual research competence. A similar strategy was used by Magnaye (2022); perceived research competence was measured as a composite variable for the purpose of regression analysis. From a practical perspective, comparing perceived and actual research competence using a single composite variable reduces the complexity of the analysis. Field (2013) explained that working with multiple dimensions in comparison can lead to statistical complications, such as inflated error rates or multicollinearity.

3.4. Operationalising Actual Research Competence

One method with which to measure actual research competence more objectively is the use of knowledge tests (Bauer et al., 2024; Böttcher-Oschmann et al., 2021). Therefore, we developed a knowledge test that was administered at the beginning of the master’s programme (to establish a baseline of pre-service teacher actual research competence) and again at the end of the first semester to measure changes over time. In designing this test, we explicitly included items that reflected the content that had been introduced in the bachelor’s programme (e.g., basic statistical techniques and basic methodological concepts) and further developed in the master’s programme during the research-based course. The aim was to measure not only whether students remember basic concepts, but also whether they can apply them in specific educational contexts.
The knowledge test was designed to measure actual research competence through a combination of multiple-choice questions, short-answer questions and research-related tasks. It was also designed to measure the »establishing research«. The test included the following components:
  • A total of 14 true/false statements about statistics and research methodology, which measured participants’ understanding of research principles such as reliability, sampling, statistical knowledge and ethical considerations.
  • Three brief research scenarios, which were presented and from which participants had to select the most appropriate research method for each scenario from a range of given options.
  • The ability to select the most appropriate research design for the given research objectives, distinguishing between quantitative and qualitative approaches and specific types of research such as case studies or action research.
  • Instructions to formulate the research question for a given topic.
  • Instructions to formulate a hypothesis based on the formulated research question.
  • The ability to select the most appropriate research method and data collection technique to investigate the formulated research question and hypothesis.
Before we could carry out the knowledge test, we first had to validate it. Therefore, we sent the test to three experts in the field of research methodology and research competence, who assessed the content and face validity of the test. This step was taken to ensure reliability and accuracy in the assessment of actual research competence. A pilot study was then conducted with a small group of respondents to identify and clarify unclear or ambiguous questions, adjust the difficulty of the items and refine the test structure. Finally, we conducted an item analysis to assess the difficulty of the questions and the discriminatory power (D > 0.30, good discriminatory power) to ensure that the test can distinguish between different levels of research competence.
The test was scored quantitatively (maximum 22 points) so that we could calculate the total score for both time points. To answer RQ2, a composite variable was created by summing up the scores across all test items presenting actual research competence. Similar methods for assessing actual competence, in which composite variables were constructed from individual test components, have already been used in educational research (Baartman & Ruijs, 2011).
Finally, as the knowledge test was administered at two different time points, we assessed its test–retest reliability by analysing the correlation between the composite variables, which confirmed the stability of the test over time.

3.5. Data Analysis

All analyses were performed in R using the packages lavaan, psych, EFA.dimensions and Rnest.
Before addressing RQ1, we tested the data for normality using the Kolmogorov–Smirnov test and examined Q–Q plots. The results indicated that the data for perceived research competence were not normally distributed (p < 0.001), necessitating the use of non-parametric tests. Specifically, we used the Wilcoxon signed-rank test to assess changes in perceived research competence across the three dimensions (research knowledge, research skills, and research attitudes) at two time points. We also calculated the Hodges–Lehmann estimator to quantify the median increase for each dimension of perceived research competence (Rosenkranz, 2010) and the effect size (rank- biserial correlation) to understand the practical significance of the observed changes (van Doorn et al., 2020).
As for the actual research competence, the Kolmogorov–Smirnov test (p > 0.05) and the Q–Q plots showed a normal distribution, so RQ2 was addressed using a paired t-test to compare the results of the knowledge test at two time points. In addition, Cohen’s d was calculated to measure the practical significance of the results obtained.
Finally, in order to directly compare perceived and actual research competence (RQ3) measured on different scales, we first normalised the composite scores using a min–max transformation that rescales the scores to a common range (0–1). This approach preserves the relative differences between the observations and enables meaningful comparisons across measures. After normalisation, a Wilcoxon signed-rank test (due to the non-normality of the composite scores, as shown by the Kolmogorov–Smirnov test and the Q–Q plots) was calculated with the Hodges–Lehmann estimator and the rank- biserialcorrelation as the effect size.
At this point, it should be noted that although the Wilcoxon signed-rank test is based on the ranks of the differences between paired observations, reporting both the median and the interquartile range (IQR) provides a clear and interpretable measure of central tendency and variability, especially given the non-normal distribution of the data. In addition, the inclusion of the Hodges–Lehmann estimator supports the interpretation as it provides a non-parametric estimate of the median difference between paired observations.

4. Results

4.1. Perceived Research Competence

4.1.1. Perceived Research Knowledge

For RQ1, the medians of the dimensions of perceived research competence were compared for each respective dimension.
The results presented in Table 1 provide an assessment of changes in students’ perceived research knowledge at two time points. The analysis focuses on three key dimensions of perceived research knowledge obtained from the EFA: knowledge of research approaches (familiarity with different research approaches such as quantitative and qualitative research as well as appropriate data analysis techniques), conceptual knowledge (the ability to theoretically justify research problems, formulate objectives, research questions, and hypotheses) and methodological knowledge (selection of appropriate research methods, understanding of sampling procedures and familiarity with data collection tools).
The results in Table 1 show statistically significant changes (p < 0.001) in all three dimensions of perceived research knowledge at the end of the first semester. Starting with the factor knowledge of research approaches, we can see that the median value increased from 3.0 at the beginning of the master’s programme to 4.0 at the end of the semester. Furthermore, the Hodges–Lehmann estimator of −1.30 means that a student who rated their knowledge of research approaches as 3 (neutral) at the beginning was more likely to rate it as 4 (good) at the end of the first semester. This is also confirmed by the large effect size (r = 0.86), which shows the practical significance of the difference in students’ perceived knowledge of research approaches. In the area of conceptual knowledge, the median score also increased from 3.75 to 4.25. The Hodges–Lehmann estimator (−0.88 points) and the effect size (r = 0.81) also confirm the practical significance of the difference in students’ perceived conceptual knowledge. The same applies to perceived methodological knowledge, where the median value rose from 3.0 to 3.8. Here, too, the value of the Hodges–Lehmann estimator (−1.00) and the effect size (r = 0.84) show the practical significance of the difference in the students’ perceived methodological knowledge.

4.1.2. Perceived Research Skills

Table 2 shows the results of the three key dimensions of perceived research skills obtained from the EFA: statistical analysis and interpretation (ability to evaluate data collection instruments, use statistical software, write statistical interpretations, and draw data-based conclusions), research planning (ability to design a research plan and create data collection instruments), and the preparation and presentation of results (describing the data collection process, writing empirical research reports and summaries, and developing plans.
Similarly to perceived research knowledge, we found not only statistically significant changes (p < 0.001) at the end of the first semester, but also practical significance in all three dimensions of perceived research skills. When we look at the factor statistical analysis and interpretation, we can see that there was an increase, with the median value increasing from 3.2 to 4.0. Again, the Hodges–Lehmann estimator (−0.90) and large effect size (r = 0.83) emphasise the significant increase in students’ perceived statistical analysis and interpretation skills. In the area of perceived research planning, the median value also increased from 3.8 to 4.0, with the Hodges–Lehmann estimator (−0.50) and effect size (r = 0.63) confirming the practical significance of the change. For the factor preparation and presentation of results, the median value also increased from 2.8 to 3.8. The Hodges–Lehmann estimator (−1.10) and the effect size (r = 0.84) again confirm the practical significance in the students’ perceived knowledge in the preparation and presentation of research results at the end of the first semester.

4.1.3. Research Attitude

The final dimension of perceived research competence is research attitude, for which the EFA revealed the following dimensions: evaluation of research (assessing the perceived importance of integrating personal research experience into teaching and the belief that research can solve practical problems in educational settings), positive attitudes towards research (measuring the enjoyment of reading research reports, the perceived value of research knowledge for academic success and overall interest in educational research), and negative attitudes towards research (capturing feelings of stress and boredom associated with engaging in research).
Table 3. Changes in research attitudes from the beginning to the end of the first semester.
Table 3. Changes in research attitudes from the beginning to the end of the first semester.
FactorAt the Beginning of the Master’s ProgrammeAt the End of the First SemesterWilcoxon Signed-Rank Test (z-Value)Hodges–Lehmann EstimatorEffect Size (r)
Median (IQR)
Evaluation of research3.71 (0.71)3.71 (0.71)2.08 *−0.140.24
Positive attitude towards research3.25 (1.00)3.50 (0.75)4.54 ***−0.380.53
Negative attitude towards research2.50 (1.00)3.00 (1.50)1.08−0.250.13
*** p < 0.001; * p < 0.05.
The results in Table 3 show that there were changes in students’ perceived research attitudes at the end of the first semester, although not all were significant. For example, the median value for the evaluation of research factor remained unchanged from the beginning to the end of the semester, at 3.71. Despite the statistical significance (z = 2.08, p < 0.05), there was a small and possibly negligible increase in students’ evaluation of research (Hodges–Lehman estimator = −0.14, r = 0.24). In contrast, the positive attitude towards research showed a more notable and statistically significant improvement (z = 4.54, p < 0.001), with the median value increasing from 3.25 to 3.50. The Hodges–Lehmann estimator (−0.38), and the effect size (r = 0.53) indicate a moderate change in positive attitudes towards research. Interestingly, the median value for negative attitudes towards research increased from 2.5 to 3.0, indicating an increase in student’s negative attitudes. However, this change was neither statistically significant (z = 1.08, p > 0.05) nor meaningful (Hodges–Lehmann estimator = −0.25, r = 0.13). This is supported by the IQR, which shows greater variability in negative attitudes towards research, suggesting that while some students developed stronger negative perceptions, others did not, resulting in a more diverse range of negative attitudes.

4.2. Actual Research Competence

In order to answer RQ2, the mean values of two knowledge tests carried out at the beginning of the master’s programme and end of the first semester were compared (Table 4).
A statistically significant difference was found between the results at the beginning of the master’s programme and at the end of the first semester (t = −7.81, p < 0.001), which means that the students’ results in the knowledge test at the end of the first semester improved by an average of 2.70 points. The effect size measured with Cohen’s d was 0.98, which means that the mean score at the end of the first semester was almost one standard deviation higher than the mean score at the beginning of the master’s programme.

4.3. Comparison of Perceived and Actual Research Competence

To answer RQ3, the medians of two normalised variables were compared (Table 5).
Table 5 shows that at the beginning of the master’s programme, the median value for perceived research competence was 0.61, while the actual research competence, as measured by the knowledge test, had a median value of 0.44. This shows that the students overestimated their perceived research competence compared to their actual research competence (z = 2.92, p < 0.01). Looking at the Hodges–Lehmann estimator (−0.17) and the effect size (r = 0.38), the results reflect moderate practical significance. By the end of the first semester, however, the situation had changed. The median value for perceived research competence decreased to 0.36, while the median value for actual research competence increased to 0.50. The Wilcoxon Signed-Rank test (z = 2.30, p < 0.05) confirmed a significant difference, with the Hodges–Lehmann estimator (−0.11) and effect size (r = 0.30) showing a moderate change in students’ perceptions.

5. Discussion

The present study was designed to address the gaps in the literature related to the development of research competence in pre-service teachers through a pre-test–post-test research design that utilised web surveys and knowledge tests to measure both perceived and actual research competence over time. The results of this study offer several original insights into the development of research competence among pre-service teachers and are presented below.
RQ1: How do pre-service teachers rate their perceived research competence at the beginning of the master’s programme and at the end of the first semester?
The results showed a significant and important increase in the rating of perceived research competence in almost all measured dimensions of perceived research competence. We found that pre-service teachers rated their perceived research competence significantly higher at the end of the first semester than at the beginning of the master’s programme. This suggests that their engagement with research tasks and assignments during the research-based course (and the master’s programme itself) contributed to a higher rating of their perceived research competence. In terms of research knowledge, the greatest changes were observed in the understanding of different research approaches. This improvement can be attributed to the comprehensive coverage of both qualitative and quantitative research methods in the course. For example, students were involved in designing, administering and analysing questionnaires, and applying their knowledge of research methods and statistical techniques. They developed research plans outlining their study objectives, sampling strategies and data collection methods, demonstrating their ability to translate theoretical knowledge into practical applications. We believe that these activities led to an improvement in their perceived research knowledge. In addition, pre-service teachers also rated their perceived research skills better at the end of the master’s programme. In particular, the ability to conduct statistical analyses and interpret data showed one of the largest increases in perceived competence. This could be due to the research activities pre-service teachers were exposed to during the course (e.g., using research findings to interpret and present data using the Statistical Package for the Social Sciences (SPSS)). Similarly, students reported greater confidence in research planning and in the preparation and presentation of research findings, which could be attributed to their active engagement in research-based activities and independent research assignments. In terms of attitudes towards research, the results were mixed. While positive attitudes towards research improved moderately and statistically significantly, changes in other aspects of students’ attitudes towards research were minimal, resulting in a modest overall change. This contrasts with the findings of previous studies by Sizemore and Lewandowski (2009), who reported that while students’ knowledge of research methods improved significantly after completing a research methods course, their attitudes towards research did not, and in some cases actually decreased, particularly in relation to the perceived benefits of research. Similarly, Wessels et al. (2018) emphasised that the demanding nature of research requires additional affective–motivational dispositions beyond cognitive competence and that students often face affective–motivational challenges during the research process. Our findings suggest that practical research experiences can foster positive research attitudes even if they do not affect other dimensions of research attitudes (e.g., evaluation of research). This suggests that while the cognitive benefits are evident, changing students’ attitudes towards research remains complex and may require pedagogical approaches that more directly address affective–motivational factors.
Overall, the results of our study are consistent with the findings of Gussen et al. (2023), who found that pre-service teachers experienced a significant increase in their self-assessed research competence in the cognitive domain at the end of the semester. Specifically, students felt that their perceived methodological skills and ability to reflect on research had improved, similarly to the observed increase in perceived research knowledge and skills in our study. In addition, they also found a decrease in affective–motivational aspects such as interest and enjoyment in research, which contrasts with our findings regarding students’ research attitudes. Furthermore, Van der Linden et al. (2012) found that students perceive conducting research and applying research findings to be more important (cognitive aspect) than appealing (emotional aspect); that they are more likely to express how important research is than to actually conduct it (behavioural aspect); and that they feel more competent in conducting research (self-efficacy) than they are enthusiastic about carrying it out or applying it. They suggest various approaches that can positively influence students’ attitudes towards research, such as using authentic research examples (showing students how research can directly impact teaching practice, and using research tasks that are directly related to the practical challenges faced by teachers) and collaborative learning.
RQ2: How does the actual research competence of pre-service teachers develop over the course of the master’s programme, especially from the beginning of the master’s programme to the end of the first semester?
Our results show a significant improvement in knowledge test scores, both statistically (p < 0.001) and practically (d = 0.98). In addition, students scored an average of 60% at the beginning of the semester, while they scored 72% at the end of the first semester. The improvement in knowledge test scores suggests that the research-based course in which the pre-service teachers participated successfully supported and increased their actual research competence. The research-based course structure gave students the opportunity to explore research methods and statistical techniques. This allowed them to gradually make a connection between theory and practical application as they learnt statistical methods for educational research—such as parametric and non-parametric tests—using SPSS. We believe that the course structure promoted critical thinking and a deeper understanding of research designs and methods, especially when critically analysing previously published master’s theses and scientific articles. Our findings are consistent with those of Böttcher-Oschmann et al. (2021), who showed that engaging students in authentic research activities where they go through the entire research process, apply statistical methods and critically analyse research designs leads to measurable gains in actual research competence. Furthermore, Magnaye (2022) found a significant relationship between pedagogical competence (classroom management and assessment) and research competence among pre-service teachers. According to him, practical skills and the ability to apply knowledge in authentic contexts are crucial for the development of research competence. Furthermore, Albareda-Tiana et al. (2018) found that pre-service teachers can develop and demonstrate actual research competence when they have the opportunity to engage in meaningful research activities (e.g., conducting research projects). Our findings are also supported by the findings of Aspfors and Eklund (2017), who found that when teacher education includes explicit research activities (e.g., research seminars and workshops in which students critically analyse existing studies and discuss methodological approaches), it strengthens pre-service teachers’ ability to effectively apply research methods in the educational context.
RQ3: Are there significant differences between the perceived and actual research competence of pre-service teachers at the beginning of the master’s programme and at the end of the first semester?
We found that the pre-service teachers tended to overestimate their actual research competence at the beginning of the master’s programme. More specifically, they rated their actual research competence higher than it actually was according to the knowledge test. This could be due to less practical experience or a limited understanding of how complex conducting educational research can be. However, as the semester progressed and they became more actively involved in the research-based course, as well as other master’s courses, their actual research competence improved, reflecting the skills and knowledge they had gained through the research-based teaching. This may suggest that as students became more research competent, they also developed a more realistic self-assessment of their abilities. This aligns with the findings of Saqipi and Vogrinc (2016), who found that pre-service teachers often focus more on the processes of research than on understanding the underlying objectives. Without adequate practical experience, students may have only a superficial idea of research activities, leading to an exaggerated self-perception. However, when they engage in authentic research experiences, their awareness of the challenges increases, leading to a re-evaluation of their research competence. In addition, Böttcher-Oschmann et al. (2021) observed that students initially overestimated their research competence, that their actual research competence improved the more they conducted research as part of their research-based projects and that their perceived research competence more closely matched their actual research competence. Böttcher-Oschmann et al. (2021) also discussed the phenomenon of response shift, where individuals adjust their internal standards and understanding when they have new experiences. This shift can lead to changes in self-evaluation that do not necessarily reflect actual changes in competence, but rather a deeper awareness of what competence means. In our study, the decrease in perceived competence despite an increase in actual research competence indicates such a response shift. As students became more immersed in research activities, they were able to develop a more nuanced understanding of the skills required, leading to a re-evaluation of their own skills.

6. Conclusions

This study makes several important contributions. One of the most important is the research design, which allowed us to measure and assess both perceived and actual research competence over time, thus providing a more comprehensive picture of the development of pre-service teachers’ research competence. First, we were able to measure and quantify the improvement in perceived and actual research competence over the course of the master’s programme. Our results show that pre-service teachers’ self-perceived competence improved significantly, and more importantly, that they improved their performance on the knowledge test by almost one standard deviation after participating in the research-based course. This shows that research-based teaching can significantly support the development of research competence. In addition, the study made it possible to uncover discrepancies between perceived and actual research competence, track how these constructs evolve with experience, and identify the phenomenon of response shift, in which students recalibrate their self-assessment standards as they gain practical experience. This finding suggests that practical experience causes pre-service teachers to better align their self-perceptions with their actual performance. By tracking these constructs over time, our study demonstrates that research-based teaching fosters both the development of research competence and a more accurate self-assessment process. The integration of research-based teaching into the course structure is another strength as it encouraged students to engage in research activities such as statistical analysis using SPSS, conducting literature reviews, working in teams, critically analysing published research, designing research projects and presenting their findings. This practical approach emphasises the importance of providing pre-service teachers with hands-on opportunities to engage with research, bridging the gap between academic knowledge and its practical application in an educational context. Furthermore, the use of validated measurement tools, including web-based questionnaires and knowledge tests, ensured the reliability and validity of the data collected, thus strengthening the methodological rigour of the study. Because the study captures both perceived and actual research competence, it provides a nuanced understanding of how these skills develop together, which has important implications for the design of pre-service teacher education curricula. For example, integrating objective assessments in conjunction with self-reflective practises can help pre-service teachers gain a more accurate understanding of their skills while promoting both competence and confidence in research. Finally, the study offers practical recommendations for the design of research-based teaching that prioritise the integration of research activities to ensure that students are both confident in their research skills and able to apply them effectively in educational contexts. These findings are not only valuable for improving pre-service teacher education, but also contribute to a wider discussion about promoting research competence in education professionals.

Study Limitations and Future Work

As with any research endeavour, some possible limitations of the present study must be taken into account. First, the sample size of the study decreased considerably from 110 to 58 participants when comparing perceived and actual research competence. To determine whether attrition from the initial 110 participants to the final 58 participants resulted in systematic bias, we compared available academic and study-related characteristics such as gender, age, field of study, and prior research experience between the initial and final samples using Chi-square tests. These analyses revealed no significant differences (p > 0.05), suggesting that the final sample was not systematically biassed by dropouts. Nevertheless, the reduction in sample size may have affected the statistical power and limited the generalisability of the results. Future research should adopt strategies to improve participant retention (such as predefined identification codes) and consider employing statistical techniques, such as multiple imputation or sensitivity analysis, to address missing data. Second, the study only examined perceived and actual research competence at two points in time—at the beginning and end of the first semester—which limits the ability to assess long-term changes in competence development. In future studies, a more comprehensive longitudinal design with additional measurement time points and, ideally, a control group could help to clarify the causal effects of research-based teaching on the development of research competence. Third, this study focussed on only one faculty. Although the findings are generalisable to this faculty and to other faculties offering pre-service teacher education programmes (at least in Slovenia and other countries with a similar system of teacher education), future studies should extend this line of research to multiple institutions to validate and generalise the findings. Fourth, external variables, such as motivation to study, which could influence both perceived and actual research competence, were not controlled for. Fifth, future studies should incorporate qualitative methods (e.g., interviews) to capture the cognitive and affective processes behind students’ evolving self-evaluations to further improve our understanding of response bias. These qualitative findings could demonstrate how exposure to authentic research experiences prompts students to critically reflect on and adjust their self-assessment standards. Finally, while the measurement instruments showed acceptable internal consistency and supported the factor structures through the EFA, the knowledge test primarily captured “establishing research”. Future studies should consider incorporating additional performance-based assessments to capture the multifaceted nature of research competence more comprehensively.

Author Contributions

Conceptualization, M.M. and J.V.; Methodology, M.M. and J.V.; Validation, M.M. and J.V.; Formal analysis, M.M.; Investigation, M.M.; Data curation, M.M.; Writing—original draft preparation, M.M.; Writing—review and editing, M.M. and J.V.; Supervision, J.V.; Project administration, J.V.; Funding acquisition, J.V. All authors have read and agreed to the published version of the manuscript.

Funding

The authors acknowledge the financial support of the Slovenian Research Agency under the research core funding Strategies for Education for Sustainable Development Applying Innovative Student-Centred Educational Approaches (ID: P5-0451).

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and was reviewed and approved by the Ethics Commission of the Faculty of Education of the University of Ljubljana. Approval code 14/2023, on 29 August 2023.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The derived data and the R code supporting the results of this study are available on request from the corresponding author. The data have been anonymized, but are not publicly available due to privacy issues.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A

Table A1. Standardised factor loadings and Cronbach’s α for the factor perceived research knowledge.
Table A1. Standardised factor loadings and Cronbach’s α for the factor perceived research knowledge.
FactorItemStandardised Factor LoadingsCronbach’s α
Knowledge of research approachesI am familiar with the characteristics of the quantitative research approach0.950.92
I am familiar with the characteristics of the qualitative research approach.0.94
I am familiar with the characteristics of action research.0.91
I am familiar with the characteristics of quantitative data analysis.0.62
I am familiar with the characteristics of qualitative data analysis.0.89
Conceptual knowledgeI can theoretically justify the research problem of my study.0.560.77
I can formulate the objectives of the research.0.73
I can formulate research questions.0.90
I can formulate hypotheses.0.81
Methodological KnowledgeI can select the most appropriate research methodology for my research work.0.470.66
I am familiar with different sampling methods and selecting individuals appropriate for my research.0.62
I am familiar with the characteristics of various data collection instruments (e.g., questionnaire, knowledge test, interview question list).0.70
I am familiar with different methods of assessing the quality of data collection instruments (e.g., validity and reliability of the instrument).0.40
I am familiar with various statistical techniques used in educational research (e.g., measures of central tendency, measures of dispersion, correlation coefficients).0.46
Table A2. Standardised factor loadings and Cronbach’s α for the factor perceived research skills.
Table A2. Standardised factor loadings and Cronbach’s α for the factor perceived research skills.
FactorItemStandardised Factor LoadingsCronbach’s α
Statistical analysis and interpretationI can assess the quality of a designed data collection instrument (its validity and reliability).0.920.78
I can use a programme for statistical data analysis (e.g., SPSS).0.49
I can present the results and findings of my research to a broader audience (e.g., to my classmates, at a conference).0.62
I can write a statistical interpretation of the applied statistical techniques.0.62
I can draw conclusions from the collected data for my research.0.58
Research planningI can search for literature needed for the theoretical framework of my research.0.630.83
I can cite the literature I have read.0.77
I can summarise the main points of the literature I have read.0.89
I can design a research plan.0.71
I can create various data collection instruments (e.g., a questionnaire, knowledge test, interview questions).0.61
Preparation and presentation of resultsI can describe the data collection process.0.400.70
I can write a report on the conducted empirical research.0.91
I can write a summary of the empirical research report (e.g., seminar paper).0.69
I can develop an (action) plan for implementing changes in educational practice based on the findings of the empirical research.0.40
I can prepare research conclusions based on the collected data.0.56
Table A3. Standardised factor loadings and Cronbach’s α for the factor research attitude.
Table A3. Standardised factor loadings and Cronbach’s α for the factor research attitude.
FactorItemStandardised Factor LoadingsCronbach’s α
Evaluation of researchI find it important that a teacher uses examples from their own research during lectures.0.720.81
It is important for effective teaching at the university level that teachers and assistants engage in research.0.79
Teachers’ research in schools helps solve problems in everyday practice.0.79
When I will work in a school, I will engage in research.0.61
Conducting research is a good way to improve a teacher’s teaching.0.49
Conducting research is a good way to increase the reputation of the teaching profession.0.40
It is important to me that my teachers and assistants conduct research.0.49
Positive attitude toward researchI enjoy reading research reports.0.760.82
Knowledge of research in the field of education helps me succeed in my studies.0.76
Knowledge of research in the field of education helps me understand theoretical concepts we have covered in various subjects.0.80
I am interested in research in the field of education.0.73
Negative attitude toward researchResearch causes me stress.0.950.67
Research is boring.0.61

References

  1. Albareda-Tiana, S., Vidal-Raméntol, S., Pujol-Valls, M., & Fernández-Morilla, M. (2018). Holistic approaches to develop sustainability and research competencies in pre-service teacher training. Sustainability, 10(10), 3698. [Google Scholar] [CrossRef]
  2. Aspfors, J., & Eklund, G. (2017). Explicit and implicit perspectives on research-based teacher education: Newly qualified teachers’ experiences in Finland. Journal of Education for Teaching, 43(4), 400–413. [Google Scholar] [CrossRef]
  3. Baartman, L. K. J., & Ruijs, L. (2011). Comparing students’ perceived and actual competence in higher vocational education. Assessment & Evaluation in Higher Education, 36(4), 385–398. [Google Scholar] [CrossRef]
  4. Bauer, M., Traub, S., & Kunina-Habenicht, O. (2024). The growth of knowledge and self-perceived competence during long-term internships: Comparing preparatory versus accompanying seminars in teacher education programs. Frontiers in Education, 9, 1194982. [Google Scholar] [CrossRef]
  5. Bayrak Özmutlu, E. (2022). Views of pre-service teachers on the research-based teacher education approach. Tuning Journal for Higher Education, 10(1), 113–153. [Google Scholar] [CrossRef]
  6. BERA-RSA. (2014). Research and the teaching profession: Building the capacity for a self-improving education system. Final report of the BERA-RSA inquiry into the role of research in teacher education. Available online: https://www.bera.ac.uk/wp-content/uploads/2013/12/BERA-RSA-Research-Teaching-Profession-FULL-REPORT-for-web.pdf (accessed on 4 April 2024).
  7. Böttcher-Oschmann, F., Groß Ophoff, J., & Thiel, F. (2021). Preparing teacher training students for evidence-based practice: Promoting students’ research competencies in research-learning projects. Frontiers in Education, 6, 642107. [Google Scholar] [CrossRef]
  8. Field, A. (2013). Discovering statistics using IBM SPSS statistics (4th ed.). SAGE Publications. [Google Scholar]
  9. Gussen, L., Schumacher, F., Großmann, N., González, L. F., Schlüter, K., & Großschedl, J. (2023). Supporting pre-service teachers in developing research competence. Frontiers in Education, 8, 1197938. [Google Scholar] [CrossRef]
  10. Healey, M., & Jenkins, A. (2009). Developing students as researchers. In S. K. Haslett, & H. Rowlands (Eds.), Linking research and teaching in higher education: Proceedings of the newport NEXUS conference (Special Publication No. 1). Centre for Excellence in Learning and Teaching, University of Wales. [Google Scholar]
  11. Holgado-Tello, F. C., Chacón-Moscoso, S., Barbero-García, I., & Vila-Abad, E. (2010). Polychoric versus Pearson correlations in exploratory and confirmatory factor analysis of ordinal variables. Quality and Quantity, 44(1), 153–166. [Google Scholar] [CrossRef]
  12. Magnaye, A. L. (2022). Pedagogical and research competence of the pre-service teachers. American Journal of Multidisciplinary Research and Innovation, 1(3), 81–88. [Google Scholar] [CrossRef]
  13. Mamolo, L. A., & Sugano, S. G. C. (2020). Self-perceived and actual competencies of senior high school students in General Mathematics. Cogent Education, 7(1), 1779505. [Google Scholar] [CrossRef]
  14. Matjašič, M., & Vogrinc, J. (2024). Research competence of pre-service teachers: A systematic literature review. European Journal of Educational Research, 13(2), 877–894. [Google Scholar] [CrossRef]
  15. Matjašič, M., & Vogrinc, J. (2025). Perceived research competence among master’s students in pre-service teacher education programmes. Center for Educational Policy Studies Journal. [Google Scholar] [CrossRef]
  16. Matos, J. F., Piedade, J., Freitas, A., Pedro, N., Dorotea, N., Pedro, A., & Galego, C. (2023). Teaching and learning research methodologies in education: A systematic literature review. Education Sciences, 13(2), 173. [Google Scholar] [CrossRef]
  17. Molina-Torres, M.-P. (2022). Project-based learning for teacher training in primary education. Education Sciences, 12(10), 647. [Google Scholar] [CrossRef]
  18. Prince, M. J., Felder, R. M., & Brent, R. (2007). Does faculty research improve undergraduate teaching? An analysis of existing and potential synergies. Journal of Engineering Education, 96(4), 283–294. [Google Scholar] [CrossRef]
  19. Rosenkranz, G. K. (2010). A note on the Hodges-Lehmann estimator. Pharmaceutical Statistics, 9(2), 162–167. [Google Scholar] [CrossRef]
  20. Salmento, H., Murtonen, M., & Kiley, M. (2021). Understanding teacher education students’ research competence through their conceptions of theory. Frontiers in Education, 6, 763803. [Google Scholar] [CrossRef]
  21. Saqipi, B., & Vogrinc, J. (2016). Developing research competence in pre-service teacher education. Pedagoška Obzorja, 31(2), 101–117. [Google Scholar]
  22. Shank, G., & Brown, L. (2007). Exploring educational research literacy. Routledge Taylor & Francis Group. [Google Scholar]
  23. Sizemore, O. J., & Lewandowski, G. W. (2009). Learning might not equal liking: Research methods course changes knowledge but not attitudes. Teaching of Psychology, 36(2), 90–95. [Google Scholar] [CrossRef]
  24. Stevens, J. (2002). Applied multivariate statistics for the social sciences (4th ed.). Lawrence Erlbaum Associates. [Google Scholar]
  25. Štemberger, T. (2020). Educational research within the curricula of initial teacher education: The case of Slovenia. Center for Educational Policy Studies Journal, 10(3), 31–51. [Google Scholar] [CrossRef]
  26. Thiem, J., Preetz, R., & Haberstroh, S. (2023). How research-based learning affects students’ self-rated research competences: Evidence from a longitudinal study across disciplines. Studies in Higher Education, 48(7), 1039–1051. [Google Scholar] [CrossRef]
  27. Toquero, C. M. D. (2021). “Real-world:” Preservice teachers’ research competence and research difficulties in action research. Journal of Applied Research in Higher Education, 13(1), 126–148. [Google Scholar] [CrossRef]
  28. Van der Linden, W., Bakx, A., Ros, A., Beijaard, D., & Vermeulen, M. (2012). Student teachers’ development of positive attitude towards research and research knowledge and skills. European Journal of Teacher Education, 35(4), 401–419. [Google Scholar] [CrossRef]
  29. van Doorn, J., Ly, A., Marsman, M., & Wagenmakers, E. J. (2020). Bayesian rank-based hypothesis testing for the rank sum test, the signed rank test, and Spearman’s ρ. Journal of Applied Statistics, 47(16), 2984–3006. [Google Scholar] [CrossRef] [PubMed]
  30. Van Katwijk, L., Jensen, E., & Van Veen, K. (2023). Pre-service teacher research: A way to future-proof teachers? European Journal of Teacher Education, 46(3), 435–455. [Google Scholar] [CrossRef]
  31. Visser-Wijnveen, G. J., van Driel, J. H., van der Rijst, R. M., Visser, A., & Verloop, N. (2012). Relating academics’ ways of integrating research and teaching to their students’ perceptions. Studies in Higher Education, 37(2), 219–234. [Google Scholar] [CrossRef]
  32. Wessels, I., Rueß, J., Jenßen, L., Gess, C., & Deicke, W. (2018). Beyond cognition: Experts’ views on affective-motivational research dispositions in the social sciences. Frontiers in Psychology, 9, 01300. [Google Scholar] [CrossRef]
Table 1. Changes in perceived research knowledge from the beginning to the end of the first semester.
Table 1. Changes in perceived research knowledge from the beginning to the end of the first semester.
FactorAt the Beginning of the Master’s ProgrammeAt the End of the First SemesterWilcoxon Signed-Rank Test (z-Value)Hodges–Lehmann EstimatorEffect Size (r)
Median (IQR)
Knowledge of research approaches3.00 (0.80)4.00 (0.80)7.30 ***−1.300.86
Conceptual knowledge3.75 (0.50)4.25 (0.75)6.79 ***−0.880.81
Methodological knowledge3.00 (0.40)3.80 (0.60)7.11 ***−1.000.84
*** p < 0.001.
Table 2. Changes in perceived research skills from the beginning to the end of the first semester.
Table 2. Changes in perceived research skills from the beginning to the end of the first semester.
FactorAt the Beginning of the Master’s ProgrammeAt the End of the First SemesterWilcoxon Signed-Rank Test (z-Value)Hodges–Lehmann EstimatorEffect Size (r)
Median (IQR)
Statistical analysis and interpretation3.20 (0.60)4.00 (0.80)7.11 ***−0.900.83
Research planning3.80 (0.60)4.00 (0.80)5.36 ***−0.500.63
Preparation and presentation of results2.80 (0.80)3.80 (0.60)7.20 ***−1.100.84
*** p < 0.001.
Table 4. Changes in actual research competence from the beginning to the end of the first semester.
Table 4. Changes in actual research competence from the beginning to the end of the first semester.
PairMeanSDt-ValueEffect Size (d)
At the beginning of the master’s programme13.202.15−7.81 ***0.98
At the end of the first semester15.912.08
*** p < 0.001.
Table 5. Comparison of perceived and actual research competence at the beginning of the master’s programme and at the end of the first semester (normalised data).
Table 5. Comparison of perceived and actual research competence at the beginning of the master’s programme and at the end of the first semester (normalised data).
FactorPerceived Research CompetenceActual Research CompetenceWilcoxon Signed-Rank Test (z-Value)Hodges–Lehmann EstimatorEffect Size (r)
Median (IQR)
At the beginning of the master’s programme0.61 (0.34)0.44 (0.20)2.92 **−0.170.38
At the end of the first semester0.36 (0.35)0.50 (0.28)2.30 *−0.110.30
** p < 0.01; * p < 0.05.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Matjašič, M.; Vogrinc, J. Supporting the Development of the Perceived and Actual Research Competence of Pre-Service Teachers. Educ. Sci. 2025, 15, 652. https://doi.org/10.3390/educsci15060652

AMA Style

Matjašič M, Vogrinc J. Supporting the Development of the Perceived and Actual Research Competence of Pre-Service Teachers. Education Sciences. 2025; 15(6):652. https://doi.org/10.3390/educsci15060652

Chicago/Turabian Style

Matjašič, Miha, and Janez Vogrinc. 2025. "Supporting the Development of the Perceived and Actual Research Competence of Pre-Service Teachers" Education Sciences 15, no. 6: 652. https://doi.org/10.3390/educsci15060652

APA Style

Matjašič, M., & Vogrinc, J. (2025). Supporting the Development of the Perceived and Actual Research Competence of Pre-Service Teachers. Education Sciences, 15(6), 652. https://doi.org/10.3390/educsci15060652

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop