Next Article in Journal
Comparison of Procedural Content Item Generator versus Interactive Tool for Clinical Skills Acquisition in Physiotherapy Students
Previous Article in Journal
Laying Foundations for Islamic Teacher Education
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Validation of the University Student Engagement Questionnaire in the Field of Education

by
Sara Cebrián Cifuentes
* and
Empar Guerrero Valverde
Faculty of Teaching and Educational Sciences, Universidad Católica de Valencia, c/Sagrado Corazón, 5, Godella, 46110 Valencia, Spain
*
Author to whom correspondence should be addressed.
Educ. Sci. 2024, 14(10), 1047; https://doi.org/10.3390/educsci14101047
Submission received: 5 July 2024 / Revised: 18 September 2024 / Accepted: 24 September 2024 / Published: 25 September 2024
(This article belongs to the Section Higher Education)

Abstract

:
In this study, the Student Engagement Questionnaire (SEQ) developed by Kember and Leung (2009), was validated using a Spanish sample. This instrument is intended to provide a comprehensive evaluation of teaching and learning processes within university contexts, offering feedback to educators and institutions to support the improvement of these processes. To achieve this, the SEQ was administered to a sample of 561 university students in the field of education, drawn from two universities in Valencia: the Universidad de Valencia (32%), a public institution, and the Universidad Católica de Valencia (68%), a private one, during the start of the first quarter. The questionnaire is designed to assess both students’ competencies and teachers’ abilities to foster a conducive learning environment. Across its various versions, the SEQ has demonstrated considerable stability in its dimensions and the relationships between the variables it assesses. Confirmatory factor analysis has validated the proposed structure in the Spanish population. As a result, the SEQ serves as a suitable instrument for diagnosing and evaluating the development of university students’ competencies, as well as teachers’ capacity to structure teaching and assessment in a way that promotes a constructive and enriching learning environment. Furthermore, it assists in identifying areas for improvement in the methodological strategies employed, enabling educators to refine their pedagogical approaches to maximise the impact on student learning and development.

1. Introduction

Over the past few decades, society has been undergoing continuous and constant changes, driven in part by information and communication technologies (ICT) and the phenomenon of globalisation [1]. These changes have introduced new challenges and opportunities for progress and improvement across various social domains [2]. On the one hand, these transformations influence supply and demand within the labour market, requiring professionals equipped with the necessary skills to address both current and future challenges. On the other hand, they also impact societal transformation, prompting a need for socially competent and responsible citizens who can navigate this evolving environment [3,4,5,6].
In this uncertain and ever-changing society, it is essential to delve into a model that not only adapts to these shifts but is dynamic, focusing on the development of skills for academic and professional life, underpinned by values and respect for diversity [3]. According to [2], future citizens must be prepared for both existing jobs and those yet to emerge, for technologies not yet invented, and to solve problems that have yet to arise. Moreover, the impact of the pandemic has necessitated a re-evaluation of training methods, with an emphasis on fostering skills that can help individuals tackle new challenges [7].
For many years, debates regarding the alignment of university education with societal and labour demands have garnered significant interest within the academic community [8]. To meet these evolving socio-labour requirements, the educational system embarked on the creation of the European Higher Education Area (EHEA), leading to the development of the Tuning Project. This initiative emerged from the need to implement the processes initiated by the Bologna Declaration of 1999. Among other aspects, this declaration sought to align academic training more closely with the demands of the knowledge society, promoting a new teaching–learning model that adopts a comprehensive, student-centred approach aimed at skill development [9], that in a globalised and continuously changing society, came to complete the mastery of content [10].
The Tuning Project resulted in establishing that European universities should provide training in both specific and generic skills. Over time, specific skills training was incorporated into the various subjects of each degree. However, while generic skills were recognised as important for the holistic development of students, enabling lifelong learning and fostering higher employability and active, democratic citizenship [3,11], they were often developed indirectly through technical subjects, elective courses, or complementary training activities [6,12].
Given the increasing importance of these generic skills, it is now considered essential for students to be trained in both specific and generic skills, as without this dual focus, “there is a risk of training future professionals who are experts in their specific field but lack the ability to manage their personal and social circumstances” [5]. Furthermore, [13] it has been acknowledged that specific skills may become obsolete over time, whereas generic skills tend to be more enduring. Consequently, according to [14], training in generic skills is now regarded as a vital issue for the university educational community, as their “purpose is to respond to social, educational and labour market needs” [14]. The current context demands individuals who possess openness to cultural diversity, critical thinking, autonomous learning, empathy, flexibility, adaptability, and the ability to cooperate, communicate, and resolve conflicts, among other qualities [11,15]. Despite the significant attention, these skills have received from educational policymakers and stakeholders, and the efforts made by universities to include them as part of their educational objectives, expecting students to develop them throughout their academic journey, there remains a paucity of studies that can identify which aspects of the teaching–learning environment influence the development of these generic skills and the best methods for facilitating this process [12,15].
Research suggests that the success of skills development initiatives is partially dependent on the pedagogical methodologies and learning environments provided by educational institutions [16]. According to [12,17], fostering the development of generic skills requires the use of diverse teaching and learning methods, along with various pedagogical practices that allow for continuous and formative assessment of student learning. Furthermore, it is necessary to promote diverse forms of learning by offering students multiple opportunities to learn both inside and outside the classroom, as well as through extracurricular activities. Some studies indicate that generic skills are more effectively developed through specific pedagogical approaches, such as active learning methods and group activities [18,19,20]. In studies conducted by [17,21], it was found that the teaching–learning environment significantly influences students’ perceptions of the development of their generic skills.
For these reasons, high-quality instruments are needed to evaluate the teaching–learning process within the university context, in order to provide a solid diagnosis and propose measures for improvement to ensure educational success.
Although teaching and learning are closely interrelated, it is useful to describe these concepts separately, since teaching is designed to promote student learning [22].
The learning process itself can be conceptualised as an active, constructive, and meaningful process [23] through which individuals acquire or modify knowledge, abilities, behaviours, attitudes, and values through study, experience, instruction, reasoning, or observation. This process results in lasting changes in individuals, affecting their knowledge, abilities, and attitudes [22].
Although learning is an individual activity, it develops within a social and cultural context [24]. It is the outcome of cognitive processes that facilitate the assimilation and internalisation of new information—such as facts, concepts, procedures, and values—while enabling learners to construct meaningful and functional mental representations. These new representations enable learners to adopt new behaviours that can be applied in various contexts [25]. Learning extends beyond the memorisation of information; it also involves complex cognitive processes, including comprehension, application, analysis, synthesis, and evaluation [26].
The understanding of learning has evolved over time. This author [22] has identified three predominant paradigms that have shaped the learning process from the 20th century to the present day: learning as the acquisition or strengthening of responses (behaviourism, during the first half of the 20th century), learning as the acquisition of knowledge (cognitive psychology/information processing theory, in the 1950s and 1960s), and learning as the construction of knowledge (constructivism and socio-cognitive approaches, from the 1970s onwards). In contemporary approaches, the focus is placed on the learner, while the role of the teacher is redefined as that of a facilitator and mediator who supports the student in the autonomous construction of knowledge.
Adopting this perspective requires learners to develop and refine their learning strategies and self-regulatory mechanisms, enabling them to manage their own learning processes and thereby fostering greater autonomy and self-regulation [27]. This approach is particularly relevant in university contexts, where students, as adult learners, must cultivate the skills, capacities, and abilities necessary to adapt to a society that demands continuous learning.
Like the learning process, the teaching process is inherently multidimensional. It involves a series of actions undertaken by the educator, whose aim is to facilitate the acquisition of educational content—such as knowledge, habits, standards, abilities, techniques, and attitudes—by a student or group of students. These actions are carried out through the application of specific methods and resources, and tailored to defined objectives within a particular context so that students integrate the information both cognitively and behaviourally [22].
From a traditional perspective, this process may be viewed as a unidirectional transmission of knowledge, with the educator, as the holder of knowledge, imparting information to the student, who is perceived as a passive recipient [28]. In contrast, more contemporary approaches, such as constructivism and the socio-cognitive perspective, conceptualise teaching as a process of support, scaffolding, guidance, and facilitation. In this framework, the educator designs enriched learning environments and provides diverse opportunities, resources, and tools that enable students, with the support and mediation of the educator, to construct, organise, internalise, and integrate information meaningfully into their cognitive structures, resulting in the generation of knowledge. In this approach, students are considered active participants and constructors of knowledge [29]. This perspective requires the educator to possess a wide repertoire of skills and abilities in order to manage the teaching process effectively.
Moreover, the conception of the teaching process has evolved over time, influenced by advancements in learning theories. Since theories of teaching and instruction must be grounded in learning theories [30], the interpretation of teaching has shifted in line with theoretical developments and research findings in instructional processes. As a result, to be an effective educator, a diverse set of teaching abilities, capacities, and skills must be acquired, in addition to certain personal attributes.
Within the university context, a substantial body of scientific literature has contributed to defining the repertoire of abilities, capacities, and skills associated with effective teaching. These skills are generally grouped into several key areas: planning of teaching, management of teaching–learning activities through the application of pedagogical methods adapted to specific objectives and contexts, assessment abilities, communication, interpersonal relationships, tutoring, classroom climate management, innovation, teamwork, and the use of information and communication technologies (ICTs).
Given the complexity of these processes, as mentioned above, it is practically impossible to conduct a fully comprehensive evaluation that encompasses all dimensions and components. For this reason, the scientific literature has increasingly focused on the development of conceptual constructs that provide relevant information on the most significant aspects of these processes.
In relation to the learning process, explanatory constructs have been developed that address key elements such as the ability and techniques to study, learning strategies, approaches and styles, student abilities, skills, and academic performance. Similarly, in the field of teaching, various teaching styles, pedagogical models, and orientations have been outlined, in addition to teaching abilities, capabilities, and skills.
In this study, we propose the examination of the integrated evaluation of teaching–learning processes within the university context using a single instrument, with students serving as the primary source of information. The objective is to determine the extent to which the teaching–learning environment designed by educators influences the development of students’ generic skills.
To achieve this, a range of instruments has been developed, each addressing different aspects of the teaching and learning process. Among these, this study focuses on the Student Engagement Questionnaire (SEQ), whose purpose is to assess the effectiveness of teaching based on students’ perceptions and the development of the skills they achieve through their learning.
It should be noted that this instrument has already been validated by [31], however, this study aims to draw new conclusions by considering a different sample. Moreover, we aim to contrast our findings with those of other studies and continue providing insights regarding its validity within the Spanish context.

2. Student Engagement Questionnaire (SEQ)

The goal of Kember and his collaborators in developing the Student Engagement Questionnaire (SEQ) was to create a diagnostic tool rigorous enough to identify both strengths and weaknesses within the teaching–learning process. This tool also aimed to provide feedback to educators and institutions to enhance the quality of their educational processes. Moreover, it was crucial that the instrument aligned with existing research on the teaching–learning environment and effectively captured the key constructs necessary for evaluation.
The questionnaire was to cover two major topics:
-
The analysis of the skills that university students must acquire during the learning process.
-
The evaluation of the learning environment that educators establish in the classroom to facilitate the acquisition of these skills.
To design the range of skills, Kember collaborated with a panel of professors from various faculties, who developed lists of attributes required of graduates from each academic discipline. The results from this process were compared with previous studies in the scientific literature, such as those reviewed by [32], which revealed a high degree of consistency between both. Based on these comparisons, item banks were created to assess the selected skills, which were refined and reduced through successive validation iterations. This process constitutes the first part of the questionnaire.
The second section of the questionnaire was designed to evaluate the key elements of the teaching–learning environment that educators organise to support effective learning. Initially, the constructs included in this section were based on existing literature on the student experience within the educational process. In early versions of the instrument, a limited number of scales were included, as it was assumed that evaluations by teachers were already sufficiently comprehensive. However, in later versions, the number of scales in this second section was expanded.
Throughout the successive validation processes, different versions of the questionnaire have incorporated a variable number of dimensions, scales, variables, or aspects within the two key areas mentioned earlier (student capacities and the teaching–learning environment), all of which have remained consistent across versions.
Finally, the structure of the questionnaire comprises five dimensions, two relating to capacities—Intellectual Capacities and Teamwork—and three relating to the teaching–learning environment—Teaching, the Teacher–Student Relationship, and the Relationship Among Students. Additionally, the questionnaire includes 15 observed variables, grouped into scales that are integrated into the aforementioned dimensions, with seven corresponding to students and eight to the learning environment (see Table 1).
In their model [19], the authors seek to examine the impact of the teaching–learning environment on the development of students’ capacities. According to the classification proposed by [19,33], the authors use nomological validity as evidence of the construct.
To achieve this, the eight scales describing the teaching–learning environment are considered latent indicators of the three previously mentioned dimensions: Teaching, the Teacher–Student Relationship, and the Relationship Among Students. These three constructs are correlated with one another, as indicated by [19,21,34].
Regarding the seven student competencies, which are grouped into two dimensions—Intellectual Capacities and Teamwork—the model establishes that Teamwork has an influence on Intellectual Capacities [19,21,34]. Furthermore, the proposed model demonstrates how the latent dimensions of the teaching–learning environment are linked to the constructs representing student capacities. Specifically, an influence of Teaching is observed on the two latent variables related to student capacities [19,21,34]. Likewise, there is a relationship of influence between the Relationship Among Students and Teamwork [19,21,34]. However, the relationship between the Relationship Among Students and Intellectual Capacities is indirect and mediated through Teamwork.
On the other hand, the authors did not identify a direct relationship between the Teacher–Student Relationship and the two latent variables representing student capacities [34]; instead, an indirect effect on these capacities is observed. The authors suggest that this relationship is interconnected with teaching–learning variables, serving as a prerequisite for effective teaching.
Finally, the model has been validated in subsequent research [20], presenting a consolidated theoretical structure regarding the dimensions and relationships between them, which have remained consistent over time.
The validation conducted by [31] within a Spanish population confirms the structure proposed by the original authors of the questionnaire, particularly in terms of the dimensions and relationships outlined in the model. For these reasons, we consider it appropriate to validate the questionnaire with university students in the field of education to further examine the stability of the proposed model.

3. Materials and Methods

A test validation design [35,36,37] was employed to corroborate the structure of the instrument.

3.1. Participants

The sample consisted of 561 university students from the field of education, attending two universities in the city of Valencia: one public, the Universidad de Valencia (32%), and one private, the Universidad Católica de Valencia (68%). Among the participants, 22.3% were male and 77.7% female, with an average age of 20.8 years. In terms of their academic degrees, 28.9% were pursuing a degree in Primary Education, 3.9% in Early Childhood Education, 21.4% a Dual Degree in Primary and Early Childhood Education, and 45.8% in Social Education. Regarding the year of study, 38.1% were in their first year, 26.6% in their second year, 26.2% in their third year, and 9.1% in their fourth year. The selection of participants was constructed through intentional non-probabilistic sampling, as the students were taught by lecturers who were part of the research team for this study. The criteria for selection included being enrolled in degree programmes related to education.

3.2. Instrument

The Student Engagement Questionnaire (SEQ), developed by [19] and translated into Spanish by [31], was used in this study. The questionnaire consists of 35 items and assesses 15 variables, 7 related to student competencies and 8 to the teaching–learning environment (see Table 1). It is a Likert-type scale with five response options, ranging from “strongly disagree” to “strongly agree”. The questionnaire is provided in Appendix A (see Table A1).

3.3. Procedure

The university students participating in the study completed the questionnaire at the beginning of the first quarter, via the virtual campus platform, after providing their informed consent.

3.4. Data Analysis

For the analysis, the statistical software packages SPSS and AMOS version 24.0 were used, allowing for various essential statistical calculations to determine the structure of the questionnaire.
For the exploratory analysis, the varimax rotation method was employed, alongside the Kaiser–Meyer–Olkin (KMO) and Bartlett tests, both of which were found to be suitable for conducting factor analysis using the Kaiser method.
In line with these recommendations, the Root Mean Square Error of Approximation (RMSEA) was selected as a key indicator, with values below 0.05 being considered acceptable. A 90% confidence interval was also considered. Additionally, the Comparative Fit Index (CFI) was calculated, yielding an acceptable value.
To assess the reliability of the instrument, Cronbach’s alpha was used [38], as proposed by [19], yielding a value of 0.94, which indicates excellent internal consistency.

4. Results

First, the descriptive statistics of the items are presented, followed by the analysis of the reliability and validation of the model.

4.1. Descriptive Statistics

The means for the various items ranged between 3.33 and 3.98 (see Table 2). The highest mean values were observed in items 5, 10, and 13, while the lowest were recorded for item 12. The standard deviation indicates that the students’ responses present a certain heterogeneity.

4.2. Internal Consistency Analysis

Through this analysis, the internal consistency of the instrument can be estimated; in this case, the Cronbach’s alpha obtained is 0.94, which indicates that its value is adequate (see Table 3).

4.3. Exploratory Factor Analysis

This statistical analysis allows us to explore the dimensions that make up the questionnaire. First, the varimax rotation method was applied, which aims to maximise factor loadings in order to determine the representativeness of the variables provided by the instrument and reduce them.
Following this, the KMO and Bartlett tests were applied to verify the correlations and adequacy between the variables. Both indicators are considered acceptable; the former is greater than 0.60, and Bartlett’s test of sphericity is significant (0.000), being less than 0.05 (see Table 4).
These results allow for the application of factor analysis, based on the Kaiser method, enabling the use of the principal components method to synthesise the variables of the instrument. Following this, the total variance and the proportion of the variance explained by the common factors, referred to as “communality”, will be examined (see Table 5 and Table 6).
The explained variance reveals a total of 16 factors with Eigenvalues greater than one, representing the main dimensions of this questionnaire. For instance, the first component accounts for 9.879% of the variance of the original data, which makes it a principal component as it exceeds the expected 50% threshold in this type of analysis (see Table 5).
Regarding the proportion of variance explained by the common factors in a variable, it is observed that the model is able to account for 80.5% of the variability of component 1, and similarly for the rest of the components (see Table 6).
Next, the items that make up each of the extracted components or factors will be analysed. This can be seen in Table 7.

4.4. Confirmatory Factor Analysis

In this type of analysis, the focus lies on determining the structure of the data using empirical criteria. With this, it is possible to confirm the components of the data and determine the role that each element plays in its total set, based on three fundamental questions: total variance explained by the factors, explained variance of each factor and saturation of the items in the factors.
To do so, data adjustments must be made, which allow the establishment of the bases for the described procedure. Currently, statistical software offer several indices for this purpose. Two of them are of particular interest for this study, which will be discussed below:
-
Bentler’s CFI (Comparative Fit Index), which is based on a previous index, called BFI, corrects to avoid taking values that are beyond the range of between 0 and 1. To do this, it uses different procedures and it is understood that its measurement must be around 0.95 to consider that the model fits the data adequately. In this study, a value of 0.937 was obtained, indicating an adequate fit of the model to the data (see Table 8).
-
The RMSEA facilitates the determination of the amount of variance not explained by the model per degree of freedom. To consider that there is a good fit, its value must be <0.05. This is enhanced if the 90% confidence interval (C.I.) is between 0 and 0.05. The value obtained is 0.044, which is less than 0.5, indicating a good fit. Similarly, the confidence interval is set at 0.41, which further supports this conclusion (see Table 9 and Figure 1).
The value is 0.044, less than 0.5, so it is considered that there is a good fit. At the same time, the confidence interval is 0.41, which reinforces the above.
These questions construct the following structural model:

5. Discussion

By applying various statistical methods, we were able to establish a set of approaches to the Student Engagement Questionnaire (SEQ) and its feasibility. Firstly, we determined that the mean of each of the items is similar, ranging between 2.67 as the minimum point and 4.31 as the maximum point.
One of the items that denoted a lower mean was number 32, associated with cooperative learning, suggesting that there is minimal interaction between the students when discussing content, both in and out of class. In addition, there are other numbers related to this same question, such as 30, which suggests a lack of connection or identification with the group. Among the numbers that have a higher mean, those that focus on the teacher’s work are particularly notable, specifically items 18, 19, 26, and 27, which indicate a committed teacher attentive to students’ needs.
Even though these questions allow us to approach the operation of the instrument, they are not the object of this study. These findings indicate that the instrument demonstrates a certain homogeneity, which concurs with the results achieved in other studies [19,31]. This consistency reflects the internal structure of the questionnaire, whose set of items collectively address the same issue and brings with it results that can be interpreted as a holistic measure of the functioning of the teaching–learning processes and the elements that influence it: intellectual capacities; teamwork; teaching; teacher–student relationship and relationships between students, as a set of interconnected elements.
Likewise, it is worth noting another relationship with the aforementioned study [31], which also focused on the Spanish population. The means obtained in this study closely resemble those reported in the Kember and Leung studies [19,21,34], with slight variations, so that there seems to be some consistency, per se, in the Student Engagement Questionnaire (SEQ) in terms of its structure and results.
Secondly, Cronbach’s alpha shows high reliability, reinforcing the notion that the instrument is dependable and well-designed. Therefore, the Student Engagement Questionnaire (SEQ) applied to a Spanish population shows excellent internal consistency, and its intended purpose is effectively fulfilled. Furthermore, it should be noted that the achieved measure is higher than that indicated by the authors [19,21,34] and similar to that achieved in Spain [31], where there appears to be a higher index.
Thirdly, the component analysis carried out allows us to determine that this questionnaire has 16 main factors, which indicates that this is the number of elements that can be measured from its structure (see Table 10). These are the following, all associated with the teaching and learning process:
It is worth mentioning here that the number of components identified and classified in this study is greater than the number proposed in the original study [19,21,31,34], which suggested a total of 15 factors. This study yielded one more additional factor, and also categorised the others more precisely, based on the observed factor loadings, shown in the previous table. This factor is referred to as the use of teaching methods by the teacher and the creation of spaces for participation. This aligns with other research [39,40] that explores the influence of teaching methodologies on the teaching–learning process.
In addition, it is important to consider the results obtained from the confirmatory factor analysis, which determine that, when applied, the responses obtained have sufficient reliability to be considered adequate, in such a way that they would be a sample of the 16 components already mentioned. They are analysed first in isolation, considering the items linked to each of them, and then holistically, as a sample of the willingness of students and teachers towards the teaching and learning process.
Therefore, these results allow us to establish that the Student Engagement Questionnaire (SEQ) is an instrument that can be used to effectively assess its intended objective within the Spanish context, which is the willingness of students and teachers to engage in the teaching–learning process, based on the dimensions it contemplates and for two purposes:
  • Diagnosis of students’ capacities at the beginning of the course, at the end of a specific timeframe, before or after an intervention, etc.
  • Teacher evaluation during an academic period, to establish new guidelines to follow, etc.
Thanks to the test’s reliability, a wide range of procedures can be performed. For example, undertaking intervention processes that improve students’ attitudes towards the teaching and learning process, either comprehensively or by focusing on certain dimensions that had a lower mean. It is also possible to build a general overview of students’ abilities at the end of specific timeframes and at key moments, such as halfway through the degree or when the academic workload has been completed.
Similarly, considering the reliability of the Student Engagement Questionnaire (SEQ), assessing the teaching staff’s capabilities is also possible, in turn establishing strategies to ensure their ongoing training, proper treatment of students, and regular updates of teaching materials, resources, techniques, methods, and so forth.
Regardless, it is a dependable instrument for Higher Education, especially when there is a need to understand its functioning and suggest new and improved approaches, centred around its core component—the teaching and learning processes.

6. Conclusions

This study has analysed the reliability of the Student Engagement Questionnaire (SEQ), based on data collected from a sample comprising 561 university students enrolled in educational degree courses from two universities in the city of Valencia, a public one, the Universidad de Valencia (32%) and a private one, the Universidad Católica de Valencia (68%).
After a thorough examination from three different angles, namely internal consistency analysis, component analysis, and factor analysis, it has been determined that the Student Engagement Questionnaire (SEQ) is a well-constructed instrument whose reliability in the Spanish context is considered satisfactory. Therefore, it is an optimal test to analyse the structure of teaching and learning processes in the university setting, as its inherent composition has been demonstrated to effectively address the question posed.
Furthermore, the study found that the statistics obtained, such as Cronbach’s alpha, are superior to those obtained by the creators of the instrument themselves [19,21,34] and from another validation conducted in Spain [31]. This suggests a significantly high level of adaptation within the country, leading to favourable results upon administration, such that it has the potential to become a crucial instrument in Spanish institutions and a valuable tool for researchers, given its verified structure and scope.
One limitation of this study is the use of intentional non-probabilistic sampling; however, while the sample cannot be considered representative of the Spanish population, the variability in qualifications helps to mitigate this potential limitation. In any case, in future research, the model should be tested with larger samples.
Finally, the model could be expanded to include other constructs, such as learning strategies or learning approaches, to gain a better understanding of university students’ learning processes.

Author Contributions

All authors contributed equally to the research and writing of this paper. All authors have read and agreed to the published version of the manuscript.

Funding

This work has been carried out as part of the project “Intervention model and its application in the curricular design of generic competencies in university students in the field of education”, financed by the Department of Innovation, Universities, Science and Digital Society under the grant programme for Emerging Research Groups “CIGE/2022/95”.

Institutional Review Board Statement

All subjects gave informed consent for inclusion before participating in the study. The study was carried out in accordance with the Declaration of Helsinki and the protocol was approved by the Ethics Committee of the Catholic University of Valencia on 21 March 2021 (UCV/2020-2021/127).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Dataset available on request from the authors.

Acknowledgments

The authors wish to express their sincere gratitude to all the university students who participated in this study. Also, to the two participating universities.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. Student Engagement Questionnaire (SEQ).
Table A1. Student Engagement Questionnaire (SEQ).
ItemsStrongly DisagreeDisagreeUndecidedAgreeStrongly Agree
I have developed my capacity to judge alternative perspectives.
I have become more willing to consider alternative perspectives.
I have been motivated to use my own initiative.
I have been challenged to come up with new ideas.
I feel I am able to take responsibility for my own learning.
I have greater confidence in my capacity to continue learning.
In this subject I have learned to be more adaptable.
I have become more willing to change my perspective and accept new ideas.
I have improved my ability to use knowledge to solve problems in my field of study.
I am able to offer information and different ideas to solve problems.
I have developed my ability to communicate effectively with others.
In this subject I have improved my ability to transmit ideas.
I have learned to be an effective team member during group work.
I feel confident dealing with a wide range of people.
I feel confident using computer applications when necessary.
I have learned more about using computers to present information.
The teacher uses a variety of teaching methods.
Students are given the opportunity to participate in classes.
The teacher makes an effort to help understand the course material.
The design of the course helps students understand its contents.
The explanations given by the teacher are useful when I have difficulties with the learning materials.
There is enough feedback on activities and tasks to ensure that we learn from the work we do.
The subject uses a variety of evaluation methods.
To do well when being evaluated in this subject you need to have good analytical skills.
The evaluation assesses our understanding of the key concepts in this subject.
There is good communication between the teacher and students.
The teacher helps when asked.
I am able to complete the programme requirements without feeling excessively stressed.
The workload assigned is reasonable.
I have a strong sense of belonging to my class group.
I regularly work with peers in my classes.
I have regularly discussed course ideas with other students outside of class.
Discussing the course material with other students outside of class has helped me gain a better understanding of the subject.
I can see how the subjects fit together to make a coherent study programme for my specialisation.
The study programme for my specialisation is well-integrated.

References

  1. Almerich, G.; Suárez Rodríguez, J.; Diaz García, I.; Orellana, N. Structure of 21st century competences in students in the sphere of education. Influential personal factors. Educ. XX1 2020, 23, 45–74. [Google Scholar] [CrossRef]
  2. Organisation for Economic Cooperation and Development. The Future of Educational and Skills, Education 2030. Available online: https://hdl.handle.net/20.500.12365/17367 (accessed on 15 January 2024).
  3. Aguaded, E. Development of personal competencies in multicultural university contexts. In Development of Intercultural Competencies in University Contexts; Salmeron, H., Rodríguez, S., Cardona, M.L., Eds.; Publicacions de la Universitat de València: Valencia, Spain, 2012; pp. 55–92. [Google Scholar]
  4. Almerich, G.; Díaz-García, I.; Cebrián-Cifuentes, S.; Suárez-Rodríguez, J. Dimensional structure of 21st century competences in university student of education. Relieve 2018, 24, 1–21. [Google Scholar] [CrossRef]
  5. Crespí, P. How Higher Education can develop generic competences? Int. E-J. Adv. Educ. 2020, 6, 23–29. [Google Scholar] [CrossRef]
  6. Crespí, P.; García-Ramos, J.M. Generic skills at university. Evaluation of a training program. Educ. XX1 2021, 24, 297–327. [Google Scholar] [CrossRef]
  7. Comisión Europea. Bologna with Student Eyes 2020. Available online: https://www.esu-online.org/wp-content/uploads/2021/01/BWSE2020-Publication_WEB2.pdf (accessed on 21 February 2024).
  8. Rodríguez Pallares, M.; Segado-Boj, F. Journalistic Competences and Skills in the 21st Century. Perception of Journalism Students in Spain. Ad. Comun. 2020, 20, 67–94. [Google Scholar] [CrossRef]
  9. González-Ferreras, J.M.; Wagenaar, R. Una introducción a Tuning Educational Structures in Europe. In La contribución de las universidades al proceso de Bolonia; Publicaciones de la Universidad de Deusto: Bilbao, Spain, 2006; Available online: http://www.deustopublicaciones.es/deusto/pdfs/tuning/tuning12.pdf (accessed on 23 March 2024).
  10. Salido, P. Active methodologies in initial teacher training: Project-based learning (ABS) and artistic education. Profr. Rev. Currículum Form. Profr. 2020, 24, 120–143. [Google Scholar] [CrossRef]
  11. Council of Europe. A Educación en el Consejo de Europa, Competencias y Cualificaciones para la Vida en Democracia, 2017. Available online: https://rm.coe.int/09000016806fe63e (accessed on 20 April 2024).
  12. Virtanen, A.; Tynjälä, P. Factors explaining the learning of generic skills: A study of university student’s experiences. Teach. High Edu. 2019, 24, 880–894. [Google Scholar] [CrossRef]
  13. Corominas Rovira, E. Generic competencies in university training. Rev. Edu. 2001, 325, 299–321. Available online: https://redined.educacion.gob.es/xmlui/bitstream/handle/11162/75927/008200230385.pdf?sequence=1&isAllowed=y (accessed on 21 April 2024).
  14. Crespí, P.; García-Ramos, J.M. Design and evaluation of a program on transversal personal competencies at the university. In Proceedings of the XIX International Congress of Educational Research, Palma, Spain, 1–3 July 2019; pp. 963–969. [Google Scholar]
  15. Cheng, M.; Lee, K.; Chan, C. Generic skills development in discipline-specific courses in higher education: A systematic literature review. Curr. Teach. 2018, 33, 47–65. [Google Scholar] [CrossRef]
  16. Martín González, M.; Ortiz, S.; Jano, M. Do teaching and learning methodologies affect the skills acquired by master’s students? Evidence from Spanish universities? Educ in the Know Soc. 2020, 28, 1–28. [Google Scholar]
  17. Gargallo, B.; Suárez-Rodríguez, J. Methodological formats used in the research and results. Methodological format in the subject Theory of Education. In Teaching Focused on Learning and Competency-Based Design at the University; Gargallo, B., Ed.; Tirant Humanidades: Valencia, Spain, 2017; pp. 287–294. [Google Scholar]
  18. Kember, D. Nurturing generic capabilities through a teaching and learning environment which provides practise in their use. High. Educ. 2009, 57, 37–55. [Google Scholar] [CrossRef]
  19. Kember, D.; Leung, D. Development of a questionnaire for assessing students’ perceptions of the teaching and learning environment and its use in quality assurance. Learn. Environ. Res. 2009, 12, 15–29. [Google Scholar] [CrossRef]
  20. Kember, D.; Leung, D. Disciplinary differences in student ratings of teaching quality. Res. High. Educ. 2011, 52, 278–299. [Google Scholar] [CrossRef]
  21. Kember, D.; Leung, Y.P.L.; Ma, R. Characterizing learning environments capable of nurturing generic capabilities in higher education. Res. High. Educ. 2007, 48, 609–632. [Google Scholar] [CrossRef]
  22. Mayer, R.E. Learning and Instruction; Alianza: Madrid, Spain, 2010. [Google Scholar]
  23. González-Pienda, J.A. The student: Personal variables. In Instructional Psychology I. Basic Variables and Processes; Beltrán, J., Genovard, C., Eds.; Síntesis: Madrid, Spain, 1999; pp. 148–191. [Google Scholar]
  24. Coll, C. The constructivist conception as an instrument for the analysis of school educational practices. In Psychology of School Education; Coll, C., Palacios, J., Marchesi, A., Eds.; Alianza: Madrid, Spain, 2000. [Google Scholar]
  25. Vidal-Abarca, E. Learning and teaching: A view from Psychology. In Learning and Personality Development; Vidal-Abarca, E., García-Ros, R., Pérez-González, F., Eds.; Alianza Editorial: Madrid, Spain, 2010; pp. 19–43. [Google Scholar]
  26. Monereo, C. Learning strategies in formal education: Teaching to think and about thinking. Infanc. Aprendiz. 1990, 50, 3–25. [Google Scholar] [CrossRef]
  27. Zimmerman, B.J. Investigating self-regulation and motivation: Historical background, methodological developments, and future prospects. Am. Edu. Res. J. 2008, 45, 166–183. [Google Scholar] [CrossRef]
  28. Doménech Betoret, F. Educational Psychology and Instruction: Its Application to the Classroom Context; Universitat Jaume I: Castellón, Spain, 2007. [Google Scholar]
  29. Coll, C. The constructivist conception as an instrument for the analysis of school educational practices. In Psychology of Instruction; Coll, C., Ed.; Teaching and Learning in Secondary Education Barcelona: Barcelona, Spain, 2008; p. 16.44. [Google Scholar]
  30. Ertmer, P.A.; Newby, T.J. Behaviorism, cognitivism, constructivism: Comparing critical features from an instructional design perspective. Intern. Impro. Quart. 2013, 6, 43–71. [Google Scholar] [CrossRef]
  31. Gargallo, B.; Suárez, J.; Almerich, G.; Verde, I.; Cebrià, A. The dimensional validation of the Student Engagement Questionnaire (SEQ) with a Spanish university population. Students’ capabilities and the teaching-learning environment. Anal. Psic. 2018, 34, 519–530. [Google Scholar] [CrossRef]
  32. Pascarella, E.T.; Terenzini, P.T. How college affects students: Findings and insights form twenty years of research; Jossey-Bass: San Francisco, CA, 1991. [Google Scholar]
  33. Hair, J.F.; Black, W.C.; Babin, B.J.; Anderson, R.E. Multivariate Data Analysis, 7th ed.; Prentice Hall: Upper Saddle River, NJ, USA, 2010. [Google Scholar]
  34. Kember, D.; Leung, D. The influence of the teaching and learning environment on the development of generic capabilities needed for a knowledge-based society. Learn. Environ. Res. 2011, 8, 245–266. [Google Scholar] [CrossRef]
  35. Crocker, J.C.; Algina, J. Introduction to Classical and Modern Test Theory; Holt, Rinehart and Winston: New York, NY, USA, 1986. [Google Scholar]
  36. Jornet, J.M.; Suárez, J.M. Standardized Testing and Performance Assessment: Metric Uses and Characteristics. Rev. Invesig. Edu. 1996, 14, 141–163. [Google Scholar]
  37. Popham, J. Modern Educational Measurement; Allyn and Bacon: Boston, MA, USA, 1990. [Google Scholar]
  38. Cronbach, L. Coefficient alpha and the internal structure of tests. Psychometrika 1951, 16, 297–334. [Google Scholar] [CrossRef]
  39. Gargallo López, B.; Suárez Rodríguez, J.; Garfella Esteban, P.R.; Fernández March, A. The TAMUFQ Questionnaire (Teaching andAssessment Methodology of University Faculty Questionnaire). An Instrument to Assess the Teaching Methodology of University Faculty. Est. Educ. 2011, 21, 9–40. [Google Scholar] [CrossRef]
  40. Gargallo, B.; Morera, I.; Iborra, S.; Climent, M.J.; Navalón, S. y García, E. Methodology focused on learning. Its impact on learning strategies and academic performance of university students. Rev. Esp. Ped. 2014, 259, 415–435. [Google Scholar]
Figure 1. Structural model.
Figure 1. Structural model.
Education 14 01047 g001
Table 1. Dimensions and variables of the SEQ questionnaire.
Table 1. Dimensions and variables of the SEQ questionnaire.
Student Capacity
CategoriesVariables
Intellectual capacities
  • Critical thinking.
  • Creative thinking.
  • Self-managed learning.
Teamwork
  • Adaptability.
  • Problem resolution.
  • Interpersonal skills.
  • Communications skills.
Teaching–learning environment variables
Teaching
  • Active learning.
  • Teaching for understanding.
  • Assessment.
  • Curriculum coherence.
Teacher–student relationship
  • Teacher–student interaction.
  • Feedback to help learning.
Relationship between students
  • Relationship with other students Cooperative learning.
Table 2. Descriptive statistics.
Table 2. Descriptive statistics.
ItemMeanDev.N
1. In this subject I develop my capacity to judge alternative perspectives.3.620.930561
2. I have become more willing to consider alternative perspectives3.710.871561
3. I have been motivated to use my own initiative3.760.887561
4. I have been challenged to come up with new ideas3.520.926561
5. I feel I am able to take responsibility for my own learning3.980.837561
6. I have greater confidence in my capacity to continue learning3.690.888561
7. In this subject I have learned to be more adaptable3.540.853561
8. I have become more willing to change my perspective and accept new ideas.3.620.898561
9. I have improved my ability to use knowledge to solve problems in my field of study3.550.885561
10. I am able to offer information and different ideas to solve problems3.810.842561
11. I have developed my ability to communicate effectively with others3.650.927561
12. In this subject I have improved my ability to transmit ideas3.330.933561
13. I have learned to be an effective team member during group work3.880.818561
14. I feel confident dealing with a wide range of people3.700.886561
15. I feel confident using computer applications when necessary3.681.032561
16. I have learned more about using computers to present information3.500.986561
17. The teacher uses a variety of teaching methods3.750.860561
18. In this subject, students are given the opportunity to participate in classes4.190.861561
19. The teacher makes an effort to help understand the course material4.190.725561
20. The design of this subject helps students understand its contents3.730.890561
21. The explanations given by the teacher are useful when I have difficulties with the learning materials3.880.823561
22. There is enough feedback on activities and tasks to ensure that we learn from the work we do3.780.844561
23. The subject uses a variety of evaluation methods3.620.804561
24. To do well when being evaluated in this subject you need to have good analytical skills3.650.834561
25. The evaluation assesses our understanding of the key concepts in this subject3.630.813561
26. There is good communication between the teacher of this subject and their students4.110.762561
27. The teacher of this subject helps when asked4.310.692561
28. I am able to complete the programme requirements without feeling excessively stressed3.360.952561
29. The workload assigned in this subject is reasonable3.730.770561
30. I have a strong sense of belonging to my class group3.311.013561
31. I regularly work with peers in my classes3.840.877561
32. I have regularly discussed course ideas with other students outside of class2.671.150561
33. Discussing the course material with other students outside of class has helped me gain a better understanding of the subject3.121.008561
34. I can see how the subjects fit together to make a coherent study programme for my specialisation3.360.888561
35. The study programme for my specialisation is well-integrated3.420.949561
Table 3. Cronbach’s alpha.
Table 3. Cronbach’s alpha.
Cronbach’s AlphaNumber of Elements
0.94135
Table 4. KMO and Bartlett’s test.
Table 4. KMO and Bartlett’s test.
Kaiser–Meyer–Olkin measure of sampling adequacy 0.942
Bartlett’s test of sphericity Approx. Chi-squared 9088.286
gl 595
Sig. 0.000
Table 5. Total variance.
Table 5. Total variance.
Initial EigenvaluesSum of Squared Loadings from the ExtractionSum of Squared Loadings from the Rotation
ComponentTotal%
Variance
%
Accumulated
Total%
Variance
%
Accumulated
Total%
Variance
%
Accumulated
112.14434.69834.69812.14434.69834.6983.4589.8799.879
22.2216.34541.0432.2216.34541.0433.2409.25719.135
31.9414.68745.7301.9414.68745.7301.9705.62824.764
41.8934.26649.9971.8934.26649.9971.7254.92929.693
51.8413.54553.5421.8413.54553.5421.6834.80934.503
61.7603.31556.8571.7603.31556.8571.6794.79639.299
71.6622.75059.6071.6622.75059.6071.5354.38543.684
81.6292.65562.2621.6292.65562.2621.5214.34748.030
91.5552.44264.7051.5552.44264.7051.5214.34552.375
101.5022.29266.9971.5022.29266.9971.5044.29656.672
111.4662.18969.1861.4662.18969.1861.4764.21660.888
121.4352.09971.2851.4352.09971.2851.4674.19165.079
131.3172.05073.3351.3172.05073.3351.4384.10969.188
141.2501.85875.1921.2501.85875.1921.2463.56172.749
151.1261.78876.9801.1261.78876.9801.2283.50876.257
161.0071.67978.6591.0071.67978.6590.8412.40278.659
170.9701.62880.287
180.8281.50981.797
190.7091.45383.249
200.6931.40984.658
210.6871.39286.051
220.5561.30287.352
230.5531.29388.645
240.4261.21889.863
250.4021.14891.011
260.3841.09792.107
270.3741.06893.176
280.3541.01194.186
290.3490.99995.185
300.3130.89696.080
310.3090.88296.963
320.2920.83497.797
330.2830.80998.606
340.2480.70999.315
350.2400.685100.000
Table 6. Communalities.
Table 6. Communalities.
ComponentInitialExtraction
11.0000.805
21.0000.770
31.0000.731
41.0000.848
51.0000.907
61.0000.700
71.0000.781
81.0000.740
91.0000.711
101.0000.835
111.0000.753
121.0000.686
131.0000.819
141.0000.907
151.0000.873
161.0000.795
171.0000.705
181.0000.745
191.0000.741
201.0000.758
211.0000.743
221.0000.720
231.0000.795
241.0000.813
251.0000.740
261.0000.777
271.0000.790
281.0000.807
291.0000.803
301.0000.767
311.0000.851
321.0000.844
331.0000.787
341.0000.835
351.0000.844
Table 7. Rotated component matrix.
Table 7. Rotated component matrix.
ComponentItemsFactor Loadings
Component 1260.81
270.84
Component 210.73
20.75
Component 3280.34
290.42
340.80
350.82
Component 4320.90
330.80
Component 590.38
100.80
Component 630.56
40.82
Component 7230.75
240.80
250.60
Component 8300.69
310.84
Component 9130.76
140.33
Component 10210.58
220.38
Component 11110.43
120.52
Component 12150.85
160.63
Component 1370.51
80.56
Component 1450.88
60.37
Component 15190.60
200.66
Component 16170.27
180.54
Table 8. CFI.
Table 8. CFI.
ModelNFI
Delta1
RFI
rho1
IFI
Delta2
TLI
rho2
CFI
Default model0.8950.8500.9380.9070.937
Saturated model1.000 1.000 1.000
Independence model0.0000.0000.0000.0000.000
Table 9. RMSEA.
Table 9. RMSEA.
ModelRMSEALO 90HI 90PCLOSE
Default model0.0440.0410.0480.031
Independence model0.1620.1590.1640.000
Table 10. 16 factors.
Table 10. 16 factors.
ComponentFactor
Component 1 Communication between the student and the teacher.
Component 2 Stimulation of critical thinking.
Component 3 Workload and time management according to the study programme.
Component 4 Cooperation between students.
Component 5 Acquisition of tools to solve problems.
Component 6 Strengthening student initiative.
Component 7 Teacher evaluation methods and assessments.
Component 8 Sense of belonging and teamwork.
Component 9 Integration in teams and appreciation of diversity.
Component 10 Teacher explanations and feedback on the activities.
Component 11 Development of assertive communication and transmission of information.
Component 12 Effective use of IT.
Component 13 Training of ability to adapt.
Component 14 Strengthening of self-confidence and accountability in learning.
Component 15 Teacher support and subject design.
Component 16 Use of teaching methods by the teacher and creation of participatory spaces.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Cebrián Cifuentes, S.; Guerrero Valverde, E. Validation of the University Student Engagement Questionnaire in the Field of Education. Educ. Sci. 2024, 14, 1047. https://doi.org/10.3390/educsci14101047

AMA Style

Cebrián Cifuentes S, Guerrero Valverde E. Validation of the University Student Engagement Questionnaire in the Field of Education. Education Sciences. 2024; 14(10):1047. https://doi.org/10.3390/educsci14101047

Chicago/Turabian Style

Cebrián Cifuentes, Sara, and Empar Guerrero Valverde. 2024. "Validation of the University Student Engagement Questionnaire in the Field of Education" Education Sciences 14, no. 10: 1047. https://doi.org/10.3390/educsci14101047

APA Style

Cebrián Cifuentes, S., & Guerrero Valverde, E. (2024). Validation of the University Student Engagement Questionnaire in the Field of Education. Education Sciences, 14(10), 1047. https://doi.org/10.3390/educsci14101047

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop