Next Article in Journal
Mathematical Models for the Design of GRID Systems to Solve Resource-Intensive Problems
Previous Article in Journal
The Equilibrium Solutions for a Nonlinear Separable Population Model
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

How Chinese Undergraduate Students’ Perceptions of Assessment for Learning Influence Their Responsibility for First-Year Mathematics Courses

1
School of Mathematics and Statistics, Northeast Normal University, Changchun 130024, China
2
School of Mathematics and Information Science, Anshan Normal University, Anshan 114056, China
3
School of Humanities and Social Sciences, Beijing Institute of Technology, Beijing 100081, China
*
Author to whom correspondence should be addressed.
Mathematics 2024, 12(2), 274; https://doi.org/10.3390/math12020274
Submission received: 11 December 2023 / Revised: 3 January 2024 / Accepted: 3 January 2024 / Published: 14 January 2024

Abstract

:
Assessment for learning (AFL) has been associated with curriculum and teaching reform for the past three decades. However, studies on undergraduate students’ perceptions of their mathematics teachers’ AFL practices are still very limited in the Chinese higher education context. This quantitative study investigated three independent variables—teacher formal feedback and support, interactive dialog and peer collaboration, and learning-oriented assessment—that influence undergraduate students’ ability to take responsibility for their learning through the mediation of the factor of active engagement with subject matter in first-year mathematics courses. One hundred and sixty-eight students from a Chinese “double-first-class” university were recruited to provide valid questionnaire data using the convenience sampling method. Partial least-squares structural equation modeling (PLS-SEM) was used to analyze the data. The results showed that interactive dialog and peer collaboration, as well as learning-oriented assessment, have a direct effect on students’ active engagement with the subject matter and an indirect effect on undergraduate students taking responsibility for their learning in first-year mathematics courses. In addition, learning-oriented assessment was the biggest factor influencing undergraduate students’ ability to take responsibility for their learning in first-year mathematics courses. This study contributes by developing a conceptual model and providing new insights into Chinese higher education sectors on factors that can improve undergraduate students’ ability to take responsibility for their learning.

1. Introduction

Many students experience difficulties and challenges in taking mathematics courses at the tertiary level [1]. When students enter their first year of tertiary education at university, they face several shocks of transition from a situation where concepts have an empirical and intuitive foundation to a situation where concepts are clarified by formal definitions and properties are reconstructed by logical inferences [2,3,4]. The transformative process goes beyond changes in content to include changes in pedagogical approaches, learning methodologies, and assessment modalities [5]. Undoubtedly, these factors contribute to the elevated and high failure rate observed in foundational mathematics [6]. Many countries and regions reported that students perform poorly in first-year mathematics courses, noting that unacceptably high failure rates in first-year mathematics courses have become a catalyst for many students to leave mathematics-intensive programs, delay progression from one academic level to the next, and drop out of university [7,8,9]. In China, the quality of first-year mathematics courses is also an issue under heated dispute, and factors such as students’ learning approaches and attitudes, teachers’ instructional methods, and the construction of institutional textbooks have been implicated in students’ poor performance [10,11,12,13]. Moreover, in order to build a high-quality system in the Chinese mathematics education context, formative assessment (FA) and assessment for learning (AFL) have received more attention in China in recent years [14,15,16], but few studies have been based on systematic empirical approaches, especially on how students experience and feel about their mathematics teachers’ FA and AFL practices.
Specifically, formative assessment (FA) and assessment for learning (AFL) signify a departure from the conventional paradigm that perceives learning as the mere accumulation and production of knowledge. Furthermore, they align with a contemporary paradigm that views learning as a dynamic process involving the discovery and construction of knowledge [17]. It should be noted that AFL emphasizes the learning and teaching process, whereas FA emphasizes the function of certain assessments [18]. In addition, pupils exercise agency and autonomy in AFL, yet they are often seen as passive recipients of their teachers in FA [19]. For example, Black et al. point out that when teachers collect information about period tests and assessments of their current students for long-term curriculum improvement, in which case the beneficiaries will be pupils at some stage in the future, the assessment has the characteristics of FA but not of AFL [20]. In other words, teaching cannot exist independently of learning in AFL, what Wiliam describes as “keeping learning on track” [21]. In mathematics courses, many theorems and proofs must be taught in a limited amount of time at universities, so how teachers use feedback, interaction, and assessment is a key component of effective classroom instruction, and student perceptions of these activities and strategies would affect their mathematics performance, course engagement and learning approaches [22,23]. AFL advocates some related pedagogical strategies, namely formal feedback from tutors, dialogic teaching and peer interaction, appropriate assessment tasks, and opportunities for understanding [24]. In this paper, we prefer the use of AFL because it incorporates a broader range of activities and strategies that are consistent with advanced pedagogical approaches rather than approaches to assessment from a pure FA perspective [19].
A range of literature provides extensive evidence that AFL practices are embedded within the pedagogy of a wide number of teachers in higher education [25,26,27]. However, Carless noted that research and development work congruent with AFL was little known in China after a synthesis of relevant literature [28], let alone mathematics education in higher education. This study investigated students’ perceptions of their mathematics teachers’ AFL practices, including feedback, interaction, and assessment, and focused on how students’ perceptions of specific AFL approaches affected their active engagement with subject matter, which in turn influenced their responsibility for learning in first-year mathematics courses. Our findings will fill the gap in the AFL cognition in Chinese higher mathematics education, further examine the scope of AFL practices in the reform of mathematics curriculum in recent years, and provide a basis for instructional interventions in the teaching and learning process. In addition, some findings differ from those in Western contexts, providing new perspectives and cases for global AFL research.
The remaining parts of this article are organized as follows: in Section 2, we review the relevant literature to introduce the educational system and assessment practices in China, the development of AFL practices and strategies, and the conceptual model of this study. Then, we describe the research methodology in Section 3 and present the research findings in Section 4. We then discuss the findings and their implications in Section 5 and Section 6, respectively. Finally, the article summarizes the main findings as conclusions and discusses limitations and future work in Section 7 and Section 8.

2. Literature Review and Hypothesis Development

2.1. Education System and Assessment Practices in China

So far, China has built up one of the largest higher education systems in the world. At the same time, this position inevitably increases various tensions within the higher education system and threatens the quality of teaching conditions in institutions [29,30]. For example, the rapid expansion of higher education has brought challenges such as a noticeable decline in resources and expenditure per student, intensifying the pressures faced by educators in terms of teaching and administration [31]. As a result, various stakeholders have argued that higher education in China must now shift its priority from expansion to quality improvement [32]. Simultaneously, the global emphasis on quality assurance in higher education has prompted numerous surveys that can be divided into two categories [33]. The earlier type focused on students’ course experiences and their perceptions of the quality of teaching and learning, while the latter, which emerged somewhat later, examined aspects of student engagement, including behaviors and institutional characteristics [34,35].
China has been playing an important role in the latter category. In 2009, based on the experience of the National Survey of Student Engagement (NSSE), the research team at Tsinghua University initiated the National Survey on Student Engagement in China [36].
Over time, this type of survey has become a cornerstone of higher education research in China, making substantial contributions to academic inquiry, institutional practices, and policy considerations, as demonstrated by a series of research efforts and practical applications [37]. Despite its contributions, this type of survey tends to offer a broad perspective on higher education as a whole and lacks an in-depth examination of the specifics of what is taught and how undergraduate courses are conducted [38,39]. Since mathematics courses in higher education are complex and nuanced [3], there is a need to develop more theories and instruments for collecting and analyzing data about how students experience mathematics courses, and it will help stakeholders develop a better understanding of where the pedagogical obstacles lie and what can be done to improve student success. As AFL is deeply incorporated into the classroom teaching and assessment processes, it can be used by educators in higher education in several ways to evaluate students’ assessment of learning experience in specific classrooms or higher education institutions.

2.2. Evolution of AFL Practices or Strategies

Assessment for learning has evolved as a deliberate response to counterbalance the predominant focus on process assessment or summative assessment within classrooms and educational institutions [40,41]. Since the publication of Black and Wiliam’s systematic review of formative classroom assessment, the theory of “assessment for learning” has continued to be enriched and developed, and the higher education sector, particularly in the West, has continued to call for the organization of assessment for learning on a large scale to provide a focus for the reform of teaching and learning practices in higher education [28]. AFL practices have been recognized as central components of assessment for learning in both school and higher education settings and form the core of the assessment for learning construct [19]. Initially identified by Black and Wiliam, these practices include fostering productive classroom discussions, providing teacher feedback to encourage student progress, providing opportunities for students to share learning resources, and implementing various strategies that encourage students to take responsibility for their own learning [42]. More recently, Carless has slightly reformulated these AFL practices and added some illustrative processes for adaptation in higher education, such as productive assessment task design, which refers to the development of tasks that have the potential to meaningfully stimulate learning processes [28]. Although the articulation of these practices may differ between major studies, both emphasize adherence to AFL principles and stress the importance of constructive feedback, interactive dialog, and engaging assessment tasks for students [43]. Furthermore, these practices or strategies still need to follow the alignment of teaching, learning, and assessment to empower students to engage in constructive and collaborative learning to become motivated and self-regulated learners [44].
There has also been a wide range of research activities and projects in higher education influenced explicitly or implicitly by AFL principles over the last 30 years or so. For example, Hattie showed that the visible learning attributed to practices was congruent with AFL: student self-evaluation and meta-cognitive strategies, formative evaluation and feedback, and collaborative learning through reciprocal teaching [45]. This is not the only case: there are several examples of positive, sustained implementation of practices and policies congruent with AFL in selected international settings, such as the UK and Australia [46,47]. Although there are some nascent initiatives to introduce a more formative orientation to assessment at the university level, the long history of competitive examinations represents a challenge to AFL in China [48].

2.3. Conceptual Framework

Rather than perceiving learners as mere recipients of AFL practices, the contemporary view in the studies emphasizes the relationships between learners’ active engagement, learning approaches, and students’ perceptions of AFL activities and strategies [49,50,51]. For example, King, Schrodt, and Weisel proposed a connection between students’ use of teacher feedback, their self-efficacy, and academic performance [52]. Birenbaum explored the relationship between assessment and instructional preferences among undergraduate students and emphasized the need for increased interaction between teachers and students in instructional and assessment processes [53]. Another study by Hernández in an Irish university examined the facilitation of student learning using formative and summative assessment practices in continuous assessment [54]. In summary, these studies reflect the reality that AFL practices such as feedback, interaction, and assessment have an impact on student outcomes and behavior [55,56].
In China, AFL has a certain preparation for pedagogical interventions in the field of English. For instance, one study used a quantitative method to explore the relationships among students’ conceptions of feedback, self-regulation, self-efficacy, and language achievement [57]. However, few studies have investigated the role of perceptions of these types of activities or strategies in mathematics education, especially in higher education. An analysis of the Chinese educational system and AFL evolution situation provides insights into possible problems of such systems and identifies potential interventions. Hence, it is worth studying students’ perceptions and applications of AFL practices within the context of higher mathematics education, as these might explain inter-individual responsibility in their first-year mathematics courses. Therefore, this study among undergraduate students investigated the impact of students’ perceptions of mathematics teachers’ AFL practices on their engagement with subject matter and responsibility for learning. Based on studies suggesting that AFL activities and strategies predict student performance in mathematics and learning approaches in secondary schools [23,58], it is hypothesized that students’ perceptions of the nature and quality of AFL practices will predict their active engagement in mathematics coursework, therefore influencing their responsibility for learning in universities. The conceptual model is shown in Figure 1.
The following is the initial hypothesis that will be tested in this study.
Hypothesis 1 (H1).
Teacher formal feedback and support affects students’ active engagement with the subject matter.
Hypothesis 2 (H2).
Interactive dialog and peer collaboration affects students’ active engagement with the subject matter.
Hypothesis 3 (H3).
Learning-oriented assessment affects students’ active engagement with the subject matter.
Hypothesis 4 (H4).
Students’ active engagement with the subject matter affects their taking responsibility for their learning.
Additionally, this study analyzed the moderating effects of active engagement with subject matter on the relationships between teachers’ formal feedback and support, interactive dialog and peer collaboration, learning-oriented assessment, and students taking responsibility for their learning.

3. Methodology

This study used a quantitative approach to investigate the factors that influence undergraduate students’ responsibility for learning in first-year mathematics courses, as it allows for comparisons of different variables, and the results would be rigorous and objective [59]. In addition, the study examined the moderating effects on all relationships within the conceptual model. The Assessment for Learning Experience Inventory (AFLEI) questionnaire served as the basic instrument to collect data for its five factors adapting the AFL conceptual model [44], and the partial least-squares structural equation modeling (PLS-SEM) approach was used to analyze these data with the AFL conceptual model.

3.1. Sample of the Study

Students in their second year at a “double-first-class” Chinese university were selected as respondents. The reason for this selection was that this group of students had already completed the study of the basic mathematics courses according to the university curriculum, and their learning experience of the AFL activities and strategies of the first-year mathematics courses was closest to the time of our survey. Our questionnaire was distributed and collected via the Internet, and each student could respond only once. Using this convenience sampling method, we recruited a total of 168 second-year undergraduate students from non-mathematics majors to participate in this study. Their mean age is 18.99 years, with a standard deviation of 2.20. The sample is considered representative of a homogeneous population because all volunteers completed the same number of credits and content in mathematics courses and took the uniform exams [60]. In addition, a full year of learning experiences in mathematics courses would help students more accurately compare their actual learning experiences with those described in the survey and further contribute to the relevance of their responses to this survey.

3.2. Measuring Instrument

We used the AFLEI questionnaire, which was developed from the theory of AFL practices and strategies and some existing assessment experience questionnaires, to generate data on students’ perceptions of AFL activities and strategies in first-year mathematics courses [24,40,61]. Moreover, results from exploratory factor analyses (EFAs) showed that the Kaiser–Meyer–Olkin (KMO) statistic was 0.88, exceeding the minimum adequacy value of 0.50, and Bartlett’s spherical test showed a significant chi-square value of 1881.09 (p < 0.001), and results from confirmatory factor analyses (CFAs) showed good model fit with SRMR = 0.07; CFI = 0.91. Both exploratory factor analyses (EFAs) and confirmatory factor analyses (CFAs) provided support for the AFLEI questionnaire with a strong psychometric basis. The final version of the AFLEI questionnaire has 18 items divided into 5 subscales: teacher formal feedback and support, interactive dialog and peer collaboration, learning-oriented assessment, active engagement with subject matter, and students taking responsibility for their learning [44]. Table 1 shows each subscale of the AFLEI questionnaire, brief descriptions of each dimension, and sample items. The full English and Chinese translations of the questionnaire are available in Appendix A and Appendix B, respectively.
On the questionnaire, to avoid neutral options and to show more nuance in the differentiation of their attitudes, students were asked to rate their agreement with each item statement on a six-point Likert scale: strongly disagree, disagree, slightly disagree, slightly agree, agree, and strongly agree. Additionally, we added some items that requested students to provide their gender, age, and major. We agreed to keep this information confidential.

3.3. Data Collection and Analysis

For our empirical study, we prepared the electronic version of the AFLEI questionnaire and administered it in the fall of 2022. All data that were collected can be found in Table S1. An initial screening of the data indicated that the data contained no outliers or missing values that would pose a challenge to the subsequent analysis. Thus, we calculated the scores for the five dimensions of the AFLEI questionnaire by averaging the corresponding item scores. Subsequently, we analyzed the generated data using SPSS 26.0 and Smart PLS 4.0. First, SPSS 26.0 was used for descriptive analysis, such as internal consistency of scales and Kaiser–Meyer–Olkin and Bartlett’s spherical test. Then, the Shapiro–Wilk test was conducted to check the normality of the data. Finally, Smart PLS 4.0 was dedicated to exploring the factors that influence students to take responsibility for their learning of aspects of teacher formal feedback and support, interactive dialog and peer collaboration, learning-oriented assessment, and active engagement with content in first-year mathematics courses. Because there were only 168 respondents in this study and the distribution of the sample is non-normal, PLS-SEM is considered to be a more appropriate SEM approach than the traditional covariance-based structural equation modeling (CB-SEM) approach [62,63]. Compared to the latter, the former requires a smaller sample size and does not require that the data conform to a normal distribution [64].

4. Results

The findings were divided into parts or pieces. First, through the students’ responses to the questionnaire, their general perceptions of AFL activities and strategies were inferred. Meanwhile, the score for each dimension was obtained by averaging the students’ responses to the items of each corresponding dimension, while the score for each item was the average of the total responses of the participating volunteers. Then, some indicators of the questionnaire, such as the composite reliability, convergent validity, and discriminant validity, were tested by the measurement model evaluation of the PLS-SEM. Finally, the PLS-SEM bootstrapping mode was used to show the effects of different aspects of AFL activities and strategies on Chinese undergraduate students’ ability to take responsibility for their learning in first-year mathematics courses and the influence of the moderating effects on all the relationships in the AFL conceptual model.

4.1. Students’ Perceptions of AFL Practices in First-Year Mathematics Courses

The results presented in Table 2 show that there is not an overwhelming agreement with the statement that the AFL activities and strategies that students received in the first-year mathematics courses were sufficient. In addition, this information can be very useful to indicate that more assessment tasks should be developed to improve teaching and learning at the class level and the institutional level. Specifically, the active engagement with the subject matter factor received a score of 4.73. Furthermore, teacher formal feedback and support, interactive dialog peer collaboration, and learning-oriented assessment received mean scores of 5.09, 4.95, and 5.00, respectively. These outcomes indicated that students recognized the utility of the AFL activities and strategies employed in the first-year mathematics courses. On a more positive note, the factor of students taking responsibility for their learning achieved a mean score of 5.15, suggesting that there was a relatively widespread agreement among students about the importance of managing and taking responsibility for their learning.

4.2. Measurement Model Evaluation

The reliability coefficients obtained using Cronbach’s alpha for the subscales of the questionnaire were 0.938, 0.875, 0.885, 0.942, and 0.883. The Kaiser–Meyer–Olkin (KMO) statistic was 0.939, exceeding the minimum adequacy value of 0.50, and Bartlett’s spherical test yielded a significant chi-square value of 3126.848 (p < 0.001), which together indicated that the data were suitable for factor analysis [65]. In Table 3, the lowest loading observed in the construct of interactive dialog and peer collaboration was 0.794, which exceeded the recommended threshold of 0.708. Furthermore, all t-statistics of the outer loadings were greater than 2.58 (p < 0.001), indicating strong indicator reliability [66]. It should also be noted that the Cronbach alpha and composite reliability (CR) of all constructs were close to 0.9 or more than 0.9, both of which exceeded the critical value of 0.7. Therefore, the model had a good internal consistency reliability [60]. Meanwhile, we could see that all the variance inflation factor (VIF) values were below 10, with most of them below 5, showing that this model did not have a serious multicollinearity problem [67,68].
Although the sample size was small, the presence of only five questionnaire factors, each with three or more items, together with acceptable indicator and internal consistency reliability, justified further validity analysis using PLS-SEM. Table 4 shows the convergent validity as Average Variance Extracted (AVE) and the discriminant validity as assessed by the Fornell-Larcker test, which emphasized the correlation of items under the same latent variable and the differentiation between latent variables, respectively [60].
So far, indicator reliability, internal consistency reliability, convergent validity, and discriminant validity have been tested, and all values met the criteria. This meant that the evaluation of the measurement model was satisfactory to proceed to the next step.

4.3. Structural Model Evaluation

Since the measurement model was established with an acceptable value in the previous subsection, we proceeded with the structural model to examine the associations among the latent variables. First, we tested the structural validity of the model. According to Henseler et al. [69], “the overall goodness-of-fit of the model should be the starting point of model assessment”. SRMR (standardized root mean square residual) and NFI (normed fit index) values were commonly used to evaluate the suitability and robustness of the model [70]. In this study, the SRMR (Standardized Root Mean Square Residual) value was 0.056, less than 0.08, and the NFI (Normed Fit Index) value was 0.833, slightly less than 0.90, but still within an acceptable range [71]. Thus, this study had a good model for empirical research, and the estimated PLS-SEM is demonstrated in Figure 2.
Based on Figure 2, we presented a summary of the hypothesis testing in Table 5. The first three hypotheses relate to the association between activities and strategies of AFL practices and active engagement with the subject matter. Among these three, teachers’ formal feedback and support had an insignificant effect on students’ active engagement with the subject matter (β = 0.077, p = 0.526 > 0.05). Meanwhile, interactive dialog and peer collaboration significantly affected students’ active engagement with the subject matter (β = 0.267, p = 0.035 < 0.05). This result implied that students found their coursework more interesting and relevant to the wider world when they were exposed to more opportunities for discussion and experienced dialogic teaching and learning through peer interaction. Moreover, learning-oriented assessment greatly affected students’ active engagement with the subject matter (β = 0.451, p < 0.001). This result suggested that learning-oriented assessment, which encourages students to rehearse their subject knowledge, could enhance their enjoyment of their work. Overall, it could be concluded that learning-oriented assessment was the strongest influential factor for students’ active engagement with subject matter. Apart from the findings mentioned above, active engagement with subject matter hugely affected students’ responsibility for their learning (β = 0.723, p < 0.001), suggesting that increased engagement in coursework corresponds to a greater sense of responsibility for learning.
Furthermore, the results indicated that interactive dialog and peer collaboration (β = 0.193, p < 0.05) and learning-oriented assessment (β = 0.326, p < 0.001) had an indirect positive effect on students’ ability to take responsibility for their learning in their first-year mathematics courses. Both were mediated by students’ active engagement with the subject matter.

5. Discussion

The survey showed that interactive dialog and peer collaboration, as well as learning-oriented assessment, were the main factors that influenced undergraduate students to actively engage with mathematical matters and take responsibility for their learning. In particular, learning-oriented assessment was the determinant factor that greatly affected undergraduate students’ course engagement and responsibility for learning. The following paragraphs will analyze each hypothesis in detail.
Teacher formal feedback and support had no significant impact on undergraduate students’ active engagement with the subject matter. In contrast to other views that feedback is effective in triggering appropriate formative responses in students when it is perceived as supportive [72,73], this study found that not all feedback provided to learners contributes to the extent of their engagement with the subject matter. Given the realities of China’s higher mathematics education system, there was an imbalance in the number of teachers and students. One teacher often responds to hundreds of students, and individual feedback and support would be almost impossible to provide for such large classes. In addition, Chinese university students tend to dislike teacher-centered pedagogy and desire for teaching styles that encourage creativity and enable collaborative work [74]. It suggests that mathematics teachers should focus on student-centered strategies when implementing feedback practices in mathematics classrooms, and institutional administrators should assign more teachers or tutors to students in mathematics classrooms.
Interactive dialog and peer collaboration had a significant impact on undergraduate students’ active engagement with the subject matter. Most studies have focused on formative feedback and assessment, the perceptions of interactive dialog and peer collaboration are still in their infancy, and very few studies have been reported in international journals. In fact, the factor of interactive dialog and peer collaboration is related to the nature of AFL as a knowledge-construction process involving interactive dialog [75]. This result addressed this gap and was consistent with previous studies that showed that students who had the opportunity to experience dialogic teaching and learning through peer interaction were often active participants in the classroom [76,77,78]. It means that mathematics teachers and institutional administrators could give their students more opportunities to discuss theorems and propositions through interaction and collaboration.
The learning-oriented assessment had the most significant impact on undergraduate students’ active engagement with the subject matter. Learning-oriented assessment is an approach that advocates the use of different assessment tasks that encourage students to test out ideas, practice relevant skills, and rehearse subject knowledge [79]. We speculated that formative assessment practices, such as having students reflect on their learning processes and how to improve their learning at the end of an examination, and innovative assessment methods, such as portfolios and concept mapping to examine students’ mastery of the learning content, encourage students to actively engage with the subject matter. It underscores that mathematics teachers and institutional administrators need to be more careful about the assessment tasks they design for students, avoiding rote learning and focusing on flexibility.
Undergraduate students’ active engagement with the subject matter had an extremely significant impact on students taking responsibility for learning. Students who scored high on the former factor tended to be more active and motivated in their coursework, while students who scored high on the latter factor were often more reflective, self-managed, and self-regulated, and thus better able to take responsibility for their learning [44]. This finding is consistent with Kaplan’s review, which concluded that the purpose of engagement is central to self-regulated action and noted its potential to guide the search for meaningful dimensions on which to typify self-regulation [80]. It advocates that mathematics teachers and institutional administrators relate mathematical problems to the wider world and make course content interesting to engage their students in mathematics courses.
Further mediation analysis revealed that interactive dialog and peer collaboration, learning-oriented assessment, and students taking responsibility for their learning were mediated via active engagement with subject matter. More specifically, interactive dialog and peer collaboration, as well as learning-oriented assessment, had statistically significant regression weights on students taking responsibility for their learning. Conversely, teacher formal feedback and support had statistically insignificant regression weights on students taking responsibility for their learning. These findings were not in line with previous studies, which showed that feedback is effective in triggering students’ responsibility for learning [73]. Given the contextual constraints of the Chinese education system, the long-term pressure of high-stakes exams forces students to seek feedback directly from their teachers. This hinders the development of creative thinking and initiative, which are seen as prerequisites for students to take responsibility for their learning. It implies that efforts to promote students’ responsibility for learning should focus on how teachers better deliver interactive activities and assessment tasks and how they focus on improving feedback.
Given the unique contribution of this study, we might expect these findings to generalize to all Chinese universities and benefit all students struggling in advanced mathematics courses. However, Chinese higher education has now moved from mass to universalization. Students’ mathematics backgrounds and teachers’ qualifications may vary across universities, reminding us that we need to be more cautious in extrapolating our findings. Nevertheless, the current findings do provide insights into how Chinese undergraduate students experience and use their mathematics teachers’ AFL practices and give teachers and institutional administrators implications for improving the learning, teaching, and assessment environment.

6. Implications

6.1. Theoretical Implications

Assessment for learning practices focuses on both the teacher’s teaching process and the student’s learning experience, providing information and explanations for both teachers and students. In particular, generating and collecting data on students’ perceptions of their teachers’ AFL practices contributes to improving instructional processes. Thus, this study developed an AFL conceptual model to identify the factors that positively influence students’ ability to take responsibility for their learning in first-year mathematics courses. We found that some AFL practices and strategies, such as interactive dialog and peer collaboration, as well as learning-oriented assessment, had a direct effect on students’ active engagement with the subject matter and an indirect effect on undergraduate students taking responsibility for their learning in first-year mathematics courses. Furthermore, we noted that sometimes, even students’ positive perceptions of teacher feedback and support practices do not necessarily result in effective engagement. These findings differed from previous works in that Chinese students often appeared as objects rather than subjects of assessment, which fully implemented the student-centered philosophy. It further extended the study of mathematics education by showing that learning-oriented assessment contributed to students taking responsibility for their learning in first-year mathematics courses.

6.2. Practical Implications

In fact, assessment for learning has become increasingly important in the area of curriculum reform and has been strongly promoted by increasing numbers of educational institutions [41]. For example, many Asian countries and regions at the OECD conference openly called for curricular reforms aimed at reducing the dominance of examinations by placing greater emphasis on the use of assessment to improve teaching and learning [81]. China shares the same examination culture with them, so collecting and generating high-quality data to evaluate the extent to which AFL practices have been incorporated into the revised university curriculum could lead to pedagogical interventions in the curriculum, especially for first-year mathematics courses. Therefore, the present empirical study did provide insights into how Chinese undergraduate students perceive and utilize teachers’ AFL practices in their first-year mathematics courses. Based on these findings, we suggest that teachers should encourage students to form learning communities to enhance each other’s learning through interactive negotiation and sharing of learning resources. For example, in mathematics class, teachers can help students break into study groups to discuss a proposition or theorem. Finally, we encourage diverse assessment approaches and content, such as noncognitive skills, including students’ learning attitudes, application skills, and personal enhancement [82]. In recent years, the ubiquity of 5G has made this vision a reality, allowing us to use big data and online platforms to build a picture of the students involved in the math curriculum [83]. It is worth noting that the defining characteristics of mathematics are the ability to think rationally through abstract ideas and logical reasoning and the ability to analyze and solve problems by integrating what has been learned [84], which should be taken into account when designing assessment tasks.

7. Conclusions

Generally speaking, this study examined the effects of undergraduate students’ perceptions of their teachers’ AFL practices conducted in first-year mathematics courses on their sense of responsibility for learning in China. The results provided more evidence of characteristics of AFL activities and strategies carried out in the mathematics classrooms. Admittedly, the results revealed several significant effects of undergraduate students’ perceptions of their teachers’ AFL practices on their sense of responsibility for learning in China, most of which were consistent with the findings of previous studies. Specifically speaking, interactive dialog and peer collaboration, as well as learning-oriented assessment, were found to positively predict students’ active engagement with subject matter and taking responsibility for their learning. These results echo the pattern of relationships reported in previous studies conducted in Western contexts [85,86,87]. Given the desirable associations between these factors, instructors are advised to provide more opportunities for student collaboration and different assessment approaches and content in their first-year mathematics courses. The opportunities should focus on the development of students’ generic skills, such as communication and interaction skills. Meanwhile, student assessment should emphasize the formative function to facilitate students’ mastery and understanding of knowledge.
Compared with the findings of previous studies, this study also revealed some notable inconsistencies regarding the role of the factor of teacher formal feedback and support. In contrast to the findings by Patel et al. [58], teacher formal feedback and support did not have a significant relationship with students’ active engagement with the subject matter and sense of responsibility for learning. This result is quite different from the findings of previous studies conducted in Western contexts [49,73], which prompts us to rethink the role of university teaching in China. This may be due to the fact that university teaching in China has long been characterized by teacher-centeredness and traditional pedagogy [88]. As stated before, students are always seen as recipients of teaching, and they only need to follow their teacher’s instruction and submit to their teachers’ will [32]. Although Chinese university students were found to have negative attitudes toward this passive teaching [89], the persistent belief that students are incapable of learning independently without teachers’ spoon-feeding didactic approaches remained in China [90], reinforcing students’ intention not to self-regulate. Therefore, teachers should make more efforts to improve the way they provide individualized feedback to their students so that they can take responsibility for their learning. Likewise, we recommend professional development aimed at improving teacher feedback literacy and feedback delivery, as previous research shows that promoting positive student perceptions of teacher assessment practices enhances feedback use and student learning gains [60].

8. Limitations and Directions for Future Research

Using the PLS-SEM approach, the present study has provided some preliminary findings to monitor and scaffold the construction of quality systems in China. Unfortunately, there are still some limitations, which can also be seen as directions for future research. First, the sample size of the sophomores was small. To make the results more general and representative, it is suggested that further research could be conducted at a more appropriate time. Second, due to length limitations, the study did not examine differences based on student demographics, such as gender, disciplinary background, and type of institution, which are important issues to address. Finally, the current research used only quantitative methodology, which is inadequate to explain the complexities associated with undergraduate student responsibility for learning. A further study should use mixed approaches, complementing quantitative with qualitative methods, to explore the factors that influence students’ ability to take responsibility for their learning in first-year mathematics courses.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/math12020274/s1, Table S1: Student perception.

Author Contributions

Conceptualization, B.W. and Y.P.; methodology, B.W. and Y.P.; software, B.W.; validation, B.W., Y.P. and Z.C.; formal analysis, B.W.; investigation, B.W.; resources, Z.C.; Data curation, B.W.; writing—original draft, B.W.; writing—review & editing, B.W., Y.P. and Z.C.; visualization, B.W.; supervision, Z.C.; project administration, B.W. and Z.C.; funding acquisition, Z.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Beijing Social Science Fund Program of Beijing Philosophy and Social Science Planning Office, grant number 23JYC023.

Data Availability Statement

The data presented in this study are available in Supplementary Materials.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A. Assessment for Learning Experience Inventory

The purpose of this questionnaire is to explore your perceptions of feedback, interaction, and assessment in our first-year mathematics courses, as well as your self-assessment of your engagement in mathematics courses and responsibility for your learning. The results will be used for evaluation and research purposes only, and we promise to keep them confidential.
For each statement, show the extent of your agreement or disagreement by putting a cross in the box that best reflects your current view of courses so far.
Table A1. AFLEI questionnaire.
Table A1. AFLEI questionnaire.
Strongly DisagreeDisagreeSlightly DisagreeSlightly AgreeAgreeStrongly Agree
Q1. Staff were patient in explaining things that seemed difficult to grasp
Q2. The feedback given on my work by staff helped me to improve my ways of learning
Q3. Staff gave me the support I needed to help me approach the set course work
Q4. The feedback given on my work by staff helped to clarify things I hadn’t fully understood
Q5. In class, students had the opportunity to discuss ideas
Q6. In class, there was group cooperative work
Q7. In class, students have the opportunity to ask questions
Q8. I cooperate with other students when doing assignment work
Q9. Portfolios were used to assess student progress
Q10. Concept mapping was used to assess understanding of thelearning contents
Q11. Students had the opportunity to decide on their learning objectives and goals
Q12. After each examination or assessment, staff encouraged students to reflect upon their learning processes and how to improve their learning
Q13. I found most of what I learned in my course really interesting
Q14. My course encouraged me to relate what I learned to issues in the wider world
Q15. I enjoyed being involved in my course
Q16. I developed my understanding of the subject content through communication with other students
Q17. I was prompted to think about how well I was learning and how I might improve
Q18. Students supported each other and tried to give help when it was needed

Appendix B. “学习性评价” 经历问卷

这份问卷旨在收集你在一年级高等数学课程学习过程中对于接收到的反馈、互动和评估活动的感受, 此外我们还想你自己做一个关于你的学习投入和学习责任感的自评。你回答的所有信息仅用于研究, 我们承诺对此完全保密.
对于每一个题项, 你可以通过打勾的方式表达你的满意或者不满意以及对应的程度.
Table A2. “学习性评价” 经历问卷.
Table A2. “学习性评价” 经历问卷.
非常不同意不同意略微不同意略微同意同意非常同意
1. 老师耐心地给同学讲解难以掌握的内容
2. 老师对我的作业所给予的评语或建议有助于我改进我的学习方法
3. 在完成作业的过程中, 我得到了老师的支持
4. 老师对我的作业所给予的评语或建议有助于我理解先前我不太明白的内容
5. 课堂上, 学生有机会参加讨论
6. 课堂上有学生小组合作学习的活动
7. 在课堂上, 学生有机会问问题
8. 我和我的同学合作共同完成作业和功课
9. 通过记录平时学习过程和表现来衡量学习上的进步
10. 课堂上用概念图来检查对学习内容的理解和掌握
11. 学生自己决定个人学习目标和目的
12. 考试或评核之后, 老师引导同学反思功课哪些方面需要改进
13. 我觉得所学的大部分东西都很有趣
14. 我的课程鼓励我将我所学到的东西与课外的一些问题联系起来
15. 我喜欢学我的课程
16. 我通过和其他同学沟通来帮助我理解所学的内容
17. 在学习时, 我常被引发思考自己学得怎样以及我该怎样进一步提升自己
18. 同学间互相支持, 以及帮助有需要的同学

References

  1. Gueudet, G. Investigating the secondary-tertiary transition. Educ. Stud. Math. 2008, 67, 237–254. [Google Scholar] [CrossRef]
  2. Alcock, L.; Simpson, L. Convergence of sequences and series: Interactions between visual reasoning and the learner’s beliefs about their own role. Educ. Stud. Math. 2004, 57, 1–32. [Google Scholar] [CrossRef]
  3. Clark, M.; Lovric, M. Suggestion for a theoretical model for secondary-tertiary transition in mathematics. Math. Educ. Res. J. 2008, 20, 25–37. [Google Scholar] [CrossRef]
  4. Selden, A. Transitions and proof and proving at tertiary level. In Proof and Proving in Mathematics Education: The 19th ICMI Study; Hanna, G., Villier, D.M., Eds.; Springer: Dordrecht, The Netherlands, 2012; Volume 15, pp. 391–420. [Google Scholar]
  5. Bardelle, C.; Martino, D.P. E-learning in secondary–tertiary transition in mathematics: For what purpose? ZDM-Math. Educ. 2012, 44, 787–800. [Google Scholar] [CrossRef]
  6. Smith, W.M.; Rasmussen, C.; Tubbs, R. Introduction to the special issue: Insights and lessons learned from mathematics department in the process of change. PRIMUS 2021, 31, 239–251. [Google Scholar] [CrossRef]
  7. Laursen, S. Levers for Change: An Assessment of Progress on Changing STEM Instruction, 1st ed.; American Association for the Advancement of Science: Washington, DC, USA, 2019. [Google Scholar]
  8. Zakariya, Y.F. Undergraduate Students’ Performance in Mathematics: Individual and Combined Effects of Approaches to Learning, Self-Efficacy, and Prior Mathematics Knowledge; Department of Mathematical Sciences, University of Agder: Kristiansand, Norway, 2021. [Google Scholar]
  9. Ellis, J.; Fosdick, B.K.; Rasmussen, C. Women 1.5 times more likely to leave STEM pipeline after calculus compared to men: Lack of mathematical confidence a potential culprit. PLoS ONE 2016, 11, e0157447. [Google Scholar] [CrossRef] [PubMed]
  10. Liu, J.; Liu, Q.; Zhang, J.; Shao, Y.Y.; Zhang, Z.K. The trajectory of Chinese mathematics textbook development for supporting students’ interest within the curriculum reform: A grade eight example. ZDM-Math. Educ. 2022, 54, 625–637. [Google Scholar] [CrossRef]
  11. Li, S.; Shen, Y.; Jiao, X.; Cai, S. Using Augmented Reality to Enhance Students’ Representational Fluency: The Case of Linear Functions. Mathematics 2022, 10, 1718. [Google Scholar] [CrossRef]
  12. Wang, L. Effects of online learning communities on college students’ knowledge learning and construction. J. Interdiscip. Math. 2018, 21, 377–387. [Google Scholar] [CrossRef]
  13. Su, C.Y.; Chen, C.H. Investigating the effects of flipped learning, student question generation, and instant response technologies on students’ learning motivation, attitudes, and engagement: A structural equation modeling. Eurasia J. Math. Sci. Technol. 2018, 14, 2453–2466. [Google Scholar]
  14. Gu, F.; Gu, L. Characterizing mathematics teaching research specialists’ mentoring in the context of Chinese lesson study. ZDM-Math. Educ. 2016, 48, 441–454. [Google Scholar] [CrossRef]
  15. Li, N.; Cao, Y.; Mok, I.A.C. A framework for teacher verbal feedback: Lessons from Chinese mathematics classrooms. Eurasia J. Math. Sci. Technol. 2016, 12, 2465–2480. [Google Scholar] [CrossRef]
  16. Zhou, J.; Bao, J.; He, R. Characteristics of Good Mathematics Teaching in China: Findings from Classroom Observations. Int. J. Sci. Math. Educ. 2023, 21, 1177–1196. [Google Scholar] [CrossRef]
  17. Buhagiar, M.A. Classroom assessment within the alternative assessment paradigm: Revisiting the territory. Curric. J. 2007, 18, 39–56. [Google Scholar] [CrossRef]
  18. Wiliam, D. What is assessment for learning? Stud. Educ. Eval. 2011, 37, 3–14. [Google Scholar] [CrossRef]
  19. Swaffield, S. Getting to the heart of authentic assessment for learning. Assess. Educ. 2011, 18, 433–449. [Google Scholar] [CrossRef]
  20. Black, P.; Harrison, C.; Lee, C.; Marshall, B.; Wiliam, D. Assessment for Learning: Putting It into Practice, 1st ed.; Open University Press: Maidenhead, UK, 2003. [Google Scholar]
  21. Wiliam, D. Keeping learning on track: Formative assessment and the regulation of learning. In Second Handbook of Mathematics Teaching and Learning; Lester, F.K., Jr., Ed.; Information Age: Greenwich, CT, USA, 2007; pp. 1053–1098. [Google Scholar]
  22. Harris, L.R.; Brown, G.T.L.; Harnett, J.A. Understanding classroom feedback practices: A study of New Zealand student experiences, perceptions, and emotional responses. Educ. Assess. Eval. Acc. 2014, 26, 107–133. [Google Scholar] [CrossRef]
  23. Kyaruzi, F.; Strijbos, J.W.; Ufer, S. Students’ assessment for learning perceptions and their mathematics achievement in Tanzanian secondary schools. In Proceedings of the 40th Conference of the International Group for the Psychology of Mathematics Education, Szeged, Hungary, 3–7 August 2016; Csíkos, C., Rausch, A., Szitányi, J., Eds.; PME: Szeged, Hungary, 2016; pp. 69–84. [Google Scholar]
  24. McDowell, L.; Wakelin, D.; Montgomery, C.; King, S. Does assessment for learning make a difference? The development of a questionnaire to explore the student response. Assess. Eval. High. Educ. 2011, 36, 749–765. [Google Scholar] [CrossRef]
  25. Black, P.; Harrison, C.; Lee, C.; Marshall, B.; Wiliam, D. Working inside the Black Box: Assessment for Learning in the classroom. Phi Delta Kappan 2004, 86, 8–21. [Google Scholar] [CrossRef]
  26. Carless, D. Excellence in University Assessment: Learning from Award-Winning Practice, 1st ed.; Routledge: London, UK, 2015. [Google Scholar]
  27. Sambell, K.; McDowell, L.; Montgomery, C. Assessment for Learning in Higher Education, 1st ed.; Routledge: London, UK, 2013. [Google Scholar]
  28. Carless, D. Scaling Up Assessment for Learning: Progress and Prospects. In Scaling Up Assessment for Learning in Higher Education; Carless, D., Bridges, S.M., Chan, C.K.Y., Glofcheski, R., Eds.; Springer: Singapore, 2017; pp. 3–17. [Google Scholar]
  29. Zhang, H.; Foskett, N.; Wang, D.; Qu, M. Student satisfaction with undergraduate teaching in China: A comparison between research-intensive and other universities. High. Educ. Policy 2011, 24, 1–24. [Google Scholar] [CrossRef]
  30. Yin, H.; Lu, G.; Wang, W. Unmasking the teaching quality of higher education: Students’ course experience and approaches to learning in China. Assess. Eval. High. Educ. 2014, 39, 949–970. [Google Scholar] [CrossRef]
  31. Lee, J.C.K.; Huang, Y.X.; Zhong, B. Friend or foe: The impact of undergraduate teaching evaluation in China. High. Educ. Rev. 2012, 44, 5–25. [Google Scholar]
  32. Yin, H.; Wang, W.; Han, J. Chinese undergraduates’ perceptions of teaching quality and the effects on approaches to studying and course satisfaction. High. Educ. 2016, 71, 39–57. [Google Scholar] [CrossRef]
  33. Yin, H.; Ke, Z. Students’ course experience and engagement: An attempt to bridge two lines of research on the quality of undergraduate education. Assess. Eval. High. Educ. 2017, 42, 1145–1158. [Google Scholar] [CrossRef]
  34. Coates, H. The value of student engagement for higher education quality assurance. Qual. High. Educ. 2005, 11, 25–36. [Google Scholar] [CrossRef]
  35. Coates, H.; Mccormick, A.C. Introduction: Student engagement-a window into undergraduate education. In Engaging University Students: International Insights from System-Wide Studies; Coates, H., Mccormick, A.C., Eds.; Springer: Dordrecht, The Netherlands, 2014; pp. 1–12. [Google Scholar]
  36. Shi, J. The Transformation of Quality Assurance in Higher Education in China. In Higher Education in Asia/Pacific; Bigalke, T.W., Neubauer, D.E., Eds.; Palgrave Macmillan: New York, NY, USA, 2009; pp. 99–110. [Google Scholar] [CrossRef]
  37. Guo, F.; Gao, X.; Yang, J.; Shi, J. CCSS and Student Engagement In China: From an academic concept to institutional practices and policy implications. In Global Student Engagement: Policy Insights and International Research Perspectives; Coates, H., Gao, X., Guo, F., Shi, J., Eds.; Routledge: London, UK, 2022; pp. 27–42. [Google Scholar]
  38. Campbell, C.M.; Cabrera, A.F. How sound is NSSE: Investigating the psychometric properties of NSSE at a public, research-extensive institution. Rev. High. Educ. 2011, 35, 77–103. [Google Scholar] [CrossRef]
  39. Porter, S.R. Do college student surveys have any validity? Rev. High. Educ. 2011, 35, 45–76. [Google Scholar] [CrossRef]
  40. Pat-El, R.J.; Tillema, H.; Segers, M.; Vedder, P. Validation of Assessment for Learning Questionnaires for teachers and students. Br. J. Educ. Psychol. 2013, 83, 98–113. [Google Scholar] [CrossRef]
  41. Stiggins, R. From formative assessment to assessment for learning: A path to success in standards-based schools. Phi Delta Kappan 2005, 87, 324–328. [Google Scholar] [CrossRef]
  42. Black, P.; Wiliam, D. Assessment and classroom learning. Int. J. Phytoremediat. 1998, 21, 7–74. [Google Scholar] [CrossRef]
  43. Assessment Reform Group. Assessment for Learning: 10 Principles. 2002. Available online: http://www.assessment-reform-group.org.uk/ (accessed on 26 October 2023).
  44. Gan, Z.; He, J.; Mu, K. Development and Validation of the Assessment for Learning Experience Inventory (AFLEI) in Chinese Higher Education. Asia-Pac. Educ. Res. 2019, 28, 371–385. [Google Scholar] [CrossRef]
  45. Hattie, J. Visible Learning: A Synthesis of over 800 Meta-Analyses Relating to Achievement, 1st ed.; Routledge: London, UK, 2009. [Google Scholar]
  46. Boud, D. Shifting views of assessment: From secret teachers’ business to sustaining learning. In Advances and Innovations in University Assessment and Feedback; Kreber, C., Anderson, C., Entwistle, N., McArthur, J., Eds.; Edinburgh University Press: Edinburgh, UK, 2014; pp. 13–31. [Google Scholar]
  47. Meyer, L.; Davidson, S.; McKenzie, L.; Rees, M.; Fletcher, R.; Johnston, P. An investigation of tertiary assessment policy and practice: Alignment and contradictions. High. Educ. Q. 2010, 64, 331–350. [Google Scholar] [CrossRef]
  48. Chen, Q.; Kettle, M.; Klenowski, V.; May, L. Interpretations of formative assessment in the teaching of English at two Chinese universities: A sociocultural perspective. Assess. Eval. High. Educ. 2013, 38, 831–846. [Google Scholar] [CrossRef]
  49. Jonsson, A. Facilitating productive use of feedback in higher education. Act. Learn. High. Educ. 2013, 14, 63–76. [Google Scholar] [CrossRef]
  50. Winstone, N.E.; Nash, R.A.; Parker, M.; Rowntree, J. Supporting learners’ agentic engagement with feedback: A systematic review and a taxonomy of recipience processes. Educ. Psychol. 2017, 52, 17–37. [Google Scholar] [CrossRef]
  51. Carless, D.; Boud, D. The development of student feedback literacy: Enabling uptake of feedback. Assess. Eval. High. Educ. 2018, 43, 1315–1325. [Google Scholar] [CrossRef]
  52. King, P.E.; Schrodt, P.; Weisel, J.J. The Instructional Feedback Orientation Scale: Conceptualizing and validating a new measure for assessing perceptions of instructional feedback. Commun. Educ. 2009, 58, 235–261. [Google Scholar] [CrossRef]
  53. Birenbaum, M. Assessment and instruction preferences and their relationship with test anxiety and learning strategies. High. Educ. 2007, 53, 749–768. [Google Scholar] [CrossRef]
  54. Hernández, R. Does continuous assessment in higher education support student learning? High. Educ. 2012, 64, 489–502. [Google Scholar] [CrossRef]
  55. Ajzen, I. The theory of planned behavior. Organ. Behav. Hum. Dec. 1991, 50, 179–211. [Google Scholar] [CrossRef]
  56. Ajzen, I. Attitudes, Personality and Behavior, 2nd ed.; Open University Press: Milton-Keynes, UK, 2005. [Google Scholar]
  57. Lu, S.; Cheng, L.; Chahine, S. Chinese university students’ conceptions of feedback and the relationships with self-regulated learning, self-efficacy, and English language achievement. Front. Psychol. 2022, 13, 1047323. [Google Scholar] [CrossRef]
  58. Pat-El, R.J.; Tillema, H.; Segers, M.; Vedder, P. Multilevel predictors of differing perceptions of assessment for learning practices between teachers and students. Assess. Educ. 2015, 22, 282–298. [Google Scholar] [CrossRef]
  59. William, L.N. Social Research Methods: Qualitative and Quantitative Approaches, 7th ed.; Pearson: London, UK, 2009. [Google Scholar]
  60. Hair, J.F.; Black, W.; Babin, B.; Anderson, R. Multivariate Data Analysis, 8th ed.; Cengage Learning EMEA: Hampshire, UK, 2019. [Google Scholar]
  61. Gibbs, G.; Simpson, C. Measuring the response of students to assessment: The Assessment Experience Questionnaire. In Improving Student Learning: Theory, Research and Scholarship; Rust, C., Ed.; OCSLD: Oxford, OH, USA, 2004; pp. 171–185. [Google Scholar]
  62. Wijaya, T.T.; Zhou, Y.; Houghton, T.; Weinhandl, R.; Lavicza, Z.; Yusop, F.D. Factors Affecting the Use of Digital Mathematics Textbooks in Indonesia. Mathematics 2022, 10, 1808. [Google Scholar] [CrossRef]
  63. Wijaya, T.T.; Yu, B.; Xu, F.; Yuan, Z.; Mailizar, M. Analysis of Factors Affecting Academic Performance of Mathematics Education Doctoral Students: A Structural Equation Modeling Approach. Int. J. Environ. Res. Pub. Health 2023, 20, 4518. [Google Scholar] [CrossRef] [PubMed]
  64. Hair, J.F.; Sarstedt, M.; Ringle, C.M.; Mena, J.A. An assessment of the use of partial least squares structural equation modeling in marketing research. J. Acad. Mark. Sci. 2012, 40, 414–433. [Google Scholar] [CrossRef]
  65. George, D.; Mallery, P. SPSS for Windows Step by Step: A Simple Guide and Reference, 4th ed.; Allyn & Bacon: Boston, MA, USA, 2011. [Google Scholar]
  66. Hair, J.F.; Risher, J.J.; Sarstedt, M.; Ringle, C.M. When to use and how to report the results of PLS-SEM. Eur. Bus. Rev. 2019, 31, 2–24. [Google Scholar] [CrossRef]
  67. O’Brien, R.M. A caution regarding rules of thumb for variance inflation factors. Qual. Quant. 2007, 41, 673–690. [Google Scholar] [CrossRef]
  68. Hair, J.F.; Ringle, C.M.; Sarstedt, M. PLS-SEM: Indeed a silver bullet. J. Mark. Theory Pract. 2011, 19, 139–152. [Google Scholar] [CrossRef]
  69. Fornell, C.; Larcker, D.F. Evaluating structural equation models with unobservable variables and measure error. J. Mark. Res. 1981, 18, 39–50. [Google Scholar] [CrossRef]
  70. Schuberth, C.H.; Rademaker, M.E.; Henseler, J. Assessing the overall fit of composite models estimated by partial least squares path modeling. Eur. J. Mark. 2022, 57, 1678–1702. [Google Scholar] [CrossRef]
  71. Ziggers, G.W.; Henseler, J. The reinforcing effect of a firm’s customer orientation and supply-base orientation on performance. Ind. Mark. Manag. 2016, 52, 18–26. [Google Scholar] [CrossRef]
  72. Kyaruzi, F.; Strijbos, J.W.; Ufer, S.; Brown, G.T.L. Students’ formative assessment perceptions, feedback use and mathematics performance in secondary schools in Tanzania. Assess. Educ. 2019, 26, 278–302. [Google Scholar] [CrossRef]
  73. Lipnevich, A.A.; Berg, D.A.G.; Smith, J.K. Toward a model of student response to feedback. In The Handbook of Human and Social Conditions in Assessment; Brown, G.T.L., Harris, L.R., Eds.; Routledge: New York, NY, USA, 2016; pp. 169–185. [Google Scholar]
  74. Brown, G.T.L.; Wang, Z. Illustrating assessment: How Hong Kong university students conceive of the purposes of assessment. Stud. High. Educ. 2013, 38, 1037–1057. [Google Scholar] [CrossRef]
  75. Black, P.; Wiliam, D. Developing the theory of formative assessment. Educ. Assess. Eval. Acc. 2009, 21, 5–31. [Google Scholar] [CrossRef]
  76. Aguiar, O.G.; Mortimer, E.F.; Scott, P. Learning from and responding to students’ questions: The authoritative and dialogic tension. J. Res. Sci. Teach. 2010, 47, 174–193. [Google Scholar] [CrossRef]
  77. Kim, D.; Jang, S.E. Dialogic practices in using podcasting and blogging as teaching tools for teachers seeking ESOL certificate. J. Educ. Comput. Res. 2014, 51, 205–232. [Google Scholar] [CrossRef]
  78. Alkhouri, J.S.; Donham, C.; Pusey, T.S.; Signorini, A.; Stivers, A.H.; Kranzfelder, P. Look Who’s Talking: Teaching and Discourse Practices across Discipline, Position, Experience, and Class Size in STEM College Classrooms. BioScience 2021, 71, 1063–1078. [Google Scholar] [CrossRef]
  79. Carless, D. Learning-oriented assessment: Conceptual bases and practical implications. Innov. Educ. Teach. Int. 2007, 44, 57–66. [Google Scholar] [CrossRef]
  80. Kaplan, A. Clarifying metacognition, self-regulation, and self-regulated learning: What’s the purpose? Educ. Psychol. Rev. 2008, 20, 477–484. [Google Scholar] [CrossRef]
  81. Gallagher, M.J.; Malloy, J.; Ryerson, R. Achieving excellence: Bringing effective literacy pedagogy to scale in Ontario’s publicly-funded education system. J. Educ. Change 2016, 17, 477–504. [Google Scholar] [CrossRef]
  82. Heckman, J.J.; Rubinstein, Y. The benefits of skill: The importance of noncognitive skills: Lessons from the GED testing program. Am. Econ. Rev. 2001, 91, 145–154. [Google Scholar] [CrossRef]
  83. Boden, M.A. The Creative Mind: Myths and Mechanisms, 2nd ed.; Routledge: London, UK, 2004. [Google Scholar]
  84. Tall, D. Advanced Mathematical Thinking; Kluwer Academic Publishers: Holland, MI, USA, 1991. [Google Scholar]
  85. Boud, D. Enhancing Learning through Self-Assessment, 1st ed.; Kogan Page: London, UK, 1995. [Google Scholar]
  86. Boud, D. Sustainable assessment: Rethinking assessment for the learning society. Stud. Contin. Educ. 2000, 22, 151–167. [Google Scholar] [CrossRef]
  87. Boud, D.; Molloy, E. Rethinking models of feedback for learning: The challenge of design. Assess. Eval. High. Edu. 2013, 38, 698–712. [Google Scholar] [CrossRef]
  88. Yin, Y.M.; Mu, G.M. Examination-oriented or quality-oriented? A question for fellows of an alternative teacher preparation program in China. Aust. Educ. Res. 2022, 49, 727–742. [Google Scholar] [CrossRef]
  89. Tam, K.Y.; Heng, M.A.; Jiang, G.H. What undergraduate students in China say about their professors’ teaching. Teach. High. Educ. 2009, 14, 147–159. [Google Scholar] [CrossRef]
  90. Webster, R.J.; Chan, W.S.; Prosser, M.T.; Watkins, D. Approaches to studying and perceptions of the academic environment among university studies in Pakistan. Compare 2009, 41, 113–127. [Google Scholar]
Figure 1. AFL conceptual model.
Figure 1. AFL conceptual model.
Mathematics 12 00274 g001
Figure 2. Final model with path coefficients and p values.
Figure 2. Final model with path coefficients and p values.
Mathematics 12 00274 g002
Table 1. Sample items of each factor of the AFLEI questionnaire.
Table 1. Sample items of each factor of the AFLEI questionnaire.
FactorShort DescriptionItem NumberSample Item
F1. Teacher formal feedback and supportStudents need to receive constructive guidance about how to improve1, 2, 3, 4Staff were patient in explaining things that seemed difficult to grasp
F2. Interactive dialog and peer collaborationKnowledge construction process involving interactive dialog5, 6, 7, 8In class, students had the opportunity to discuss ideas
F3. Learning-oriented assessmentDifferent assessment tasks encourage students to test out ideas, skills, and knowledge9, 10, 11, 12Portfolios were used to assess student progress
F4. Active engagement with subject matterConstructive alignment leads to students’ engagement with their coursework13, 14, 15I found most of what I learned in my course really interesting
F5. Students taking responsibility for their learningAspects of the AFL enable students to manage and take ownership of their learning16, 17, 18I developed my understanding of the subject content through communication with other students
Table 2. Descriptive statistics of undergraduate students’ perceptions.
Table 2. Descriptive statistics of undergraduate students’ perceptions.
FactorItem NumberMSDCronbach’s α
F1. Teacher formal feedback and support45.091.080.938
F2. Interactive dialog and peer collaboration44.951.190.875
F3. Learning-oriented assessment45.001.060.885
F4. Active engagement with subject matter34.731.300.942
F5. Students taking responsibility for their learning35.150.960.883
Table 3. Results for outer loadings, reliability, and multicollinearity.
Table 3. Results for outer loadings, reliability, and multicollinearity.
FactorItemCROuter LoadingsT-Statisticsp ValuesVIF
F1. Teacher formal feedback and supportQ10.9560.86833.2350.0002.505
Q20.95184.7050.0006.072
Q30.91939.5290.0003.944
Q40.93661.8100.0005.163
F2. Interactive dialog and peer collaborationQ50.9150.91243.3440.0003.182
Q60.84930.4490.0002.433
Q70.79415.2890.0002.061
Q80.85723.8060.0002.397
F3. Learning-oriented assessmentQ90.9210.85125.7310.0002.296
Q100.85530.4540.0002.317
Q110.90923.0730.0002.104
Q120.95546.4100.0003.308
F4. Active engagement with subject matterQ130.9630.94980.9070.0005.051
Q140.93686.4270.0004.540
Q150.92762.8830.0003.955
F5. Students taking responsibility for their learningQ160.9270.89147.2880.0003.344
Q170.90552.5490.0002.337
Q180.86522.5580.0002.443
Table 4. Results for convergent validity and discriminant validity.
Table 4. Results for convergent validity and discriminant validity.
FactorsConvergent ValidityDiscriminant Validity
AVEF1F2F3F4F5
F1. Teacher formal feedback and support0.8440.919
F2. Interactive dialog and peer collaboration0.7290.8550.854
F3. Learning-oriented assessment0.7450.8480.8500.863
F4. Active engagement with subject matter0.8960.6870.7160.7430.947
F5. Students taking responsibility for their learning0.8090.7320.6920.7730.7230.899
Table 5. Summary of hypothesis testing.
Table 5. Summary of hypothesis testing.
HypothesisStandardized CoefficientRemark
H1: F1→F40.077Not supported
H2: F2→F40.267 *Supported
H3: F3→F40.451 ***Supported
H4: F4→F50.723 ***Supported
Indirect/mediation effects
F1→F4→F50.055No mediation
F2→F4→F50.193 *Partial mediation
F3→F4→F50.326 ***Partial mediation
Standard error in parenthesis. *** p < 0.001, * p < 0.05.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wang, B.; Peng, Y.; Cao, Z. How Chinese Undergraduate Students’ Perceptions of Assessment for Learning Influence Their Responsibility for First-Year Mathematics Courses. Mathematics 2024, 12, 274. https://doi.org/10.3390/math12020274

AMA Style

Wang B, Peng Y, Cao Z. How Chinese Undergraduate Students’ Perceptions of Assessment for Learning Influence Their Responsibility for First-Year Mathematics Courses. Mathematics. 2024; 12(2):274. https://doi.org/10.3390/math12020274

Chicago/Turabian Style

Wang, Bo, Yangui Peng, and Zhenxi Cao. 2024. "How Chinese Undergraduate Students’ Perceptions of Assessment for Learning Influence Their Responsibility for First-Year Mathematics Courses" Mathematics 12, no. 2: 274. https://doi.org/10.3390/math12020274

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop