Next Article in Journal
Charting New Imaginaries for DEI: Lessons from a Capabilities Approach to Justice
Previous Article in Journal
The Possibilities and Impossibilities of Transformative Leadership: An Autoethnographic Study of Demographic Data Policy Enactment in Ontario
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Understanding Fourth-Grade Student Achievement Using Process Data from Student’s Web-Based/Online Math Homework Exercises

1
Russian School of Mathematics, Newton, MA 02459, USA
2
School of Education and Human Development, Boston College, Chestnut Hill, MA 02467, USA
*
Author to whom correspondence should be addressed.
Educ. Sci. 2025, 15(6), 753; https://doi.org/10.3390/educsci15060753
Submission received: 22 May 2025 / Revised: 10 June 2025 / Accepted: 13 June 2025 / Published: 14 June 2025

Abstract

:
Understanding how students’ online homework behaviors relate to their academic success is increasingly important, especially in elementary education where such research is still emerging. In this study, we examined three years of online homework data from fourth-grade students enrolled in an after-school math program. Our goal was to see whether certain behaviors—like how soon students started their homework, how many times they tried to solve problems, or whether they uploaded their written work—could help explain differences in homework completion and test performance. We used multiple regression analyses and found that some habits, such as beginning homework soon after class and regularly attending lessons, were consistently linked to better homework scores across all curriculum levels. Test performance, however, was harder to predict and showed fewer consistent patterns. These findings suggest that teaching and encouraging specific online study behaviors may help support younger students’ academic growth in digital learning environments.

1. Introduction

Studying student learning behaviors related to homework performance is an important focus in educational research and practice (Yeary, 1978; Epstein, 1983; Cooper et al., 2006). Understanding these behaviors offers insight into academic achievement (Zimmerman & Kitsantas, 2005) and supports the development of effective teaching strategies (Rosário et al., 2015; Roschelle et al., 2016). Prior research on the process of completing homework has examined various aspects of student learning behaviors, including the amount of homework completed, time spent on assignments (Ilina et al., 2023; Ilina et al., 2024), time management (Valle et al., 2016), frequency of attempts, effort, and homework completion rates (Fan et al., 2017)—all in relation to their impact on homework performance and academic outcomes. Much of this research has aimed to improve instructional efficiency (Fateen & Mine, 2021; Algozzine et al., 2010) and identify students at risk of academic failure (Meylani et al., 2014; Balfanz et al., 2007).
Our experience as practitioners has shown that strong homework habits contribute to a wide range of positive academic outcomes. This perspective is supported by research highlighting the role of homework in promoting academic motivation (Bempechat, 2004) and the association between homework scores, time spent on assignments, and final grades (Fan et al., 2017; Alhassan et al., 2020). Additionally, studies suggest that homework policies and practices at the elementary level can shape the development of academic skills and learning behaviors needed for success in secondary school (Epstein, 1983). Homework can also support the retention of factual knowledge, enhance information processing, and foster self-discipline (Ramdass & Zimmerman, 2011; Wu et al., 2023).
In addition to a long tradition of linking survey and observational data on student behaviors to academic outcomes—such as standardized test scores and teacher-developed assessments—recent research has increasingly used technology to collect and analyze large volumes of online “process data” to better understand student behavior and performance (Mogessie et al., 2015; Feng & Roschelle, 2016; Namoun & Alshanqiti, 2020). Process data refers to information logged with timestamps while users interact with a computer; in this study, it refers specifically to data collected while students worked on homework using a computer (He & von Davier, 2016). When such data are analyzed in educational settings, the practice is commonly known as educational data mining (Alhassan et al., 2020).
Richards-Babb et al. (2011) found that online homework completion was positively associated with learning achievement among college students. At the university level, Kabakchieva (2013) successfully predicted academic outcomes using data mining techniques applied to students’ online activity data. In contrast, Feng and Roschelle (2016) developed metrics from seventh-grade process data to predict standardized test scores. They also tracked online homework behaviors among fifth-grade students, showing correlations with academic achievement (Shechtman et al., 2019). More recently, Wu et al. (2023) analyzed homework log records from 66 fourth-grade students and reported that “online homework completion exhibited a significant effect on learning achievement.” Collectively, these studies support the aims of the current research by demonstrating the value of online process data in understanding student homework behaviors and their relationship to academic success.

2. Literature Review

A growing body of research highlights the value of using process data from digital learning environments to better understand student behavior and achievement. This literature review synthesizes prior empirical and theoretical work relevant to the process variables analyzed in this study. These include indicators of behavioral engagement such as attendance, time of homework initiation, number of problem attempts, assignment revisits, time spent on homework, and evidence of problem-solving through uploaded written work. Each variable is conceptually grounded in existing research on student learning behaviors and educational outcomes.

2.1. Attendance and Instructional Engagement

Regular class attendance has been consistently linked to positive academic outcomes. Balfanz et al. (2007) demonstrated that high rates of attendance are a strong predictor of student success and school completion, particularly in urban school settings. In hybrid or after-school programs, attendance further enables students to benefit from ongoing feedback and instructional scaffolding (Rosário et al., 2015). In such contexts, missing a class can disrupt learning continuity and reduce the effectiveness of follow-up assignments, such as homework.

2.2. Timing of Homework Initiation

The timing of a student’s first homework engagement after instruction is often associated with self-regulation and effective time management. Zimmerman and Kitsantas (2005) emphasized that students who begin assignments promptly are more likely to employ proactive learning strategies. Valle et al. (2016) also found that early engagement helps students retain instructional content and manage academic responsibilities more effectively, particularly in elementary school populations where cognitive load and memory decay may be more pronounced between lessons.

2.3. Problem Attempts and Persistence

The total number of attempts made when solving homework problems serves as a behavioral indicator of persistence. While repeated attempts can signify perseverance, there is also evidence of diminishing returns. Bezirhan et al. (2020) modeled item revisit and response behaviors, finding that performance gains plateau beyond a certain threshold of repeated effort. Similarly, Wu et al. (2023) and Shechtman et al. (2019) showed that moderate repetition correlates positively with performance, but excessive trial-and-error may reflect confusion rather than mastery.

2.4. Revisits to Homework or Problems

Revisiting previously attempted or unsolved problems is another marker of engagement and metacognitive awareness. In studies of student behavior in digital learning systems, revisits often indicate reflection or delayed help-seeking (Feng & Roschelle, 2016). However, the value of revisits depends on context. As Namoun and Alshanqiti (2020) noted in their review of learning analytics, revisits contribute to learning gains when they occur in conjunction with feedback, scaffolding, or social interaction. Isolated revisits without resolution may indicate less productive patterns of behavior.

2.5. Time Spent on Homework

Time spent on task is one of the most commonly used variables in studies of homework efficacy. Cooper et al. (2006) concluded that moderate time investments in homework generally correlate with higher academic achievement, especially in mathematics and science. However, Fan et al. (2017) observed that the benefits of increased time plateau and may even reverse at higher thresholds. Thus, time spent must be interpreted in conjunction with measures of efficiency and performance quality.

2.6. Uploading Written Work as Evidence of Reasoning

In technology-enhanced instruction, uploading handwritten work can serve as a form of cognitive transparency. Shechtman et al. (2019) found that when students submit both final answers and their solution steps, teachers are better equipped to provide meaningful feedback. This dual submission format enables instructors to assess both the outcome and the process of learning. While this behavior has not been widely studied in elementary contexts, it aligns with formative assessment principles and offers insight into students’ conceptual understanding.

2.7. Summary

The variables explored in the present study each correspond to established educational constructs—such as self-regulation, persistence, engagement, and help-seeking—that influence student outcomes. Importantly, the literature underscores that these behaviors often exhibit nonlinear relationships with performance and must be understood in context. By analyzing large-scale process data from an online homework platform, this study builds on prior findings and offers a more granular, behaviorally grounded perspective on fourth-grade student achievement in blended learning environments.

3. Context

Our research team is a collaboration among Russian School of Mathematics (RSM) teachers, RSM high school students engaged in a multi-year project-based learning curriculum, and a university faculty advisor. RSM is an after-school program that provides mathematics instruction for students in grades K–12 (Russian School of Mathematics, 2020). The instructional philosophy of RSM is rooted in Vygotsky’s theory of child development and emphasizes the social construction of knowledge through collaboration (Vygotsky, 1926/1997). This approach aims to cultivate students’ problem-solving skills through peer-to-peer and student–teacher interactions.
RSM began as a traditional in-person school, but we later developed an online learning platform that includes a Homework Online component. This system allows students to submit homework electronically and receive immediate feedback. Teachers review assignments at the beginning of each lesson and assign new online homework at the end. The platform is used for all classes beginning in fourth grade and supports a blended learning model that integrates face-to-face and online instruction.
In this study, we focus on identifying which, if any, of our extensive process data variables can help predict or explain students’ homework achievement and end-of-semester summative exam scores. These outcomes are important to study because they inform promotion decisions and student placement for the following academic year. Our research questions are the following: RQ1: Which, if any, of the process data indicators/variables are statistically significant predictors of homework achievement for grade four students?
RQ1a: Do the relationships among the predictors and homework stay consistent over three years of replicated data?
RQ1b: Do the relationships among the predictors and homework stay consistent over three years of replicated data for the three levels of student placement designed by RSM?
RQ2: Which, if any, of the process data indicators/variables are statistically significant predictors of summative test scores for grade four students?
RQ2a: Do the relationships among the predictors and test scores stay consistent over three years of replicated data?
RQ2b: Do the relationships among the predictors and test scores stay consistent over three years of replicated data for the three levels of student placement designed by RSM?

4. Method

4.1. Sample

Our sample includes fourth-grade RSM Online students from the 2020–2021, 2021–2022, and 2022–2023 academic years. We selected fourth grade for four reasons: (a) it is the youngest grade at RSM that received online homework assignments during these years; (b) homework assignments in grade four are relatively standardized across all RSM locations, unlike upper grades where teachers have more flexibility; (c) the fourth-grade curriculum remained unchanged over the three years; and (d) few studies have examined elementary school mathematics achievement across three consecutive years using online learning behavior data.
RSM students are placed into one of three instructional levels, ordered by increasing difficulty: accelerated, advanced, or honors. At enrollment, parents meet with a principal or an experienced teacher, who conducts an interview to assess the child’s mathematics knowledge. Based on this evaluation, the student is placed into a class aligned with their skill level. Continuing students are placed according to the teacher’s assessment of their performance over the prior year, including homework and summative exam results. Typically, the accelerated level is intended for new students who are adjusting to RSM’s curriculum, expectations, and homework practices. The advanced level is for students with at least one year of experience, consistent study habits, and independence in coursework. Honors is designed for students who require additional academic challenge.
The ages in our sample ranged from 9 to 12, with a mean age of approximately 10.2 years. The students are mostly from the United States. Table 1 shows the breakdown of students participating in the homework and test score analyses by grade level and girl/boy identification.

4.2. Data

Students complete weekly online homework assignments and take two summative assessments known as “principal tests” (PTs). These assessments are used throughout the year to monitor student progress and to help determine curriculum level placement for the following year. Developed by the RSM Training and Curriculum departments, each PT assesses students on foundational mathematical concepts expected to be mastered by the time of testing. The test consists of 10 core tasks and takes approximately 30 min to complete. While the three curriculum levels receive different versions of the test, all classes within the same grade and level are administered the same problems. Content validity is reviewed regularly as the curriculum evolves. For this study, we analyzed scores from the second PT, which is administered near the end of the academic year.
We also used an average homework performance metric (Avg_HW), calculated as the average percentage of problems correctly solved across all assignments. Each homework set typically includes 12 to 14 problems that mirror classroom problem types and are designed to take about an hour to complete. Items are scored as correct or incorrect, with partial credit given for multiple attempts when applicable. These assignments are developed by the RSM Curriculum Department and are standardized across students at the same curriculum level. In addition to submitting their answers, students are required to show their work on paper and upload it to the online homework portal.
The RSM online homework portal offers separate interfaces for students and teachers. Proprietary software captures a wide range of keyboard-based student behaviors, logging most interactions students have with the system while completing their assignments. Although the software does not track mouse clicks or tab/window changes, it records all keystrokes for each attempted problem, along with timestamps and automatic grading (correct or incorrect). The system also includes a badge-based reward system to recognize the number of correct answers. In our experience, badges serve as a meaningful incentive for fourth-grade students to engage more fully with their homework.
Numerous process data variables were available as potential predictors of both homework and PT performance. Common metrics include the average time spent per problem and the average number of attempts per item—frequently used in prior studies of student behavior (Bezirhan et al., 2020). Additional indicators, such as the average percent correct, the mean number of hint requests, and the percentage of assignments submitted on time, have also been shown to predict achievement outcomes (Feng & Roschelle, 2016). These, along with other indicators generated by RSM software, were analyzed using various statistical procedures to identify not only statistically significant predictors, but also behaviors that are actionable—those teachers can encourage and students and parents can reasonably implement. The final list of online behavior variables (Table 2) was used to model the two achievement outcomes presented in Table 3.

4.3. Data Cleaning and Outliers

During our initial data review, several student records were identified as atypical and excluded from the analysis. For instance, records were removed if students attended fewer than 10 classes or switched curriculum levels during the academic year, as such data were deemed unrepresentative and unreliable. Additional exclusions were made for cases with implausible behavior values that could not be reasonably corrected (e.g., a record showing that a student took one week to complete a single problem). These qualitative exclusion decisions were subsequently supported by regression diagnostics (Belsley et al., 1980). Furthermore, after evaluating the distributional normality of the data, we applied various transformations to both predictor and outcome variables.

4.4. Statistical Models

Unlike predictive modeling—which aims to maximize the explained variance in an outcome—our explanatory modeling approach prioritized the selection of process data predictors specifically linked to homework habits that can be taught and reinforced. While various statistical methods have been applied in process behavior studies, including artificial neural networks, K-nearest neighbor, random forests, and naive Bayes (Meylani et al., 2014; Namoun & Alshanqiti, 2020), we employed ordinary least squares (OLS) regression to conduct a series of simple and multiple regressions for each curriculum level within each academic year.
The use of ordinary least squares (OLS) regression was consistent with the conditions required under the Gauss–Markov theorem (Neter & Wasserman, 1974). As a consequence, a few predictor variables required a squared term to be added because of their nonlinear relationship to the outcomes. In addition, each OLS model was specifically tested for multicollinearity. However, because the selection of predictors was theory-driven rather than based on statistical optimization, we applied the same set of predictors across all models, regardless of curriculum level or year. This approach enabled us to evaluate how the same predictors related differently to the two outcome variables. Although the degree of intercorrelation among predictors varied somewhat by year and curriculum level, we retained all specified predictors to preserve consistency and comparability across models. After verifying that the pattern of results was stable across the three academic years within each curriculum level, we aggregated the data by level and reanalyzed the models. Finally, we constructed a comprehensive model using data pooled across all curriculum levels and years.

5. Results

Before constructing multiple regression models using all predictors simultaneously, we first examined the individual relationship between each predictor and the two outcomes. This involved calculating correlation coefficients and conducting simple regressions—one predictor at a time. This approach allowed us to determine the general direction of each bivariate relationship, assess the influence of potential outliers, and ensure the presence of practically meaningful associations. For most predictors, the relationship with the outcome was linear, either positive or negative. For two predictors, however, we observed a curvilinear (quadratic) relationship: in one case, the association began positively before shifting to negative; in the other, the pattern reversed, beginning negatively and later turning positive.
In Figure 1 the bold red line (representing the quadratic relationship taken across all three years of data) shows the average number of attempts per homework problem starting off in a positive direction for the prediction of the average percent of solved homework problems (a pattern that most teachers would expect), but after a certain number of attempts, the increase in solved problems shows a decline (an unexpected but meaningful finding). It is noteworthy that this pattern holds for the three curriculum levels.
In Figure 2 we see the opposite pattern for the average number of revisits per unsolved homework problem predicting the average percentage of solved homework problems. Overall, most students do not benefit from an additional revisit to the problem. However, for some students there is a point at which these efforts do succeed–given the way that unsolved_problem_revisits is calculated, it is not clear at what point these revisits have a positive impact, but increased revisits do help some students.
Proceeding now to multiple regressions, the same predictor variables (some with linear terms only, others with both their linear and squared terms) were used to predict the Avg_HW and PT outcome scores. This approach provided the greatest opportunity to assess how the same set of behavioral predictors functioned for the two outcomes for the three curriculum levels within and across the three years.
Table 4, Table 5 and Table 6 present the multiple regression results for the Avg_HW outcome for the accelerated, advanced, and honors curriculum levels (respectively) for each of the three years. Table 7 contains the regression results for the combined three years for each curriculum level. Table 8, Table 9 and Table 10 present the multiple regression results for the PT outcome for the accelerated, advanced, and honors curriculum levels (respectively) for each of the three years. Table 11 contains the regression results for the combined three years for each curriculum level.
Table 4, Table 5 and Table 6 reveal three notable patterns. First, the linear regression coefficients within each curriculum level are highly consistent across the three academic years. For example, the coefficients for hw_revisits are consistently negative for the accelerated and advanced levels, but consistently positive for the honors level across all years. This consistency supports the decision to aggregate data over time. Second, several predictors—such as picture_upload—demonstrate strong and significant effects across all levels and years, with consistently positive coefficients. Third, multiple quadratic predictors (identified by the term “squared” in their variable names and illustrated in Figure 1 and Figure 2) were statistically significant across years and levels. These quadratic effects offer important educational insights, which are discussed in the sections that follow.
Table 7 presents the results for (a) the three years of aggregated data for the curriculum levels, and (b) the aggregation of all years and levels (the right-most column in the table). An inspection of the “b” coefficients and “p” values down the columns of the three levels and then for the combined “all levels, all years” column reveals a remarkably consistent pattern of meaningful predictors of homework achievement. That is, across three separate years of data at three different curriculum levels, homework achievement is related to: more attendance (b = 0.169, p < 0.001), fewer hw_revisits (b = −0.197, p < 0.001), more problem_attempts (b = 0.279, p < 0.001) up to a point before this effort no longer holds (b = −0.08, p < 0.001), more picture_upload (b = 0.41, p < 0.001), fewer unsolved_problem_revisits (b = −0.09, p < 0.001) up to a point where more are beneficial b = 0.05, p < 0.001)), and fewer days_first_attempt (b = −0.176, p < 0.001).
Second, the R-squared values are strong across all curriculum levels over the three academic years. The small differences between R-squared and Adjusted R-squared values suggest that the models consistently explain stable and statistically meaningful variance.
While R-squared represents the proportion of variance explained by the model, adjusted R-squared accounts for the number of predictors and sample size, offering a more conservative estimate of model fit. The difference between R-squared and adjusted R-squared reflects the degree of shrinkage due to potential overfitting, particularly in models based on smaller samples. According to guidelines from J. Cohen et al. (2013) and Yin and Fan (2001), differences less than 0.01 are considered very small, values between 0.01 and 0.03 are acceptable, differences greater than 0.05 may indicate concern, and values exceeding 0.10 suggest a major concern. In the tables above (and below), the only substantial differences occur in the honors-level models, which is attributable to the smaller sample sizes observed in those groups across the study years.
Turning to the PT outcomes presented in Table 8, Table 9 and Table 10, two key patterns emerge. First, no significant quadratic predictors were identified at any level across the three years. Second, the linear regression coefficients across years are less consistent than they were for predicting Avg_HW. That is, the behavioral predictors that reliably explained Avg_HW outcomes did not perform as well in predicting PT scores. For example, at the accelerated level (Table 8), only picture_upload was significant across all three years. At the advanced level (Table 9), only unsolved_problem_revisits met this criterion. At the honors level (Table 10), no predictors were consistently significant. Additionally, unlike the patterns in Table 4, Table 5, Table 6 and Table 7, hw_revisits did not significantly predict PT scores in any year or at any level. Notably, more behavioral predictors were statistically significant and consistent in the final two academic years than in the initial 2020–2021 period.
Table 8. Principal Test regression results for accelerated only—all three years separately.
Table 8. Principal Test regression results for accelerated only—all three years separately.
All Three Years: Grade 4, Accelerated Level
OutcomePrincipal Test (PT) score
PredictorAll predictors
Year2020–20212021–20222022–2023
R-squared0.3970.2670.205
F (p)12.5 (<0.01)98.71 (<0.01)7.35 (<0.01)
Adjusted R-squared0.3650.2500.186
Root Mean Square Error0.6960.7720.709
Sample size (n)141303293
bpbpbp
Intercept−0.1660.030−0.4500.000−0.4030.000
attendance0.0690.3850.2940.000−0.0330.518
hw_revisits0.0190.842−0.0780.219−0.0640.367
problem_attempts−0.0690.445−0.3100.000−0.1030.113
picture_upload0.2380.0080.1340.0250.2390.000
unsolved_problem_revisits−0.2230.026−0.1370.057−0.2240.000
time_spent_on_hw−0.5380.0000.0260.683−0.0290.663
days_first_attempt−0.1310.134−0.0070.909−0.1340.032
Table 9. Principal Test regression results for advanced only—all three years separately.
Table 9. Principal Test regression results for advanced only—all three years separately.
All Three Years: Grade 4, Advanced Level
OutcomePrincipal Test (PT) score
PredictorAll predictors
Year2020–20212021–20222022–2023
R-squared0.2490.2830.250
F (p)4.97 (<0.01)108.96(<0.01)9.37 (<0.01)
Adjusted R-squared0.1990.2630.235
Root Mean Square Error0.6070.5830.684
Sample size (n)113261357
bpbpbp
Intercept0.0350.6540.2470.0000.1330.003
attendance0.0860.3070.1160.0270.2890.000
hw_revisits0.0560.5970.1310.0520.0730.238
problem_attempts−0.0280.823−0.1890.006−0.1430.009
picture_upload0.0670.4020.0030.9450.0470.309
unsolved_problem_revisits−0.4350.000−0.2920.000−0.2670.000
time_spent_on_hw−0.0200.841−0.2970.000−0.1990.000
days_first_attempt−0.21520.019−0.03000.593−0.06320.256
Table 10. Principal Test regression results for honors only—all three years separately.
Table 10. Principal Test regression results for honors only—all three years separately.
All Three Years: Grade 4, Honors Level
OutcomePrincipal Test (PT) score
PredictorAll predictors
Year2020–20212021–20222022–2023
R-squared0.2780.2730.195
F (p)2.31(<0.05)115.11 (<0.05)2.88 (<0.05)
Adjusted R-squared0.1580.2430.155
Root Mean Square Error0.4980.5610.579
Sample size (n)50176147
bpbpbp
Intercept0.3540.0190.3360.0000.4870.000
attendance0.1620.164−0.0310.615−0.0130.853
hw_revisits0.0440.831−0.1310.1190.1270.190
problem_attempts−0.1200.521−0.1700.055−0.0240.810
picture_upload0.01400.9010.1420.0150.1920.008
unsolved_problem_revisits−0.2010.240−0.1640.028−0.1910.017
time_spent_on_hw−0.2790.069−0.1070.173−0.3420.001
days_first_attempt−0.3590.020−0.0920.153−0.0510.520
Table 11 presents the results from (a) the aggregated data across three academic years for each curriculum level and (b) the fully aggregated dataset combining all years and levels (shown in the right-most column). Several noteworthy patterns emerge. First, many predictor–outcome relationships that were non-significant in individual years (see Table 8, Table 9 and Table 10) became statistically significant in the aggregated analyses. For instance, the variable picture_upload was not significant for advanced students in any single year (Table 9), but its aggregated coefficient is statistically significant in Table 11 (b = 0.076, p = 0.017). This shift reflects a general principle of statistical modeling: larger sample sizes increase the power to detect significant effects (J. C. Cohen, 1977). This finding carries important—though potentially cautionary—implications for using single-year models to explain current outcomes or forecast future performance.
Table 11. Principal Test regression results for all years combined.
Table 11. Principal Test regression results for all years combined.
OutcomePrincipal Test (PT) Score
PredictorAll predictors
Curriculum4_1 Accelerated4_2 Advanced4_3 HonorsAll levels,
all years
R-squared0.2290.2170.2240.188
F (p)29.03 (<0.05)1228.63 (<0.05)1314.89 (<0.05)1459.69 (2 × 10−77)
Adjusted R-squared0.2190.2070.2050.184
Root Mean Square Error0.7570.7000.5800.812
Sample size (n)7377313731841
bpbpbpbp
Intercept−0.3710.0000.1530.0000.4360.000−6.1 × 10−171.000
attendance0.1200.0010.1480.000−0.0190.6370.0970.000
hw_revisits−0.0610.1530.0720.098−0.0130.8255.5 × 10−50.998
problem_attempts−0.1730.000−0.1490.000−0.0710.241−0.1960.000
picture_upload0.1820.0000.0760.0170.1630.0000.1450.000
unsolved_problem_revisits−0.1990.000−0.2800.000−0.1910.000−0.1700.000
time_spent_on_hw−0.1000.015−0.2170.000−0.2260.000−0.1730.000
days_first_attempt−0.1090.005−0.0880.020−0.0970.038−0.1060.000
Second, consistent with the homework results, the direction of the regression coefficients for key predictors remains remarkably stable across the three curriculum levels. For instance, days_first_attempt exhibits a negative and statistically significant association with PT achievement across all levels when data are aggregated over three years. Third, the adjusted R-squared values are closely aligned with the R-squared values, indicating stable and likely replicable model fits.
This consistency justified aggregating all years and levels into the final model reported in the right-most column of Table 11. The results indicate that PT achievement is positively associated with higher attendance (b = 0.096, p < 0.001) and more frequent picture uploads (b = 0.145, p < 0.001), and negatively associated with a greater number of problem attempts (b = −0.196, p < 0.001), more unsolved problem revisits (b = −0.170, p < 0.001), increased time spent on homework (b = −0.173, p < 0.001), and delays in initial homework attempts (days_first_attempt, b = −0.106, p < 0.001).

6. Discussion

Our multi-year analysis affirms that strengthening homework-related behaviors in elementary students is an essential strategy for fostering both academic achievement and self-regulated learning. While much of the existing literature focuses on secondary and postsecondary populations (e.g., Kabakchieva, 2013; Richards-Babb et al., 2011), our work illustrates how online process data can be used at the elementary level to identify specific digital behaviors associated with student success. These behaviors—such as beginning homework promptly, making repeated solution attempts, and uploading handwritten work—are not only predictive but also teachable. This finding aligns with broader research highlighting the importance of promoting study habits that support academic motivation and achievement from an early age (Zimmerman & Kitsantas, 2005; Bempechat, 2004).
One of the clearest and most actionable findings was the strong and consistent relationship between starting homework soon after the lesson and overall homework performance. Across all curriculum levels and all three academic years, students who engaged with their homework within one to two days consistently outperformed peers who delayed. This echoes prior studies linking early engagement to content retention and successful transfer of learning (Ramdass & Zimmerman, 2011; Rosário et al., 2015), and supports our long-standing instructional observation: when fourth-grade students wait several days to begin their assignments, they often forget key concepts from class. Given that our classes meet once a week, we emphasize to parents that encouraging students to start homework early is not about urgency—it is about recall. Prompt engagement gives students a better chance to work independently while the material is still fresh.
To reinforce this habit, we have begun piloting a digital badge system within our online platform that rewards students for starting their assignments within 48 h of class. This incentive aligns with findings from online learning research showing that gamified feedback systems can positively influence student behavior (Wu et al., 2023). It also reflects our core belief at RSM that student achievement is most sustainable when it emerges from a partnership between teachers, students, and families.
Another key insight emerged from our analysis of the number of problem attempts. We found a curvilinear relationship with homework performance: initially, more attempts were associated with higher scores—a result that resonates with our educational aim to promote persistence. Students are encouraged to reflect on mistakes and try again, a message that has become integral to our classroom culture. However, beyond a certain threshold, additional attempts no longer produced performance gains, a finding also noted in studies of diminishing returns on repeated engagement (Bezirhan et al., 2020). In practice, we have observed that this turning point often coincides with students losing the opportunity for full credit—shifting from meaningful problem-solving to trial-and-error strategies. While we value the growth mindset this system promotes, it also reminds us of the importance of helping students develop productive struggle strategies and knowing when to seek help.
The pattern was reversed, but equally telling, for problem revisits—defined as returning to a problem more than 12 h later. Initially, revisits were negatively correlated with homework scores, suggesting that students rarely made successful progress after prolonged gaps. However, as revisit frequency increased, a positive trend reemerged. This may indicate that students who struggled initially eventually received support—perhaps from parents, peers, or classroom reviews. From our experience, this behavior is particularly common after we discuss homework solutions in class, and students go back to update their answers. We view this as a sign of maturity and persistence. Indeed, our scoring system intentionally allows partial credit for corrected answers, reinforcing a mastery-based approach to learning. But as with problem attempts, revisits must be interpreted within the broader instructional context: not all repeated effort is equally effective, and overreliance on second chances can sometimes undermine careful work on the first attempt.
While many behavioral predictors showed strong associations with homework scores, their relationships to end-of-year summative test scores (the Principal Test, or PT) were more complex and, in some cases, counterintuitive. For example, more problem attempts and longer homework completion times were negatively correlated with PT scores. Unlike homework, which is self-paced and allows for revisions, the PT is time-constrained and emphasizes first-pass accuracy. These results align with research on timed assessments showing that fluency and accuracy are more predictive of test performance than persistence alone (Feng & Roschelle, 2016). In our setting, students who solve homework quickly and with fewer retries often show greater conceptual clarity—an advantage on timed assessments.
Another behavior that consistently predicted both homework and PT outcomes was uploading written work. This small but powerful habit gives teachers access to students’ mathematical reasoning—a critical insight when the digital interface only captures final answers. We have often observed students score perfectly on online homework, only to struggle with similar questions on an in-class quiz. These experiences reinforce the instructional value of written work, both as a diagnostic tool and as a means of encouraging students to articulate their thinking. Prior research supports this view, noting that written explanations deepen understanding and enhance learning outcomes (Rosário et al., 2015; Wu et al., 2023).
Drawing on three years of data, our findings also affirm the value of replication in educational research. While most studies in this domain rely on single-year samples (J. C. Cohen, 1977), our repeated analyses across multiple cohorts and curriculum levels allowed us to identify patterns that are not only statistically significant but also practically reliable. These results suggest that the relationship between digital learning behaviors and academic outcomes is durable, and that targeted behavioral feedback can improve student performance over time.
Ultimately, this study underscores the importance of aligning online learning tools with developmentally appropriate expectations. Elementary students are still learning how to manage time, regulate effort, and interpret feedback. Our process data show that when these skills are modeled and reinforced—by teachers and parents alike—students can flourish in digital learning environments. We believe that our findings, though situated within the Russian School of Mathematics context, can inform practices in a broad range of technology-integrated programs. Supporting the development of productive homework habits not only promotes academic success but also prepares students for lifelong learning.

Author Contributions

Conceptualization, J.B.; methodology, A.M.; formal analysis and investigation, O.I., M.K., P.B., S.I., B.N., R.S. and M.S. (Milind Sharma); data curation, O.I. and M.K.; writing—original draft preparation, review and editing, L.L. and M.S. (Manasi Singhal); visualization, M.K.; supervision, O.I. and L.L.; project administration, S.A. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The datasets used in this article are not readily available because of the confidential nature of the data, which contains actual student information. Access to the original raw data is restricted to authorized members of the RSM data analytics team to ensure privacy and compliance with data protection policies. Requests for access to the dataset, statistical summaries, or data analyses may be directed to the responsible author.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
RSMRussian School of Mathematics

Notes

1
The red regression line represents the model for all curricula combined.
2
The highest p-value for any Accelerated level Homework model was 6 × 10−18.
3
See note 2.
4
The highest p-value for any Advanced level Homework model was 5 × 10−10.
5
The highest p-value for any Honors level Homework model was 0.006.
6
See note 2.
7
See note 4.
8
See note 5.
9
The highest p-value for any Accelerated level Homework model was 2 × 10−12.
10
The highest p-value for any Advanced level Homework model was 6 × 10−5.
11
The highest p-value for any Honors level Homework model was 0.04.
12
See note 9.
13
See note 10.
14
See note 11.

References

  1. Algozzine, B., Wang, C., & Violette, A. S. (2010). Reexamining the relationship between academic achievement and social behavior. Journal of Positive Behavior Interventions, 13(1), 3–16. [Google Scholar] [CrossRef]
  2. Alhassan, A., Zafar, B., & Mueen, A. (2020). Predict students’ academic performance based on their assessment grades and online activity data. International Journal of Advanced Computer Science and Applications (IJACSA), 11(4), 2020. [Google Scholar] [CrossRef]
  3. Balfanz, R., Herzog, L., & Mac Iver, D. J. (2007). Preventing student disengagement and keeping students on the graduation path in urban middle-grades schools: Early identification and effective interventions. Educational Psychologist, 42(4), 223–235. [Google Scholar] [CrossRef]
  4. Belsley, D. A., Kuh, E., & Welsch, R. E. (1980). Regression diagnostics. John Wiley and Sons. [Google Scholar]
  5. Bempechat, J. (2004). The motivational benefits of homework: A social-cognitive perspective. Theory Into Practice, 43(3), 189–196. [Google Scholar] [CrossRef]
  6. Bezirhan, U., von Davier, M., & Grabovsky, I. (2020). Modeling item revisit behavior: The hierarchical speed–accuracy–revisits model. Educational and Psychological Measurement, 81(2), 363–387. [Google Scholar] [CrossRef] [PubMed]
  7. Cohen, J., Cohen, P., West, S. G., & Aiken, L. S. (2013). Applied multiple regression/correlation analysis for the behavioral sciences. Routledge. [Google Scholar]
  8. Cohen, J. C. (1977). Statistical power analysis for the behavioral sciences (Rev. ed.). Academic Press. [Google Scholar]
  9. Cooper, H., Robinson, J. C., & Patall, E. A. (2006). Does homework improve academic achievement? A synthesis of research, 1987–2003. Review of Educational Research, 76(1), 1–62. [Google Scholar] [CrossRef]
  10. Epstein, J. L. (1983). Homework practices, achievements, and behaviors of elementary school students. Center for Research on Elementary and Middle Schools. (ERIC Document Reproduction Service No. ED301322). Available online: https://eric.ed.gov/?id=ED250351 (accessed on 1 June 2025).
  11. Fan, H., Xu, J., Cai, Z., He, J., & Fan, X. (2017). Homework and student’s achievement in math and science: A 30-year meta-analysis, 1986–2015. Educational Research Review, 20, 35–54. [Google Scholar] [CrossRef]
  12. Fateen, M., & Mine, T. (2021). Predicting student performance using teacher observation reports. Available online: https://eric.ed.gov/?id=ED615587 (accessed on 1 June 2025).
  13. Feng, M., & Roschelle, J. (2016, April 25–26). Predicting students’ standardized test scores using online homework. Third (2016) ACM Conference on Learning@Scale, Edinburgh, UK. [Google Scholar] [CrossRef]
  14. He, Q., & von Davier, M. (2016). Analyzing process data from problem-solving items with N-grams: Insights from a computer-based large-scale assessment. In Y. Rosen, S. Ferrara, & M. Mosharraf (Eds.), Handbook of research on technology tools for real-world skill development (pp. 750–777). IGI Global. [Google Scholar] [CrossRef]
  15. Ilina, O., Antonyan, S., Mirny, A., Brodskaia, J., Kosogorova, M., Lepsky, O., Belakurski, P., Iyer, S., Ni, B., Shah, R., Sharma, M., & Ludlow, L. (2024, April 24). Predicting fourth-grade student achievement using process data from student’s web-based/online math homework exercises. Roundtable Presentation at the New England Educational Research Association (NEERO) Annual Meeting, Portsmouth, NH, USA. [Google Scholar]
  16. Ilina, O., Antonyan, S., Mirny, A., Lepsky, O., Brodskaia, J., Balsara, M., Keyhan, K., Mylnikov, A., Quackenbush, A., Tarlie, J., Tyutyunik, A., Venkatesh, R., & Ludlow, L. H. (2023). Fourth grade student mathematics performance in an after-school program before and after COVID-19: Project-based learning for teachers and students as co-researchers. Journal of Teacher Action Research, 10(2). Available online: https://jtar-ojs-shsu.tdl.org/JTAR/article/view/94 (accessed on 1 June 2025).
  17. Kabakchieva, D. (2013). Predicting student performance by using data mining methods for classification. Cybernetics and Information Technologies, 1, 61–72. [Google Scholar] [CrossRef]
  18. Meylani, R., Bitter, G., & Castaneda, R. (2014). Predicting student performance in statewide high stakes tests for middle school mathematics using the results from third party testing instruments. Journal of Education and Learning, 3(3), 135–145. [Google Scholar] [CrossRef]
  19. Mogessie, M., Riccardi, G., & Rochetti, M. (2015, October 21–24). Predicting student’s final exam scores from their course activities. 2015 IEEE Frontiers in Education Conference (FIE) (pp. 1–9), El Paso, TX, USA. [Google Scholar] [CrossRef]
  20. Namoun, A., & Alshanqiti, A. (2020). Predicting student performance using data mining and learning analytics techniques: A systematic literature review. Applied Sciences, 11(1), 237. [Google Scholar] [CrossRef]
  21. Neter, J., & Wasserman, W. (1974). Applied linear statistical models. Irwin. [Google Scholar]
  22. Ramdass, D., & Zimmerman, B. J. (2011). Developing self-regulation skills: The important role of homework. Journal of Advanced Academics, 22(2), 194–218. [Google Scholar] [CrossRef]
  23. Richards-Babb, M., Drelick, J., Henry, Z., & Robertson-Honecker, J. (2011). Online homework, help or hindrance? What students think and how they perform. Journal of College Science Teaching, 40(4), 81–93. [Google Scholar]
  24. Rosário, P., Núñez, J. C., Vallejo, G., Cunha, J., Nunes, T., Suárez, N., Fuentes, S., & Moreira, T. (2015). The effects of teachers’ homework follow-up practices on students’ EFL performance: A randomized-group design. Frontiers in Psychology, 6, 1528. [Google Scholar] [CrossRef]
  25. Roschelle, J., Feng, M., Murphy, R., & Mason, C. (2016). Online mathematics homework increases student achievement. AERA Open, 2(4). [Google Scholar] [CrossRef]
  26. Russian School of Mathematics. (2020). About RSM: The Russian school of mathematics. Available online: https://www.russianschool.com/about (accessed on 1 June 2025).
  27. Shechtman, N., Roschelle, J., Feng, M., & Singleton, C. (2019). An efficacy study of a digital core curriculum for grade 5 mathematics. AERA Open, 5(2). [Google Scholar] [CrossRef]
  28. Valle, A., Regueiro, B., Núñez, J. C., Rodríguez, S., Piñeiro, I., & Rosário, P. (2016). Academic goals, student homework engagement, and academic achievement in elementary school. Frontiers in Psychology, 7, 463. [Google Scholar] [CrossRef]
  29. Vygotsky, L. (1997). Educational psychology. St. Lucie Press. (Original work published 1926). [Google Scholar]
  30. Wu, D., Li, H., Zhu, S., Yang, H. H., Bai, J., Zhao, J., & Yang, K. (2023). Primary students’ online homework completion and learning achievement. Interactive Learning Environments, 32, 4469–4483. [Google Scholar] [CrossRef]
  31. Yeary, E. E. (1978). What about homework? Today’s Education, September–October, 80–82. [Google Scholar]
  32. Yin, P., & Fan, X. (2001). Estimating R2 shrinkage in multiple regression: A comparison of different analytical methods. The Journal of Experimental Education, 69(2), 203–224. [Google Scholar] [CrossRef]
  33. Zimmerman, B. J., & Kitsantas, A. (2005). Homework practices and academic achievement: The mediating role of self-efficacy and perceived responsibility beliefs. Contemporary Educational Psychology, 30, 397–417. [Google Scholar] [CrossRef]
Figure 1. Average attempts per homework problems vs. overall homework scores.1 The bold red line (representing the quadratic relationship taken across all three years of data).
Figure 1. Average attempts per homework problems vs. overall homework scores.1 The bold red line (representing the quadratic relationship taken across all three years of data).
Education 15 00753 g001
Figure 2. Average number of revisits per unsolved homework problems vs. overall homework scores.2 The bold red line (representing the quadratic relationship taken across all three years of data).
Figure 2. Average number of revisits per unsolved homework problems vs. overall homework scores.2 The bold red line (representing the quadratic relationship taken across all three years of data).
Education 15 00753 g002
Table 1. Combined number of students, by level, for 2020/2021, 2021/2022, and 2022/2023 school years.
Table 1. Combined number of students, by level, for 2020/2021, 2021/2022, and 2022/2023 school years.
Curriculum4_1 Accelerated4_2 Advanced4_3 HonorsCombined
Number of students, target—PT 7377313731841
Boys361367222950
Girls376364151891
Number of students, target—% of problems solved in HW10847994062289
Boys:5484052451198
Girls:5363941611091
Table 2. Predictors.
Table 2. Predictors.
PredictorsDescriptionData Transformations
problem_attemptsTotal number of times an answer was submitted to any student’s HW problem divided by the total number of problems in all HW assignments.
Indicator of persistence
Square root
days_first_attemptAverage number of days between lesson end time and the first time student submits an answer to any problem in the HW assignment.
Indicator of good homework habits
Square root
picture_uploadThe number of times a student took a picture of their homework solution and uploaded it to RSM’s website divided by the total number of attempted homework assignment.
Indicator of good homework habits
No transformation applied
time_spent_on_hwThe average time spent on solving homework assignments. When time is calculated, periods of inactivity (no answer submitted) of 15+ minutes are excluded.
Indicator of persistence
Square root
unsolved_problem_revisitsIf a problem was not solved during the first attempt—which can be several answer submissions in a row—this is how many times a student would return to it on average. A revisit is when a student worked (submitted an answer) on another problem before returning to that one.
Indicator of consistency
Square root
hw_revisitsHow many times a student comes back to their homework assignment on average. A revisit is coming back after a 12+ hours break.
Indicator of consistency
Square root
attendanceThe percentage of lessons attended excluding the first lesson. Some students join later or request a different weekday at the beginning of the year due to schedule conflicts, so we excluded the first lesson.
Indicator of regular attendance
Arcsine
Table 3. Outcome variables.
Table 3. Outcome variables.
PredictorsDescriptionData Transformations
PTThe score received for the Principal Test. The Principal Test is a 30 min test given in the second half of the school year to all RSM students. All students of the same curriculum receive identical questions.Arcsine
Avg_HWAverage percentage of problems solved per homework assignmentArcsine
Table 4. Homework regression results for accelerated only–all three years separately.
Table 4. Homework regression results for accelerated only–all three years separately.
All Three Years: Grade 4, Accelerated Level
OutcomeHomework (HW) score
PredictorAll predictors
Year2020–20212021–20222022–2023
R-squared0.3820.4380.389
F (p)10.86 (<0.013)22.53 (<0.01)21.74 (<0.01)
Adjusted R-squared0.3520.4220.374
Root Mean Square Error0.6600.5460.598
Sample size (n)241386457
bpbpbp
Intercept−0.1780.030−0.1390.027−0.0690.233
attendance0.1410.0230.1420.0010.1010.005
attendance_squared−0.1200.002−0.0650.031−0.0050.818
hw_revisits−0.0450.507−0.2280.000−0.2650.000
problem_attempts0.2760.0000.2500.0000.2450.000
problem_attempts_squared−0.0400.174−0.0780.000−0.0520.002
picture_upload0.4400.0000.5130.0000.4500.000
unsolved_problem_revisits−0.1710.014−0.0230.664−0.1100.017
unsolved_problem_revisits_squared0.0570.0490.0330.3400.0570.029
time_spent_on_hw−0.0780.2120.1130.023−0.0000.978
time_spent_on_hw_squared0.0570.1060.0140.653−0.0810.005
days_first_attempt−0.0680.284−0.1050.018−0.2250.000
Table 5. Homework regression results for advanced only—all three years separately.
Table 5. Homework regression results for advanced only—all three years separately.
All Three Years: Grade 4, Advanced Level
OutcomeHomework (HW) score
PredictorAll predictors
Year2020–20212021–20222022–2023
R-squared0.4770.3880.523
F (p)7.41 (<0.01)413.83 (<0.01)31.87 (<0.01)
Adjusted R-squared0.4200.3640.509
Root Mean Square Error0.4820.6060.486
Sample size (n)113298389
bpbpbp
Intercept0.3610.0090.0860.2470.1230.036
attendance0.0340.6850.2150.0000.1990.000
attendance_squared0.0520.5460.0500.189−0.0250.354
hw_revisits−0.1960.045−0.2440.000−0.1150.021
problem_attempts0.4300.0020.4100.0000.4520.000
problem_attempts_squared−0.1820.022−0.1330.000−0.1150.000
picture_upload0.4000.0000.3570.0000.4340.000
unsolved_problem_revisits−0.3380.003−0.1730.003−0.1590.000
unsolved_problem_revisits_squared0.0730.3060.0490.0370.0370.179
time_spent_on_hw0.0920.312−0.0370.557−0.1020.025
time_spent_on_hw_squared−0.1060.1740.0290.4690.0360.182
days_first_attempt−0.2770.001−0.1690.002−0.1640.000
Table 6. Homework regression results for honors only—all three years separately.
Table 6. Homework regression results for honors only—all three years separately.
All Three Years: Grade 4, Honors Level
OutcomeHomework (HW) score
PredictorAll predictors
Year2020–20212021–20222022–2023
R-squared0.4670.3550.327
F (p)2.81 (<0.01)58.19 (<0.01)5.36 (<0.01)
Adjusted R-squared0.3240.3170.274
Root Mean Square Error0.2430.6180.536
Sample size (n)53199154
bpbpbp
Intercept0.4160.0320.2410.0060.3390.001
attendance0.2330.0180.1550.0170.1620.026
attendance_squared−0.0480.5390.0450.123−0.0080.875
hw_revisits0.1040.502−0.2150.010−0.2690.006
problem_attempts0.2860.1440.1190.2140.1060.318
problem_attempts_squared0.0020.989−0.0920.021−0.0400.308
picture_upload0.2500.0090.3870.0000.3780.000
unsolved_problem_revisits−0.4370.005−0.0150.846−0.0700.437
unsolved_problem_revisits_squared0.1570.0980.0720.0940.0490.256
time_spent_on_hw−0.2390.0820.0540.4920.1720.064
time_spent_on_hw_squared0.0300.6660.0050.913−0.0040.936
days_first_attempt−0.1780.110−0.1170.077−0.1420.071
Table 7. Homework regression results for all years combined.
Table 7. Homework regression results for all years combined.
OutcomeHomework (HW) Score
PredictorAll predictors
Curriculum4_1 Accelerated4_2 Advanced4_3 HonorsAll levels, all years
R-squared0.3800.4500.3300.360
F (p)59.73 (<0.01)658.57 (<0.01)717.63 (<0.01)8116.41 (2 × 10−211)
Adjusted R-squared0.3740.4420.3110.357
Root Mean Square Error0.6110.5540.5650.640
Sample size (n)10847994062289
bpbpbpbp
Intercept−10.1800.0010.1400.0010.3030.0000.0400.124
attendance0.1400.0000.2160.0000.1870.0000.1690.000
attendance_squared−0.0420.0110.0030.8790.0380.066−0.0040.733
hw_revisits−0.2010.000−0.1700.000−0.22890.000−0.1970.000
problem_attempts0.2670.0000.4490.0000.1230.0570.2790.000
problem_attempts_squared−0.0600.000−0.1360.000−0.0700.009−0.0850.000
picture_upload0.4660.0000.3900.0000.3550.0000.4090.000
unsolved_problem_revisits−0.0940.002−0.1830.000−0.0620.237−0.0920.000
unsolved_problem_revisits_squared0.0510.0020.0480.0030.0680.0140.0500.000
time_spent_on_hw0.0180.563−0.0630.0650.0900.093−0.0020.923
time_spent_on_hw_squared−0.0230.2160.0210.318−0.0020.957−0.0020.906
days_first_attempt−0.1490.000−0.1810.000−0.1400.002−0.1760.000
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Ilina, O.; Antonyan, S.; Kosogorova, M.; Mirny, A.; Brodskaia, J.; Singhal, M.; Belakurski, P.; Iyer, S.; Ni, B.; Shah, R.; et al. Understanding Fourth-Grade Student Achievement Using Process Data from Student’s Web-Based/Online Math Homework Exercises. Educ. Sci. 2025, 15, 753. https://doi.org/10.3390/educsci15060753

AMA Style

Ilina O, Antonyan S, Kosogorova M, Mirny A, Brodskaia J, Singhal M, Belakurski P, Iyer S, Ni B, Shah R, et al. Understanding Fourth-Grade Student Achievement Using Process Data from Student’s Web-Based/Online Math Homework Exercises. Education Sciences. 2025; 15(6):753. https://doi.org/10.3390/educsci15060753

Chicago/Turabian Style

Ilina, Oksana, Sona Antonyan, Maria Kosogorova, Anna Mirny, Jenya Brodskaia, Manasi Singhal, Pavel Belakurski, Shreya Iyer, Brandon Ni, Ranai Shah, and et al. 2025. "Understanding Fourth-Grade Student Achievement Using Process Data from Student’s Web-Based/Online Math Homework Exercises" Education Sciences 15, no. 6: 753. https://doi.org/10.3390/educsci15060753

APA Style

Ilina, O., Antonyan, S., Kosogorova, M., Mirny, A., Brodskaia, J., Singhal, M., Belakurski, P., Iyer, S., Ni, B., Shah, R., Sharma, M., & Ludlow, L. (2025). Understanding Fourth-Grade Student Achievement Using Process Data from Student’s Web-Based/Online Math Homework Exercises. Education Sciences, 15(6), 753. https://doi.org/10.3390/educsci15060753

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop