Next Article in Journal
“I Don’t Believe Any Qualifications Are Required”: Exploring Global Stakeholders’ Perspectives Towards the Developmental Experiences of Esports Coaches
Previous Article in Journal
Merging Didactic and Relational Competence: A Student Perspective on the Teacher’s Role in Working with Student Health
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Comparison Between Guided and Non-Guided Homework as a Tool for Learning Electric Circuit Theory

Department of Electronics Engineering, Pontificia Universidad Javeriana, Bogotá 110231, Colombia
*
Author to whom correspondence should be addressed.
Educ. Sci. 2025, 15(7), 857; https://doi.org/10.3390/educsci15070857
Submission received: 5 May 2025 / Revised: 31 May 2025 / Accepted: 17 June 2025 / Published: 4 July 2025

Abstract

The present work highlights the differences between two simultaneous groups taking the electric circuit theory course. Both groups were given the same non-mandatory homework prior to the evaluation; however, the instructions provided to the groups differed. Historically, this homework has been assigned without any guidelines, and instructors have not been aware of whether students completed the task. In this study, specific instructions were developed to simulate the same stressful conditions regarding time constraints for finding the correct answer that students face during an exam. The results indicate that the group of students who received categorized exercises along with a structured schedule performed better than the non-guided group (2.8/5 versus 2.3/5 on average). The study also reveals that a significant portion of students engaged with this optional activity during the initial weeks (45%), but this engagement decreased substantially over time (to 5%), showing a correlation between evaluation results and commitment to the exercises. Sharing these findings with students may help improve performance in a course that historically has had a pass rate below 50%.

1. Introduction

1.1. Context for Learning Electrical Circuits

The practical learning of electrical circuit theory represents a fundamental challenge in engineering education, especially in electrical and electronic engineering programs. This area of knowledge, characterized by its highly abstract and mathematical content, requires not only a solid conceptual foundation but also pedagogical strategies that foster the active understanding and practical application of concepts. In this context, homework assignments have traditionally been used as a key tool to reinforce learning outside the classroom.
However, the effectiveness of assignments depends mainly on their design and the level of guidance they provide to students. While guided assignments provide structured information and support, unguided assignments promote autonomy and self-regulation of learning. The educational literature has extensively debated the relative benefits of both approaches, but there is a growing need for specific empirical evidence in the context of teaching electrical circuits.
This study compares the impact of guided and unguided assignments on the learning of electrical circuit theory in university students. Using a quasi-experimental design, the effects of each task type on academic performance, perceptions of learning, and student motivation are analyzed. The results of this research provide evidence to guide teachers and curriculum designers in selecting the most effective task strategies to improve conceptual understanding in engineering courses.

1.2. Literature Review of Learning Electrical Circuits

Conceptual difficulties in learning electrical circuits have been widely documented, reinforcing the need for pedagogical strategies that guide and structure the learning process. Hussain et al. (2010) identified several common obstacles among electrical engineering students, including a lack of understanding of fundamental concepts and difficulty applying theories in practical contexts. These findings suggest that students could benefit from guided assignments that help them build a solid conceptual foundation. Similarly, Rankhumise and Sitwala (2014) demonstrated that the use of analogies, such as that of a bicycle, can be effective in correcting misconceptions regarding electrical circuits.
Similarly, other studies have documented the persistent difficulties students face when learning fundamental concepts of electrical circuit analysis. Pitterson and Streveler (2016) emphasize that these difficulties are closely related to the interaction between students’ prior knowledge, the proposed learning activities, and the design of the educational environment. Along these lines, Johnson et al. (2014) demonstrated that the use of abstract or contextualized visual representations, as well as diagram labeling, significantly influences circuit comprehension, suggesting that the instructional design, including the type of assigned homework, can have a direct impact on learning. Fayyaz and Trueman (2022) identified persistent errors in fundamental circuit analysis, even after formal instruction, reinforcing the need for more effective pedagogical strategies. Similarly, Sanches et al. (2018) found that students commonly struggle with DC circuits, even in inquiry-based laboratory settings, suggesting that task structure and support can be crucial in overcoming these obstacles.
Thus, teaching complex concepts in electrical circuit theory remains a significant challenge in engineering education. Pitterson and Streveler (2016) highlight that students’ difficulties in understanding these concepts are mainly due to a lack of formal prior knowledge, the abstract nature of the content, and the absence of instructional strategies that generate conceptual conflict. Their study underlines the importance of considering the interaction between the student’s prior knowledge, learning activities, and the design of the educational environment.
For these reasons, the need for effective pedagogical strategies for teaching circuit theory has motivated various innovative proposals in recent years. Li (2022) proposed using flowcharts as a tool to facilitate the rapid understanding of fundamental circuit theories, highlighting that a structured visual representation can help students organize and apply concepts more efficiently, which aligns with the principles of guided assignments. Freeborn (2022) evaluated the progress of students who participated in a pre-preparation course for circuit analysis, finding that structured prior instruction significantly improves subsequent performance, reinforcing the importance of guiding learning from an early stage. In a broader context, Zhang et al. (2022) addressed the reform of digital circuit teaching within the framework of high-level engineering training, underscoring the need for more active and student-centered teaching approaches, which can be achieved through carefully designed and guided assignments. As we have shown, the understanding of electrical circuits has been widely studied from various pedagogical perspectives, which allowed us to identify practical approaches to improve learning. Paatz et al. (2004) analyzed how the use of analogies can facilitate the understanding of simple circuits, highlighting the importance of instructional strategies that connect with the students’ prior knowledge. In a comparative study, Ekmekci and Gulacar (2015) found that both computer simulations and hands-on activities can be effective in enhancing learning. However, their impact depends on the context and the design of the activity, suggesting that guided tasks could optimize these resources. More recently, Harjuna et al. (2024) demonstrated that a core-idea-based approach significantly enhances students’ conceptual understanding of electrical circuits, underscoring the importance of carefully structuring learning activities. Finally, the systematic review by Espera and Pitterson (2019) supports the use of evidence-based instructional approaches for teaching circuit concepts, providing a robust framework for evaluating the effectiveness of guided versus unguided tasks in this field.
Interventions utilizing digital resources and collaborative strategies have gained relevance in the teaching of circuit analysis, particularly in designing tasks that promote active learning. Sambamurthy et al. (2021) analyzed the use of self-assessed activities in a web-based textbook for circuit analysis, finding that students tend to interact more with tasks that offer immediate feedback, which is characteristic of guided tasks. Complementarily, Skromme et al. (2024) proposed a step-by-step tutoring system that improves student performance in circuit analysis courses, highlighting the effectiveness of structured guidance in the problem-solving process. On the other hand, Crouch and Mazur (2001) demonstrated that peer instruction, a form of guided collaborative learning, significantly enhances conceptual understanding in physics, which can be applied to the teaching of electrical circuits. Finally, Reba et al. (2022) demonstrated that the cooperative jigsaw method enhances the teaching of circuit analysis by fostering active participation and the collective construction of knowledge, elements that can also be incorporated into guided assignments.
All of the above leads us to believe that learning electrical circuit theory requires both a solid conceptual foundation and effective pedagogical strategies that facilitate the progressive understanding of the content. In this sense, works such as those by Trzaska (2025), which address advanced topics in circuit theory, and those by Sundararajan (2020), which focus on introductory fundamentals, offer a robust theoretical framework to contextualize the design of academic assignments. Furthermore, recent publications in the journal IEEE Transactions on Learning Technologies (2021–2025) have explored the impact of educational technologies and instructional strategies, such as guided assignments, on engineering learning, highlighting their effectiveness in improving student performance. Resources such as those provided by TryEngineering.org have also promoted the use of structured lesson plans, which can be considered practical examples of guided tasks applied to teaching circuits.
Several studies have emphasized the importance of pedagogical design in assignments to enhance learning outcomes in STEM disciplines. Freeman et al. (2014) demonstrated through a meta-analysis that active learning strategies—such as those embedded in well-designed assignments—significantly improve student performance compared to traditional methods. Similarly, the Education Endowment Foundation (2021) concluded that the most effective assignments are those directly aligned with classroom content and accompanied by structured feedback, underscoring the value of guided over unguided tasks. In the context of engineering education, Kang et al. (2018) presented a pedagogical case involving active learning strategies for teaching Kirchhoff’s circuit laws, illustrating the value of personalized and interactive approaches when addressing complex topics in circuit theory.
The design of tasks that promote conceptual understanding and active learning has received increasing attention in circuit theory teaching. Becker and Hacker (2022) demonstrated that conceptual writing exercises in circuit analysis courses can significantly improve student understanding by encouraging reflection and the articulation of ideas, which are characteristics of guided tasks. In a complementary approach, Costa et al. (2007) applied problem-based learning (PBL) to teach circuit analysis, highlighting that this structured approach promotes autonomy and critical thinking, key elements in guided tasks. Carstensen and Bernhard (2009) also underlined the importance of task design in circuit learning, identifying critical aspects that need to be addressed to facilitate deeper understanding. More recently, Arredondo et al. (2025) explored the use of peer-produced and peer-reviewed explanatory videos as a learning strategy, finding that this approach improves both student understanding and motivation.
The effectiveness of tasks as a learning tool in circuit theory has been studied from various perspectives. Gerlein et al. (2016) compared the impact of using highly complex problems and frequent assessment on the learning of electrical circuits, concluding that a consistent and challenging work structure can significantly improve conceptual understanding. Along these lines, Kazakou et al. (2021) proposed an approach of structured, randomized, and self-assessed tasks, highlighting that this type of design promotes student autonomy without sacrificing pedagogical guidance, which aligns with the principles of guided tasks. For their part, Callaghan et al. (2021) demonstrated that transforming traditional tasks into deliberate practice exercises can enhance the effectiveness of active learning, particularly when elements of feedback and strategic repetition are incorporated. These studies reinforce the hypothesis that guided tasks, when carefully designed and aligned with specific learning objectives, can offer significant advantages over unguided tasks in teaching electrical circuit theory.
In this sense, guided tasks play a crucial role by structuring learning in a way that explicitly addresses these cognitive barriers, thereby facilitating a deeper understanding of electrical phenomena.
These studies support the hypothesis that guided tasks, because they provide a more transparent structure and conceptual support, can also be more effective than tasks that lack these features, and can prove to be a valuable active learning tool with a very low implementation cost.
The remainder of this paper is organized as follows: Section 2 provides a detailed description of the study context and the instructional design implemented. Section 3 outlines the methodology, including the experimental design and data collection techniques. Section 4 presents the results obtained from both qualitative and quantitative analyses. Section 5 offers a discussion of the findings, considering the research objectives and the existing literature. Section 6 concludes the study and proposes directions for future research. Finally, the Acknowledgments section highlights the contributions and support received during the development of this work.

2. Description of the Study

In this section, we provide the context of the present study, including the study population, followed by a description of the material assigned weekly and the evaluation conducted weekly.

2.1. Description of the Fundamentals of the Electric Circuits Course

The course is an undergraduate course with a total of four academic credits. It is structured as a theoretical and practical course. This course provides a foundational basis for other subjects related to the analysis and implementation of electric and electronic circuits across several engineering programs, including Electronics, Networks and Telecommunications, Mechatronics, Mechanical Engineering, and Bioengineering. Its primary objective is to present an engineering-based methodology for analyzing and designing circuits, enabling students to understand, identify, plan, and solve problems related to electrical systems.
The nuclear content of the course covers core electrical topics, including electrical variables, power, and energy; basic electrical components, such as sources, resistors, inductors, and capacitors; and fundamental interconnection laws, such as Kirchhoff’s Laws. Students engage with key concepts such as instantaneous, average, stored, and dissipated power and energy. They are introduced to analytical tools, including voltage and current dividers, source transformation, the superposition principle, and Thévenin and Norton equivalents. The course also includes the analysis of first-order circuits in both the time and frequency domains, the frequency response of basic circuits, and the analysis of circuits using ideal operational amplifiers. In addition to disciplinary knowledge, the course seeks to develop non-disciplinary competencies such as systems of thinking and professional skills, including strategic and graphical communication. Each learning outcome is explicitly associated with a specific set of competencies and is assessed through detailed rubrics that evaluate student performance.
Pedagogically, the course is delivered through theoretical lectures supplemented by practical sessions, during which students directly interact with real circuits, often working in groups. Interaction between students and the instructor is strongly encouraged to foster active problem-solving and conceptual understanding. Group work is also promoted for laboratory sessions, although the syllabus does not provide explicit details regarding the group formation process.
The course evaluation is composed of several components. It begins with an initial assessment of mathematical prerequisites. During the semester, the students complete quizzes and assignments, which may be based on in-class activities or pre-class readings. Three midterm exams are administered throughout the term, followed by a final cumulative exam. The specific grading breakdown is as follows: the initial midterm in Week 2 accounts for 5% of the final grade; the second midterm in Week 6 accounts for 15%; the third midterm in Week 12 accounts for 20%; and the final exam in Week 17 accounts for 25%. The remaining components include assignments, workshops, quizzes, and other coursework (15%), as well as laboratory practices carried out throughout the semester (20%). To avoid bias introduced by course withdrawals, this analysis considered only quiz results from approximately the first nine weeks of the course, ending about three weeks before the official withdrawal date.
The total workload for this four-credit course is 192 h, spread over 17 academic weeks. It is distributed as 64 h of in-person classroom instruction (4 h per week), 32 h of practical work (2 h per week), and 96 h of independent study (6 h per week). The detailed schedule spans from Week 1 through Week 17, including final evaluations. Laboratory sessions begin formally in Week 2 and are integrated into nearly every subsequent week of the course.
Based on data from all course groups between 2022 and 2024, an average of 40.6% of students withdrew before the end of the semester, only 45.4% completed the course with a passing grade, and 14% failed.

2.2. Characterization of the Two Groups

The grades for the exams at our university range from 0 to 5, with approval grades starting at 3.0. Students take the Electric Circuit course from various engineering progr ams, including electronics, mechatronics, mechanics, bioengineering, and telecommunications. Typically, there are six groups, each with around 24 students. Among these groups, we chose the two groups with the same instructor. Note that one group typically consists of sophomore students from various programs. In our case, both groups (control and intervention) contained students from different semesters and different programs, ranging from second to fifth semester, with an average of 3.65 semesters for the control group and 3.45 semesters for the intervention group. The control and intervention groups were also similar in terms of previous performance, with a GPA (Grade Point Average) of 3.73 for the control group and 3.68 for the intervention group, and in the number of students, with 23 and 22 students, respectively. This information is shown in Table 1.
A comparison was made of the GPA of the students in each group. Tests for statistical assumptions confirmed that the data were suitable for t-test analysis. A Shapiro–Wilk test indicated that both distributions were normal. Control group: W(23) = 0.975, p = 0.807; intervention group: W(22) = 0.973, p = 0.782). Levene’s test confirmed the assumption of equal variances (p = 0.0758).
The independent samples t-test revealed no significant difference in the GPA of the students in each group, with a p-value of 0.653.

2.3. Homework Description

Each homework assignment was sent weekly to both groups, containing the same practice exercises, with a total of around 9 points. For the control group, only the exercises were proposed, followed by their answers for verification purposes. For the intervention group, the exercises were separated by level of difficulty, and a suggested time was provided for the development of each point. There were around three points per level, and the recommendation was to pass to the next level only when the student could obtain the correct answer within the suggested time. For the intervention group, an optional worksheet is given to record the time each student takes to complete every exercise.
A sample of the exercises from the homework corresponding to the second week is shown in Figure 1. It displays points from the three different levels. At this stage, the students have only seen the definitions of the electrical variables (voltage, current, and power), as well as Kirchhoff’s voltage and current laws, Ohm’s law, and the series and parallel interconnection and equivalence for resistors. Even in these basic topics, students have difficulties in appropriating and using DC circuit concepts (Leniz et al., 2014; Li & Singh, 2012).
Level 1 recognizes the student’s ability to identify series and parallel interconnections of resistors and utilize equivalences to simplify circuits. Level 2 introduces additional complexity, ensuring students understand the correct polarity of voltage using the passive element convention of Ohm’s law, as well as relating variables from different equivalent circuits. Level 3 exercise also requires the integration of Kirchhoff’s voltage and current laws to find the variables, aiming for the self-discovery of new techniques that will be explained in the following weeks.
The weekly assignments for the Electric Circuits Fundamentals course are designed with the specific intent of supporting the development of the course’s learning outcomes. Each task is crafted to reinforce the student’s ability to analyze and design engineering solutions by engaging them in problem-solving activities that require understanding of a problem statement, the identification of relevant variables, the selection and interpretation of available information, planning of a strategy, and the application of mathematical tools to reach a solution. In Figure 2, a flowchart presents the approach to homework assignments.
The learning outcomes of the course emphasize the progression from understanding concepts to higher-order skills such as application and analysis. The assignments target specific outcomes, including describing electrical systems through their variables, laws, and interconnections, interpreting circuit behavior through equation formulation, analyzing operational amplifier circuits, and characterizing energy-storage elements under steady-state and first-order conditions. Additionally, students are expected to develop communication competencies by reporting their results in a structured manner, including the preparation of lab reports, tables, schematics, and graphs. The weekly assignments serve as a formative component that gradually enables students to meet these expectations through structured practice and feedback.

2.4. Weekly Evaluation

The evaluation of homework was carried out each week with a quiz during the class (presential), as noted by Gerlein et al. (2016). This was conducted the week after the assignment, consisting of only one differing point that focuses on the same subject as the homework.
To develop different quizzes for both groups with similar complexity, during the homework elaboration, five exercises were built for one of the three levels. Three out of these five were randomly selected and placed in the homework, and the remaining two were provided as quizzes, with one for each group.
The group that must present the quiz first randomly selected one of the two exercises, and the second group presented the remaining exercise.
The quizzes were peer-graded immediately after the test, showing a possible solution method and the evaluation rubric.

3. Methodology

3.1. Research Questions

This study is driven by the need to promote student success through the use of timely and targeted pedagogical strategies. Accordingly, it addresses two main research questions. First, it investigates whether structured and guided homework instructions—categorized by difficulty level and accompanied by time recommendations—can improve students’ exam performance, conceptual understanding, and course approval rates in an undergraduate Fundamentals of Electric Circuits course. Second, it examines whether monitoring the time students spend on each task can help identify and strengthen weak conceptual areas based on their performance and self-reported effort.

3.2. Study Design

The study was conducted with students enrolled in two different sections of the same undergraduate course, both taught by the same instructor during a single academic term, as described in the Study Description section. The number of students in each section naturally determined the sample size, without any filtering or manipulation being needed to preserve the authentic classroom composition.
Although not a randomized trial, the study employed a quasi-experimental design, enabling meaningful comparisons between two parallel groups under similar instructional conditions. The intervention spanned a 10-week period during which students completed weekly homework assignments and periodic quizzes, as detailed earlier.
For the intervention group, each homework assignment was supplemented with guided instructions intended to scaffold learning. The control group, by contrast, received standard problem sets without additional support. Quizzes were unannounced and addressed the same topics in both groups. To ensure fairness, all assessments were administered under identical time constraints and evaluated using the same grading criteria. No late homework submissions were accepted.
Student performance was analyzed using standard quantitative techniques. Specifically, each student’s overall GPA was compared with their quiz performance, and trends across quizzes were examined. To complement the quantitative analysis, students’ feedback was examined through the standard institutional course evaluations completed at the end of the semester. These evaluations, which include both scaled responses (1–6) and open-ended comments, were qualitatively analyzed to identify perceptions explicitly related to the homework methodology. Comments were categorized to highlight recurring themes, such as the perceived difficulty of the exercises, the usefulness of the guided support, preferences for procedural guidance over time estimates, and suggestions for making homework mandatory. Additionally, classroom observations and student behaviors regarding support sessions were considered to contextualize the feedback. This qualitative analysis was employed to determine the intervention’s reception and to identify areas for pedagogical improvement.
Additionally, the worksheet completed by students, used to record the amount of time dedicated to each exercise, was reviewed every week. This review served a dual purpose: first, to monitor the level of engagement and time management among students in the intervention group, and second, to identify specific topics or concepts that consistently presented difficulties, thereby enabling timely instructional adjustments and targeted reinforcement. However, it is important to note that completion of these worksheets was voluntary. As a result, the consistency of the data declined over time, with most students discontinuing their reports after the first few weeks. Consequently, the utility of this information was limited to the initial stages of the intervention. Nonetheless, during that early period, the data provided meaningful insights that supported a more responsive and adaptive instructional approach.

3.3. Evaluation Criteria

Homework and quiz answers were graded using a standardized rubric designed by the instructional team based on Bloom’s taxonomy and university guidelines for performance assessment in STEM courses. The rubric focused on four dimensions: correctness of results, completeness of procedure, clarity of presentation, and evidence of conceptual understanding. The same rubric was used for both groups to ensure consistency and objectivity.

3.4. Data Analysis

To evaluate the impact of guided instructions on student performance, quiz results for both groups were compared individually for each assessment. Once all quizzes were completed, the average quiz score for each student was calculated and compared to their overall GPA, aiming to identify any correlation between prior academic performance and outcomes in this course. The remainder of this subsection presents the Pearson correlation analysis, while the following section provides detailed results.
The average quiz score for the control group was 2.32, while the intervention group achieved a higher average of 2.76, suggesting a possible positive effect of the guided homework. A summary of the descriptive statistics for both groups is presented in Table 2.
A Pearson correlation analysis was conducted to assess the relationship between GPA and quiz average. In the control group, the correlation coefficient was r = 0.187 with a p-value = 0.391. In the intervention group, the coefficient was r = 0.076 with a p-value = 0.735. In both cases, the results indicate no statistically significant correlation between students’ GPAs and their performance on quizzes.
To determine whether the difference in quiz averages between the two groups was statistically significant, an independent samples t-test was performed. The test yielded a t-statistic of −2.061 and a p-value of 0.045. Since the p-value is below the 0.05 threshold, the null hypothesis of no difference between the group means can be rejected. This indicates a statistically significant difference in favor of the intervention group, suggesting that the guided instructions may have had a positive influence on performance.

4. Results

4.1. Comparison Between Control and Intervention Groups

Figure 3 illustrates the differences in the results of the first six quizzes administered to the students during the study. Except for the fifth quiz, the intervention group performed better, obtaining an average grade of 2.8, compared with the 2.4 obtained by the control group. This result implies that the presence of instructions in the different homework tasks improves the learning process or performance in the quizzes.

4.2. Comparison to Previous Results by Student

The GPA of each student was compared to the results of the quizzes they took, yielding the results shown in Figure 4. This figure indicates that even outstanding students were not guaranteed good results in this course.
Figure 4 shows a comparison of the GPA and the average grade achieved in the quizzes for each of the students in the control group (top) and intervention group (bottom).
The ratio between the quiz grades and the GPA of the students was 0.65 for the control group and 0.76 for the intervention group, indicating that the average grades in this course are lower than the GPA for most students, as shown in Figure 5. Nevertheless, an advantage was observed for the intervention group, with only one student achieving a ratio below 0.5.
These results show that, on average, the electric circuits course is more challenging for students compared to previous courses, but also that the detailed instructions in the homework help to close the gap between past results and the results in this course.
The detailed information about the test reveals a similar situation for the population being evaluated, with the number of students in the control group who completed the quizzes varying from 15 to 22 students across the six sessions, averaging 19.2 students who completed all the quizzes. For the intervention group, the number of students varied from 19 to 21 in the six sessions, with an average of 20.2 students completing every quiz. Note that students who were absent or missed quizzes were not included in the analysis.

4.3. Evolution of the Homework Used by the Intervention Group

The students in the intervention group could voluntarily record the time it took to solve each exercise in a worksheet. Nevertheless, it is not possible to infer from this tool the number of students that used the homework exercises to prepare for the evaluation. Figure 6 shows the evolution of the worksheet completion, with the number of students who completed these worksheets remaining consistently below 50% and this number decreasing as the semester advanced, ending with only one student completing the report (5%). Even if the students in this group orally indicated that they used the homework for preparation, since completing the worksheet was not mandatory, they did not report it. For this reason, it is essential to promote the use of this tool in upcoming semesters.

5. Discussion

At the end of each semester, students are required to complete a standard institutional evaluation. This evaluation is conducted in every course, assessing both the class and the instructor. Most of the questions are quantitative, based on a numerical scale (from 1 to 6), and an open-ended question is included at the end for comments.
These evaluations are shared with instructors after the semester concludes, allowing them to use the feedback to make improvements.
Various types of comments were made in these evaluations. Here, only those related to the methodology of the homework exercises are included. The following comments were received:
“I was frustrated at first because the suggested time was much shorter than the time it took me to complete the exercise.”
“I wish the exercises included the type of procedure that should be followed to save time.”
“The homework should be mandatory and graded.”
“I learned more than my classmates in other groups. The workshops and weekly quizzes helped me in this process.”
During class, additional comments were received and can be grouped into the following three categories.
The class content is challenging for students; therefore, offering workshops with exercises of varying difficulty supports the learning process. This suggests that, in addition to the theoretical content, students require extensive practice to reinforce their understanding and enhance their performance in assessments.
Students highly value support in solving the exercises. While in-class examples are important, they do not fully address the range of problems students encounter. Students prefer to have a teaching assistant (TA) or professor available to answer their questions. Although this request was made previously, and a designated time was scheduled for the professor to answer exercise-related questions, students only attended shortly before assessments. It would be worth investigating why students, even when their requests are fulfilled, do not utilize the resources offered. Based on personal experience, support tools requiring prior preparation may be less effective due to poor time management and a tendency to procrastinate.
Knowing which type of procedure or method to use to solve an exercise may be more valuable to students than knowing the estimated completion time. This is particularly true in the early stages of learning when conceptual understanding is still developing. Once they gain a better understanding of the subject, the estimated time becomes more important, as time management is critical in written assessments. If students cannot complete an exercise within the teacher’s suggested time, they may be unable to finish the entire exam.
Some student comments focus on improving their conceptual understanding, aligning with the findings reported by Bull et al. (2010). Based on a statistical analysis comparing students’ average GPA and assessment results, the following conclusions can be drawn.
The groups have statistically equivalent GPA characteristics; thus, there is no bias indicating stronger academic performance in one group.
There is no evidence of a correlation between students’ grade point averages (GPAs) and their assessment results in the course. According to observations made by instructors, student performance correlates more strongly with the effort invested during the course. This should be emphasized to students at the beginning of the semester so they understand that the course demands significant study and commitment.
Additionally, statistical analysis showed an improvement in the average academic performance of students in the intervention group compared to the control group.

6. Conclusions

Electrical circuits is a particularly challenging course for engineering students. This is evident both in the bibliographical references and in the daily work of instructors. All of these different efforts, methodologies, and initiatives aim to help students in their learning process, their understanding, and their application of concepts. Some of these interventions improve student performance, consequently enhancing their academic results and grades, which in turn positively impact students’ confidence, self-esteem, and satisfaction with their academic program.
Another aspect that highlights the difficulty of the electrical circuits course is the lack of statistical evidence of a correlation between students’ GPAs and their assessment results. This implies that greater attention must be paid to the teaching–learning process in these classes, and a greater commitment from students is necessary to achieve better results.
The implementation of a guided procedure to solve homework practice exercises—which includes indicating the difficulty level and the recommended resolution time for each task—has a demonstrated positive impact on student’s performance in electric circuit theory exams. This structured approach not only supports the development of problem-solving skills but also helps students manage their study time more effectively, aligning their efforts with the expected complexity of each topic.
The results also show that, regardless of the student’s initial skill level or academic background, structured support mechanisms such as guided procedures and targeted feedback help to strengthen the correlation between consistent effort and academic achievement. In other words, while ability alone does not guarantee success, the presence of additional scaffolding can significantly improve learning outcomes and reduce performance variability.

7. Future Work

Building on the findings of this study, future work will focus on integrating weekly feedback mechanisms that reflect the actual time students dedicate to each task. This data will offer instructors valuable insights into students’ learning processes, serving as a diagnostic tool to identify patterns of difficulty or conceptual misunderstandings. Such insights can support timely instructional adjustments, enabling the targeted reinforcement of specific topics and clarification of problem-solving strategies.
In parallel, future research will continue examining the impact of structured and guided homework—especially when combined with student engagement data—on the early identification of at-risk students. By linking instructional design to measurable academic outcomes, this approach seeks to promote more responsive teaching practices and foster improvements in student performance, course comprehension, and retention rates.
Further work will also investigate the relationship between key academic events and student performance using both quantitative and qualitative methods. Particular attention will be paid to perceived shifts in motivation, task engagement, and participation in practical activities. Qualitative tools such as focus groups will be employed to deepen our understanding of these dynamics and their potential pedagogical implications.

Author Contributions

Conceptualization, R.D., A.F. and J.A.H.; Methodology, R.D., A.F. and J.A.H.; Formal analysis, A.F. and J.A.H.; Investigation, R.D. and A.F.; Data curation, R.D.; Writing—original draft, R.D., A.F. and J.A.H.; Writing—review & editing, R.D., A.F. and J.A.H. All authors have read and agreed to the published version of the manuscript.

Funding

The APC was funded by the Pontificia Universidad Javeriana.

Institutional Review Board Statement

This study was conducted by the Declaration of Helsinki, and the protocol was approved by the Research and Ethics Committee (CIE) of the School of Engineering at Pontificia Universidad Javeriana (FID-22-306, dated 21 October 2022).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data supporting the findings of this study are available from the corresponding author (J.A.H.) upon reasonable request.

Acknowledgments

The authors thank the Department of Electronics and the Electronics Laboratory of the Pontificia Universidad Javeriana for their institutional and technical support. We also acknowledge the academic program directors for providing anonymized data and the department heads for their valuable insights on preliminary findings. Financial support from the Pontificia Universidad Javeriana for the dissemination and publication of this work is also gratefully recognized. The authors used QuillBot (version 4.23.3), Grammarly (web-based version as of June 2025), and ChatGPT based on the GPT-4o model (June 2025 version) to assist in editing the language of this manuscript. All content was reviewed by the authors, who take full responsibility for its accuracy and integrity.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

References

  1. Arredondo, F., García, B., & Lijo, R. (2025). Learning through explanation: Producing and peer-reviewing videos on electric circuits problem solving. IEEE Transactions on Education, 68(1), 67–78. [Google Scholar] [CrossRef]
  2. Becker, J. P., & Hacker, D. J. (2022). Conceptual-based writing exercises in a circuit analysis course. IEEE Transactions on Education, 65(4), 544–552. [Google Scholar] [CrossRef]
  3. Bull, S., Jackson, T. J., & Lancaster, M. J. (2010). Students’ interest in their misconceptions in first-year electrical circuits and mathematics courses. International Journal of Electrical Engineering & Education, 47(3), 307–318. [Google Scholar] [CrossRef]
  4. Callaghan, K., Deslauriers, L., McCarty, L. S., & Miller, K. (2021). Increasing the effectiveness of active learning using deliberate practice: A homework transformation. Physical Review Physics Education Research, 17, 010129. [Google Scholar] [CrossRef]
  5. Carstensen, A. K., & Bernhard, J. (2009). Student learning in an electric circuit theory course: Critical aspects and task design. European Journal of Engineering Education, 34(4), 393–408. [Google Scholar] [CrossRef]
  6. Costa, L. R. J., Honkala, M., & Lehtovuori, A. (2007). Applying the problem-based learning approach to teach elementary circuit analysis. IEEE Transactions on Education, 50(1), 41–48. [Google Scholar] [CrossRef]
  7. Crouch, C. H., & Mazur, E. (2001). Peer instruction: Ten years of experience and results. American Journal of Physics, 69, 970–977. Available online: https://mazur.harvard.edu/publications/peer-instruction-ten-years-experience-and-results (accessed on 1 April 2025). [CrossRef]
  8. Education Endowment Foundation. (2021). Homework. Teaching and learning toolkit. Available online: https://educationendowmentfoundation.org.uk/education-evidence/teaching-learning-toolkit/homework (accessed on 1 April 2025).
  9. Ekmekci, A., & Gulacar, O. (2015). A case study for comparing the effectiveness of a computer simulation and a hands-on activity on learning electric circuits. Eurasia Journal of Mathematics, Science and Technology Education, 11(4), 765–775. [Google Scholar] [CrossRef]
  10. Espera, A. H., & Pitterson, N. P. (2019, June 15–19). Teaching circuit concepts using evidence-based instructional approaches: A systematic review. 2019 ASEE Annual Conference & Exposition, Tampa, FL, USA. [Google Scholar] [CrossRef]
  11. Fayyaz, F., & Trueman, C. (2022, June 18–22). Persistent mistakes in learning basic circuit analysis. Proceedings of the Canadian Engineering Education Association (CEEA), Toronto, ON, Canada. [Google Scholar] [CrossRef]
  12. Freeborn, T. (2022, August 26–29). Student progress after a learning in advance course to prepare engineering students for circuit analysis in electrical engineering. 2022 ASEE Annual Conference & Exposition, Minneapolis, MN, USA. [Google Scholar] [CrossRef]
  13. Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences, 111(23), 8410–8415. [Google Scholar] [CrossRef] [PubMed]
  14. Gerlein, E., Cruz, J. M., & Hurtado, J. (2016). Comparación del aprendizaje en la asiganmtura d ecircuitos eléctricos mediante uso de problemas de alta compeljidad y evalaución frecuente. Encuentro internacional De educación en ingeniería. Available online: https://acofipapers.org/index.php/eiei/article/view/921 (accessed on 1 April 2025).
  15. Harjuna, R., Sutopo, S., & Munfaridah, N. (2024). Learning based on central idea on the topic of electrical circuits to improve students’ concept understanding. Kasuari: Physics Education Journal (KPEJ), 7(2), 446–456. [Google Scholar] [CrossRef]
  16. Hussain, N. H., Latiff, L. A., & Yahaya, N. (2010). Learning difficulties among electrical engineering students. The International Journal of Science in Society, 1(4), 1–12. [Google Scholar] [CrossRef]
  17. IEEE Transactions on Learning Technologies. (2021–2025). Various articles on guided learning and educational strategies in engineering. IEEE Xplore. Available online: https://ieeexplore.ieee.org (accessed on 1 April 2025).
  18. Johnson, A. M., Butcher, K. R., Ozogul, G., & Reisslein, M. (2014). Introductory circuit analysis learning from abstract and contextualized circuit representations: Effects of diagram labels. IEEE Transactions on Education, 57(3), 160–168. [Google Scholar] [CrossRef]
  19. Kang, R., Lin, Y., Wang, Y., Wu, H., Wu, M., & Teng, B. (2018). A pedagogical case on active learning regarding to Kirchhoff’s circuit laws. International Journal of Electrical Engineering & Education, 56(2), 179–190. [Google Scholar] [CrossRef]
  20. Kazakou, E., Edgcomb, A. D., Rajasekhar, Y., Lysecky, R., & Vahid, F. (2021, July 19–26). Randomized, structured, auto-graded homework: Design philosophy and engineering examples. 2021 ASEE Virtual Annual Conference Content Access, Virtual. [Google Scholar] [CrossRef]
  21. Leniz, A., Zuza, K., & Guisasola, J. (2014, July 30–31). Difficulties understanding the explicative model of simple DC circuits in introductory physics courses. Physics Education Research Conference 2014, Minneapolis, MN, USA. [Google Scholar]
  22. Li, J. (2022). Using flowchart to help students learn basic circuit theories quickly. Sustainability, 14(12), 7516. [Google Scholar] [CrossRef]
  23. Li, J., & Singh, C. (2012). Students’ difficulties with equations involving circuit elements. AIP Conference Proceedings, 1413, 243–246. [Google Scholar]
  24. Paatz, R., Ryder, J., Schwedes, H., & Scott, P. (2004). A case study analyzing the process of analogy-based learning in a teaching unit about simple electric circuits. International Journal of Science Education, 26(9), 1065–1081. [Google Scholar] [CrossRef]
  25. Pitterson, N. P., & Streveler, R. A. (2016, June 26–29). Teaching and learning complex circuit concepts: An investigation of the intersection of prior knowledge, learning activities, and design of learning environments. 2016 ASEE Annual Conference & Exposition, New Orleans, LA, USA. [Google Scholar] [CrossRef]
  26. Rankhumise, M. P., & Sitwala, N. I. (2014). Using a bicycle analogy to alleviate students’ alternative conceptions and conceptual difficulties in electric circuits. Mediterranean Journal of Social Sciences, 5(15), 297–302. [Google Scholar] [CrossRef]
  27. Reba, P., Susithra, N., Deepa, M., & Santhanamari, G. (2022). Effective teaching of electric circuit analysis through Jigsaw cooperative learning method. Journal of Engineering Education Transformations, 36(1), 49–59. [Google Scholar] [CrossRef]
  28. Sambamurthy, N., Kazakou, E., & Adibi, Y. (2021, July 26). Student usage of auto-graded activities in a web-based circuit analysis textbook. 2021 ASEE Virtual Annual Conference Content Access, Tagged Division: Electrical and Computer (p. 21), Virtual. [Google Scholar] [CrossRef]
  29. Sanches, V. T., Gomes Costa, G. G., Mariano dos Santos, J. F., & Catunda, T. (2018). Analysis of engineering students’ common difficulties with DC electric circuits in an inquiry-based laboratory. International Journal on Alive Engineering Education, 5(2), 13–22. [Google Scholar] [CrossRef]
  30. Skromme, B. J., O’Donnell, M. A., & Barnard, W. M. (2024, June 12). Step-by-step tutoring support for student success in circuit analysis courses. GLSVLSI ‘24: Proceedings of the Great Lakes Symposium on VLSI 2024 (pp. 347–350), New York, NY, USA. [Google Scholar] [CrossRef]
  31. Sundararajan, D. (2020). Introductory circuit theory. Springer. [Google Scholar]
  32. Trzaska, Z. (2025). Advanced topics in electric circuits. Lecture notes in electrical engineering. Springer. [Google Scholar]
  33. Zhang, J., Zhang, B., Yue, H., Liu, J., & Xu, D. (2022). Reform of digital circuit teaching in the context of the training of outstanding engineers. In L. Yan, H. Duan, & X. Yu (Eds.), Advances in guidance, navigation and control. Lecture notes in electrical engineering (Vol. 644). Springer. [Google Scholar] [CrossRef]
Figure 1. Sample of the three different levels of exercises from the second homework.
Figure 1. Sample of the three different levels of exercises from the second homework.
Education 15 00857 g001
Figure 2. Flowchart describing the approach of the homework assignment.
Figure 2. Flowchart describing the approach of the homework assignment.
Education 15 00857 g002
Figure 3. A comparison of the results from the two courses reveals a better average outcome for the intervention group than for the control group.
Figure 3. A comparison of the results from the two courses reveals a better average outcome for the intervention group than for the control group.
Education 15 00857 g003
Figure 4. Comparison of the GPA and the average grade achieved in the quizzes for each of the students in the control group (top) and intervention group (bottom).
Figure 4. Comparison of the GPA and the average grade achieved in the quizzes for each of the students in the control group (top) and intervention group (bottom).
Education 15 00857 g004
Figure 5. Comparison of the two courses regarding the ratio between the average grade obtained in quizzes and the GPA of each student. Better results were observed for the intervention group.
Figure 5. Comparison of the two courses regarding the ratio between the average grade obtained in quizzes and the GPA of each student. Better results were observed for the intervention group.
Education 15 00857 g005
Figure 6. Evolution of voluntary completion of the worksheet by the students in the intervention group.
Figure 6. Evolution of voluntary completion of the worksheet by the students in the intervention group.
Education 15 00857 g006
Table 1. Comparison of the average GPA of students in both groups.
Table 1. Comparison of the average GPA of students in both groups.
NMeanVarianceMedianMinimumMaximum
Control group233.730.063.723.244.29
Intervention group 223.670.153.632.774.37
Table 2. Comparison of the average quiz results of students in both groups.
Table 2. Comparison of the average quiz results of students in both groups.
NMeanVarianceMedianMinimumMaximum
Control group232.320.522.084.34.08
Intervention group 222.760.512.781.51.5
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Diez, R.; Fajardo, A.; Hurtado, J.A. Comparison Between Guided and Non-Guided Homework as a Tool for Learning Electric Circuit Theory. Educ. Sci. 2025, 15, 857. https://doi.org/10.3390/educsci15070857

AMA Style

Diez R, Fajardo A, Hurtado JA. Comparison Between Guided and Non-Guided Homework as a Tool for Learning Electric Circuit Theory. Education Sciences. 2025; 15(7):857. https://doi.org/10.3390/educsci15070857

Chicago/Turabian Style

Diez, Rafael, Arturo Fajardo, and Jairo A. Hurtado. 2025. "Comparison Between Guided and Non-Guided Homework as a Tool for Learning Electric Circuit Theory" Education Sciences 15, no. 7: 857. https://doi.org/10.3390/educsci15070857

APA Style

Diez, R., Fajardo, A., & Hurtado, J. A. (2025). Comparison Between Guided and Non-Guided Homework as a Tool for Learning Electric Circuit Theory. Education Sciences, 15(7), 857. https://doi.org/10.3390/educsci15070857

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop