Enhancing Teaching-Learning Effectiveness by Creating Online Interactive Instructional Modules for Fundamental Concepts of Physics and Mathematics

This study explored the effectiveness of online instructional modules for providing supplementary instruction in basic mathematics and physics concepts. The modules were developed in accordance with a cognitive apprenticeship model. Participants (N = 47) were students enrolled in a required Statics course at a midwestern university. Participants were randomly assigned to either an intervention or control group. The results show that the intervention group performed significantly better on post-tests through using the online instructional modules, while there was no significant improvement of performance in the control group. Based on survey results, students expressed their engagement to the instructional materials. Furthermore, they expressed a self-paced learning experience through providing feedback that they had control over the course materials by using the developed online instructional modules. Survey results also were indicative of students’ approval of using the modules as a supplemental material to classroom lectures.


Introduction
Engineering instructors face a variety of challenges in teaching introductory-level courses.One major challenge is that students frequently lack precursor knowledge of key mathematics and science concepts [1].Students are generally required to take prerequisite courses in mathematics, physics, and chemistry before beginning their engineering coursework, but retention of knowledge from these prerequisite courses is often low.In order to develop expertise in the engineering field, learners must acquire a deep and well-organized understanding of the domain.This includes knowledge of key concepts (e.g., force, mass) and general principles (e.g., Newton's laws of motion), and awareness of the contextual conditions under which the knowledge is useful [2].Connecting new ideas and concepts to previous knowledge and experience is a key element of developing this expertise [2].
The current study examines the effectiveness of an intervention focused on supplementary online video instruction in an undergraduate statics course.Our goal was to focus on modules for fundamental concepts of physics and mathematics that are utilized in teaching the engineering mechanics courses in Statics and Dynamics.To determine the relevant fundamental concepts, we surveyed the faculty members across the School of Engineering who teach engineering mechanics courses.Based on faculty feedback, the supplementary videos focused on three key concepts for which students often lack fundamental understanding: differentiation (including the product rule, the power rule, and the chain rule), integration (including the power rule, the quotient rule, and integrals of inverse tangents and natural logarithms), and vector calculus (including unit vectors, dot products, and cross products).The intervention is structured around several key educational principles: ensuring that relevant prior knowledge is accurate and active, integrating testing into study sessions, and allowing for individualized, self-paced learning.

Prior Knowledge
Instructional strategies that promote connections with prior knowledge can promote fuller understanding and deeper learning of new material e.g., [3,4].For example, elaborative interrogation, which activates prior knowledge and encourages students to connect new information with what they already know, can facilitate learning [5,6].However, in order to promote learning, prior knowledge must be appropriate, accurate, and sufficient for the task at hand [7].
Students arrive at each course with varying levels of prior knowledge; how to address these differences in prior knowledge is a common challenge for instructors [7].One approach is to provide remedial instruction for students who lack understanding of key prior concepts.However, it is often not feasible or desirable to provide such instruction during regular class meetings, because there is not sufficient time to cover the material, or because students who already understand the material will become bored or disengaged.Thus, the ability to provide out-of-class supplementary instruction to ensure accurate and active prior knowledge is beneficial for both instructors and students.
In the current study, the relevant prior knowledge concepts were fundamental concepts of physics and mathematics that are primary concepts in second year engineering mechanics courses such as Statics and Dynamics.These concepts were differentiation, integration, and vector calculus.Differentiation and integration are recognized as the two most central mathematical concepts in almost all engineering disciplines.Vector calculus was also selected since the prominent receiver group of the modules were mechanical engineering students, and because the Dynamics course is mainly taught using a vector analysis approach.

Benefits of Online Supplementary Instruction
Currently, instructional videos are widely used in both campus-based and online courses [8][9][10][11][12][13].The use of such videos in combination with face-to-face class instruction may be especially beneficial; several meta-analyses have found that hybrid or blended courses, which include both face-to-face and online instruction, are more effective than either traditional in-person courses or fully online courses [14,15].
Most recent online courses (and related research) adopt the student-centered approach [16][17][18][19][20].Following a student-centered approach, online instructional materials (e.g., videos) are made to be as interactive as possible so that the learner controls the flow of the course [21][22][23].

Individualized, Self-Paced Instruction
Students benefit from individual instruction, but most classes do not allow for it [24,25] and students frequently do not take advantage of opportunities for in-person individual instruction, such as instructor office hours [26,27].One common reason that students give for their lack of attendance at office hours is that the time and/or location of office hours was not convenient; Griffin et al. [27] found that convenience had a greater impact on student office hour attendance than a number of other factors, such as whether the instructor was perceived as approachable and whether material was explained clearly during class.Online tutoring sessions allow students to access material at a time and place that is convenient for them, and thus this may increase engagement with these sessions.
One of the main factors contributing to learning effectiveness in online courses, compared to traditional classroom-based courses, is the increased control that learners have over the course flow and content [28][29][30][31].Contrary to popular belief, the major motivation for enrollment in distance education is not physical access, but rather, temporal freedom to move through a course of studies at a pace of the student's choice [32][33][34][35][36].

Presence in Online Instruction
Effective online teaching activities design, facilitate, and direct the cognitive and social process for the purpose of promoting personally and educationally meaningful learning outcomes [37].A sense of presence contributes to the effectiveness of online learning activities.
There are three types of presence in online teaching and learning environments: social, cognitive, and teaching presence.Social presence was defined as "the degree to which a person is perceived as a real person in mediated communication" [38] (p.151).A study of the role of social presence has found that interaction among participants is critical in learning and cognitive development [39].Students with high overall perceptions of social presence performed better regarding perceived learning [39][40][41].
In the current study, promoting social presence was partly achieved by introducing the instructor within the modules.We encouraged students to share their background with the instructor to help them build a connection with the instructor.Furthermore, all of the participants understood that their opinions on the modules are considered to be valuable, to promote a feeling of relevance to the instructor and the course.
Cognitive presence was defined as "the extent to which learners are able to construct and confirm meaning through sustained reflection and discourse in a critical community of inquiry", by Garrison, Anderson, and Archer [42] (p.5).Cognitive presence has been identified as the most challenging type of presence to create in online learning environments [43].To foster cognitive presence in the current study, we provided prompt feedback to students on their performance in modules, quizzes, and video embedded questions.Innovative problem-solving techniques were also provided for the students by the instructional videos.
Teaching presence was defined by Anderson, Liam, Garrison, and Archer [44] (p. 5) as the "design, facilitation, and direction of cognitive and social processes for the purpose of realizing personally meaningful and educationally worthwhile learning outcomes".Our modules promoted teaching presence by encouraging independent study among learners.Additionally, the modules were constructed such that student empowerment was promoted.Students had full control over the pace of progression through the video modules, and therefore, they were given the freedom to address their unique needs based on their own aspirations and goals.Information about technology requirements and available resources for technical help were provided to the students to facilitate the use of the modules.Furthermore, the instructions and rules of the course were outlined at the beginning of the study.

Cognitive Apprenticeship
Cognitive apprenticeship is an instructional model in which expert approaches to thinking and problem solving are made visible [45].In cognitive apprenticeship approaches to instruction, an expert works through a problem while verbally describing and explaining the underlying cognitive processes in which the expert is engaged.Thus, the learner is able to see correct problem solving being modeled, while also understanding the decision making and problem solving processes that inform the chosen behaviors or strategies [45,46].
To make our videos consistent with this model, the verbal explanation of problem solving procedure was synchronized precisely with writing part of the video.For instance, in covering the product rule in the "Differentiation" module, the importance of the topic was explained in the video by providing examples of Statics-related problems.The basic concept of the rule, in addition to its implementation, was then covered thoroughly in a step-by-step manner.Therefore, students were able to follow the solution strategy and steps as they were explained and written down on paper.In addition, a number of examples (related to both Statics and Mathematics) were solved by following the step-by-step procedure, which was explained prior to the example solution.Furthermore, there were references at the end of each segment of the video to the already-explained concept in solving a problem.The segment was concluded with a question (video-embedded question) which was focused on the product rule in differentiation.This was performed especially to help the students to perceive the application of the taught concept in solving a problem, and to emphasize the importance of learning the fundamental concepts of mathematics.

Integrating Self-Testing into Study Sessions
One benefit of online supplementary instruction is the ability to provide prompt feedback about the accuracy of a student's calculations and conclusions.As noted by Anderson [47], to provide motivation and to shape behavior and mental constructs most effectively, detailed feedback should be provided as quickly as possible following the performance of the assessed behavior.For this reason, machine evaluations, such as those provided in online multiple-choice test questions or in simulations, can be very effective learning devices.
A variety of studies have found that integrating self-testing into study sessions promotes better memory and deeper learning.For example, Roediger and Karpicke [48] found that dividing study time between material review and self-testing was more effective than devoting study time entirely to material review.Subsequent research indicates that recall testing of previously learned information can positively impact not only memory of previously learned information, but also learning of new information [49], and that integrated testing can improve attention to and learning from online lectures [50].
The presence of integrated quizzes also allows for immediate feedback.Prompt feedback helps students to build awareness of what they do and do not know and can prevent the persistence of misconceptions [7].The instructional videos used in the current intervention included integrated questions that were posed to the viewers at the end of each chapter of the video.After viewing the chapter, students were required to answer the question in order to proceed to the next chapter.The questions were mostly focused on common mistakes that students would make in applying the concept of that chapter.Immediate feedback was provided to the students, which showed whether their answer was wrong or right.Then students were provided with the choice of continuing to the next chapter or replaying the chapter that they had just watched.

Procedure
Participation in the study was voluntary.Informed consent was obtained from each participant included in the study.Extra credit was provided as the benefit of participation to those who completed the study.Figure 1 summarizes the overall procedure in this study.
Educ.Sci.2018, 8, x FOR PEER REVIEW 4 of 13 the step-by-step procedure, which was explained prior to the example solution.Furthermore, there were references at the end of each segment of the video to the already-explained concept in solving a problem.The segment was concluded with a question (video-embedded question) which was focused on the product rule in differentiation.This was performed especially to help the students to perceive the application of the taught concept in solving a problem, and to emphasize the importance of learning the fundamental concepts of mathematics.

Integrating Self-Testing into Study Sessions
One benefit of online supplementary instruction is the ability to provide prompt feedback about the accuracy of a student's calculations and conclusions.As noted by Anderson [47], to provide motivation and to shape behavior and mental constructs most effectively, detailed feedback should be provided as quickly as possible following the performance of the assessed behavior.For this reason, machine evaluations, such as those provided in online multiple-choice test questions or in simulations, can be very effective learning devices.
A variety of studies have found that integrating self-testing into study sessions promotes better memory and deeper learning.For example, Roediger and Karpicke [48] found that dividing study time between material review and self-testing was more effective than devoting study time entirely to material review.Subsequent research indicates that recall testing of previously learned information can positively impact not only memory of previously learned information, but also learning of new information [49], and that integrated testing can improve attention to and learning from online lectures [50].
The presence of integrated quizzes also allows for immediate feedback.Prompt feedback helps students to build awareness of what they do and do not know and can prevent the persistence of misconceptions [7].The instructional videos used in the current intervention included integrated questions that were posed to the viewers at the end of each chapter of the video.After viewing the chapter, students were required to answer the question in order to proceed to the next chapter.The questions were mostly focused on common mistakes that students would make in applying the concept of that chapter.Immediate feedback was provided to the students, which showed whether their answer was wrong or right.Then students were provided with the choice of continuing to the next chapter or replaying the chapter that they had just watched.

Procedure
Participation in the study was voluntary.Informed consent was obtained from each participant included in the study.Extra credit was provided as the benefit of participation to those who completed the study.Figure 1 summarizes the overall procedure in this study.Each stage of the study is briefly described as follows: 1.
The consent form was distributed to all students in the course (in the classroom); 2.
The students who chose to participate in the study returned the completed consent form and were randomly assigned to either the intervention or control group; 3.
A questionnaire was provided and filled out by the study participants (in the classroom); 4.
All participants took the pre-test (in the classroom); 5.
Links to all the videos and related assignments were provided to the participants (intervention and control groups) and they were required to complete all parts of these online instructional modules within a specific 10-day period.All quizzes and videos were provided on the institution's Learning Management System (Blackboard) website (outside the classroom); 6.
All participants took the post-test (in the classroom); 7.
A survey form was provided and completed by all participants.

Participants
Students enrolled in the required Statics course in the fall 2016 semester were invited to participate in this study.Forty-seven students completed the study, with 25 in the intervention group and 22 in the control group.Group assignments were random, and participants did not know if they were members of the experimental or control group, as a single-blind experimental design was utilized.Of the participants in the intervention group, 80% were male and 20% female.Of the participants in the control group, 100% were male.

Modules
Based on the feedback from the faculty members across the School of Engineering who teach engineering mechanics courses, we focused on three fundamental concepts: differentiation, integration, and vector calculus; and we established the most important elements of each concept.Therefore, we developed three modules, one for each fundamental concept: differentiation, integration, and vector calculus.Each module consisted of: (1) pre-test quiz; (2) instructional video; and (3) post-test quiz.The differentiation, integration, and vector calculus instructional videos were 11, 7, and 18 min in length, respectively.Each instructional video had a menu listing the module chapters, with each chapter focusing on a specific concept, and the menu enabling easy navigation between chapters.The chapters were short videos (4 min or less).For example, within the vector calculus instructional video, one chapter focused on taking the cross product between two vectors.Within each chapter, the concept was first explained (e.g., cross product), then a relevant example was elucidated thoroughly to demonstrate the concept, and finally a more advanced example problem was presented.Embedded questions were placed at the end of the majority of the chapters, and immediate feedback was provided upon answering these questions.Furthermore, students could either use the menu to navigate to a different chapter or they could use arrow shaped links provided on the screen to continue to the next chapter, replay the previous chapter, or skip to another chapter.
Unlike the intervention group, the control group watched three videos on random topics that were not directly related to the study.The control group watched videos that were solely focused on Statics concepts and problems.These videos covered mathematical concepts as well, as it is necessary to know mathematics in order to be able to solve Statics problems.The videos were different from the ones available to the intervention group, in that the control group videos were not focused on the mathematical concepts and applications in Statics, while the intervention group watched videos that were specifically created to focus on particular mathematical concepts.These videos were provided to the control group in an attempt to avoid the Hawthorne effect in our study.The Hawthorne effect states that subjects modify their behavior based on their knowledge of an external observer.Although these videos did not have any embedded questions, the control group participants were required to watch all three videos to be considered as participants in the study.In addition, the control group still received the same information (on mathematical topics) during in-class lectures.The main difference between the control and intervention group was the delivery method of the supplementary course content.

Measures
2.4.1.Questionnaire: Demographic, Academic, and Social Background Data All participants completed a questionnaire containing sections for their age, gender, grade point average (GPA; both in high school and in college), social activity background, and awards and accomplishments information.These data were later assessed to draw correlations between the participants' performance and engagement in the online learning environment.

Pre-and Post-Quiz
For each module, all participants completed the same quiz before and after an instructional video exposure.Videos were developed according to the process previously described by Moradi, Liu, and Luchies [51].Each pre-and post-quiz consisted of five multiple choice questions.The pre-and post-quizzes contained the same multiple choice questions, with answers provided in a random order.Two examples of quiz questions are: ) is the displacement of a particle.Find its velocity.(Differentiation module)

•
The acceleration of a car, which was initially at rest, is approximated as: a (t) = 3 4 t 3 .What would be the velocity of the car at t = 2? (Integration module) Each quiz was designed to assess participants' knowledge of topics related to the corresponding module, and student performance on these quizzes was not included in the data analysis.Furthermore, the quizzes were included in the modules to: (1) provide a pre-study opportunity for the student so he/she would have a better idea about the content of the instructional video before watching it; and (2) help the student to review the material after watching the video.Participants were only provided quiz performance feedback after the post-quiz.Based on their choice of answer, an appropriate form of feedback was provided to them.The feedback either referred to a particular part of the video, or reminded of a key point that would lead to the correct choice of answer.For example, if the student answered incorrectly for the first example question about calculating particle velocity using the displacement function, they would be directed to the part of the related instructional video that reviewed the use of the method of differentiation.

Pre-and Post-Test
All participants took the pre-test at the beginning of the study and the post-test at the end of the study; both tests were administered in the classroom.Both tests consisted of 10 multiple-choice questions focused on the fundamental concepts: differentiation, integration, and vector calculus.The questions were different between the two tests, but they were related to the same concepts.The questions were designed to assess the students' understanding of the concepts, rather than performing computations.The participants' scores from both tests were recorded and included in the data analysis.

Survey
After the post-test was complete, each participant filled out a survey.The survey consisted of six statements and participants were asked to provide their opinions on each statement by choosing a number from 1 to 5 (1 = strongly disagree-5 = strongly agree).The quality of the modules, and students' control over the pace of learning were the fundamental elements of the survey statements.

Data Analysis
One-sample Kolmogorov-Smirnov tests (KS test) was employed to check the normality of the data in all statistical analyses.An independent t-test was conducted on the age of participants in both groups to verify that the two groups were similar in age.
After the normality of the data was checked, a t-test on the test scores was performed.Means and standard deviations of pre-and post-tests of both groups were calculated.An independent samples t-test was conducted on the pre-test total scores for the two groups to verify that the two groups were not statistically different before the video exposure.A paired sample t-test was conducted within each group across the pre-and post-test scores to determine the effect of the video exposure.The significance level of p ≤ 0.05 was used for all t-tests.
A normality distribution check of data revealed that the collected data was not normally distributed for the following cases: (1) participants' age in both groups; and (2) post-test scores of the intervention group.Although the normal distribution of data is one of the assumptions of the t-test, previous investigations have shown that t-test results are reliable only in cases with relatively large sample sizes (n ≥ 30) [52][53][54].A study by Ratcliffe [55] stated that increasing the sample size to greater than 15 will make the normality assumption of the t-test negligible.Lumley, Diehr, Emerson, and Chen [56] also stated that the findings of Ratcliffe [55] were related to one-tailed t-tests, which are more sensitive to the non-normality of data.Hence, since the sample size in the current study was greater than 15 and our intention was towards reporting of the results of two-tailed t-tests, we concluded that performing a t-test on the data was reasonable.

Results
Based on the results of the t-test, the two groups were similar to each other in term of the ages of the participants (Table 1).An age distribution of participants in both groups showed that most of the population was 19 years old (Table 2), with 20-year-old participants being the second biggest age group.Performing a KS test, all four sets of data for pre-and post-test scores of participants were verified to have a normal distribution.Students' performance in both groups are presented in Table 3.As the figures demonstrate, there was a small decrease in test results in the intervention group which was contrary to the control group as the majority of the control group dropped in their test scores over the two tests.Furthermore, the number of students who performed better, and also the rate of increase in the intervention group was more than that of the control group.Test results indicate that the control group had a better understanding of the targeted topics than the intervention group by 10% in the average of pre-test scores (Table 4), and this difference was statistically significant (Table 5).Test results indicate that the control group had a better understanding of the targeted topics than the intervention group by 10% in the average of pre-test scores (Table 4), and this difference was statistically significant (Table 5).Table 6 represents survey statements and the results of the intervention group survey.Numbers in each column are indicative of the number of the intervention group participants, who gave feedback to each row statement.Percentages of each Likert item corresponding to a survey statement are also illustrated (Figure 2a-f).

Discussion
As Table 3 demonstrated, the number of students and also the rate of increase in their scores was higher in the intervention group, which is in accordance with previous research [57].Furthermore, since the intervention group was exposed to online instructional modules, it was expected that they would improve their learning, which is consistent with previous research [58].For the intervention group, the post-test score was higher by 13%, compared to the pre-test scores (Table 4) and t-test analysis proved that the increase in test scores was significant (Table 5).These results suggest that the instructional videos had a significant impact on the intervention group and improved their scores in the post-test compared to the pre-test.Table 4 also denotes a 13% increase across the post-vs.pretest scores in the intervention group, which was expected since the post-test was on the topics that were covered in the modules.The control group watched videos that were not directly related to the test, and it was expected that their post-test scores would not change; t-test result demonstrated that the post-test scores were not significantly different compared to the pre-test scores (Table 5).
Approximately 92% of the intervention group participants were satisfied with the quality of the developed modules, with quality defined as audio, visual, length, and buffering (Figure 2a).However, one participant was not satisfied by the video quality.The second statement indicates a wider spectrum of opinions.As demonstrated by Figure 2b, only 28% percent of participants preferred videos over classroom lectures, and 72% were either neutral or against having videos rather than lectures.The above relatively low percentage approval rate indicates that although participants were awarded extra credit, there is a high possibility that they did not have a bias towards the videos.Therefore, it is reasonable to recognize this experimental study as being unbiased, which is very important when trying to interpret the results and propose new ideas.Only 8% disagreed with the third statement; see Figure 2c.It suggests that the majority of the students found the modules helpful.According to Figure 2d, approximately 32% percent agreed that either of pre or post-test was unnecessary, and 36% percent disagreed with the statement.They were asked to single out which test they thought was unnecessary, and an equal 50% chose either the pre-or post-test.Although there is disagreement between participants about the necessity of either pre-or post-test, it is of significant importance to design a pre-and post-assessment process.The pre-and post-test is necessary to evaluate the effectiveness of the implemented teaching method, and including them in future studies is strongly recommended.Also, the remaining 32% were neutral on the topic, which was another indicator that there is not enough evidence to draw any conclusion on excluding the assessment tools from a study framework.

Discussion
As Table 3 demonstrated, the number of students and also the rate of increase in their scores was higher in the intervention group, which is in accordance with previous research [57].Furthermore, since the intervention group was exposed to online instructional modules, it was expected that they would improve their learning, which is consistent with previous research [58].For the intervention group, the post-test score was higher by 13%, compared to the pre-test scores (Table 4) and t-test analysis proved that the increase in test scores was significant (Table 5).These results suggest that the instructional videos had a significant impact on the intervention group and improved their scores in the post-test compared to the pre-test.Table 4 also denotes a 13% increase across the post-vs.pre-test scores in the intervention group, which was expected since the post-test was on the topics that were covered in the modules.The control group watched videos that were not directly related to the test, and it was expected that their post-test scores would not change; t-test result demonstrated that the post-test scores were not significantly different compared to the pre-test scores (Table 5).
Approximately 92% of the intervention group participants were satisfied with the quality of the developed modules, with quality defined as audio, visual, length, and buffering (Figure 2a).However, one participant was not satisfied by the video quality.The second statement indicates a wider spectrum of opinions.As demonstrated by Figure 2b, only 28% percent of participants preferred videos over classroom lectures, and 72% were either neutral or against having videos rather than lectures.The above relatively low percentage approval rate indicates that although participants were awarded extra credit, there is a high possibility that they did not have a bias towards the videos.Therefore, it is reasonable to recognize this experimental study as being unbiased, which is very important when trying to interpret the results and propose new ideas.Only 8% disagreed with the third statement; see Figure 2c.It suggests that the majority of the students found the modules helpful.According to Figure 2d, approximately 32% percent agreed that either of pre or post-test was unnecessary, and 36% percent disagreed with the statement.They were asked to single out which test they thought was unnecessary, and an equal 50% chose either the pre-or post-test.Although there is disagreement between participants about the necessity of either pre-or post-test, it is of significant importance to design a pre-and post-assessment process.The pre-and post-test is necessary to evaluate the effectiveness of the implemented teaching method, and including them in future studies is strongly recommended.Also, the remaining 32% were neutral on the topic, which was another indicator that there is not enough evidence to draw any conclusion on excluding the assessment tools from a study framework.
Almost half of the participants felt that that they had more control over the course flow and were engaged in the course, as it is demonstrated in Figure 2e.This percentage is especially important when it comes to reviewing course materials or for adjusting the progress of the course based on one's own preferences, since individualized learning has been considered to be an important factor in students' learning in online learning environments [59].As some students need extra time to fully understand the course materials, using the developed modules helps them to gain control over the speed of their learning.This finding is especially important when considering their opinion on the sixth statement.Participants' feedback on this statement indicates their level of satisfaction with the modules and possibly their engagement in the course and self-directed learning [60,61].Figure 2f shows that 70% of students expressed positive attitude towards the modules and recommended the use of such modules in the future.This high rate of recommendation from students is a promising figure in activities and policies that practice the use and development of online instructional modules, and supports the idea of incorporating these modules into lecture-based classrooms [62].About 8% were against using modules in the future, which is a considerably smaller percentage compared with 70% approval of the modules.Another important indication of the feedback is that although they did not prefer replacing the lectures with videos, they were still satisfied by having access to the videos as an enhancement of the lectures.This means that the students prefer to have a blended course design, where they can have access to both lectures in the classroom and supplemental instructional videos.This result is consistent with previous research that emphasizes the significance of providing students with videos as a supplemental instructional tool [51].Furthermore, since these modules were provided as a tool for course preparation, the level of the students' satisfaction is consistent with the findings of a previous study [63].

Conclusions
The present study focuses on enhancing the effectiveness of online instructional modules for students learning.Since the modules were focused on covering basic concepts of mathematics and physics, developing and extending the use of these modules can help with reducing the time that an instructor needs to spend reviewing these concepts within mechanical engineering mechanics courses.This study shows that students who had access to the modules performed significantly better in their post-test, with their scores improving by 13%.
The students' feedback indicates that they felt more engaged in the course by using the online instructional modules.Participant feedback provides evidence of the advantages of using student-centered classrooms and integrating technology in lecture-based courses.Almost 70% of intervention group recommended using these modules to cover prerequisite course materials.Also, they felt that incorporating quizzes in the videos and having short tests before and after the video helped them in understanding course material.As these features add interactivity to a video, this data shows that online instructional modules are useful in promoting students' learning and preparedness.
Based on the participant feedback, we suggest that online instructional modules are an effective tool when combined with a lecture-based course.However, further research is needed that focuses on the instructors' opinions on the use of online instructional modules.Finally, reviewing basic mathematics and physics concepts should be the core part of the developed online instructional modules in order to address specific needs within the targeted engineering course.One of the main aspects that must be considered in future studies is to assess the effectiveness of such instructional modules on students' performance within the statics course by comparing final grades of the two groups, and also the effect of the modules in downstream courses.The length of similar future studies should be extensive enough to enable researchers to evaluate the effectiveness of these resources on students' perceptions of not only their currently enrolled course, but also downstream courses.

Figure 1 .
Figure 1.Flowchart of the study.

Figure 1 .
Figure 1.Flowchart of the study.

Figure 2 .
Figure 2. Distribution of survey question feedback in percentages.
1Value of p indicates no significant difference between the compared data (p > 0.05).

Table 2 .
Participants' age distribution (in percentages of population of each group) in intervention and control group.

Table 3 .
Comparison of participants' performance in the pre-and post-test.(Numbers indicate the percentage of people in each group).

Table 4 .
Mean, median, and standard deviation of test scores.
1Value of p indicates no significant difference between the compared data (p > 0.05);2Value of p indicates significant difference between the compared data (p ≤ 0.05).

Table 6
represents survey statements and the results of the intervention group survey.Numbers in each column are indicative of the number of the intervention group participants, who gave feedback to each row statement.Percentages of each Likert item corresponding to a survey statement are also illustrated (Figure2a-f).

Table 3 .
Comparison of participants' performance in the pre-and post-test.(Numbers indicate the percentage of people in each group).

Table 4 .
Mean, median, and standard deviation of test scores.
1Value of p indicates no significant difference between the compared data (p > 0.05);2Value of p indicates significant difference between the compared data (p ≤ 0.05).