Next Article in Journal
As Symbol as That: Inconsistencies in Symbol Systems of Alleles in Textbooks, and Students’ Justifications for Them
Previous Article in Journal
Concept Raps versus Concept Maps: A Culturally Responsive Approach to STEM Vocabulary Development
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Enhancing Teaching-Learning Effectiveness by Creating Online Interactive Instructional Modules for Fundamental Concepts of Physics and Mathematics

1
Department of Mechanical Engineering, University of Kansas, Lawrence, KS 66045, USA
2
Department of Educational Psychology, University of Kansas, Lawrence, KS 66045, USA
3
Department of Curriculum and Teaching, University of Kansas, Lawrence, KS 66045, USA
*
Author to whom correspondence should be addressed.
Educ. Sci. 2018, 8(3), 109; https://doi.org/10.3390/educsci8030109
Submission received: 3 July 2018 / Revised: 23 July 2018 / Accepted: 30 July 2018 / Published: 2 August 2018

Abstract

:
This study explored the effectiveness of online instructional modules for providing supplementary instruction in basic mathematics and physics concepts. The modules were developed in accordance with a cognitive apprenticeship model. Participants (N = 47) were students enrolled in a required Statics course at a midwestern university. Participants were randomly assigned to either an intervention or control group. The results show that the intervention group performed significantly better on post-tests through using the online instructional modules, while there was no significant improvement of performance in the control group. Based on survey results, students expressed their engagement to the instructional materials. Furthermore, they expressed a self-paced learning experience through providing feedback that they had control over the course materials by using the developed online instructional modules. Survey results also were indicative of students’ approval of using the modules as a supplemental material to classroom lectures.

1. Introduction

Engineering instructors face a variety of challenges in teaching introductory-level courses. One major challenge is that students frequently lack precursor knowledge of key mathematics and science concepts [1]. Students are generally required to take prerequisite courses in mathematics, physics, and chemistry before beginning their engineering coursework, but retention of knowledge from these prerequisite courses is often low. In order to develop expertise in the engineering field, learners must acquire a deep and well-organized understanding of the domain. This includes knowledge of key concepts (e.g., force, mass) and general principles (e.g., Newton’s laws of motion), and awareness of the contextual conditions under which the knowledge is useful [2]. Connecting new ideas and concepts to previous knowledge and experience is a key element of developing this expertise [2].
The current study examines the effectiveness of an intervention focused on supplementary online video instruction in an undergraduate statics course. Our goal was to focus on modules for fundamental concepts of physics and mathematics that are utilized in teaching the engineering mechanics courses in Statics and Dynamics. To determine the relevant fundamental concepts, we surveyed the faculty members across the School of Engineering who teach engineering mechanics courses. Based on faculty feedback, the supplementary videos focused on three key concepts for which students often lack fundamental understanding: differentiation (including the product rule, the power rule, and the chain rule), integration (including the power rule, the quotient rule, and integrals of inverse tangents and natural logarithms), and vector calculus (including unit vectors, dot products, and cross products). The intervention is structured around several key educational principles: ensuring that relevant prior knowledge is accurate and active, integrating testing into study sessions, and allowing for individualized, self-paced learning.

1.1. Prior Knowledge

Instructional strategies that promote connections with prior knowledge can promote fuller understanding and deeper learning of new material e.g., [3,4]. For example, elaborative interrogation, which activates prior knowledge and encourages students to connect new information with what they already know, can facilitate learning [5,6]. However, in order to promote learning, prior knowledge must be appropriate, accurate, and sufficient for the task at hand [7].
Students arrive at each course with varying levels of prior knowledge; how to address these differences in prior knowledge is a common challenge for instructors [7]. One approach is to provide remedial instruction for students who lack understanding of key prior concepts. However, it is often not feasible or desirable to provide such instruction during regular class meetings, because there is not sufficient time to cover the material, or because students who already understand the material will become bored or disengaged. Thus, the ability to provide out-of-class supplementary instruction to ensure accurate and active prior knowledge is beneficial for both instructors and students.
In the current study, the relevant prior knowledge concepts were fundamental concepts of physics and mathematics that are primary concepts in second year engineering mechanics courses such as Statics and Dynamics. These concepts were differentiation, integration, and vector calculus. Differentiation and integration are recognized as the two most central mathematical concepts in almost all engineering disciplines. Vector calculus was also selected since the prominent receiver group of the modules were mechanical engineering students, and because the Dynamics course is mainly taught using a vector analysis approach.

1.2. Benefits of Online Supplementary Instruction

Currently, instructional videos are widely used in both campus-based and online courses [8,9,10,11,12,13]. The use of such videos in combination with face-to-face class instruction may be especially beneficial; several meta-analyses have found that hybrid or blended courses, which include both face-to-face and online instruction, are more effective than either traditional in-person courses or fully online courses [14,15].
Most recent online courses (and related research) adopt the student-centered approach [16,17,18,19,20]. Following a student-centered approach, online instructional materials (e.g., videos) are made to be as interactive as possible so that the learner controls the flow of the course [21,22,23].

1.2.1. Individualized, Self-Paced Instruction

Students benefit from individual instruction, but most classes do not allow for it [24,25] and students frequently do not take advantage of opportunities for in-person individual instruction, such as instructor office hours [26,27]. One common reason that students give for their lack of attendance at office hours is that the time and/or location of office hours was not convenient; Griffin et al. [27] found that convenience had a greater impact on student office hour attendance than a number of other factors, such as whether the instructor was perceived as approachable and whether material was explained clearly during class. Online tutoring sessions allow students to access material at a time and place that is convenient for them, and thus this may increase engagement with these sessions.
One of the main factors contributing to learning effectiveness in online courses, compared to traditional classroom-based courses, is the increased control that learners have over the course flow and content [28,29,30,31]. Contrary to popular belief, the major motivation for enrollment in distance education is not physical access, but rather, temporal freedom to move through a course of studies at a pace of the student’s choice [32,33,34,35,36].

1.2.2. Presence in Online Instruction

Effective online teaching activities design, facilitate, and direct the cognitive and social process for the purpose of promoting personally and educationally meaningful learning outcomes [37]. A sense of presence contributes to the effectiveness of online learning activities.
There are three types of presence in online teaching and learning environments: social, cognitive, and teaching presence. Social presence was defined as “the degree to which a person is perceived as a real person in mediated communication” [38] (p. 151). A study of the role of social presence has found that interaction among participants is critical in learning and cognitive development [39]. Students with high overall perceptions of social presence performed better regarding perceived learning [39,40,41]. In the current study, promoting social presence was partly achieved by introducing the instructor within the modules. We encouraged students to share their background with the instructor to help them build a connection with the instructor. Furthermore, all of the participants understood that their opinions on the modules are considered to be valuable, to promote a feeling of relevance to the instructor and the course.
Cognitive presence was defined as “the extent to which learners are able to construct and confirm meaning through sustained reflection and discourse in a critical community of inquiry”, by Garrison, Anderson, and Archer [42] (p. 5). Cognitive presence has been identified as the most challenging type of presence to create in online learning environments [43]. To foster cognitive presence in the current study, we provided prompt feedback to students on their performance in modules, quizzes, and video embedded questions. Innovative problem-solving techniques were also provided for the students by the instructional videos.
Teaching presence was defined by Anderson, Liam, Garrison, and Archer [44] (p. 5) as the “design, facilitation, and direction of cognitive and social processes for the purpose of realizing personally meaningful and educationally worthwhile learning outcomes”. Our modules promoted teaching presence by encouraging independent study among learners. Additionally, the modules were constructed such that student empowerment was promoted. Students had full control over the pace of progression through the video modules, and therefore, they were given the freedom to address their unique needs based on their own aspirations and goals. Information about technology requirements and available resources for technical help were provided to the students to facilitate the use of the modules. Furthermore, the instructions and rules of the course were outlined at the beginning of the study.

1.2.3. Cognitive Apprenticeship

Cognitive apprenticeship is an instructional model in which expert approaches to thinking and problem solving are made visible [45]. In cognitive apprenticeship approaches to instruction, an expert works through a problem while verbally describing and explaining the underlying cognitive processes in which the expert is engaged. Thus, the learner is able to see correct problem solving being modeled, while also understanding the decision making and problem solving processes that inform the chosen behaviors or strategies [45,46].
To make our videos consistent with this model, the verbal explanation of problem solving procedure was synchronized precisely with writing part of the video. For instance, in covering the product rule in the “Differentiation” module, the importance of the topic was explained in the video by providing examples of Statics-related problems. The basic concept of the rule, in addition to its implementation, was then covered thoroughly in a step-by-step manner. Therefore, students were able to follow the solution strategy and steps as they were explained and written down on paper. In addition, a number of examples (related to both Statics and Mathematics) were solved by following the step-by-step procedure, which was explained prior to the example solution. Furthermore, there were references at the end of each segment of the video to the already-explained concept in solving a problem. The segment was concluded with a question (video-embedded question) which was focused on the product rule in differentiation. This was performed especially to help the students to perceive the application of the taught concept in solving a problem, and to emphasize the importance of learning the fundamental concepts of mathematics.

1.3. Integrating Self-Testing into Study Sessions

One benefit of online supplementary instruction is the ability to provide prompt feedback about the accuracy of a student’s calculations and conclusions. As noted by Anderson [47], to provide motivation and to shape behavior and mental constructs most effectively, detailed feedback should be provided as quickly as possible following the performance of the assessed behavior. For this reason, machine evaluations, such as those provided in online multiple-choice test questions or in simulations, can be very effective learning devices.
A variety of studies have found that integrating self-testing into study sessions promotes better memory and deeper learning. For example, Roediger and Karpicke [48] found that dividing study time between material review and self-testing was more effective than devoting study time entirely to material review. Subsequent research indicates that recall testing of previously learned information can positively impact not only memory of previously learned information, but also learning of new information [49], and that integrated testing can improve attention to and learning from online lectures [50].
The presence of integrated quizzes also allows for immediate feedback. Prompt feedback helps students to build awareness of what they do and do not know and can prevent the persistence of misconceptions [7]. The instructional videos used in the current intervention included integrated questions that were posed to the viewers at the end of each chapter of the video. After viewing the chapter, students were required to answer the question in order to proceed to the next chapter. The questions were mostly focused on common mistakes that students would make in applying the concept of that chapter. Immediate feedback was provided to the students, which showed whether their answer was wrong or right. Then students were provided with the choice of continuing to the next chapter or replaying the chapter that they had just watched.

2. Methods

2.1. Procedure

Participation in the study was voluntary. Informed consent was obtained from each participant included in the study. Extra credit was provided as the benefit of participation to those who completed the study. Figure 1 summarizes the overall procedure in this study.
Each stage of the study is briefly described as follows:
  • The consent form was distributed to all students in the course (in the classroom);
  • The students who chose to participate in the study returned the completed consent form and were randomly assigned to either the intervention or control group;
  • A questionnaire was provided and filled out by the study participants (in the classroom);
  • All participants took the pre-test (in the classroom);
  • Links to all the videos and related assignments were provided to the participants (intervention and control groups) and they were required to complete all parts of these online instructional modules within a specific 10-day period. All quizzes and videos were provided on the institution’s Learning Management System (Blackboard) website (outside the classroom);
  • All participants took the post-test (in the classroom);
  • A survey form was provided and completed by all participants.

2.2. Participants

Students enrolled in the required Statics course in the fall 2016 semester were invited to participate in this study. Forty-seven students completed the study, with 25 in the intervention group and 22 in the control group. Group assignments were random, and participants did not know if they were members of the experimental or control group, as a single-blind experimental design was utilized. Of the participants in the intervention group, 80% were male and 20% female. Of the participants in the control group, 100% were male.

2.3. Modules

Based on the feedback from the faculty members across the School of Engineering who teach engineering mechanics courses, we focused on three fundamental concepts: differentiation, integration, and vector calculus; and we established the most important elements of each concept. Therefore, we developed three modules, one for each fundamental concept: differentiation, integration, and vector calculus. Each module consisted of: (1) pre-test quiz; (2) instructional video; and (3) post-test quiz. The differentiation, integration, and vector calculus instructional videos were 11, 7, and 18 min in length, respectively. Each instructional video had a menu listing the module chapters, with each chapter focusing on a specific concept, and the menu enabling easy navigation between chapters. The chapters were short videos (4 min or less). For example, within the vector calculus instructional video, one chapter focused on taking the cross product between two vectors. Within each chapter, the concept was first explained (e.g., cross product), then a relevant example was elucidated thoroughly to demonstrate the concept, and finally a more advanced example problem was presented. Embedded questions were placed at the end of the majority of the chapters, and immediate feedback was provided upon answering these questions. Furthermore, students could either use the menu to navigate to a different chapter or they could use arrow shaped links provided on the screen to continue to the next chapter, replay the previous chapter, or skip to another chapter.
Unlike the intervention group, the control group watched three videos on random topics that were not directly related to the study. The control group watched videos that were solely focused on Statics concepts and problems. These videos covered mathematical concepts as well, as it is necessary to know mathematics in order to be able to solve Statics problems. The videos were different from the ones available to the intervention group, in that the control group videos were not focused on the mathematical concepts and applications in Statics, while the intervention group watched videos that were specifically created to focus on particular mathematical concepts. These videos were provided to the control group in an attempt to avoid the Hawthorne effect in our study. The Hawthorne effect states that subjects modify their behavior based on their knowledge of an external observer. Although these videos did not have any embedded questions, the control group participants were required to watch all three videos to be considered as participants in the study. In addition, the control group still received the same information (on mathematical topics) during in-class lectures. The main difference between the control and intervention group was the delivery method of the supplementary course content.

2.4. Measures

2.4.1. Questionnaire: Demographic, Academic, and Social Background Data

All participants completed a questionnaire containing sections for their age, gender, grade point average (GPA; both in high school and in college), social activity background, and awards and accomplishments information. These data were later assessed to draw correlations between the participants’ performance and engagement in the online learning environment.

2.4.2. Pre- and Post-Quiz

For each module, all participants completed the same quiz before and after an instructional video exposure. Videos were developed according to the process previously described by Moradi, Liu, and Luchies [51]. Each pre- and post-quiz consisted of five multiple choice questions. The pre- and post-quizzes contained the same multiple choice questions, with answers provided in a random order. Two examples of quiz questions are:
  • x (t) = exp ( t 2 ) is the displacement of a particle. Find its velocity. (Differentiation module)
  • The acceleration of a car, which was initially at rest, is approximated as: a (t) = ¾ t 3 . What would be the velocity of the car at t = 2? (Integration module)
Each quiz was designed to assess participants’ knowledge of topics related to the corresponding module, and student performance on these quizzes was not included in the data analysis. Furthermore, the quizzes were included in the modules to: (1) provide a pre-study opportunity for the student so he/she would have a better idea about the content of the instructional video before watching it; and (2) help the student to review the material after watching the video. Participants were only provided quiz performance feedback after the post-quiz. Based on their choice of answer, an appropriate form of feedback was provided to them. The feedback either referred to a particular part of the video, or reminded of a key point that would lead to the correct choice of answer. For example, if the student answered incorrectly for the first example question about calculating particle velocity using the displacement function, they would be directed to the part of the related instructional video that reviewed the use of the method of differentiation.

2.4.3. Pre- and Post-Test

All participants took the pre-test at the beginning of the study and the post-test at the end of the study; both tests were administered in the classroom. Both tests consisted of 10 multiple-choice questions focused on the fundamental concepts: differentiation, integration, and vector calculus. The questions were different between the two tests, but they were related to the same concepts. The questions were designed to assess the students’ understanding of the concepts, rather than performing computations. The participants’ scores from both tests were recorded and included in the data analysis.

2.4.4. Survey

After the post-test was complete, each participant filled out a survey. The survey consisted of six statements and participants were asked to provide their opinions on each statement by choosing a number from 1 to 5 (1 = strongly disagree–5 = strongly agree). The quality of the modules, and students’ control over the pace of learning were the fundamental elements of the survey statements.

2.5. Data Analysis

One-sample Kolmogorov-Smirnov tests (KS test) was employed to check the normality of the data in all statistical analyses. An independent t-test was conducted on the age of participants in both groups to verify that the two groups were similar in age.
After the normality of the data was checked, a t-test on the test scores was performed. Means and standard deviations of pre- and post-tests of both groups were calculated. An independent samples t-test was conducted on the pre-test total scores for the two groups to verify that the two groups were not statistically different before the video exposure. A paired sample t-test was conducted within each group across the pre- and post-test scores to determine the effect of the video exposure. The significance level of p ≤ 0.05 was used for all t-tests.
A normality distribution check of data revealed that the collected data was not normally distributed for the following cases: (1) participants’ age in both groups; and (2) post-test scores of the intervention group. Although the normal distribution of data is one of the assumptions of the t-test, previous investigations have shown that t-test results are reliable only in cases with relatively large sample sizes (n ≥ 30) [52,53,54]. A study by Ratcliffe [55] stated that increasing the sample size to greater than 15 will make the normality assumption of the t-test negligible. Lumley, Diehr, Emerson, and Chen [56] also stated that the findings of Ratcliffe [55] were related to one-tailed t-tests, which are more sensitive to the non-normality of data. Hence, since the sample size in the current study was greater than 15 and our intention was towards reporting of the results of two-tailed t-tests, we concluded that performing a t-test on the data was reasonable.

3. Results

Based on the results of the t-test, the two groups were similar to each other in term of the ages of the participants (Table 1).
An age distribution of participants in both groups showed that most of the population was 19 years old (Table 2), with 20-year-old participants being the second biggest age group.
Performing a KS test, all four sets of data for pre- and post-test scores of participants were verified to have a normal distribution. Students’ performance in both groups are presented in Table 3. As the figures demonstrate, there was a small decrease in test results in the intervention group which was contrary to the control group as the majority of the control group dropped in their test scores over the two tests. Furthermore, the number of students who performed better, and also the rate of increase in the intervention group was more than that of the control group.
Test results indicate that the control group had a better understanding of the targeted topics than the intervention group by 10% in the average of pre-test scores (Table 4), and this difference was statistically significant (Table 5).
Table 6 represents survey statements and the results of the intervention group survey. Numbers in each column are indicative of the number of the intervention group participants, who gave feedback to each row statement. Percentages of each Likert item corresponding to a survey statement are also illustrated (Figure 2a–f).

4. Discussion

As Table 3 demonstrated, the number of students and also the rate of increase in their scores was higher in the intervention group, which is in accordance with previous research [57]. Furthermore, since the intervention group was exposed to online instructional modules, it was expected that they would improve their learning, which is consistent with previous research [58]. For the intervention group, the post-test score was higher by 13%, compared to the pre-test scores (Table 4) and t-test analysis proved that the increase in test scores was significant (Table 5). These results suggest that the instructional videos had a significant impact on the intervention group and improved their scores in the post-test compared to the pre-test. Table 4 also denotes a 13% increase across the post- vs. pre- test scores in the intervention group, which was expected since the post-test was on the topics that were covered in the modules. The control group watched videos that were not directly related to the test, and it was expected that their post-test scores would not change; t-test result demonstrated that the post-test scores were not significantly different compared to the pre-test scores (Table 5).
Approximately 92% of the intervention group participants were satisfied with the quality of the developed modules, with quality defined as audio, visual, length, and buffering (Figure 2a). However, one participant was not satisfied by the video quality. The second statement indicates a wider spectrum of opinions. As demonstrated by Figure 2b, only 28% percent of participants preferred videos over classroom lectures, and 72% were either neutral or against having videos rather than lectures. The above relatively low percentage approval rate indicates that although participants were awarded extra credit, there is a high possibility that they did not have a bias towards the videos. Therefore, it is reasonable to recognize this experimental study as being unbiased, which is very important when trying to interpret the results and propose new ideas. Only 8% disagreed with the third statement; see Figure 2c. It suggests that the majority of the students found the modules helpful. According to Figure 2d, approximately 32% percent agreed that either of pre or post-test was unnecessary, and 36% percent disagreed with the statement. They were asked to single out which test they thought was unnecessary, and an equal 50% chose either the pre- or post-test. Although there is disagreement between participants about the necessity of either pre- or post-test, it is of significant importance to design a pre- and post-assessment process. The pre- and post-test is necessary to evaluate the effectiveness of the implemented teaching method, and including them in future studies is strongly recommended. Also, the remaining 32% were neutral on the topic, which was another indicator that there is not enough evidence to draw any conclusion on excluding the assessment tools from a study framework.
Almost half of the participants felt that that they had more control over the course flow and were engaged in the course, as it is demonstrated in Figure 2e. This percentage is especially important when it comes to reviewing course materials or for adjusting the progress of the course based on one’s own preferences, since individualized learning has been considered to be an important factor in students’ learning in online learning environments [59]. As some students need extra time to fully understand the course materials, using the developed modules helps them to gain control over the speed of their learning. This finding is especially important when considering their opinion on the sixth statement. Participants’ feedback on this statement indicates their level of satisfaction with the modules and possibly their engagement in the course and self-directed learning [60,61]. Figure 2f shows that 70% of students expressed positive attitude towards the modules and recommended the use of such modules in the future. This high rate of recommendation from students is a promising figure in activities and policies that practice the use and development of online instructional modules, and supports the idea of incorporating these modules into lecture-based classrooms [62]. About 8% were against using modules in the future, which is a considerably smaller percentage compared with 70% approval of the modules. Another important indication of the feedback is that although they did not prefer replacing the lectures with videos, they were still satisfied by having access to the videos as an enhancement of the lectures. This means that the students prefer to have a blended course design, where they can have access to both lectures in the classroom and supplemental instructional videos. This result is consistent with previous research that emphasizes the significance of providing students with videos as a supplemental instructional tool [51]. Furthermore, since these modules were provided as a tool for course preparation, the level of the students’ satisfaction is consistent with the findings of a previous study [63].

5. Conclusions

The present study focuses on enhancing the effectiveness of online instructional modules for students learning. Since the modules were focused on covering basic concepts of mathematics and physics, developing and extending the use of these modules can help with reducing the time that an instructor needs to spend reviewing these concepts within mechanical engineering mechanics courses. This study shows that students who had access to the modules performed significantly better in their post-test, with their scores improving by 13%.
The students’ feedback indicates that they felt more engaged in the course by using the online instructional modules. Participant feedback provides evidence of the advantages of using student-centered classrooms and integrating technology in lecture-based courses. Almost 70% of intervention group recommended using these modules to cover prerequisite course materials. Also, they felt that incorporating quizzes in the videos and having short tests before and after the video helped them in understanding course material. As these features add interactivity to a video, this data shows that online instructional modules are useful in promoting students’ learning and preparedness.
Based on the participant feedback, we suggest that online instructional modules are an effective tool when combined with a lecture-based course. However, further research is needed that focuses on the instructors’ opinions on the use of online instructional modules. Finally, reviewing basic mathematics and physics concepts should be the core part of the developed online instructional modules in order to address specific needs within the targeted engineering course. One of the main aspects that must be considered in future studies is to assess the effectiveness of such instructional modules on students’ performance within the statics course by comparing final grades of the two groups, and also the effect of the modules in downstream courses. The length of similar future studies should be extensive enough to enable researchers to evaluate the effectiveness of these resources on students’ perceptions of not only their currently enrolled course, but also downstream courses. However, the current study provides evidence that targeted online instructional modules is a strong tool to enhance engineering students’ understanding of mathematics and physics fundamentals.

Author Contributions

L.L. and C.L. applied for and acquired the funding to hire and support M.M. to carry out the proposed research. In addition, M.P. contributed significantly to literature review, data analysis, and manuscript writing. B.D. helped with manuscript modification.

Funding

National Science Foundation under Grant Number DUE1525775, and University of Kansas.

Acknowledgments

This work was supported by the National Science Foundation under Grant Number DUE1525775, the University of Kansas School of Engineering Initiative of “Engaging Minds, Amplifying Learning”, the University of Kansas Libraries Open Education Resources Grant Initiative, and the University of Kansas Mechanical Engineering Department. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation. The authors appreciate the support from their sponsors.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data, in the writing of the manuscript, and in the decision to publish the results.

References

  1. Krause, S.; Decker, J.C.; Griffin, R. Using a materials concept inventory to assess conceptual gain in introductory materials engineering courses. In Proceedings of the 33rd Annual Frontiers in Education (FIE 2003), Westminster, CO, USA, 5–8 November 2003; p. T3D-7. [Google Scholar]
  2. Litzinger, T.; Lattuca, L.R.; Hadgraft, R.; Newstetter, W. Engineering education and the development of expertise. J. Eng. Educ. 2011, 100, 123–150. [Google Scholar] [CrossRef]
  3. Cook, M.P. Visual representations in science education: The influence of prior knowledge and cognitive load theory on instructional design principles. Sci. Educ. 2006, 90, 1073–1091. [Google Scholar] [CrossRef] [Green Version]
  4. Garfield, J.; del Mas, R.; Chance, B. Using students’ informal notions of variability to develop an understanding of formal measures of variability. In Thinking with Data; Psychology Press: New York, NY, USA, 2007; pp. 117–147. [Google Scholar]
  5. Martin, V.L.; Pressley, M. Elaborative-interrogation effects depend on the nature of the question. J. Educ. Psychol. 1991, 83, 113–119. [Google Scholar] [CrossRef]
  6. Woloshyn, V.E.; Paivio, A.; Pressley, M. Use of elaborative interrogation to help students acquire information consistent with prior knowledge and information inconsistent with prior knowledge. J. Educ. Psychol. 1994, 86, 79–89. [Google Scholar] [CrossRef]
  7. Ambrose, S.A.; Bridges, M.W.; Lovett, M.C.; DiPietro, M.; Norman, M.K. How Learning Works: 7 Research-Based Principles for Smart Teaching; John Wiley & Sons: Hoboken, NJ, USA, 2010. [Google Scholar]
  8. Bos, N.; Groeneveld, C.; Bruggen, J.; Brand-Gruwel, S. The use of recorded lectures in education and the impact on lecture attendance and exam performance. Br. J. Educ. Technol. 2016, 47, 906–917. [Google Scholar] [CrossRef]
  9. Giannakos, M.N.; Jaccheri, L.; Krogstie, J. Exploring the relationship between video lecture usage patterns and students’ attitudes. Br. J. Educ. Technol. 2016, 47, 1259–1275. [Google Scholar] [CrossRef]
  10. Heitink, M.; Voogt, J.; Verplanken, L.; van Braak, J.; Fisser, P. Teachers’ professional reasoning about their pedagogical use of technology. Comput. Educ. 2016, 101, 70–83. [Google Scholar] [CrossRef]
  11. Amador, J.M. Video simulations to develop preservice mathematics teachers’ discourse practices. Technol. Pedag. Educ. 2017, 1–14. [Google Scholar] [CrossRef]
  12. Major, L.; Watson, S. Using video to support in-service teacher professional development: The state of the field, limitations and possibilities. Technol. Pedag. Educ. 2017, 1–20. [Google Scholar] [CrossRef]
  13. Admiraal, W. Meaningful learning from practice: Web-based video in professional preparation programmes in university. Technol. Pedag. Educ. 2014, 23, 491–506. [Google Scholar] [CrossRef]
  14. Means, B.; Toyama, Y.; Murphy, R.; Bakia, M.; Jones, K. Evaluation of Evidence-Based Practices in Online Learning: A Meta-Analysis and Review of Online Learning Studies; US Department of Education: Washington, DC, USA, 2009.
  15. Todd, E.M.; Watts, L.L.; Mulhearn, T.J.; Torrence, B.S.; Turner, M.R.; Connelly, S.; Mumford, M.D. A meta-analytic comparison of face-to-face and online delivery in ethics instruction: The case for a hybrid approach. Sci. Eng. Ethics 2017, 23, 1719–1754. [Google Scholar] [CrossRef] [PubMed]
  16. Fraser, B. Classroom learning environments. In Encyclopedia of Science Education; Springer: Berlin, Germany, 2015; pp. 154–157. [Google Scholar]
  17. Papastergiou, M.; Antoniou, P.; Apostolou, M. Effects of student participation in an online learning community on environmental education: A Greek case study. Technol. Pedag. Educ. 2011, 20, 127–142. [Google Scholar] [CrossRef]
  18. Park, J.Y. Student interactivity and teacher participation: An application of legitimate peripheral participation in higher education online learning environments. Technol. Pedag. Educ. 2015, 24, 389–406. [Google Scholar] [CrossRef]
  19. Admiraal, W.; van Vugt, F.; Kranenburg, F.; Koster, B.; Smit, B.; Weijers, S.; Lockhorst, D. Preparing pre-service teachers to integrate technology into K–12 instruction: Evaluation of a technology-infused approach. Technol. Pedag. Educ. 2017, 26, 105–120. [Google Scholar] [CrossRef]
  20. Schick, J.A.; de Felix, J.W. Using Technology to Help Teachers Meet the Needs of Language-minority Students in the USA. J. Inf. Technol. Teach. Educ. 1992, 1, 159–171. [Google Scholar] [CrossRef]
  21. Allen, L.K.; Eagleson, R.; de Ribaupierre, S. Evaluation of an online three-dimensional interactive resource for undergraduate neuroanatomy education. Anat. Sci. Educ. 2016, 9, 431–439. [Google Scholar] [CrossRef] [PubMed]
  22. Lee, K.T. Teachers using an integrated learning environment to cater for individual learning differences in Hong Kong primary classrooms. Technol. Pedag. Educ. 2005, 14, 371–389. [Google Scholar] [CrossRef] [Green Version]
  23. Steils, N.; Tombs, G.; Mawer, M.; Savin-Baden, M.; Wimpenny, K. Implementing the liquid curriculum: The impact of virtual world learning on higher education. Technol. Pedag. Educ. 2015, 24, 155–170. [Google Scholar] [CrossRef]
  24. Chung, G.K.; Delacruz, G.C.; Dionne, G.B.; Baker, E.L.; Lee, J.J.; Osmundson, E. Towards Individualized Instruction with Technology-Enabled Tools and Methods: An Exploratory Study. CRESST Report 854; National Center for Research on Evaluation, Standards, and Student Testing (CRESST): Los Angeles, CA, USA, 2016. [Google Scholar]
  25. Moores, D.F. One size does not fit all: Individualized instruction in a standardized educational system. Am. Ann. Deaf 2013, 158, 98–103. [Google Scholar] [CrossRef] [PubMed]
  26. Cox, B.E.; McIntosh, K.L.; Terenzini, P.T.; Reason, R.D.; Quaye, B.R.L. Pedagogical signals of faculty approachability: Factors shaping faculty–student interaction outside the classroom. Res. High. Educ. 2010, 51, 767–788. [Google Scholar] [CrossRef]
  27. Griffin, W.; Cohen, S.D.; Berndtson, R.; Burson, K.M.; Camper, K.M.; Chen, Y.; Smith, M.A. Starting the conversation: An exploratory study of factors that influence student office hour use. Coll. Teach. 2014, 62, 94–99. [Google Scholar] [CrossRef]
  28. Adamopoulos, P. What Makes a Great MOOC? An Interdisciplinary Analysis of Student Retention in Online Courses. In Reshaping Society Through Information Systems Design, Proceedings of the 34th International Conference on Information Systems, Università Bocconi, Milan, Italy, 15–18 December 2013; Association for Information Systems, AIS Electronic Library (AISeL): Atlanta, GA, USA, 2013. [Google Scholar]
  29. Hew, K.F.; Cheung, W.S. Students’ and instructors’ use of massive open online courses (MOOCs): Motivations and challenges. Educ. Res. Rev. 2014, 12, 45–58. [Google Scholar] [CrossRef]
  30. Lewis, S.; Whiteside, A.; Dikkers, A.G. Autonomy and responsibility: Online learning as a solution for at-risk high school students. Int. J. E-Learn. Distance Educ. 2014, 29, 1–11. [Google Scholar]
  31. Tseng, J.-L.; Pai, C. Analyzing the Learning Modes of Learners using Time-Management Modules in Self-Paced Learning. GSTF J. Comput. (JoC) 2014, 2, 53–58. [Google Scholar]
  32. Simonson, M.; Smaldino, S.; Zvacek, S.M. Teaching and Learning at a Distance: Foundations of Distance Education, 6th ed.; Information Age Publishing: Charlotte, NC, USA, 2014. [Google Scholar]
  33. Terras, M.M.; Ramsay, J. Massive open online courses (MOOCs): Insights and challenges from a psychological perspective. Br. J. Educ. Technol. 2015, 46, 472–487. [Google Scholar] [CrossRef]
  34. Naidu, S. Openness and Flexibility Are the Norm, But What Are the Challenges; Taylor & Francis: Abingdon, UK, 2017. [Google Scholar]
  35. Anderson, T. Towards a theory of online learning. In Theory and Practice of Online Learning; Anderson, T., Ed.; Athabasca University Press: Athabasca, AB, Canada, 2004; pp. 109–119. [Google Scholar]
  36. Martín-Rodríguez, Ó.; Fernández-Molina, J.C.; Montero-Alonso, M.Á.; González-Gómez, F. The main components of satisfaction with e-learning. Technol. Pedag. Educ. 2015, 24, 267–277. [Google Scholar] [CrossRef]
  37. Laat, M.D.; Lally, V.; Lipponen, L.; Simons, R.-J. Online teaching in networked learning communities: A multi-method approach to studying the role of the teacher. Instr. Sci. 2006, 35, 257–286. [Google Scholar] [CrossRef] [Green Version]
  38. Gunawardena, C.N. Social presence theory and implications for interaction and collaborative learning in computer conferences. Int. J. Educ. Telecommun. 1995, 1, 147–166. [Google Scholar]
  39. Richardson, J.; Swan, K. Examing Social Presence in Online Courses in Relation to Students’ Perceived Learning and Satisfaction; J. Asynchronous Learning Networks (fmr), Online Learning Consortium (current): Newburyport, MA, USA, 2003. [Google Scholar]
  40. Zhan, Z.; Hu, M. Academic self-concept and social presence in face-to-face and online learning: Perceptions and effects on students’ learning achievement and satisfaction across environments. Comput. Educ. 2013, 69, 131–138. [Google Scholar] [CrossRef]
  41. Zhang, D.J.; Allon, G.; Van Mieghem, J.A. Does Social Interaction Improve Learning Outcomes? Evidence from Field Experiments on Massive Open Online Courses. Manuf. Serv. Oper. Manag. 2017, 19, 347–367. [Google Scholar] [CrossRef]
  42. Garrison, D.R.; Anderson, T.; Archer, W. Critical thinking, cognitive presence, and computer conferencing in distance education. Am. J. Distance Educ. 2001, 15, 7–23. [Google Scholar] [CrossRef] [Green Version]
  43. Garrison, D.R.; Arbaugh, J.B. Researching the community of inquiry framework: Review, issues, and future directions. Internet High. Educ. 2007, 10, 157–172. [Google Scholar] [CrossRef]
  44. Anderson, T.; Liam, R.; Garrison, D.R.; Archer, W. Assessing Teaching Presence in a Computer Conferencing Context; J. Asynchronous Learning Networks (fmr), Online Learning Consortium (current): Newburyport, MA, USA, 2001. [Google Scholar]
  45. Collins, A.; Brown, J.S.; Holum, A. Cognitive apprenticeship: Making thinking visible. Am. Educ. 1991, 15, 6–11. [Google Scholar]
  46. Dennen, V.P.; Burner, K.J. The cognitive apprenticeship model in educational practice. In Handbook of Research on Educational Communications and Technology; Routledge: Abingdon, UK, 2008; Volume 3, pp. 425–439. [Google Scholar]
  47. Anderson, T. Teaching in an online learning context. In Theory and Practice of Online Learning; Athabasca University Press: Athabasca, AB, Canada, 2004; p. 273. [Google Scholar]
  48. Roediger, H.L., III; Karpicke, J.D. The power of testing memory: Basic research and implications for educational practice. Perspect. Psychol. Sci. 2006, 1, 181–210. [Google Scholar] [CrossRef] [PubMed]
  49. Pastötter, B.; Bäuml, K.-H.T. Distinct slow and fast cortical theta dynamics in episodic memory retrieval. Neuroimage 2014, 94, 155–161. [Google Scholar] [CrossRef] [PubMed]
  50. Szpunar, K.K.; Khan, N.Y.; Schacter, D.L. Interpolated memory tests reduce mind wandering and improve learning of online lectures. Proc. Natl. Acad. Sci. USA 2013, 110, 6313–6317. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  51. Moradi, M.; Liu, L.; Luchies, C. Identifying Important Aspects in Developing Interactive Instructional Videos. In Proceedings of the 2016 American Society for Engineering Education Midwest Section Conference, Manhattan, NY, USA, 25 September 2016; pp. 1–8. [Google Scholar]
  52. Arnold, S.F. Mathematical Statistics; Prentice-Hall: Upper Saddle River, NJ, USA, 1990. [Google Scholar]
  53. Casella, G.; Berger, R.L. Statistical Inference; Thomson Learning: Boston, MA, USA, 2002. [Google Scholar]
  54. Moore, D.S.; McCabe, G.P. Introduction to the Practice of Statistics; WH Freeman: New York, NY, USA, 1993. [Google Scholar]
  55. Ratcliffe, J. The effect on the t distribution of non-normality in the sampled population. Appl. Stat. 1968, 17, 42–48. [Google Scholar] [CrossRef]
  56. Lumley, T.; Diehr, P.; Emerson, S.; Chen, L. The importance of the normality assumption in large public health data sets. Ann. Rev. Public Health 2002, 23, 151–169. [Google Scholar] [CrossRef] [PubMed]
  57. Erbas, A.K.; Ince, M.; Kaya, S. Learning Mathematics with Interactive Whiteboards and Computer-Based Graphing Utility. Educ. Technol. Soc. 2015, 18, 299–312. [Google Scholar]
  58. Hwang, G.-J.; Lai, C.-L. Facilitating and Bridging Out-Of-Class and In-Class Learning: An Interactive E-Book-Based Flipped Learning Approach for Math Courses. J. Educ. Technol. Soc. 2017, 20, 184–197. [Google Scholar]
  59. Delialioglu, O.; Yildirim, Z. Students’ perceptions on effective dimensions of interactive learning in a blended learning environment. Educ. Technol. Soc. 2007, 10, 133–146. [Google Scholar]
  60. Sahin, I.; Shelley, M.C. Considering Students’ Perceptions: The Distance Education Student Satisfaction Model. Educ. Technol. Soc. 2008, 11, 216–223. [Google Scholar]
  61. Kuo, F.-R.; Chin, Y.-Y.; Lee, C.-H.; Chiu, Y.-H.; Hong, C.-H.; Lee, K.-L.; Ho, W.-H.; Lee, C.-H. Web-based learning system for developing and assessing clinical diagnostic skills for dermatology residency program. Educ. Technol. Soc. 2016, 19, 194–207. [Google Scholar]
  62. Neo, M.; Neo, T.-K. Engaging students in multimedia-mediated constructivist learning-Students’ perceptions. Educ. Technol. Soc. 2009, 12, 254–266. [Google Scholar]
  63. Montrieux, H.; Vangestel, S.; Raes, A.; Matthys, P.; Schellens, T. Blending Face-to-Face Higher Education with Web-Based Lectures: Comparing Different Didactical Application Scenarios. Educ. Technol. Soc. 2015, 18, 170–182. [Google Scholar]
Figure 1. Flowchart of the study.
Figure 1. Flowchart of the study.
Education 08 00109 g001
Figure 2. Distribution of survey question feedback in percentages.
Figure 2. Distribution of survey question feedback in percentages.
Education 08 00109 g002aEducation 08 00109 g002b
Table 1. Age of participants in two groups (α = 0.05).
Table 1. Age of participants in two groups (α = 0.05).
t-Test Results
t Stat0.87
p (T ≤ t)0.38 1
t Critical2.01
1 Value of p indicates no significant difference between the compared data (p > 0.05).
Table 2. Participants’ age distribution (in percentages of population of each group) in intervention and control group.
Table 2. Participants’ age distribution (in percentages of population of each group) in intervention and control group.
18 Years Old19 Years Old20 Years Old21+ Years Old
Intervention048%32%20%
Control9%50%23%18%
Table 3. Comparison of participants’ performance in the pre- and post-test. (Numbers indicate the percentage of people in each group).
Table 3. Comparison of participants’ performance in the pre- and post-test. (Numbers indicate the percentage of people in each group).
Increased ScoreDecreased ScoreSame Score
Intervention43%14%43%
Control18%59%23%
Table 4. Mean, median, and standard deviation of test scores.
Table 4. Mean, median, and standard deviation of test scores.
Pre-TestPost-Test
InterventionControlInterventionControl
Mean6.687.597.526.73
Median6877
Standard Deviation1.801.711.681.86
Table 5. t-test results for the collected data (α = 0.05).
Table 5. t-test results for the collected data (α = 0.05).
Age of Participants in Two GroupsPre-Test Scores of the Two GroupsPost- vs. Pre-Test Scores (Intervention Group)Post- vs. Pre-Test Scores (Control Group)
t Stat0.87−2.55−3.672.02
p (T ≤ t)0.38 10.01 20.001 20.06 1
t Critical2.012.022.062.08
1 Value of p indicates no significant difference between the compared data (p > 0.05); 2 Value of p indicates significant difference between the compared data (p ≤ 0.05).
Table 6. Survey results.
Table 6. Survey results.
Survey QuestionsMeanSTDMode
Videos were good quality-wise (audio, visual, length, buffering)4.080.624
Having lectures covered in videos was better than in classroom30.843
Having a quiz helped me in understanding concepts better3.60.744
I think one of the two quizzes was unnecessary2.921.013
I felt more engaged to the course and I had a better control over course flow3.241.064
I recommend using these modules to cover basic concepts3.760.944

Share and Cite

MDPI and ACS Style

Moradi, M.; Liu, L.; Luchies, C.; Patterson, M.M.; Darban, B. Enhancing Teaching-Learning Effectiveness by Creating Online Interactive Instructional Modules for Fundamental Concepts of Physics and Mathematics. Educ. Sci. 2018, 8, 109. https://doi.org/10.3390/educsci8030109

AMA Style

Moradi M, Liu L, Luchies C, Patterson MM, Darban B. Enhancing Teaching-Learning Effectiveness by Creating Online Interactive Instructional Modules for Fundamental Concepts of Physics and Mathematics. Education Sciences. 2018; 8(3):109. https://doi.org/10.3390/educsci8030109

Chicago/Turabian Style

Moradi, Moein, Lin Liu, Carl Luchies, Meagan M. Patterson, and Behnaz Darban. 2018. "Enhancing Teaching-Learning Effectiveness by Creating Online Interactive Instructional Modules for Fundamental Concepts of Physics and Mathematics" Education Sciences 8, no. 3: 109. https://doi.org/10.3390/educsci8030109

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop