Next Article in Journal
Gamification for Teaching Integrated Circuit Processing in an Introductory VLSI Design Course
Next Article in Special Issue
Comparing Real and Imitative Practice with No Practice during Observational Learning of Hand Motor Skills from Animations
Previous Article in Journal
Diagnostic and Feedback Behavior of German Pre-Service Teachers Regarding Argumentative Pupils’ Texts in Geography Education
Previous Article in Special Issue
The More, the Better? Exploring the Effects of Modal and Codal Redundancy on Learning and Cognitive Load: An Experimental Study
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

How Scientific Is Cognitive Load Theory Research Compared to the Rest of Educational Psychology?

by
Amedee Marchand Martella
1,2,*,
Alyssa P. Lawson
3 and
Daniel H. Robinson
4
1
Department of Psychological & Brain Sciences, University of California, Santa Barbara, CA 93106, USA
2
Department of Educational Psychology, University of Georgia, Athens, GA 30602, USA
3
Institute for Research and Training, Landmark College, Putney, VT 05346, USA
4
Department of Higher Education, Adult Learning and Organizational Studies, The University of Texas at Arlington, Arlington, TX 76019, USA
*
Author to whom correspondence should be addressed.
Educ. Sci. 2024, 14(8), 920; https://doi.org/10.3390/educsci14080920
Submission received: 10 July 2024 / Revised: 14 August 2024 / Accepted: 19 August 2024 / Published: 22 August 2024
(This article belongs to the Special Issue Cognitive Load Theory: Emerging Trends and Innovations)

Abstract

:
Cognitive load theory (CLT) has driven numerous empirical studies for over 30 years and is a major theme in many of the most cited articles published between 1988 and 2023. However, CLT articles have not been compared to other educational psychology research in terms of the research designs used and the extent to which recommendations for practice are justified. As Brady and colleagues found, a large percentage of the educational psychology articles reviewed were not experimental and yet frequently made specific recommendations from observational/correlational data. Therefore, in this review, CLT articles were examined with regard to the types of research methodology employed and whether recommendations for practice were justified. Across several educational psychology journals in 2020 and 2023, 16 articles were determined to directly test CLT. In contrast to other articles, which employed mostly observational methods, all but two of the CLT articles employed experimental or intervention designs. For the two CLT articles that were observational, recommendations for practice were not made. Reasons for the importance of experimental work are discussed.

1. How Scientific Is Cognitive Load Theory Research Compared to the Rest of Educational Psychology?

1.1. Overview

Cognitive load theory (CLT) emerged in the late 1980s and early 1990s as a theory of learning intended to guide instructional design. The term “cognitive load” first appeared in an article title in 1988 by John Sweller [1]. In a recent study by Hassan et al. [2] that examined the most cited authors and articles appearing in 12 educational psychology journals from 1988 to 2023, five of the top 30 most cited articles addressed CLT, and the most cited author was John Sweller. Along with his colleagues, including Paul Chandler and Paul Ayres, Sweller advanced the notion of cognitive load into a more formalized theory in 1991, e.g., [3]. Strengths of CLT include being based on knowledge of human cognitive architecture [4] and having important implications for instructional design [5].
CLT rose in popularity within educational psychology and instructional design journals after its updated description in 1998 [4]. CLT has maintained its prominence within the educational research literature since its introduction [4,6]. In fact, in a Scopus database search using terms “cognitive load” and “cognitive load theory” in articles published between 1988 and 2023 in the twelve educational psychology journals reviewed by Hassan et al. [2], there have been 373 articles published with a combined citation number of 50,262. When looking year by year, there has been a general increase in the number of articles that include “cognitive load” and/or “cognitive load theory” and in the citation rate (see Figure 1). There is no doubt that CLT has left an indelible mark on the field of educational psychology over the past 35 years.
However, despite its highly cited research base, there has been no specific comparative examination of CLT research to determine if there are any differences or similarities in methodological trends and recommendations for practice as compared to other educational psychology research. Such a comparison is important given that in the field of educational psychology as a whole, there has been a decline in intervention research and an increase in recommendations for practice being made from non-experimental studies [7]. With a decline in studies that provide strong causal evidence, current recommendations continue to be made based on shaky ground [8]. As such, the question remains: can the same be said about CLT research?

1.2. Causal Conclusions and Recommendations for Practice

Several reviews have been conducted over the years to investigate the rigor of methodology used in educational psychology research. For example, Hsieh and colleagues [9] published an article that examined studies published in 1983 and between 1995 and 2004 in three prominent educational psychology journals and one broader educational journal. They assessed the types of methodologies used in these studies and found that there was a decrease in the frequency of intervention studies and that the ‘quality indicators’ of these studies (e.g., length of intervention, treatment integrity) were rather poor.
In a similar line of investigation, Robinson and colleagues [10] investigated the types of methodologies used in the studies published in five teaching-and-learning research journals across two years, 1994 and 2004. They classified each article as an intervention study or as a nonintervention study (i.e., correlational, qualitative, or descriptive). They also examined whether causal conclusions were made in each study. Results indicated there was a decline in the percentage of intervention studies published in these journals from 1994 to 2004, with correlational and qualitative studies becoming increasingly popular. Further, there was an increase in the number of causal conclusions made, particularly in the nonintervention studies.
Extending the work of Hsieh et al. [9] and Robinson et al. [10], Reinhart and colleagues [11] also wanted to understand the trends occurring in educational psychology concerning methodology and recommendations for practice. In their work, the authors investigated the use of statistical modeling analyses used between 2000 and 2010. Similar to the previous studies mentioned, they found that less than a quarter of studies from these journals were categorized as intervention studies, and almost half of the nonintervention studies included recommendations for practice. Additionally, when modeling analyses were used within the nonintervention studies, they more commonly resulted in recommendations for practice than those without the modeling analyses.
In the most recent review, Brady et al. [7] found a decrease in the percentage of empirical articles that employed intervention and experimental methods, an increase in the percentage of articles that employed observational methods, and an increase in the percentages of the latter that included recommendations for practice (close to two-thirds as of 2020). These findings were in line with the findings from the previous reviews, making it evident that the field as a whole is moving away from experimental research and yet continues to make recommendations for practice.
As noted by Brady et al. [7], there are a myriad of empirical methods that can be used to study questions within the field of educational psychology, each serving an important purpose in the “exploration-to-intervention study sequence” (p. 2). However, problems arise when the systematic study of a hypothesis via experimental research is skipped (often the final step in the sequence) and cause-and-effect claims are made from correlational research. Most introductory research methods and statistics textbooks caution that correlation does not imply causation. Although most agree that causal claims and recommendations for practice should be based on experimental research, e.g., [12,13,14,15], others have argued that recommendations can be based on non-experimental studies, e.g., [16,17,18].
This debate over when causality has been established and when it is appropriate to make specific recommendations was recently featured in Educational Psychology Review, where several authors [8,14,16,17,19] responded to the Brady et al. [7] article. Some argued that experimental designs are not the only way to establish causal relationships and suggested that the right analysis (e.g., structural equation modeling—SEM) could enable causation from correlational data, e.g., [16,17]. For example, Dumas and Edelsbrunner [16] presented a five-phase process in which SEM could be used to develop implications for educational practice. They emphasized that it is not a “yes/no” question as to whether specific recommendations can be made but rather is a continuum based on the strength of evidence.
With this continuum in mind, there are certain methods that are stronger with regard to establishing causality and making subsequent recommendations. Mayer [14] rated modeling studies as having moderate strength in testing causal claims and experimental studies as having the highest strength. Randomized control trials have long been considered the gold standard and are the ideal design for establishing causality and making recommendations for practice [14,17,20,21]. The benefit of experimental designs is the ability to control for possible confounds and to specifically isolate the independent variable as the causal variable. When non-experimental research is conducted, one runs the risk of extraneous or confounding variables compromising the internal validity of the study—that is, the ability to make valid inferences or draw causal conclusions [20,22,23]. As noted by statistician Donald Rubin, “when it comes to causal inference, design always trumps analysis” (personal communication).
To inform educational research and practice and improve the educational experiences of students, effective interventions need to be identified by isolating and controlling variables. This hallmark of scientific reasoning [24] helps to ensure that “researchers can conclude that changes in the dependent variable (i.e., outcome) are caused by manipulations to the independent variable” [25] (p. 111). With educational psychology moving away from one of the most rigorous methods to investigate research questions, there is a risk of making decisions based on shaky ground—or at the very least, based on results that are not of the highest strength. Shaky ground decisions could affect the validity of instructional practices and influence whether they are embraced and adopted by the educational practice community.

1.3. The Present Review

Given the shifts in the types of research designs being prioritized within educational psychology, the purpose of the present review was to compare research designs and recommendations for CLT research to other educational psychology research, with the primary research question being “What types of research designs and recommendations for practice are found within CLT research as compared to non-CLT research?”

2. Method

2.1. Journal Selection and Search Process

Brady et al. [7] reviewed 255 empirical articles that were published in one of five journals during the year 2020. The journals included Journal of Educational Psychology, Contemporary Educational Psychology, Cognition and Instruction, Journal of Experimental Education, and American Educational Research Journal. These journals are similar to those included in three previous studies by Hsieh et al. [9], Reinhart et al. [11], and Robinson et al. [10]. With permission, their 2020 database was accessed.
To identify CLT articles in the 2020 database, the question “What makes an article a CLT article?” needed to be asked. To identify articles related to cognitive load theory, the authors of the present review examined each of the 255 articles to determine whether CLT was an integral part of the article (herein termed a CLT article). “Integral” was operationalized as including measures of cognitive load within the study, using CLT as the overarching cognitive framework, using CLT in research hypotheses/predictions, or testing CLT as part of the study. In total, six articles were identified as CLT articles [26,27,28,29,30,31].
Due to only identifying a handful of CLT articles from the original Brady et al. [7] database, the present review’s search was expanded to include (a) the most recent publication year of 2023 for the same five journals represented in the 2020 search and (b) an additional journal, Educational Psychology Review, given that it has published empirical CLT research in the past. Through this updated search, an additional 10 CLT articles were identified [32,33,34,35,36,37,38,39,40,41]. Thus, the total sample for 2020 and 2023 included 16 CLT articles.

2.2. Coding and Analysis

Brady et al. [7] categorized articles based on their research designs. Articles were coded as observational/correlational, intervention, experimental, qualitative, and mixed method/multi-method. If the independent variable was manipulated, the article was either intervention or experimental. If an intervention study involved random assignment of individuals, it was further coded as experimental. If the independent variable was not manipulated, the article was categorized as observational/correlational, qualitative, or mixed method. Observational/correlational studies involved quantitative data, qualitative studies involved qualitative data, and mixed method/multi-method studies involved both quantitative and qualitative data. This same categorization system was used for 2023 CLT articles.
In addition to categorizing the research design used in each article, Brady et al. [7] coded for whether recommendations for practice (RFP) were included in studies that were not intervention. RFPs were defined as “if practice X is adopted/avoided or increased/decreased, then teacher or student outcome Y will improve” [11] (p. 244). This same coding system was used for the CLT articles pulled from the six journals published in 2023.

3. Results

3.1. Research Designs

Because the number of CLT articles was so few, these articles were simply compared to all of the educational psychology articles reviewed by Brady et al. [7] without having to adjust the denominator. See Table 1 for a breakdown of each CLT article and coding included in this analysis.
Across the CLT articles published in 2020 and 2023, all but three used random assignment experiments. One of the articles that was not experimental was intervention (without randomization of individuals), while the other two were observational/correlational. These results are not surprising and confirm the claim by Sweller et al. [4] that CLT research is characterized by experimental studies. This contrasts sharply with much of the field of educational psychology. Brady et al. [7] found that 25% of the empirical articles published in 2020 were intervention and only 20% were experimental.

3.2. Recommendations for Practice

Recommendations for practice made based on the research findings are appropriate when the methodology—experimental and intervention—supports causal claims [7]. Fourteen of the CLT articles were experimental or intervention, and as such, any recommendations made in those articles could be justified. The two articles that were observational/correlational did not include such recommendations. This contrasts sharply with much of the field of educational psychology—as demonstrated by Brady et al.’s [7] analysis, 66% of non-intervention articles published in 2020 included recommendations for practice.

4. Discussion

As confirmed by the several reviews conducted by Robinson and colleagues [9,10,11], a large proportion of educational psychology research does not involve experimental methods. The present review found that thirteen of the sixteen CLT articles published in the five reviewed journals in 2020 and six reviewed journals in 2023 used experimental methods, with an additional article categorized as “intervention.” Only two articles were observational/correlational. These results differ greatly from the rest of the educational psychology literature examined by Brady et al. [7] and are not surprising given the emphasis on experiments and experimental data discussed throughout Sweller et al.’s [4] review summarizing research on CLT over the past 20 years. In fact, failure of experiments to replicate helped cognitive load theory to expand with the identification of boundary conditions [42]. As noted by Sweller [42], “Cognitive load theory’s continual adaptation to new data is one of its primary virtues. The fact that those data are collected primarily from full experiments using randomised, controlled trials with real students studying materials from their own curricula is also a virtue” (pp. 95–96).
Although each type of empirical approach can serve a distinct and important purpose in research, a decline in experimental research affects the degree of confidence researchers and practitioners may have that the independent variable caused changes in the dependent variable. Without randomization and the control of variables, recommendations for practice are based on shaky ground. To be able to provide teachers with specific guidance on when and how to implement specific learning strategies and teaching approaches in their classroom, strong experimental research needs to be conducted to establish causality. CLT research, however, has consistently used experimental methods, and, as such, the recommendations for practice offered are based on solid ground. For example, CLT research on explicit instruction and problem-solving is rooted in randomized, controlled trials, strengthening recommendations for teachers looking to effectively teach their students.
Despite these results and justified recommendations from CLT research and other experimental educational psychology research, there remains a gap between research and practice [43]. In such areas as science education, much of the research-validated data on effective instructional practices have been overlooked in science educational policy. Sweller [44] noted, “it is regrettably rare for instructional design to be based on human cognitive architecture. Frequently, instructional design principles are promulgated as though human cognition either does not exist or if it does exist, it has no implications for instruction” (p. 37). Issues concerning the adoption of evidence-based practices are nothing new and were noted by Dempster [45] prior to the start of CLT. With the wealth of experimental data on CLT, particularly compared to other educational psychology research in general, its implications for education should not be overlooked.

4.1. Limitations and Future Directions

The present review has two primary limitations. First, the sample size for CLT articles was small and from only five/six journals, so it may not be representative of all CLT research published in other journals. However, based on Sweller and colleagues’ [4] observations, it would be expected that CLT research appearing in other journals would be mostly experimental. Second, the present review’s operational definition of a CLT article may not match other researchers’ definition of CLT, potentially changing the sample of included and excluded articles.
As for future directions, in addition to examining the research designs and recommendations for practice as performed by Brady et al. [7] and in the present review, it is recommended that researchers conduct reviews on the controlled nature of educational research. More specifically, controlling variables is essential in rigorous research, with confounds threatening the internal validity of a study. Different threats to internal validity could be assessed, as done by Martella et al. [20] and Lawson et al. [22], to determine if the experimental/intervention research that is being conducted is rigorous. Although recommendations for practice are inappropriately made in non-experimental/nonintervention research, they may also be inappropriately made in experimental/intervention research if variables have not been well controlled. As stated by John Sweller, “the ‘C’ in ‘RCT’ is at least as important as the ‘R’… Randomisation without control of variables gives random results” (personal communication).

4.2. Conclusions

The present review found that the CLT research published in five educational psychology journals in 2020 and six educational psychology journals in 2023 were primarily experimental. When not experimental or intervention, recommendations for practice were not made. These results differ from non-CLT educational psychology research as a whole. The importance of experimental research cannot be overstated, as it is an essential component in the “exploration-to-intervention study sequence” [7] (p. 2). Experimental research provides the strongest evidence and allows for causal conclusions to be made on solid rather than shaky ground. As such, CLT research and the accompanying recommendations appear to be based on solid ground.

Author Contributions

Conceptualization: A.M.M. and D.H.R.; Methodology: A.M.M., A.P.L. and D.H.R.; Validation: D.H.R.; Formal Analysis: A.M.M., A.P.L. and D.H.R.; Writing—original draft preparation: A.M.M., A.P.L. and D.H.R.; writing—review and editing: A.M.M., A.P.L. and D.H.R.; visualization: A.P.L. All authors have read and agreed to the published version of the manuscript.

Funding

The first author acknowledges support from the National Science Foundation Postdoctoral Research Fellowship Program under award number 2222208. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

Acknowledgments

The authors would also like to acknowledge and thank Waseem Hassan for his assistance in examining the frequency with which the terms “cognitive load” and “cognitive load theory” have appeared in 12 educational psychology articles since 1988.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Sweller, J. Cognitive load during problem solving: Effects on learning. Cogn. Sci. 1988, 12, 257–285. [Google Scholar] [CrossRef]
  2. Hassan, W.; Martella, A.M.; Robinson, D.H. Identifying the most cited articles and authors in educational psychology journals from 1988–2023. Educ. Psychol. Rev. 2024. in press. [Google Scholar]
  3. Sweller, J.; Chandler, P. Evidence for cognitive load theory. Cogn. Instr. 1991, 8, 351–362. [Google Scholar] [CrossRef]
  4. Sweller, J.; van Merriënboer, J.J.G.; Paas, F. Cognitive architecture and instructional design: 20 years later. Educ. Psychol. Rev. 2019, 31, 261–292. [Google Scholar] [CrossRef]
  5. Schnotz, W.; Kürschner, C. A reconsideration of cognitive load theory. Educ. Psychol. Rev. 2007, 19, 469–508. [Google Scholar] [CrossRef]
  6. de Jong, T. Cognitive load theory, educational research, and instructional design: Some food for thought. Instr. Sci. 2010, 38, 105–134. [Google Scholar] [CrossRef]
  7. Brady, A.; Griffin, M.M.; Lewis, A.R.; Fong, C.J.; Robinson, D.H. How scientific is educational psychology research? The increasing trend of squeezing causality and recommendations from non-intervention studies. Educ. Psychol. Rev. 2023, 35, 37. [Google Scholar] [CrossRef]
  8. Robinson, D.H.; Wainer, H. It’s just an observation. Educ. Psychol. Rev. 2023, 35, 83. [Google Scholar] [CrossRef]
  9. Hsieh, P.; Acee, T.; Chung, W.-H.; Hsieh, Y.-P.; Kim, H.; Thomas, G.D.; You, J.-I.; Levin, J.R.; Robinson, D.H. Is educational intervention research on the decline? J. Educ. Psychol. 2005, 97, 523–529. [Google Scholar] [CrossRef]
  10. Robinson, D.H.; Levin, J.R.; Thomas, G.D.; Pituch, K.A.; Vaughn, S. The incidence of “causal” statements in teaching-and-learning research journals. Am. Educ. Res. J. 2007, 44, 400–413. [Google Scholar] [CrossRef]
  11. Reinhart, A.L.; Haring, S.H.; Levin, J.R.; Patall, E.A.; Robinson, D.H. Models of not-so good behavior: Yet another way to squeeze causality and recommendations for practice out of correlational data. J. Educ. Psychol. 2013, 105, 241–247. [Google Scholar] [CrossRef]
  12. Alexander, P.A. In praise of (reasoned and reasonable) speculation: A response to Robinson et al.’s moratorium on recommendations for practice. Educ. Psychol. Rev. 2013, 25, 303–308. [Google Scholar] [CrossRef]
  13. Harris, K.R. Disallowing recommendations for practice and policy: A proposal that is both too much and too little. Educ. Psychol. Rev. 2013, 25, 309–316. [Google Scholar] [CrossRef]
  14. Mayer, R.E. How to assess whether an instructional intervention has an effect on learning. Educ. Psychol. Rev. 2023, 35, 64. [Google Scholar] [CrossRef]
  15. Renkl, A. Why practice recommendations are important in use-inspired basic research and why too much caution is dysfunctional. Educ. Psychol. Rev. 2013, 25, 317–324. [Google Scholar] [CrossRef]
  16. Dumas, D.; Edelsbrunner, P. How to make recommendations for educational practice from correlational data using structural equation models. Educ. Psychol. Rev. 2023, 35, 48. [Google Scholar] [CrossRef]
  17. Grosz, M.P. Should researchers make causal inferences and recommendations for practice on the basis of nonexperimental studies? Educ. Psychol. Rev. 2023, 35, 57. [Google Scholar] [CrossRef]
  18. Grosz, M.P.; Rohrer, J.M.; Thoemmes, F. The taboo against explicit causal inference in nonexperimental psychology. Perspect. Psychol. Sci. 2020, 15, 1243–1255. [Google Scholar] [CrossRef]
  19. Zitzmann, S.; Machts, N.; Hübner, N.; Schauber, S.; Möller, J.; Lindner, C. The yet underestimated importance of communicating findings from educational trials to teachers, schools, school authorities, or policy makers (Comment on Brady et al. 2023)). Educ. Psychol. Rev. 2023, 35, 65. [Google Scholar] [CrossRef]
  20. Martella, A.M.; Martella, R.C.; Yatcilla, J.K.; Newson, A.; Shannon, E.N.; Voorhis, C. How rigorous is active learning research in STEM education? An examination of key internal validity controls in intervention studies. Educ. Psychol. Rev. 2023, 35, 107. [Google Scholar] [CrossRef]
  21. Shavelson, R.J.; Towne, L. Scientific Research in Education; National Academy Press: Washington, DC, USA, 2002. [Google Scholar]
  22. Lawson, A.P.; Martella, A.M.; LaBonte, K.; Delgado, C.Y.; Zhao, F.; Gluck, J.A.; Munns, M.E.; Wells LeRoy, A.; Mayer, R.E. Confounded or controlled? A systematic review of media comparison studies involving immersive virtual reality for STEM education. Educ. Psychol. Rev. 2024, 36, 69. [Google Scholar] [CrossRef]
  23. Martella, R.C.; Nelson, J.R.; Morgan, R.L.; Marchand-Martella, N.E. Understanding and Interpreting Educational Research; The Guilford Press: New York, NY, USA, 2013. [Google Scholar]
  24. Chen, Z.; Klahr, D. All other things being equal: Acquisition and transfer of the control of variables strategy. Child Dev. 1999, 70, 1098–1120. [Google Scholar] [CrossRef]
  25. Kaya, C. Internal validity: A must in research designs. Educ. Res. Rev. 2015, 10, 111–118. [Google Scholar] [CrossRef]
  26. Bichler, S.; Schwaighofer, M.; Stadler, M.; Bühner, M.; Greiff, S.; Fischer, F. How working memory capacity and shifting matter for learning with worked examples—A replication study. J. Educ. Psychol. 2020, 112, 1320–1337. [Google Scholar] [CrossRef]
  27. de Koning, B.B.; Rop, G.; Paas, F. Learning from split-attention materials: Effects of teaching physical and mental learning strategies. Contemp. Educ. Psychol. 2020, 61, 101873. [Google Scholar] [CrossRef]
  28. Merkt, M.; Lux, S.; Hoogerheide, V.; van Gog, T.; Schwan, S. A change of scenery: Does the setting of an instructional video affect learning? J. Educ. Psychol. 2020, 112, 1273–1283. [Google Scholar] [CrossRef]
  29. Miller-Cotto, D.; Byrnes, J.P. What’s the best way to characterize the relationship between working memory and achievement? An initial examination of competing theories. J. Educ. Psychol. 2020, 112, 1074–1084. [Google Scholar] [CrossRef]
  30. Schneider, S.; Nebel, S.; Beege, M.; Rey, G.D. The retrieval-enhancing effects of decorative pictures as memory cues in multimedia learning videos and subsequent performance tests. J. Educ. Psychol. 2020, 112, 1111–1127. [Google Scholar] [CrossRef]
  31. Zu, T.; Hutson, J.; Loschky, L.C.; Rebello, N.S. Using eye movements to measure intrinsic, extraneous, and germane load in a multimedia learning environment. J. Educ. Psychol. 2020, 112, 1338–1352. [Google Scholar] [CrossRef]
  32. Buchin, Z.L.; Mulligan, N.W. Retrieval-based learning and prior knowledge. J. Educ. Psychol. 2023, 115, 22–35. [Google Scholar] [CrossRef]
  33. Ehrhart, T.; Lindner, M.A. Computer-based multimedia testing: Effects of static and animated representational pictures and text modality. Contemp. Educ. Psychol. 2023, 73, 102151. [Google Scholar] [CrossRef]
  34. Hoch, E.; Sidi, Y.; Ackerman, R.; Hoogerheide, V.; Schiter, K. Comparing mental effort, difficulty, and confidence appraisals in problem-solving: A metacognitive perspective. Educ. Psychol. Rev. 2023, 35, 61. [Google Scholar] [CrossRef]
  35. Martin, A.J.; Ginns, P.; Nagy, R.P.; Collie, R.J.; Bostwick, K.C.P. Load reduction instruction in mathematics and English classrooms: A multilevel study of student and teacher reports. Contemp. Educ. Psychol. 2023, 72, 102147. [Google Scholar] [CrossRef]
  36. Park, B.; Korbach, A.; Ginns, P.; Brüken, R. How learners use their hands for learning: An eye-tracking study. Educ. Psychol. Rev. 2023, 35, 116. [Google Scholar] [CrossRef]
  37. Pengelley, J.; Whipp, P.R.; Rovis-Hermann, N. A testing load: Investigating test mode effects on test score, cognitive load and scratch paper use with secondary school students. Educ. Psychol. Rev. 2023, 35, 67. [Google Scholar] [CrossRef]
  38. Rau, M.A.; Beier, J.P. Exploring the effects of gesture-based collaboration on students’ benefit from a perceptual training. J. Educ. Psychol. 2023, 115, 267–289. [Google Scholar] [CrossRef]
  39. Sondermann, C.; Merkt, M. What is the effect of talking heads in educational videos with different types of narrated slides? Contemp. Educ. Psychol. 2023, 74, 102207. [Google Scholar] [CrossRef]
  40. Wang, F.; Cheng, M.; Mayer, R.E. Improving learning-by-teaching without audience interaction as a generative learning activity by minimizing the social presence of the audience. J. Educ. Psychol. 2023, 115, 783–797. [Google Scholar] [CrossRef]
  41. Yang, X.; Wang, F.; Mayer, R.E.; Hu, X.; Gu, C. Ocular foundations of the spatial contiguity principle: Designing multimedia materials for parafoveal vision. J. Educ. Psychol. 2023, 115, 1125–1140. [Google Scholar] [CrossRef]
  42. Sweller, J. The development of cognitive load theory: Replication crises and invorporation of other theories can lead to theory expansion. Educ. Psychol. Rev. 2023, 35, 95. [Google Scholar] [CrossRef]
  43. Zhang, L.; Kirschner, P.A.; Cobern, W.W.; Sweller, J. There is an evidence crisis in science educational policy. Educ. Psychol. Rev. 2022, 34, 1157–1176. [Google Scholar] [CrossRef]
  44. Sweller, J. Chapter two: Cognitive load theory. Psychol. Learn. Motiv. 2011, 55, 37–76. [Google Scholar] [CrossRef]
  45. Dempster, F.N. The spacing effect: A case study in the failure to apply the results of psychological research. Am. Psychol. 1988, 43, 627–634. [Google Scholar] [CrossRef]
Figure 1. Articles from 1988 to 2023 that contain the terms “Cognitive Load” and/or “Cognitive Load Theory”.
Figure 1. Articles from 1988 to 2023 that contain the terms “Cognitive Load” and/or “Cognitive Load Theory”.
Education 14 00920 g001
Table 1. Breakdown of each CLT article included.
Table 1. Breakdown of each CLT article included.
AuthorsJournalMethodRecommendations for Practice?
2020
 Bichler et al. [26]Journal of Educational PsychologyExperimental--
 de Koning et al. [27]Contemporary Educational PsychologyExperimental--
 Merkt et al. [28]Journal of Educational PsychologyExperimental--
 Miller-Cotto & Byrnes [29] Journal of Educational PsychologyObservational/
Correlational
No
 Schneider et al. [30]Journal of Educational PsychologyExperimental--
 Zu et al. [31]Journal of Educational PsychologyExperimental--
2023
 Buchin & Mulligan [32]Journal of Educational PsychologyExperimental--
 Ehrhart & Lindner [33]Contemporary Educational PsychologyExperimental--
 Hoch et al. [34]Educational Psychology ReviewExperimental--
 Martin et al. [35]Contemporary Educational PsychologyObservational/
Correlational
No
 Park el al. [36]Educational Psychology ReviewExperimental--
 Pengelley et al. [37]Educational Psychology ReviewExperimental--
 Rau & Beier [38]Journal of Educational PsychologyIntervention--
 Sondermann & Merket [39]Contemporary Educational PsychologyExperimental--
 Wang et al. [40]Journal of Educational PsychologyExperimental--
 Yang et al. [41]Journal of Educational PsychologyExperimental--
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Martella, A.M.; Lawson, A.P.; Robinson, D.H. How Scientific Is Cognitive Load Theory Research Compared to the Rest of Educational Psychology? Educ. Sci. 2024, 14, 920. https://doi.org/10.3390/educsci14080920

AMA Style

Martella AM, Lawson AP, Robinson DH. How Scientific Is Cognitive Load Theory Research Compared to the Rest of Educational Psychology? Education Sciences. 2024; 14(8):920. https://doi.org/10.3390/educsci14080920

Chicago/Turabian Style

Martella, Amedee Marchand, Alyssa P. Lawson, and Daniel H. Robinson. 2024. "How Scientific Is Cognitive Load Theory Research Compared to the Rest of Educational Psychology?" Education Sciences 14, no. 8: 920. https://doi.org/10.3390/educsci14080920

APA Style

Martella, A. M., Lawson, A. P., & Robinson, D. H. (2024). How Scientific Is Cognitive Load Theory Research Compared to the Rest of Educational Psychology? Education Sciences, 14(8), 920. https://doi.org/10.3390/educsci14080920

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop