Knowledge Acquisition of Biology and Physics University Students—the Role of Prior Knowledge
Abstract
:1. Introduction
1.1. Theories of Knowledge
“[Prior knowledge comprises a student’s] cognitive entry behaviors [as] those prerequisite types of knowledge, skills, and competencies which are essential to the learning of a particular new task or set of tasks”.[8] (p. 32)
1.2. Assessment of Knowledge Types
1.3. Learning and Knowledge Acquisition in Science at University Level
2. Purpose
3. Materials and Methods
3.1. Test Development and Design
3.2. Procedure
3.3. Sample
3.4. Scoring and Data Analysis
3.4.1. Knowledge of Facts
3.4.2. Knowledge of Meaning
3.4.3. Integration of Knowledge
3.4.4. Application of Knowledge
3.5. Scaling
3.6. Data Analysis
4. Results
4.1. Precondition: Validity
4.2. Precondition: Reliability and Item Fit Analysis
4.3. Distinguishing HS High and Low Achieving Students
4.4. Knowledge Type Acquisition: Biology and Physics Sample
5. Discussion
5.1. Preconditions: Dimensionality and Reliability
5.2. Knowledge Type Acquisition: Biology and Physics Sample
6. Conclusions and Outlook
Supplementary Materials
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Chen, X. STEM Attrition: College Students’ Paths Into and Out of STEM Fields (NCES 2014-001); National Center for Education Statistics; Institute of Education Sciences; U.S. Department of Education: Washington, DC, USA, 2013.
- Hailikari, T.; Nevgi, A. How to Diagnose At-risk Students in Chemistry: The case of prior knowledge assessment. Int. J. Sci. Educ. 2010, 32, 2079–2095. [Google Scholar] [CrossRef]
- Hailikari, T.; Nevgi, A.; Lindblom-Ylänne, S. Exploring alternative ways of assessing prior knowledge, its components and their relation to student achievement: A mathematics based case study. Stud. Educ. Eval. 2007, 33, 320–337. [Google Scholar] [CrossRef]
- Hailikari, T. Assessing University Students’ Prior Knowledge: Implications for Theory and Practice; University of Helsinki: Helsinki, Finland, 2009. [Google Scholar]
- Trapmann, S.; Hell, B.; Weigand, S.; Schuler, H. Die Validität von Schulnoten zur Vorhersage des Studienerfolgs-eine Metaanalyse. Z. Pädagogische Psychol. 2007, 21, 11–27. [Google Scholar] [CrossRef]
- Trapmann, S.; Hell, B.; Hirn, J.O.W.; Schuler, H. Meta-Analysis of the relationship between the Big Five and academic success at university. Z. Psychol./J. Psychol. 2007, 215, 132–151. [Google Scholar] [CrossRef]
- Robbins, S.B.; Lauver, K.; Le, H.; Davis, D.; Langley, R.; Carlstrom, A. Do psychosocial and study skill factors predict college outcomes? A meta-analysis. Psychol. Bull. 2004, 130, 261. [Google Scholar] [CrossRef]
- Bloom, B.S. Human Characteristics and School Learning; McGraw-Hill: New York, NY, USA, 1976. [Google Scholar]
- De Jong, T.; Ferguson-Hessler, M.G.M. Types and Qualities of Knowledge. Educ. Psychol. 1996, 31, 105–113. [Google Scholar] [CrossRef]
- Dochy, F.; Alexander, P.A. Mapping prior knowledge: A framework for discussion among researchers. Eur. J. Psychol. Educ. 1995, 10, 225–242. [Google Scholar] [CrossRef]
- Dochy, F.; De Rijdt, C.; Dyck, W. Cognitive prerequisites and learning: How far have we progressed since Bloom? Implications for educational practice and teaching. Act. Learn. High. Educ. 2002, 3, 265–284. [Google Scholar] [CrossRef]
- Bratianu, C.; Bejinaru, R. The Theory of Knowledge Fields: A Thermodynamics Approach. Systems 2019, 7, 20. [Google Scholar] [CrossRef]
- Richter-Beuschel, L.; Grass, I.; Bögeholz, S. How to Measure Procedural Knowledge for Solving Biodiversity and Climate Change Challenges. Educ. Sci. 2018, 8, 190. [Google Scholar] [CrossRef]
- Alexander, P.A.; Judy, J.E. The interaction of domain-specific and strategic knowledge in academic performance. Rev. Educ. Res. 1988, 58, 375–404. [Google Scholar] [CrossRef]
- Posner, M.I.; Boies, S.J. Components of attention. Psychol. Rev. 1971, 78, 391. [Google Scholar] [CrossRef]
- Messick, S. The Psychology of Educational Measurement. J. Educ. Meas. 1984, 21, 215–237. [Google Scholar] [CrossRef]
- Corbett, A.T.; Anderson, J.R. Knowledge Tracing: Modelling the Acquisition of Procedural Knowledge. User Model. User Adopt. Interact. 1995, 4, 253–278. [Google Scholar] [CrossRef]
- Mayer, R.E. Rote Versus Meaningful Learning. Theory Pract. 2002, 41, 226–232. [Google Scholar] [CrossRef]
- Krathwohl, D.R. A Revision of Bloom’s Taxonomy: An Overview. Theory Pract. 2002, 41, 212–218. [Google Scholar] [CrossRef]
- Dochy, F. Assessment of Prior Knowledge as a Determinant for Future Learning; Lemma, B.V., Ed.; Jessica Kingsley Publishers: Utrecht, The Netherlands; London, UK, 1992. [Google Scholar]
- Binder, T.; Sandmann, A.; Friege, G.; Sures, B.; Theyßen, H.; Schmiemann, P. Assessing prior knowledge types as predictors of academic achievement in the introductory phase of biology and physics study programmes using logistic regression. Int. J. Stem Educ. 2019, 6, 33. [Google Scholar] [CrossRef]
- Hailikari, T.; Katajavuori, N.; Lindblom-Ylanne, S. The relevance of prior knowledge in learning and instructional design. Am. J. Pharm. Educ. 2008, 72, 113. [Google Scholar] [CrossRef]
- Kyllonen, P.C.; Stephens, D.L. Cognitive abilities as determinants of success in acquiring logic skill. Learn. Individ. Differ. 1990, 2, 129–160. [Google Scholar] [CrossRef]
- Asikainen, M.A. Probing University Students’ Pre-Knowledge in Quantum Physics with QPCS Survey. Eurasia J. Math. Sci. Technol. Educ. 2017, 13, 1615–1632. [Google Scholar] [CrossRef]
- Hailikari, T.; Nevgi, A.; Komulainen, E. Academic self-beliefs and prior knowledge as predictors of student achievement in Mathematics: A structural model. Educ. Psychol. 2008, 28, 59–71. [Google Scholar] [CrossRef]
- Bissonnette, S.A.; Combs, E.D.; Nagami, P.H.; Byers, V.; Fernandez, J.; Le, D.; Realin, J.; Woodham, S.; Smith, J.I.; Tanner, K.D. Using the Biology Card Sorting Task to Measure Changes in Conceptual Expertise during Postsecondary Biology Education. CBE Life Sci. Educ. 2017, 16, ar14. [Google Scholar] [CrossRef] [PubMed]
- Crowe, A.; Dirks, C.; Wenderoth, M.P. Biology in bloom: Implementing Bloom’s Taxonomy to enhance student learning in biology. CBE Life Sci. Educ. 2008, 7, 368–381. [Google Scholar] [CrossRef] [PubMed]
- Reid, A.; Wood, L.N.; Smith, G.H.; Petocz, P. Intention, Approach and Outcome: University Mathematics Students’ Conceptions of Learning Mathematics. Int. J. Sci. Math. Educ. 2005, 3, 567–586. [Google Scholar] [CrossRef]
- Wang, W.; Coll, R.K. An Investigation of Tertiary-level Learning in Some Practical Physics Courses. Int. J. Sci. Math. Educ. 2005, 3, 639. [Google Scholar] [CrossRef]
- Geller, C.; Neumann, K.; Boone, W.J.; Fischer, H.E. What Makes the Finnish Different in Science? Assessing and Comparing Students’ Science Learning in Three Countries. Int. J. Sci. Educ. 2014, 36, 3042–3066. [Google Scholar] [CrossRef]
- Liu, L.; Lee, H.; Hofstetter, C.; Linn, M. Assessing Knowledge Integration in Science: Construct, Measures, and Evidence. Educ. Assess. 2008, 13, 33–55. [Google Scholar] [CrossRef]
- Liu, O.L.; Lee, H.S.; Linn, M.C. Measuring knowledge integration: Validation of four-year assessments. J. Res. Sci. Teach. 2011, 48, 1079–1107. [Google Scholar] [CrossRef]
- Messick, S. VALIDITY; ETS Research Report Series; Educational Testing Service: Princeton, NJ, USA, 1987; p. i-208. [Google Scholar]
- Yin, Y.; Vanides, J.; Ruiz-Primo, M.A.; Ayala, C.C.; Shavelson, R.J. Comparison of two concept-mapping techniques: Implications for scoring, interpretation, and use. J. Res. Sci. Teach. 2005, 42, 166–184. [Google Scholar] [CrossRef]
- Buntting, C.; Coll, R.K.; Campbell, A. Student Views of Concept Mapping Use in Introductory Tertiary Biology Classes. Int. J. Sci. Math. Educ. 2006, 4, 641–668. [Google Scholar] [CrossRef]
- Jonassen, D.H.; Grabowski, B.L. (Eds.) Handbook of Individual Differences, Learning and Instruction; Erlbaum: Hillsdale, MI, USA, 1993. [Google Scholar]
- McClure, J.R.; Sonak, B.; Suen, H.K. Concept Map Assessment of Classroom Learning: Reliability, Validity, and Logistical Practicality. J. Res. Sci. Teach. 1999, 36, 475–492. [Google Scholar] [CrossRef] [Green Version]
- Rice, D.C.; Ryan, J.M.; Samson, S.M. Using concept maps to assess student learning in the science classroom: Must different methods compete? J. Res. Sci. Teach. 1998, 35, 1103–1127. [Google Scholar] [CrossRef]
- Chi, M.T.H.; Feltovich, P.J.; Glaser, R. Categorization and Representation of Physics Problems by Experts and Novices; Learning Research and Development Center, University of Pittsburgh: Pittsburgh, PA, USA, 1981. [Google Scholar]
- Moseley, B.J.; Okamoto, Y.; Ishida, J. Comparing US and Japanese elementary school teachers’ facility for linking rational number representations. Int. J. Sci. Math. Educ. 2007, 5, 165–185. [Google Scholar] [CrossRef]
- Ruiz-Primo, M.A.; Schultz, S.E.; Li, M.; Shavelson, R.J. Comparison of the reliability and validity of scores from two concept-mapping techniques. J. Res. Sci. Teach. 2001, 38, 260–278. [Google Scholar] [CrossRef]
- Rye, J.A.; Rubba, P.A. Scoring concept maps: An expert map-based scheme weighted for relationships. Sch. Sci. Math. 2002, 102, 33–44. [Google Scholar] [CrossRef]
- Boone, W.J.; Scantlebury, K. The role of Rasch analysis when conducting science education research utilizing multiple-choice tests. Sci. Educ. 2006, 90, 253–269. [Google Scholar] [CrossRef]
- Neumann, I.; Neumann, K.; Nehm, R. Evaluating Instrument Quality in Science Education: Rasch-based analyses of a Nature of Science test. Int. J. Sci. Educ. 2011, 33, 1373–1405. [Google Scholar] [CrossRef] [Green Version]
- Rasch, G. Probabilistic Models for Some Intelligence and Attainment Tests; University of Chicago Press: Chicago, IL, USA, 1960. [Google Scholar]
- Masters, G.N. A Rasch model for partial credit scoring. Psychometrika 1982, 47, 149–174. [Google Scholar] [CrossRef]
- Sykes, R.C.; Yen, W.M. The Scaling of Mixed-Item-Format Tests with the One-Parameter and Two-Parameter Partial Credit Models. J. Educ. Meas. 2000, 37, 221–244. [Google Scholar] [CrossRef]
- Burnham, K.P.; Anderson, D.R. Model Selection and Multimodel Inference: A Practical Information-Theoretic Approach, 2nd ed.; Springer: New York, NY, USA, 2010. [Google Scholar]
- Bond, T.G.; Fox, C.M. Applying the Rasch Model: Fundamental Measurement in the Human Sciences, 2nd ed.; Lawrence Erlbaum Associates: Mahwah, NJ, USA, 2007. [Google Scholar]
- Hartig, J.; Kuhnbach, O. Estimating change using the plausible value technique within multidimensional Rasch-models. In Veränderungsmessung Längsschnittstudien in der empirischen Erziehungswissenschaft [Estimating Change and Longitudianl Studies in the Empirical Social Sciences]; Ittel, A., Merkens, H., Eds.; VS Verlag für Sozialwissenschaften: Wiesbaden, Germany, 2006; pp. 27–44. [Google Scholar]
- Wilson, M. Constructing Measures: An Item Response Modeling Approach; CD Enclosed; Psychology Press: New York, NY, USA, 2005. [Google Scholar]
- Warm, T.A. Weighted likelihood estimation of ability in item response theory. Psychometrika 1989, 54, 427–450. [Google Scholar] [CrossRef]
- R Development Core Team. R: A Language and Environment for Statistical Computing; R Development Core Team: Vienna, Austria, 2008; Available online: http://www.R-project.org/ (accessed on 17 June 2018).
- Kiefer, T.; Robitzsch, A.; Wu, M.L. TAM—Test Analysis Modules. 2018. Available online: http://cran.r-project.org/web/packages/TAM/index.html (accessed on 20 January 2019).
- Cohen, J. Statistical power analysis. Curr. Dir. Psychol. Sci. 1992, 1, 98–101. [Google Scholar] [CrossRef]
- Kane, M.T. Validating the interpretations and uses of test scores. J. Educ. Meas. 2013, 50, 1–73. [Google Scholar] [CrossRef]
- Friege, G.; Lind, G. Types and Qualities of Knowledge and their Relations to Problem Solving in Physics. Int. J. Sci. Math. Educ. 2006, 4, 437–465. [Google Scholar] [CrossRef]
- Chi, M.T.H.; Glaser, R.; Rees, E. Expertise in Problem Solving. In Advances in the Psychology of Human Intellegence; Sternberg, R.J., Ed.; Erlbaum: Hillsdale, MI, USA, 1982; pp. 7–75. [Google Scholar]
- Binder, T.; Theyßen, H.; Schmiemann, P. Erfassung von fachspezifischen Problemlöseprozessen mit Sortieraufgaben in Biologie und Physik [Assessing Subject-specific Problem Solving Processes Using Sorting Tasks in Biology and Physics]. Zeitschrift für Didaktik der Naturwissenschaften 2019. [Google Scholar] [CrossRef]
- Adams, R.J. Reliability as a measurement design effect. Stud. Educ. Eval. 2005, 31, 162–172. [Google Scholar] [CrossRef]
Subject | Dimensions | NP | Deviance | AIC | BIC |
---|---|---|---|---|---|
Biology | knowledge (1D) | 107 | 17478.24 | 17692.24 | 18022.61 |
Biology | declarative versus procedural knowledge (2D) | 109 | 17418.82 | 17636.82 | 17973.37 |
Biology | knowledge types (4D) | 116 | 17278.10 | 17510.10 | 17868.27 |
Physics | knowledge (1D) | 125 | 11156.02 | 11406.02 | 11731.67 |
Physics | declarative versus procedural knowledge (2D) | 127 | 11111.87 | 11365.87 | 11696.73 |
Physics | knowledge types (4D) | 134 | 10986.42 | 11254.42 | 11603.52 |
Subject | Comparison | df | Χ2emp | Χ2crit | p |
---|---|---|---|---|---|
Biology | 1D versus 2D | 2 | 59.42 | 18.42 | <0.001 |
Biology | 2D versus 4D | 7 | 140.72 | 29.84 | <0.001 |
Biology | 1D versus 4D | 9 | 200.14 | 33.70 | <0.001 |
Physics | 1D versus 2D | 2 | 44.15 | 18.42 | <0.001 |
Physics | 2D versus 4D | 7 | 125.45 | 29.84 | <0.001 |
Physics | 1D versus 4D | 9 | 169.60 | 33.70 | <0.001 |
Measurement Point | Latent Correlations (Biology Model) | Latent Correlations (Physics Model) |
---|---|---|
1 | 0.47 < rlat < 0.67 | 0.52 < rlat < 0.78 |
2 | 0.24 < rlat < 0.59 | 0.61 < rlat < 0.86 |
Subject | Tests | Measure | M1 | M2 |
---|---|---|---|---|
Biology | knowledge of facts | Infit MNSQ (Reliability) | 0.94–1.06 (0.60) | 0.85–1.15 (0.74) |
knowledge of meaning | Infit MNSQ (Reliability) | 0.90–1.09 (0.71) | 0.82–1.15 (0.72) | |
integration of knowledge | Infit MNSQ (Reliability) | 0.94–1.05 (0.58) | 0.92–1.19 (0.60) | |
application of knowledge | Infit MNSQ (Reliability) | 0.91–1.23 (0.71) | 0.89–1.10 (0.68) | |
Physics | knowledge of facts | Infit MNSQ (Reliability) | 0.88–1.16 (0.77) | 0.84–1.26 (0.62) |
knowledge of meaning | Infit MNSQ (Reliability) | 0.85–1.15 (0.68) | 0.85–1.12 (0.61) | |
integration of knowledge | Infit MNSQ (Reliability) | 0.86–1.23 (0.75) | 0.94–1.13 (0.52) | |
application of knowledge | Infit MNSQ (Reliability) | 0.73–1.22 (0.77) | 0.75–1.22 (0.66) |
Subject | n | Md | SD | Group Name |
---|---|---|---|---|
Biology | 36 | 1.50 | 0.20 | Low HS GPA (HS high achievers) |
34 | 2.10 | 0.16 | Medium HS GPA (HS medium achievers) | |
34 | 2.70 | 0.30 | High HS GPA (HS low achievers) | |
Physics | 18 | 1.35 | 0.20 | Low HS GPA (HS high achievers) |
20 | 1.90 | 0.15 | Medium HS GPA (HS medium achievers) | |
19 | 2.60 | 0.30 | High HS GPA (HS low achievers) |
Subject | Factor/s | Sum of Squares | df | F | p | partial η2 | Observed Power |
---|---|---|---|---|---|---|---|
Biology | Time | 40.99 | 1 | 62.49 | <0.001 | 0.382 | 1.0 |
Group | 26.69 | 2 | 3.94 | 0.022 | 0.072 | 0.69 | |
Time*Group | 0.25 | 2 | 0.187 | 0.830 | 0.004 | 0.08 | |
Time*Knowledge type*Group | 0.353 | 3.89 1 | 0.114 | 0.976 | 0.002 | 0.07 | |
Physics | Time | 32.81 | 1 | 64.50 | <0.001 | 0.544 | 1.0 |
Group | 97.98 | 2 | 14.62 | <0.001 | 0.351 | 0.99 | |
Time*Group | 1.02 | 2 | 1.00 | 0.375 | 0.036 | 0.21 | |
Time*Knowledge type*Group | 5.25 | 3.44 1 | 1.59 | 0.191 | 0.056 | 0.43 |
Knowledge Type | HS Achievement | M (SD) | Sig. 1 |
---|---|---|---|
Knowledge of meaning | HS low achiever | −0.45 (0.89) | − |
HS medium achiever | −0.29 (1.07) | p = 1.000 | |
HS high achiever | −0.19 (0.74) | p = 0.012 | |
Integration of knowledge | HS low achiever | −0.58 (0.71) | − |
HS medium achiever | −0.52 (0.88) | p = 1.000 | |
HS high achiever | −0.03 (0.75) | p = 0.011 |
Knowledge Type | HS Achievement | M (SD) | Sig. 1 |
---|---|---|---|
Knowledge of facts | HS low achiever | −0.45 (0.70) | − |
HS medium achiever | 0.15 (0.61) | p = 0.042 | |
HS high achiever | 0.31 (0.90) | p = 0.008 | |
Knowledge of meaning | HS low achiever | −0.34 (0.65) | − |
HS medium achiever | 0.60 (0.74) | p = 0.001 | |
HS high achiever | 0.51 (0.69) | p < 0.001 | |
Application of knowledge | HS low achiever | −0.87 (1.75) | − |
HS medium achiever | 0.47 (1.40) | p = 0.021 | |
HS high achiever | 0.97 (1.26) | p < 0.001 |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Binder, T.; Schmiemann, P.; Theyssen, H. Knowledge Acquisition of Biology and Physics University Students—the Role of Prior Knowledge. Educ. Sci. 2019, 9, 281. https://doi.org/10.3390/educsci9040281
Binder T, Schmiemann P, Theyssen H. Knowledge Acquisition of Biology and Physics University Students—the Role of Prior Knowledge. Education Sciences. 2019; 9(4):281. https://doi.org/10.3390/educsci9040281
Chicago/Turabian StyleBinder, Torsten, Philipp Schmiemann, and Heike Theyssen. 2019. "Knowledge Acquisition of Biology and Physics University Students—the Role of Prior Knowledge" Education Sciences 9, no. 4: 281. https://doi.org/10.3390/educsci9040281
APA StyleBinder, T., Schmiemann, P., & Theyssen, H. (2019). Knowledge Acquisition of Biology and Physics University Students—the Role of Prior Knowledge. Education Sciences, 9(4), 281. https://doi.org/10.3390/educsci9040281