Measuring Oral Reading Fluency (ORF) Computer-Based and Paper-Based: Examining the Mode Effect in Reading Accuracy and Reading Fluency
Abstract
:1. Introduction
1.1. Test Mode and CBM
1.2. Mode Effects on CBM
2. Present Study
- Does the computer-based ORF test administration have a similar reliability as the paper-based one?
- Are there differences in the sum scores of computer-based and paper-based ORF procedures?
- Are there differences in the percentage correct of computer-based and paper-based ORF procedures?
- Are there differences in item functioning across test modes and student background characteristics?
2.1. Participants
2.2. Instrument
2.3. Procedure
2.4. Measures
2.5. Analyses
3. Results
3.1. Model Fits
3.2. Differences in Sum Scores and Percentage Correct
3.3. Differential Item Functioning
3.4. Multilevel Modes
4. Discussion
5. Limitations and Future Directions
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Wilcox, G.; Conde, C.F.; Kowbel, A. Using Evidence-Based Practice and Data-Based Decision Making in Inclusive Education. Educ. Sci. 2021, 11, 129. [Google Scholar] [CrossRef]
- Bailey, T.R.; Colpo, A.; Foley, A. Assessment Practices within a Multi-Tiered System of Supports; University of Florida: Gainesville, FL, USA, 2020; Collaboration for Effective Educator, Development, Accountability, and Reform Center Website; Available online: http://ceedar.education.ufl.edu/tools/innovation-configurations/ (accessed on 27 March 2023).
- Jungjohann, J.; Gebhardt, M.; Scheer, D. Understanding and improving teachers’ graph literacy for data-based decision-making via video intervention. Front. Educ. 2022, 7, 919152. [Google Scholar] [CrossRef]
- Powell, S.R.; Bos, S.E.; King, S.G.; Ketterlin-Geller, L.; Lembke, E.S. Using the Data-Based Individualization Framework in Math Intervention. Teach. Except. Child. 2022, 00400599221111114. [Google Scholar] [CrossRef]
- DeVries, J.M.; Szardenings, C.; Doebler, P.; Gebhardt, M. Subject-Specific Self-Concept and Global Self-Esteem Mediate Risk Factors for Lower Competency in Mathematics and Reading. Soc. Sci. 2021, 10, 11. [Google Scholar] [CrossRef]
- Gleason, P.; Crissey, S.; Chojnacki, G.; Zukiedwicz, M.; Silva, T.; Costelloe, S.; O’Reilly, F.; Johnson, E. Evaluation of Support for Using Student Data to Inform Teachers’ Instruction (NCEE 2019-4008); National Center for Education Evaluation and Regional Assistance: Washington, DC, USA, 2019.
- Judge, S.; Watson, S.M.R. Longitudinal Outcomes for Mathematics Achievement for Students with Learning Disabilities. J. Educ. Res. 2011, 104, 147–157. [Google Scholar] [CrossRef]
- Tran, L.; Sanchez, T.; Arellano, B.; Swanson, H.L. A Meta-Analysis of the RTI Literature for Children at Risk for Reading Disabilities. J. Learn. Disabil. 2011, 44, 283–295. [Google Scholar] [CrossRef] [PubMed]
- Arias-Gundín, O.; Llamazares, A.G. Efficacy of the RtI Model in the Treatment of Reading Learning Disabilities. Educ. Sci. 2021, 11, 209. [Google Scholar] [CrossRef]
- Jungjohann, J.; Gebhardt, M. Dimensions of Classroom-Based Assessments in Inclusive Education: A Teachers’ Questionnaire for Instructional Decision-Making, Educational Assessments, Identification of Special Educational Needs, and Progress Monitoring. IJSE 2023, 38, 131–144. [Google Scholar] [CrossRef]
- Bennett, R.E. Formative assessment: A critical review. Assess. Educ. 2011, 18, 5–25. [Google Scholar] [CrossRef]
- Anderson, S.; Jungjohann, J.; Gebhardt, M. Effects of using curriculum-based measurement (CBM) for progress monitoring in reading and an additive reading instruction in second classes. Z. Für Grund. 2020, 13, 151–166. [Google Scholar] [CrossRef] [Green Version]
- Shapiro, E.S.; Keller, M.A.; Lutz, J.G.; Santoro, L.E.; Hintze, J.M. Curriculum-Based Measures and Performance on State Assessment and Standardized Tests. J. Psychoeduc. Assess. 2006, 24, 19–35. [Google Scholar] [CrossRef]
- Stecker, P.M.; Fuchs, L.S.; Fuchs, D. Using Curriculum-Based Measurement to Improve Student Achievement: Review of Research. Psychol. Sch. 2005, 42, 795–819. [Google Scholar] [CrossRef]
- Black, P.; Wiliam, D. Developing the theory of formative assessment. Educ. Assess. Eval. Account. 2009, 21, 5–31. [Google Scholar] [CrossRef] [Green Version]
- Gebhardt, M.; Sälzer, C.; Mang, J.; Müller, K.; Prenzel, M. Performance of Students with Special Educational Needs in Germany: Findings from Programme for International Student Assessment 2012. J. Cogn. Educ. Psychol. 2015, 14, 343–356. [Google Scholar] [CrossRef]
- Blumenthal, S.; Blumenthal, Y.; Lembke, E.S.; Powell, S.R.; Schultze-Petzold, P.; Thomas, E.R. Educator Perspectives on Data-Based Decision Making in Germany and the United States. J. Learn. Disabil. 2021, 54, 284–299. [Google Scholar] [CrossRef]
- Deno, S.L. Curriculum-Based Measurement: The Emerging Alternative. Except. Child. 1985, 52, 219–232. [Google Scholar] [CrossRef] [PubMed]
- Reschly, A.L.; Busch, T.W.; Betts, J.; Deno, S.L.; Long, J.D. Curriculum-Based Measurement Oral Reading as an indicator of reading achievement: A meta-analysis of the correlational evidence. J. Sch. Psychol. 2009, 47, 427–469. [Google Scholar] [CrossRef]
- Wayman, M.M.; Wallace, T.; Wiley, H.I.; Ticha, R.; Espin, C.A. Literature Synthesis on Curriculum-Based Measurement in Reading. J. Spec. Educ. 2007, 41, 85–120. [Google Scholar] [CrossRef] [Green Version]
- Jungjohann, J.; Gegenfurtner, A.; Gebhardt, M. Systematisches Review von Lernverlaufsmessung im Bereich der frühen Leseflüssigkeit. Empir. Sonderpädagogik 2018, 10, 100–118. [Google Scholar] [CrossRef]
- Jungjohann, J.; Schurig, M.; Gebhardt, M. Fachbeitrag: Pilotierung von Leseflüssigkeits- und Leseverständnistests zur Entwicklung von Instrumenten der Lernverlaufsdiagnostik. Ergebnisse einer Längsschnittstudie in der 3ten und 4ten Jahrgangsstufe. Vierteljahresschr. Für Heilpädagogik Und Ihre Nachbargeb. Plus 2021, 90, 1–19. [Google Scholar] [CrossRef]
- Hosp, M.K.; Hosp, J.L.; Howell, K.W. The ABC’s of CBM: A Practical Guide to Curriculum-Based Measurement, 1st ed.; The Guilford Press: New York, NY, USA; London, UK, 2007; ISBN 9781462524686. [Google Scholar]
- Deno, S.L.; Mirkin, P.K.; Chiang, B. Identifying valid measures of reading. Except. Child. 1982, 49, 36–45. [Google Scholar] [CrossRef] [PubMed]
- Fuchs, L.S.; Fuchs, D.; Hosp, M.K.; Jenkins, J.R. Oral Reading Fluency as an Indicator of Reading Competence: A Theoretical, Empirical, and Historical Analysis. Sci. Stud. Read. 2001, 5, 239–256. [Google Scholar] [CrossRef]
- Tzivinikou, S.; Tsolis, A.; Kagkara, D.; Theodosiou, S. Curriculum Based Measurement Maze: A Review. Psychology 2020, 11, 1592–1611. [Google Scholar] [CrossRef]
- Hudson, R.F.; Pullen, P.C.; Lane, H.B.; Torgesen, J.K. The Complex Nature of Reading Fluency: A Multidimensional View. Read. Writ. Q. 2008, 25, 4–32. [Google Scholar] [CrossRef]
- Fuchs, L.S. The Past, Present, and Future of Curriculum-Based Measurement Research. Sch. Psychol. Rev. 2004, 33, 188–192. [Google Scholar] [CrossRef]
- Ebenbeck, N.; Gebhardt, M. Simulating computerized adaptive testing in special education based on inclusive progress monitoring data. Front. Educ. 2022, 7, 945733. [Google Scholar] [CrossRef]
- Pomplun, M.; Frey, S.; Becker, D.F. The Score Equivalence of Paper-and-Pencil and Computerized Versions of a Speeded Test of Reading Comprehension. Educ. Psychol. Meas. 2002, 62, 337–354. [Google Scholar] [CrossRef]
- Schurig, M.; Jungjohann, J.; Gebhardt, M. Minimization of a Short Computer-Based Test in Reading. Front. Educ. 2021, 6, 684595. [Google Scholar] [CrossRef]
- Alonzo, J.; Tindal, G.; Ulmer, K.; Glasgow, A. Easycbm® Online Progress Monitoring Assessment System. Available online: http://easycbm.com (accessed on 12 June 2023).
- Foegen, A.; Stecker, P.M.; Genareo, V.R.; Lyons, R.; Olson, J.R.; Simpson, A.; Romig, J.E.; Jones, R. Using an Online Tool for Learning About and Implementing Algebra Progress Monitoring. Teach. Except. Child. 2016, 49, 106–114. [Google Scholar] [CrossRef]
- Jungjohann, J.; DeVries, J.M.; Gebhardt, M.; Mühling, A. Levumi: A Web-Based Curriculum-Based Measurement to Monitor Learning Progress in Inclusive Classrooms. In Proceedings of the Computers Helping People with Special Needs: 16th International Conference, ICCHP 2018, Lecture Notes in Computer Science, Linz, Austria, 11–13 July 2018; Miesenberger, K., Kouroupetroglou, G., Eds.; Springer: Cham, Switzerland, 2018; pp. 369–378. [Google Scholar] [CrossRef] [Green Version]
- Blumenthal, S.; Gebhardt, M.; Förster, N.; Souvignier, E. Internetplattformen zur Diagnostik von Lernverläufen von Schülerinnen und Schülern in Deutschland. Ein Vergleich der Plattformen Lernlinie, Levumi und quop. Z. Heilpädag. 2022, 73, 153–167. [Google Scholar] [CrossRef]
- Blumenthal, S.; Blumenthal, Y. Tablet or Paper and Pen? Examining Mode Effects on German Elementary School Students’ Computation Skills with Curriculum-Based Measurements. Int. J. Educ. Methodol. 2020, 6, 669–680. [Google Scholar] [CrossRef]
- Golan, D.D.; Barzillai, M.; Katzir, T. The effect of presentation mode on children’s reading preferences, performance, and self-evaluations. Comput. Educ. 2018, 126, 346–358. [Google Scholar] [CrossRef]
- Halamish, V.; Elbaz, E. Children’s reading comprehension and metacomprehension on screen versus on paper. Comput. Educ. 2019, 145, 103737. [Google Scholar] [CrossRef]
- Anasse, K.; Rhandy, R. Teachers’ Attitudes towards Online Writing Assessment during COVID-19 Pandemic. Int. J. Linguist. Lit. Transl. 2021, 3, 65–70. [Google Scholar] [CrossRef]
- Dacillo, M.M.; Paracueles, L.H.P.; Villacorta, L.J.C.; Licaros, O.J. Exploring Online Writing Assessment in the New Normal: Challenges from Teachers’ Perspective. Am. J. Educ. Technol. 2022, 1, 46–61. [Google Scholar] [CrossRef]
- Jungjohann, J.; Bastian, M.; Mühling, A.; Gebhardt, M. Die Sicht von Lehrkräften auf die Implementation und den Nutzen von webbasierten Lernverlaufstests: Eine Interviewstudie in inklusiven Grundschulen. In Diversität Digital Denken—The Wider View.: Eine Tagung des Zentrums für Lehrerbildung der Westfälischen Wilhelms-Universität Münster vom 08. bis 10.09.2021; Harsch, N., Jungwirth, M., Stein, M., Noltensmeier, Y., Willenberg, N., Eds.; WTM-Verlag: Münster, Germany, 2022; pp. 405–408. [Google Scholar] [CrossRef]
- Huff, K.L.; Sireci, S.G. Validity Issues in Computer-Based Testing. Educ. Meas. Issues Pract. 2005, 20, 16–25. [Google Scholar] [CrossRef]
- Fuchs, L.S.; Fuchs, D. Computer Applications to Curriculum-Based Measurement. Spec. Serv. Sch. 2002, 17, 1–14. [Google Scholar] [CrossRef]
- Gebhardt, M.; Jungjohann, J. Digitale Unterstützung bei der Dokumentation von Verhalts- und Leistungsbeurteilungen. In Praxisleitfaden Auffällige Schüler und Schülerinnen: Basiswissen und Handlungsmöglichkeiten mit Online-Materialien; Meyer, B.E., Tretter, T., Englisch, U., Eds.; Beltz: Weinheim, Basel, 2020; pp. 41–50. ISBN 9783407294227. [Google Scholar]
- Liebers, K.; Kanold, E.; Junger, R. Digitale Lernstandsanalysen in der inklusiven Grundschule. In Lernprozesse Begleiten; Bartusch, S., Ed.; Springer: Wiesbaden, Germany, 2019; pp. 209–221. ISBN 978-3-658-21923-9. [Google Scholar]
- Russell, M.; Hoffman, T.; Higgins, J. Meeting the Needs of All Students: A Universal Design Approach to Computer-Based Testing. Innov. J. Online Educ. 2009, 5, 6. [Google Scholar]
- Veerbeek, J.; Vogelaar, B.; Verhaegh, J.; Resing, W.C. Process assessment in dynamic testing using electronic tangibles. J. Comput. Assist. Learn. 2018, 35, 127–142. [Google Scholar] [CrossRef]
- Baron, L.S.; Hogan, T.P.; Schechter, R.L.; Hook, P.E.; Brooke, E.C. Can educational technology effectively differentiate instruction for reader profiles? Read. Writ. 2019, 32, 2327–2352. [Google Scholar] [CrossRef] [Green Version]
- Schwanenflugel, P.J.; Westmoreland, M.R.; Benjamin, R.G. Reading fluency skill and the prosodic marking of linguistic focus. Read. Writ. 2013, 28, 9–30. [Google Scholar] [CrossRef]
- National Institute of Child Health and Human Development. Report of the National Reading Panel—Teaching Children to Read: An Evidence-Based Assessment of the Scientific Research Literature on Reading and Its Implications for Reading Instruction; U.S. Government Printing Office: Washington, DC, USA, 2000.
- Lai, S.A.; Benjamin, R.G.; Schwanenflugel, P.J.; Kuhn, M.R. The Longitudinal Relationship Between Reading Fluency and Reading Comprehension Skills in Second-Grade Children. Read. Writ. Q. 2014, 30, 116–138. [Google Scholar] [CrossRef]
- Fuchs, L.S.; Fuchs, D.; Compton, D.L. Monitoring Early Reading Development in First Grade: Word Identification Fluency versus Nonsense Word Fluency. Except. Child. 2004, 71, 7–21. [Google Scholar] [CrossRef] [Green Version]
- Aspiranti, K.B.; Henze, E.E.C.; Reynolds, J.L. Comparing Paper and Tablet Modalities of Math Assessment for Multiplication and Addition. Sch. Psychol. Rev. 2020, 49, 453–465. [Google Scholar] [CrossRef]
- Bennett, R.E.; Braswell, J.; Oranje, A.; Sandene, B.; Kaplan, B.; Yan, F. Does it Matter if I Take My Mathematics Test on Computer? A Second Empirical Study of Mode Effects in NAEP. J. Tecol. Learn. Assess. 2008, 6. Available online: https://ejournals.bc.edu/index.php/jtla/article/view/1639/1472 (accessed on 12 June 2023).
- Hensley, K.; Rankin, A.; Hosp, J. Comparing student performance on paper- and computer-based math curriculum-based measures. Assist. Technol. 2016, 29, 140–145. [Google Scholar] [CrossRef]
- Seifert, S.; Paleczek, L. Comparing tablet and print mode of a german reading comprehension test in grade 3: Influence of test order, gender and language. Int. J. Educ. Res. 2022, 113, 101948. [Google Scholar] [CrossRef]
- Støle, H.; Mangen, A.; Schwippert, K. Assessing children’s reading comprehension on paper and screen: A mode-effect study. Comput. Educ. 2020, 151, 103861. [Google Scholar] [CrossRef]
- Schaffer Seits, T. Elementary School Student Reading Fluency: Evaluating the Differences Between the Application of Computer and Print Text Presentations. Doctoral Dissertation, Nova Southeastern University, Miami, FL, USA, 2013. [Google Scholar]
- Lenhard, W.; Schroeders, U.; Lenhard, A. Equivalence of Screen Versus Print Reading Comprehension Depends on Task Complexity and Proficiency. Discourse Process 2017, 54, 427–445. [Google Scholar] [CrossRef]
- Jungjohann, J.; Diehl, K.; Gebhardt, M. SiL-Levumi—Tests der Leseflüssigkeit zur Lernverlaufsdiagnostik—“Silben lesen” der Onlineplattform Levumi.de: [Verfahrensdokumentation aus PSYNDEX Tests-Nr. 9007767 und Silbenlisten]; ZPID Leibniz Institute for Psychology Information: Trier, Germany, 2019. [Google Scholar] [CrossRef]
- Jungjohann, J.; Diehl, K.; Mühling, A.; Gebhardt, M. Graphen der Lernverlaufsdiagnostik interpretieren und anwenden—Leseförderung mit der Onlineverlaufsmessung Levumi. Forsch. Sprache 2018, 6, 84–91. [Google Scholar] [CrossRef]
- Lenhard, W.; Lenhard, A.; Schneider, W. ELFE II—Ein Leseverständnistest für Erst- bis Siebtklässler. In Version II, 1. Auflage; Hogrefe Schultests: Göttingen, Germany, 2017. [Google Scholar]
- Mühling, A.; Jungjohann, J.; Gebhardt, M. Progress Monitoring in Primary Education using Levumi: A case study. In Proceedings of the 11th International Conference on Computer Supported Education, Heraklion, Greece, 2–4 May 2019; Lane, H., Zvacek, S., Uhomoihi, J., Eds.; Science and Technology Publications: Setúbal, Portugal, 2019; pp. 137–144. [Google Scholar] [CrossRef]
- Jungjohann, J.; Gebhardt, M. Bezugsnormorientierung im Unterricht. In Sonderpädagogische Psychologie, Diagnostik und Förderdiagnostik: Eine Einführung; Gebhardt, M., Scheer, D., Schurig, M., Eds.; Universität Regensburg: Regensburg, Germany, 2022; pp. 25–32. [Google Scholar] [CrossRef]
- Robitzsch, A.; Kiefer, T.; Wu, M. TAM: Test Analysis Modules. 2022. Available online: https://cran.r-project.org/web/packages/TAM/index.html (accessed on 12 June 2023).
- R Core Team. R: A Language and Environment for Statistical Computing; R Foundation for Statistical Computing: Vienna, Austria, 2020; Available online: https://www.R-project.org/ (accessed on 12 June 2023).
- Longford, N.T.; Holland, P.W.; Thayer, D.T. Stability of the MH D-DIF statistics across populations. In Differential Item Functioning; Holland, P.W., Wainer, H., Eds.; Lawrence Erlbaum Associates, Inc.: Mahwah, NJ, USA; Psychology Press: London, UK, 1993; pp. 171–196. [Google Scholar]
- Robitzsch, A. Sirt: Supplementary Item Response Theory Models. 2022. Available online: https://cran.r-project.org/web/packages/sirt/index.html (accessed on 12 June 2023).
- Muthén, L.K.; Muthén, B.O. Mplus User’s Guide, Los Angeles, CA, 1998–2017. Available online: https://www.statmodel.com/download/usersguide/MplusUserGuideVer_8.pdf (accessed on 12 June 2023).
Count (N) | Percent | |
---|---|---|
Total Sample | 359 | 100.0% |
With Migration background | 227 | 63.2% |
With Special Education Needs | 53 | 14.7% |
Male | 198 | 55.2% |
3rd Year | 193 | 53.8% |
Sum Score | Theta | |||
---|---|---|---|---|
Paper-Based M (SE) | Computer-Based M (SE) | Paper-Based M (SE) | Computer-Based M (SE) | |
Female | 0.00 (0.06) | 0.00 (0.06) | −0.05 (0.07) | −0.03 (0.06) |
Migration Background | −0.13 (0.05) ** | −0.12 (0.05) * | −0.15 (0.05) ** | −0.10 (0.06) x |
Special Education Needs | −0.34 (0.06) *** | −0.38 (0.06) *** | −0.33 (0.06) *** | −0.42 (0.05) *** |
Third-Year | 0.59 (0.16) *** | 0.45 (1.8) *** | 0.52 (0.20) *** | 0.23 (0.24) |
R2 within classrooms | 0.14 (0.05) *** | 0.17 (0.05) *** | 0.14 (0.04) ** | 0.20 (0.04) *** |
R2 within classrooms | 0.35 (0.19) | 0.20 (0.16) *** | 0.27 (0.21) | 0.05 (0.11) |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Jungjohann, J.; DeVries, J.M.; Gebhardt, M. Measuring Oral Reading Fluency (ORF) Computer-Based and Paper-Based: Examining the Mode Effect in Reading Accuracy and Reading Fluency. Educ. Sci. 2023, 13, 624. https://doi.org/10.3390/educsci13060624
Jungjohann J, DeVries JM, Gebhardt M. Measuring Oral Reading Fluency (ORF) Computer-Based and Paper-Based: Examining the Mode Effect in Reading Accuracy and Reading Fluency. Education Sciences. 2023; 13(6):624. https://doi.org/10.3390/educsci13060624
Chicago/Turabian StyleJungjohann, Jana, Jeffrey M. DeVries, and Markus Gebhardt. 2023. "Measuring Oral Reading Fluency (ORF) Computer-Based and Paper-Based: Examining the Mode Effect in Reading Accuracy and Reading Fluency" Education Sciences 13, no. 6: 624. https://doi.org/10.3390/educsci13060624
APA StyleJungjohann, J., DeVries, J. M., & Gebhardt, M. (2023). Measuring Oral Reading Fluency (ORF) Computer-Based and Paper-Based: Examining the Mode Effect in Reading Accuracy and Reading Fluency. Education Sciences, 13(6), 624. https://doi.org/10.3390/educsci13060624