iSTART StairStepper—Using Comprehension Strategy Training to Game the Test
Abstract
:1. Introduction
1.1. Reading Comprehension Strategies
1.2. Intelligent Tutoring Systems
1.3. iSTART
1.4. StairStepper
1.4.1. Reading Comprehension Strategies for Standardized Testing
1.4.2. Text Set and Questions
1.4.3. Game Play
1.5. Present Study
- How do students respond to the StairStepper game-based practice?
- How will participants progress through the StairStepper game based on text adaptivity and scaffolded feedback?
- How do iSTART training and StairStepper practice influence participant performance on a comprehension test and standardized assessment?
2. Results
2.1. Perceptions of the StairStepper Game
2.2. System Data
2.3. Reading Comprehension
2.3.1. GMRT
2.3.2. Science Comprehension
3. Discussion
Limitations and Future Directions
4. Materials and Methods
4.1. Participants
4.2. Learning Measures
4.3. Perceptions Measures
4.4. Procedure
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Organization for Economic Cooperation and Development (OECD). OECD Skills Outlook 2013: First Results from the Survey of Adult Skills; OECD Publishing: Paris, France, 2013. [Google Scholar] [CrossRef]
- Nations Report Card. NAEP Reading 2019 HIGHLIGHTS.NAEP Report Card: Reading. Available online: https://www.nationsreportcard.gov/highlights/reading/2019/g12/ (accessed on 20 January 2021).
- Roach, R. Teaching to the test. Divers. Issues High. Educ. 2014, 31, 32. [Google Scholar]
- Volante, L. Teaching to the test: What every educator and policymaker should know. Can. J. Educ. Adm. Policy 2004, 35, 1–6. [Google Scholar]
- Popham, W.J. Teaching to the Test? Educ. Leadersh. 2001, 58, 16–21. [Google Scholar]
- Pressley, M. Effective Beginning Reading Instruction. J. Lit. Res. 2002, 34, 165–188. [Google Scholar] [CrossRef] [Green Version]
- Jennings, J.L.; Bearak, J.M. “Teaching to the test” in the NCLB era: How test predictability affects our understanding of student performance. Educ. Res. 2014, 43, 381–389. [Google Scholar] [CrossRef] [Green Version]
- Magliano, J.P.; McCrudden, M.T.; Rouet, J.F.; Sabatini, J. The modern reader: Should changes to how we read affect research and theory? In Handbook of Discourse Processes, 2nd ed.; Schober, M.F., Rapp, D.N., Britt, M.A., Eds.; Routledge: New York, NY, USA, 2018; pp. 343–361. [Google Scholar]
- Sabatini, J.; O’Reilly, T.; Weeks, J.; Wang, Z. Engineering a twenty-first century reading comprehension assessment sys-tem utilizing scenario-based assessment techniques. Int. J. Test. 2020, 20, 1–23. [Google Scholar] [CrossRef]
- O’Reilly, T.; McNamara, D.S. The Impact of Science Knowledge, Reading Skill, and Reading Strategy Knowledge on More Traditional “High-Stakes” Measures of High School Students’ Science Achievement. Am. Educ. Res. J. 2007, 44, 161–196. [Google Scholar] [CrossRef]
- Healy, A.F.; Schneider, V.I.; Bourne, L.E., Jr. Empirically valid principles of training. In Training Cognition; Healy, A.F., Bourne, L.E., Jr., Eds.; Optimizing Efficiency, Durability, and Generalizability Psychology Press: Hove, East Sussex, UK, 2012; pp. 13–39. [Google Scholar]
- Kintsch, W. The role of knowledge in discourse comprehension: A construction-integration model. Psychol. Rev. 1988, 95, 163–182. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Butcher, K.R.; Davies, S. Inference generation during online study and multimedia learning. Inferences Dur. Read. 2015, 321–347. [Google Scholar] [CrossRef]
- McNamara, D.S.; Magliano, J. Toward a comprehensive model of comprehension. Psychol. Learn. Motiv. 2009, 51, 297–384. [Google Scholar]
- McNamara, D.S.; Kintsch, W. Learning from texts: Effects of prior knowledge and text coherence. Discourse Process. 1996, 22, 247–288. [Google Scholar] [CrossRef]
- Guthrie, J.T.; Wigfield, A.; Barbosa, P.; Perencevich, K.C.; Taboada, A.; Davis, M.H.; Tonks, S. Increasing reading com-prehension and engagement through concept-oriented reading instruction. J. Educ. Psychol. 2004, 96, 403. [Google Scholar] [CrossRef] [Green Version]
- McNamara, D.S. SERT: Self-Explanation Reading Training. Discourse Process. 2004, 38, 1–30. [Google Scholar] [CrossRef]
- McNamara, D.S. Self-Explanation and Reading Strategy Training (SERT) Improves Low-Knowledge Students’ Science Course Performance. Discourse Process. 2017, 54, 479–492. [Google Scholar] [CrossRef]
- Meyer, B.J.; Ray, M.N. Structure strategy interventions: Increasing reading comprehension of expository text. Int. Electron. J. Elem. Educ. 2011, 4, 127–152. [Google Scholar]
- Palinscar, A.S.; Brown, A.L. Reciprocal teaching of comprehension-fostering and comprehension-monitoring activities. Cogn. Instr. 1984, 1, 117–175. [Google Scholar] [CrossRef]
- Bisra, K.; Liu, Q.; Nesbit, J.C.; Salimi, F.; Winne, P.H. Inducing Self-Explanation: A Meta-Analysis. Educ. Psychol. Rev. 2018, 30, 703–725. [Google Scholar] [CrossRef]
- Chi, M.T.H.; Bassok, M.; Lewis, M.W.; Reimann, P.; Glaser, R. Self-explanations: How students study and use examples in learning to solve problems. Cogn. Sci. 1989, 13, 145–182. [Google Scholar] [CrossRef]
- Chi, M.T.; De Leeuw, N.; Chiu, M.-H.; Lavancher, C. Eliciting Self-Explanations Improves Understanding. Cogn. Sci. 1994, 18, 439–477. [Google Scholar] [CrossRef]
- Hausmann, R.G.; VanLehn, K. Explaining self-explaining: A contrast between content and generation. Front. Art Int. 2007, 158, 417. [Google Scholar]
- Hausmann, R.G.; VanLehn, K. The effect of self-explaining on robust learning. Int. J. Artif. Intell. Educ. 2010, 20, 303–332. [Google Scholar]
- McNamara, D.S.; O’Reilly, T.P.; Best, R.M.; Ozuru, Y. Improving Adolescent Students’ Reading Comprehension with Istart. J. Educ. Comput. Res. 2006, 34, 147–171. [Google Scholar] [CrossRef]
- O’Reilly, T.; Taylor, R.S.; McNamara, D.S. Classroom based reading strategy training: Self-explanation vs. reading control. In Proceedings of the 28th Annual Conference of the Cognitive Science Society; Sun, R., Miyake, N., Eds.; Erlbaum: Mahwah, NJ, USA, 2006; pp. 1887–1892. [Google Scholar]
- Chi, M.T.; Bassok, M. Learning from Examples via Self-Explanations. Knowing Learn. Instr. Essays Honor Robert Glas. 1988. [Google Scholar] [CrossRef] [Green Version]
- Wagoner, S.A. Comprehension Monitoring: What It Is and What We Know about It. Read. Res. Q. 1983, 18, 328. [Google Scholar] [CrossRef]
- Magliano, J.P.; Baggett, W.B.; Johnson, B.K.; Graesser, A.C. The time course of generating causal antecedent and causal consequence inferences. Discourse Process. 1993, 16, 35–53. [Google Scholar] [CrossRef]
- Magliano, J.P.; Dijkstra, K.; Zwaan, R.A. Generating predictive inferences while viewing a movie. Discourse Process. 1996, 22, 199–224. [Google Scholar] [CrossRef]
- McNamara, D.S. Reading both high-coherence and low-coherence texts: Effects of text sequence and prior knowledge. Can. J. Exp. Psychol. /Rev. Can. De Psychol. Expérimentale 2001, 55, 51–62. [Google Scholar] [CrossRef]
- McNamara, D.S.; Kintsch, E.; Songer, N.B.; Kintsch, W. Are Good Texts Always Better? Interactions of Text Coherence, Background Knowledge, and Levels of Understanding in Learning from Text. Cogn. Instr. 1996, 14, 1–43. [Google Scholar] [CrossRef]
- Best, R.M.; Floyd, R.G.; McNamara, D.S. Differential competencies contributing to children’s comprehension of narrative and expository texts. Read. Psychol. 2008, 29, 137–164. [Google Scholar] [CrossRef]
- Best, R.M.; Rowe, M.; Ozuru, Y.; McNamara, D.S. Deep-Level Comprehension of Science Texts. Top. Lang. Disord. 2005, 25, 65–83. [Google Scholar] [CrossRef]
- O’Reilly, T.; Best, R.; McNamara, D.S. Self-explanation reading training: Effects for low-knowledge readers. In Proceedings of the 26th Annual Cognitive Science Society; Forbus, K., Gentner, D., Regier, T., Eds.; Erlbaum: Mahwah, NJ, USA, 2004; pp. 1053–1058. [Google Scholar]
- Fang, Y.; Lippert, A.; Cai, Z.; Hu, X.; Graesser, A.C. A Conversation-Based Intelligent Tutoring System Benefits Adult Readers with Low Literacy Skills. In Proceedings of the Lecture Notes in Computer Science; Metzler, J.B., Ed.; Springer: Cham, Switzerland, 2019; pp. 604–614. [Google Scholar]
- Healy, A.F.; Clawson, D.M.; McNamara, D.S.; Marmie, W.R.; Schneider, V.I.; Rickard, T.C.; Bourne, L.E., Jr. The long-term retention of knowledge and skills. Psychol. Learn. Motiv. 1993, 30, 135–164. [Google Scholar]
- Jackson, G.T.; McNamara, D.S. Motivation and performance in a game-based intelligent tutoring system. J. Educ. Psychol. 2013, 105, 1036–1049. [Google Scholar] [CrossRef] [Green Version]
- Jackson, G.T.; McNamara, D.S. The Motivation and Mastery Cycle Framework: Predicting Long-Term Benefits of Educational Games In Game-Based Learning: Theory, Strategies and Performance Outcomes; Baek, Y., Ed.; Nova Science Publishers: New York, NY, USA, 2017; pp. 759–769. [Google Scholar]
- Kellogg, R.T.; Whiteford, A.P. Training Advanced Writing Skills: The Case for Deliberate Practice. Educ. Psychol. 2009, 44, 250–266. [Google Scholar] [CrossRef]
- Wijekumar, K.; Graham, S.; Harris, K.R.; Lei, P.-W.; Barkel, A.; Aitken, A.; Ray, A.; Houston, J. The roles of writing knowledge, motivation, strategic behaviors, and skills in predicting elementary students’ persuasive writing from source material. Read. Writ. 2018, 32, 1431–1457. [Google Scholar] [CrossRef]
- Huizenga, J.; Admiraal, W.; Akkerman, S.; Dam, G.T. Mobile game-based learning in secondary education: Engagement, motivation and learning in a mobile city game. J. Comput. Assist. Learn. 2009, 25, 332–344. [Google Scholar] [CrossRef]
- Rieber, L.P. Seriously considering play: Designing interactive learning environments based on the blending of micro worlds, simulations, and games. Educ. Technol. Res. Dev. 1996, 44, 43–58. [Google Scholar] [CrossRef]
- Kulik, J.A.; Fletcher, J.D. Effectiveness of intelligent tutoring systems: A meta-analytic review. Rev. Educ. Res. 2016, 86, 42–78. [Google Scholar] [CrossRef] [Green Version]
- VanLehn, K. The behavior of tutoring systems. Int. J. Artif. Intell. Educ. 2006, 16, 227–265. [Google Scholar]
- VanLehn, K. The Relative Effectiveness of Human Tutoring, Intelligent Tutoring Systems, and Other Tutoring Systems. Educ. Psychol. 2011, 46, 197–221. [Google Scholar] [CrossRef]
- Ma, W.; Adesope, O.O.; Nesbit, J.C.; Liu, Q. Intelligent tutoring systems and learning outcomes: A meta-analysis. J. Educ. Psychol. 2014, 106, 901–918. [Google Scholar] [CrossRef] [Green Version]
- Mitrovic, A.; Ohlsson, S.; Barrow, D.K. The effect of positive feedback in a constraint-based intelligent tutoring system. Comput. Educ. 2013, 60, 264–272. [Google Scholar] [CrossRef]
- Aleven, V.; Roll, I.; McLaren, B.M.; Koedinger, K.R. Help helps, but only so much: Research on help seeking with intelligent tutoring systems. Int. J. Artif. Intell. Educ. 2016, 26, 205–223. [Google Scholar] [CrossRef]
- Moreno, R.; Mayer, R.; Lester, J. Life-like pedagogical agents in constructivist multimedia environments: Cognitive consequences of their interaction. In EdMedia+ Innovate Learning; Association for the Advancement of Computing in Education (AACE): Morgantown, WV, USA, 2000; pp. 776–781. [Google Scholar]
- Veletsianos, G.; Russell, G.S. Pedagogical Agents. In Handbook of Research on Educational Communications and Technology; Spector, J., Merrill, M., Elen, J., Bishop, M., Eds.; Springer: New York, NY, USA, 2014; pp. 759–769. [Google Scholar]
- Craig, S.D.; Gholson, B.; Driscoll, D.M. Animated pedagogical agents in multimedia educational environments: Effects of agent properties, picture features and redundancy. J. Educ. Psychol. 2002, 94, 428–434. [Google Scholar] [CrossRef]
- Chen, Z.-H.; Chen, S.Y. When educational agents meet surrogate competition: Impacts of competitive educational agents on students’ motivation and performance. Comput. Educ. 2014, 75, 274–281. [Google Scholar] [CrossRef]
- Craig, S.D.; Driscoll, D.M.; Gholson, B. Constructing knowledge from dialog in an intelligent tutoring system: Interactive learning, vicarious learning, and pedagogical agents. J. Educ. Multimed. Hypermedia 2004, 13, 163–183. [Google Scholar]
- Duffy, M.C.; Azevedo, R. Motivation matters: Interactions between achievement goals and agent scaffolding for self-regulated learning within an intelligent tutoring system. Comput. Hum. Behav. 2015, 52, 338–348. [Google Scholar] [CrossRef] [Green Version]
- Moreno, R.; Mayer, R.E.; Spires, H.A.; Lester, J.C. The Case for Social Agency in Computer-Based Teaching: Do Students Learn More Deeply When They Interact with Animated Pedagogical Agents? Cogn. Instr. 2001, 19, 177–213. [Google Scholar] [CrossRef]
- Johnson, A.M.; Ozogul, G.; Reisslein, M. Supporting multimedia learning with visual signalling and animated pedagogical agent: Moderating effects of prior knowledge. J. Comput. Assist. Learn. 2015, 31, 97–115. [Google Scholar] [CrossRef]
- Moreno, R.; Mayer, R.E. Role of Guidance, Reflection, and Interactivity in an Agent-Based Multimedia Game. J. Educ. Psychol. 2005, 97, 117–128. [Google Scholar] [CrossRef] [Green Version]
- Schroeder, N.L.; Adesope, O.O.; Gilbert, R.B. How effective are pedagogical agents for learning? A meta-analytic review. J. Educ. Comp. Res. 2013, 49, 1–39. [Google Scholar] [CrossRef]
- Li, W.; Wang, F.; Mayer, R.E.; Liu, H. Getting the point: Which kinds of gestures by pedagogical agents improve multi-media learning? J. Educ. Psychol. 2019, 111, 1382–1395. [Google Scholar] [CrossRef]
- Mayer, R.E.; Dapra, C.S. An embodiment effect in computer-based learning with animated pedagogical agents. J. Exp. Psychol. Appl. 2012, 18, 239–252. [Google Scholar] [CrossRef] [PubMed]
- Lester, J.C.; Converse, S.A.; Kahler, S.E.; Barlow, S.T.; Stone, B.A.; Bhogal, R.S. The persona effect: Affective impact of animated pedagogical agents. In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems; Edwards, A., Pemberton, S., Eds.; Association for Computing Machinery: New York, NY, USA, 1997; pp. 359–366. [Google Scholar]
- Atkinson, R.K.; Mayer, R.E.; Merrill, M.M. Fostering social agency in multimedia learning: Examining the impact of an animated agent’s voice. Contemp. Educ. Psychol. 2005, 30, 117–139. [Google Scholar] [CrossRef]
- Baylor, A.L. The design of motivational agents and avatars. Educ. Technol. Res. Dev. 2011, 59, 291–300. [Google Scholar] [CrossRef]
- Gulz, A.; Haake, M. Design of animated pedagogical agents—A look at their look. Int. J. Hum. Comput. Stud. 2006, 64, 322–339. [Google Scholar] [CrossRef]
- Kim, M.; Ryu, J. Meta-Analysis of the Effectiveness of Pedagogical Agent. In Proceedings of ED-MEDIA 2003--World Conference on Educational Multimedia, Hypermedia & Telecommunications; Lassner, D., McNaught, C., Eds.; Association for the Advancement of Computing in Education (AACE): Honolulu, HI, USA, 2003; pp. 479–486. [Google Scholar]
- Schroeder, N.L.; Romine, W.L.; Craig, S.D. Measuring pedagogical agent persona and the influence of agent persona on learning. Comput. Educ. 2017, 109, 176–186. [Google Scholar] [CrossRef]
- Hattie, J.; Timperley, H. The power of feedback. Rev. Educ. Res. 2007, 77, 81–112. [Google Scholar] [CrossRef]
- Sadler, D.R. Formative assessment and the design of instructional systems. Instr. Sci. 1989, 18, 119–144. [Google Scholar] [CrossRef]
- Hattie, J.A.C. Visible Learning; Routledge, Taylor and Francis Group: London, UK; New York, NY, USA, 2009. [Google Scholar]
- Havnes, A.; Smith, K.; Dysthe, O.; Ludvigsen, K. Formative assessment and feedback: Making learning visible. Stud. Educ. Eval. 2012, 38, 21–27. [Google Scholar] [CrossRef]
- Kluger, A.N.; DeNisi, A. The effects of feedback interventions on performance: A historical review, a meta-analysis, and a preliminary feedback intervention theory. Psychol. Bull. 1996, 119, 254. [Google Scholar] [CrossRef]
- Kulik, J.A.; Kulik, C.-L.C. Timing of Feedback and Verbal Learning. Rev. Educ. Res. 1988, 58, 79–97. [Google Scholar] [CrossRef]
- Mory, E.H. Feedback research revisited. In Handbook of Research on Educational Communications and Technology; Jonassen, D.H., Ed.; Lawrence Erlbaum Associates Publishers: Mahwah, NJ, USA, 2004; pp. 745–783. [Google Scholar]
- Butler, A.C.; Karpicke, J.D.; Roediger, H.L., III. The effect of type and timing of feedback on learning from multi-ple-choice tests. J. Exp. Psychol. Appl. 2007, 13, 273. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Fyfe, E.R.; Rittle-Johnson, B. Feedback both helps and hinders learning: The causal role of prior knowledge. J. Educ. Psychol. 2016, 108, 82–97. [Google Scholar] [CrossRef]
- Fyfe, E.R.; Rittle-Johnson, B. The benefits of computer-generated feedback for mathematics problem solving. J. Exp. Child. Psychol. 2016, 147, 140–151. [Google Scholar] [CrossRef] [Green Version]
- Shute, V.J. Focus on Formative Feedback. Rev. Educ. Res. 2008, 78, 153–189. [Google Scholar] [CrossRef]
- Cabestrero, R.; Quirós, P.; Santos, O.C.; Salmeron-Majadas, S.; Uria-Rivas, R.; Boticario, J.G.; Arnau, D.; Arevalillo-Herráez, M.; Ferri, F.J. Some insights into the impact of affective information when delivering feedback to students. Behav. Inf. Technol. 2018, 37, 1252–1263. [Google Scholar] [CrossRef]
- Robison, J.; McQuiggan, S.; Lester, J. Evaluating the consequences of affective feedback in intelligent tutoring systems. In Proceedings of the 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops; Institute of Electrical and Electronics Engineers (IEEE): Piscataway, NJ, USA, 2009; pp. 1–6. [Google Scholar]
- Moreno, R. Decreasing cognitive load for novice students: Effects of explanatory versus corrective feedback in discovery based multimedia. Instr. Sci. 2004, 32, 99–113. [Google Scholar] [CrossRef]
- Blumenfeld, P.C.; Kempler, T.M.; Krajcik, J.S. Motivation and Cognitive Engagement in Learning Environments. In The Cambridge Handbook of the Learning Sciences; Cambridge University Press (CUP): Cambridge, UK, 2005; pp. 475–488. [Google Scholar]
- A Fredricks, J.; Blumenfeld, P.C.; Paris, A.H. School Engagement: Potential of the Concept, State of the Evidence. Rev. Educ. Res. 2004, 74, 59–109. [Google Scholar] [CrossRef] [Green Version]
- Pintrich, P.R.; De Groot, E.V. Motivational and self-regulated learning components of classroom academic performance. J. Educ. Psychol. 1990, 82, 33. [Google Scholar] [CrossRef]
- Zimmerman, B.J.; Bandura, A.; Martinez-Pons, M. Self-Motivation for Academic Attainment: The Role of Self-Efficacy Beliefs and Personal Goal-Setting. Am. Educ. Res. J. 1992, 29, 663–676. [Google Scholar] [CrossRef]
- Guthrie, J.T.; Wigfield, A.; You, W. Instructional contexts for engagement and achievement in reading. In Handbook of Research on Student Engagement; Reschly, A.L., Sandra, L.C., Wylie, C., Eds.; Springer: Boston, MA, USA, 2012; pp. 601–634. [Google Scholar]
- Phillips, V.; Popović, Z. More than Child’s Play: Games have Potential Learning and Assessment Tools. Phi Delta Kappan 2012, 94, 26–30. [Google Scholar] [CrossRef]
- McNamara, D.S.; Levinstein, I.B.; Boonthum, C. iSTART: Interactive strategy training for active reading and thinking. Behav. Res. Methodsinstrum. Comput. 2004, 36, 222–233. [Google Scholar] [CrossRef] [Green Version]
- McNamara, D.S.; O’Reilly, T.; Rowe, M.; Boonthum, C.; Levinstein, I.B. iSTART: A web-based tutor that teaches self-explanation and metacognitive reading strategies. In Reading Comprehension Strategies: Theories, Interventions, and Technologies; McNamara, D.S., Ed.; Erlbaum: Mahwah, NJ, USA, 2007; pp. 397–421. [Google Scholar]
- Jackson, G.T.; McNamara, D.S. Motivational impacts of a game-based intelligent tutoring system. In Proceedings of the 24th International Florida Artificial Intelligence Research Society (FLAIRS) Conference; Murray, R.C., McCarthy, P.M., Eds.; AAAI Press: Menlo Park, CA, USA, 2011; pp. 519–524. [Google Scholar]
- Anderson, J.R.; Conrad, F.G.; Corbett, A.T. Skill acquisition and the LISP tutor. Cogn. Sci. 1989, 1, 467–505. [Google Scholar] [CrossRef]
- Balyan, R.; McCarthy, K.S.; McNamara, D.S. Applying Natural Language Processing and Hierarchical Machine Learning Approaches to Text Difficulty Classification. Int. J. Artif. Intell. Educ. 2020, 30, 337–370. [Google Scholar] [CrossRef]
- Perret, C.A.; Johnson, A.M.; McCarthy, K.S.; Guerrero, T.A.; Dai, J.; McNamara, D.S. StairStepper: An Adaptive Remedial iSTART Module. In Proceedings of the Lecture Notes in Computer Science; Springer Science and Business Media LLC: Berlin, Germany, 2017; Volume 10331, pp. 557–560. [Google Scholar]
- Jackson, G.T.; Boonthum, C.; McNamara, D.S. The Efficacy of iSTART Extended Practice: Low Ability Students Catch Up. International Conference on Intelligent Tutoring Systems; Springer: Berlin/Heidelberg, Germany, 2010; pp. 349–351. [Google Scholar]
- MacGinitie, W.H.; MacGinitie, R.K. Gates-MacGinitie Reading Tests, 4th ed.; Houghton Mifflin: Iowa City, IA, USA, 2006. [Google Scholar]
Question | Reading Skill Level | N | Mean (SD) | t | p |
---|---|---|---|---|---|
Enjoyed the practice environment | Low | 21 | 3.29 (0.72) | 3.45 | 0.001 ** |
High | 26 | 2.42 (0.95) | |||
Feedback was helpful | Low | 21 | 3.33 (0.91) | 1.84 | 0.073 |
High | 26 | 2.77 (1.14) | |||
Interface had game-like features | Low | 21 | 3.24 (0.94) | 2.16 | 0.0360 * |
High | 26 | 2.54 (1.27) | |||
Provided a purpose for actions | Low | 21 | 3.33 (1.02) | 0.938 | 0.353 |
High | 26 | 3.04 (1.11) | |||
I set goals during practice | Low | 21 | 3.19 (1.12) | 0.809 | 0.423 |
High | 26 | 2.92 (1.13) | |||
Visual appearance made the practice more enjoyable | Low | 21 | 3.19 (1.03) | 1.62 | 0.111 |
High | 26 | 2.65 (1.20) | |||
Objects were easy to control | Low | 21 | 3.67 (0.97) | −0.22 | 0.826 |
High | 26 | 3.73 (1.00) | |||
I wanted to perform well | Low | 21 | 3.71 (0.85) | −0.479 | 0.634 |
High | 26 | 3.85 (1.00) | |||
I would use for other skills | Low | 21 | 3.24 (0.83) | 1.98 | 0.054 * |
High | 26 | 2.65 (1.13) | |||
Environment responded accurately | Low | 21 | 3.43 (0.98) | 0.287 | 0.775 |
High | 26 | 3.35 (0.98) |
Test | Mean (SD) | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 |
---|---|---|---|---|---|---|---|---|---|
1. GMRT Pretest | 0.50 (0.21) | 1.00 | |||||||
2. GMRT Posttest | 0.50 (0.24) | 0.89 ** | 1.00 | ||||||
3. Pretest Mean | 0.35 (0.21) | 0.56 ** | 0.62 ** | 1.00 | |||||
4. Posttest Mean | 0.36 (0.21) | 0.65 ** | 0.67 ** | 0.63 ** | 1.00 | ||||
5. Pretest Textbase | 0.28 (0.23) | 0.44 ** | 0.48 ** | 0.85 ** | 0.52 ** | 1.00 | |||
6. Posttest Textbase | 0.41 (0.24) | 0.57 ** | 0.60 ** | 0.57 ** | 0.87 ** | 0.49 ** | 1.00 | ||
7. Pretest Inference | 1.66 (1.01) | 0.53 ** | 0.59 ** | 0.88 ** | 0.57 ** | 0.50 ** | 0.50 ** | 1.00 | |
8. Posttest Inference | 0.32 (0.24) | 0.57 ** | 0.57 ** | 0.53 ** | 0.87 ** | 0.42 ** | 0.51 ** | 0.49 ** | 1.00 |
Study | Session 1 | Session 2 | Session 3 |
---|---|---|---|
A (delayed-treatment control) | Demographics questionnaire, GMRT pretest, self-explanation and comprehension test for Red Blood Cells | GMRT post-test, self-explanation and comprehension test for Cell Repair iSTART self-explanation instruction (60 min) | 90 min playing StairStepper |
B (training) | Demographics questionnaire, GMRT pretest, self-explanation and comprehension test for Red Blood Cells iSTART self-explanation instruction (60 min) | 90 min playing StairStepper | GMRT post-test, self-explanation and comprehension test for Cell Repair |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Arner, T.; McCarthy, K.S.; McNamara, D.S. iSTART StairStepper—Using Comprehension Strategy Training to Game the Test. Computers 2021, 10, 48. https://doi.org/10.3390/computers10040048
Arner T, McCarthy KS, McNamara DS. iSTART StairStepper—Using Comprehension Strategy Training to Game the Test. Computers. 2021; 10(4):48. https://doi.org/10.3390/computers10040048
Chicago/Turabian StyleArner, Tracy, Kathryn S. McCarthy, and Danielle S. McNamara. 2021. "iSTART StairStepper—Using Comprehension Strategy Training to Game the Test" Computers 10, no. 4: 48. https://doi.org/10.3390/computers10040048
APA StyleArner, T., McCarthy, K. S., & McNamara, D. S. (2021). iSTART StairStepper—Using Comprehension Strategy Training to Game the Test. Computers, 10(4), 48. https://doi.org/10.3390/computers10040048