Improving Science Assessments by Situating Them in a Virtual Environment
Abstract
:1. Introduction
- How can an IVE be designed for assessment with emphasis on integrating scientific inquiry with content?
- Once designed, how is this IVE perceived by students and teachers in terms of engagement and usability?
- What is the impact of the visual context on students’ ability to demonstrate science learning?
- What evidence is there in what students do and say that gives insight towards their understanding of scientific inquiry and content?
2. Theoretical and Research Context
2.1. Scientific Content and Inquiry
2.2. Developing and Assessing Students’ Scientific Inquiry and Content
“Janet has four identical containers. In each container there are 200 grams of a different colored sand, as shown below. All the sand is at the same temperature and has the same grain size. Janet leaves the containers out in the full sun for three hours. Then she measures the temperature of the sand in each container. Her results are shown below. Explain why the temperature of the sand in each container is different.”
2.3. Better High Stakes Tests
2.4. Immersive Virtual Environments and Assessment
2.5. SAVE Science
2.6. SAVE Science and the Four Assessment Conditions
“The picture (see Figure 1) below shows a type of fish that is adapted to live in the weedy areas of freshwater lakes. How is this fish adapted to live in the weedy areas in freshwater lakes?
- a)
- The upper fin looks like another fish.
- b)
- The lower fins look like the legs of a turtle.
- c)
- The stripes of the fish look like plants in the water.
- d)
- The mouth of the fish looks like the bottom of a lake.”
3. Methodology
3.1. Research Questions
- How can an IVE be designed for assessment with emphasis on integrating scientific inquiry with content?
- Once designed, how is this IVE perceived by students and teachers in terms of engagement and usability?
- What is the impact of providing visual versus textual context for assessment questions on students’ ability to demonstrate learning?
- What evidence is there in what students do and say that gives insight towards their understanding of scientific inquiry and content?
3.2. Site and Sample
3.3. Procedure
3.4. Design of Sheep Trouble Assessment
- “Changes in environmental conditions can affect the survival of populations and entire species”—students need to recognize that the new flock of sheep has been transplanted to a very different environment from the one to which they are adapted and using their classroom learning about adaptations, infer that this might be impacting their survivability.
- “Describe the structures of living things that help them function effectively in specific ways (e.g., adaptations, characteristics)”—students describe the structural differences between the two flocks as indications of adaptations to different environments.
- “Explain how different adaptations in individuals of the same species may affect survivability or reproduction success”—students explain how the different adaptations impact survivability of the two flocks.
- “Apply appropriate measurement systems (e.g., time, mass, distance, volume, temperature) to record and interpret observations under varying conditions”—students should choose the appropriate tools (ruler, scale, graphing tool) for gathering, recording and interpreting data.
- “Interpret data/observations”—students collect data but before they report to the Farmer at the end on what they have found, they must make sense of this data.
- “Use evidence, such as observations or experimental results, to support inferences about a relationship”—students need to identify the evidence that supports their hypothesized relationship.
- “Use evidence from investigations to clearly communicate and support conclusions”—students must communicate their findings and conclusions to the Farmer at the end of the test, using evidence they have collected to support their conclusions.
3.5. Analysis
4. Results and Discussion
4.1. Engagement and Usability
Engagement | Assessment aspects | Overall |
---|---|---|
it was fun with the evidence that you had to find on the two kinds of sheep | It seemed like a real-life question | I liked how you could interact with the different people |
the game was very intriguing. It was a brain puzzle but still lots of fun | it was sort of a challenge | Give a second hint about what the problem is |
Its really fun | I think the barriers of the game was too small | I need more stuff to interact with |
You get to figure out what’s wrong | It was fairly easy | It was really fun because you got to go around and explore why the new sheep were sick |
the most interesting part was trying to find out what was wrong with sheep. | It was fun and realistic. It didn’t feel like we were just taking a test on a blank screen. | Make a bit harder and longer |
its real enough looking that I can really get into it. | I think this a great way for students to test their skills | But the story will get old and it would probably be better if there were different challenges |
4.2. Evidence for Scientific Inquiry Understanding
4.3. Knowledge of Adaptation Content
4.4. Aspects for Improvement
5. Conclusion and Future Research
- Integrated content with scientific inquiry as opposed to separate questions
- Contextualized questions to help student apply their learning
- Efficient means for grading
- Statistically reliable and valid assessments
Acknowledgements
Conflict of Interest
References
- National Research Council, Classroom Assessment and the National Science Education Standards; The National Academies Press: Washington, DC, USA, 2001.
- National Research Council, A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas; The National Academies Press: Washington, DC, USA, 2012.
- National Research Council, Inquiry and the National Science Education Standards: A Guide for Teaching and Learning; The National Academies Press: Washington, DC, USA, 2000.
- Bybee, R. Teaching Science as Inquiry. In Inquiring into Inquiry Learning and Teaching in Science; Minstrel, J., Van Zee, E.H., Eds.; American Association for the Advancement of Science (AAAS): Washington, DC, USA, 2000; Chapter 3; pp. 20–46. [Google Scholar]
- Dewey, J. Democracy and Education (First free press paperback 1966 ed.); Macmillan Company: New York, USA, 1944. [Google Scholar]
- National Research Council, National Science Education Standards: Observe, Interact, Change, Learn; The National Academies Press: Washington, DC, USA, 1996.
- Rutherford, F.J. Vital Connections: Children, Books, and Science. In Vital Connections: Children, Science, and Books; Saul, W., Jagusch, S.A., Eds.; Library of Congress: Washington, DC, USA, 1991; pp. 21–30. [Google Scholar]
- Li, J.; Klahr, D. The psychology of scientific thinking: Implications for science teaching and learning. In Teaching science in the 21st century, 1st ed.; Rhoton, J., Shane, P., Eds.; National Science Teachers Association: Arlington, VA, USA, 2006; pp. 307–327. [Google Scholar]
- Massachusetts Department of Education. Massachusetts Science and Technology/Engineering Curriculum Framework [Electronic Version]. 2006. Available online: http://www.doe.mass.edu/frameworks/scitech/1006.pdf (accessed on 15 January 2008).
- Anderson, R. Reforming science teaching: What research says about inquiry. J. Sci. Teach. Educ. 2002, 13, 1–12. [Google Scholar] [CrossRef]
- Gibson, H.; Chase, C. Longitudinal impact of an inquiry-based science program on middle school students’ attitudes toward science. Sci. Educ. 2002, 86, 693–705. [Google Scholar] [CrossRef]
- Savage, L.; Ketelhut, D.J.; Varnum, S.; Stull, J. Raising Interest in Science Careers through Informal After-School Experiences; Paper presented at the National Association for Research in Science Teaching: Philadelphia, PA, USA, 2010. [Google Scholar]
- Leonard, W.H.; Speziale, B.J.; Penick, J.E. Performance assessment of a standards-based high school biology curriculum. Am. Biol. Teach. 2001, 63, 310–316. [Google Scholar]
- Alberts, B. Some Thoughts of a Scientist on Inquiry. In Inquiring into Inquiry Learning and Teaching in Science; Minstrel, J., Van Zee, E.H., Eds.; American Association for the Advancement of Science: Washington, DC, USA, 2000; Chapter 1; pp. 3–13. [Google Scholar]
- Blanchard, M.; Sutherland, S.; Osborne, J.; Sampson, V.; Annetta, L.; Granger, E. Is inquiry possible in light of accountability? A quantitative comparison of the relative effectiveness of guided inquiry and verification laboratory instruction. Sci. Educ. 2010, 94, 577–616. [Google Scholar] [CrossRef]
- Marx, R.; Blumenfeld, P.; Krajcik, J.; Fishman, B.; Soloway, E.; Geier, R.; Tal, R. Inquiry-based science in the middle grades: Assessment of learning in urban systemic reform. J. Res. Sci. Teach. 2004, 41, 1063–1080. [Google Scholar] [CrossRef]
- Tai, R.; Liu, C.; Maltese, A.; Fan, X. CAREER CHOICE: Enhanced: Planning early for careers in science. Science 2006, 312, 1143–1144. [Google Scholar] [CrossRef]
- Jorgenson, O.; Vanosdall, R. The death of science: What we risk in our rush towards standardized testing and the three R’s. Phi Delta Kappan 2002, 83, 601–605. [Google Scholar]
- Nelson, B.; Ketelhut, D.J. Designing for real-world inquiry in virtual environments. Educ. Psychol. Rev. 2007, 19, 265–283. [Google Scholar] [CrossRef]
- Carnegie Corporation, The Opportunity Equation: Transforming Mathematics and Science Education for Citizenship and the Global Economy; Carnegie Corporation of New York: New York, NY, USA, 2009.
- Krajcik, J.S.; McNeil, K.L.; Reiser, B.J. Learning-goals-driven design model: Developing curriculum materials that align with national standards and incorporate project-based pedagogy. Sci. Educ. 2007, 92, 1–32. [Google Scholar]
- Lave, J.; Wenger, E. Situated Learning: Legitimate Peripheral Participation; Cambridge University Press: New York, NY, USA, 1991. [Google Scholar]
- Brown, J.S.; Collins, A.; Duguid, P. Situated cognition and the culture of learning. Educ. Res. 1989, 18, 32–42. [Google Scholar]
- Songer, N.; Wenk, A. Measuring the Development of Complex Reasoning in Science. In Paper presented at the American Education Research Association (AERA) Annual Meeting, Chicago, 25 April 2003.
- Michael, J. Conceptual assessment in the biological sciences: A National Science Foundation sponsored workshop. Adv. Physiol. Educ. 2007, 31, 389–391. [Google Scholar]
- Resnick, L.B.; Resnick, D.P. Assessing the Thinking Curriculum: New Tools for Educational Reform. In Changing Assessments: Alternative Views of Aptitude, Achievement, and Instruction; Gifford, B., O’Connor, M., Eds.; Kluwer Academic Publishers: Norwell, MA, USA, 1992; pp. 37–75. [Google Scholar]
- Southerland, S.A.; Smith, L.K.; Sowell, S.P.; Kittleson, J.M. Resisting unlearning: Understanding science education’s response to the United States’ national accountability movement. Rev. Res. Educ. 2007, 31, 45–77. [Google Scholar]
- National Research Council, America’s Lab Report: Investigations in High School Science; National Academies Press: Washington, DC, USA, 2005.
- Harlow, A.; Jones, A. Why students answer TIMSS science test items the way they do. Res. Sci. Ed. 2004, 34, 221–238. [Google Scholar] [CrossRef]
- Shavelson, R.J.; Baxter, G.P. What we’ve learned about assessing hands-on science. Educ. Leadersh. 1992, 49, 20–25. [Google Scholar]
- Behrens, J.T.; Frezzo, D.; Mislevy, R.; Kroopnick, M.; Wise, D. Structural, Functional, and Semiotic Symmetries in Simulation-Based Games and Assessments. In Assessment of Problem Solving Using Simulations; Baker, E., Dickieson, J., Wulfeck, W., O’Neil, H., Eds.; Lawrence Erlbaum Associates: New York, NY, USA, 2007. [Google Scholar]
- Stecher, B.M.; Klein, S.P. The cost of science performance assessments in largescale testing programs. Educational Evaluation and Policy Analysis 1997, 19, 1–14. [Google Scholar]
- National Assessment Governing Board (NAGB), Science Framework for the 2009 National Assessment of Educational Progress; NAGB, U.S. Department of Education: Washington, DC, USA, 2008.
- Barab, S.; Arici, A.; Jackson, C. Eat your vegetables and do your homework: A design based investigation of enjoyment and meaning in learning. Educ. Technol. 2005, 45, 15–20. [Google Scholar]
- Nelson, B. Exploring the use of individualized, reflective guidance in an educational multi-user virtual environment. J. Sci. Educ. Technol. 2007, 16, 83–97. [Google Scholar] [CrossRef]
- Nelson, B.; Erlandson, B.; Denham, A. Global channels for learning and assessment in complex game environments. Br. J. Educ. Technol. 2011, 42, 88–100. [Google Scholar] [CrossRef]
- Steele, M. Teaching science to middle school students with learning problems. Sci. Scope 2005, 29, 50–51. [Google Scholar]
- Ketelhut, D.J. The impact of student self-efficacy on scientific inquiry skills: An exploratory investigation in river city, a multi-user virtual environment. J. Sci. Educ. Technol. 2007, 16, 99–111. [Google Scholar] [CrossRef]
- Shute, V.J.; Ventura, M.; Bauer, M.I.; Zapata-Rivera, D. Melding the Power of Serious Games and Embedded Assessment to Monitor and Foster Learning: Flow and Grow. In The Social Science of Serious Games: Theories and Applications; Ritterfeld, U., Cody, M.J., Vorderer, P., Eds.; Routledge/LEA: Philadephia, PA, USA, 2009; Chapter 18; pp. 295–321. [Google Scholar]
- Clark, D.; Nelson, B.; Sengupta, P.; D’Angelo, C. Rethinking Science Learning through Digital Games and Simulations: Genres, Examples, and Evidence. An NAS Commissioned Paper. Available online: http://www7.nationalacademies.org/bose/Clark_Gaming_CommissionedPaper.pdf (accessed on 7 October 2009).
- Ketelhut, D.J.; Dede, C. Alternative Assessments of Students’ Understanding of Scientific Inquiry via a Multi-User Virtual Environment. In Invited Paper Presented at the Distributed Learning and Collaboration (DLAC-II) Symposium, Singapore, Singapore, 11 June 2007.
- The Commonwealth of Pennsylvania. Pennsylvania System of State Assessment. 2011. Available online: http://www.portal.state.pa.us/portal/server.pt/community/pennsylvania_system_of_school_assessment_(pssa)/8757/resource_materials/507610 (accessed on 30 December 2011).
- Du Bay, W. the Principles of Readability. 2004. Available online: http://www.nald.ca/library/research/readab/readab.pdf (accessed on 21 March 2013).
- Creswell, J.W. Qualitative Inquiry and Research Design; Sage: Thousand Oaks, CA, USA, 1998. [Google Scholar]
© 2013 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/3.0/).
Share and Cite
Ketelhut, D.J.; Nelson, B.; Schifter, C.; Kim, Y. Improving Science Assessments by Situating Them in a Virtual Environment. Educ. Sci. 2013, 3, 172-192. https://doi.org/10.3390/educsci3020172
Ketelhut DJ, Nelson B, Schifter C, Kim Y. Improving Science Assessments by Situating Them in a Virtual Environment. Education Sciences. 2013; 3(2):172-192. https://doi.org/10.3390/educsci3020172
Chicago/Turabian StyleKetelhut, Diane Jass, Brian Nelson, Catherine Schifter, and Younsu Kim. 2013. "Improving Science Assessments by Situating Them in a Virtual Environment" Education Sciences 3, no. 2: 172-192. https://doi.org/10.3390/educsci3020172
APA StyleKetelhut, D. J., Nelson, B., Schifter, C., & Kim, Y. (2013). Improving Science Assessments by Situating Them in a Virtual Environment. Education Sciences, 3(2), 172-192. https://doi.org/10.3390/educsci3020172