Next Article in Journal
Causal Analysis of Databases Concerning Electromagnetism and Health
Next Article in Special Issue
Application of System Dynamics to Evaluate the Social and Economic Benefits of Infrastructure Projects
Previous Article in Journal
Decoding the XXI Century’s Marketing Shift: An Agency Theory Framework
Article Menu

Export Article

Systems 2016, 4(4), 38; doi:10.3390/systems4040038

Article
Designing Computer-Supported Complex Systems Curricula for the Next Generation Science Standards in High School Science Classrooms
1
Graduate School of Education, University of Pennsylvania, Philadelphia, PA 19104, USA
2
MIT Scheller Teacher Education Program, Massachusetts Institute of Technology, Cambridge, MA 02139, USA
3
Governor’s Academy, Byfield, MA 01922, USA
4
Faculty of Arts and Social Social Sciences, Lancaster University, Lancashire LA1 4YD, UK
5
Ministry of Education, Singapore
*
Author to whom correspondence should be addressed.
Academic Editor: Nam Nguyen
Received: 28 August 2016 / Accepted: 1 December 2016 / Published: 3 December 2016

Abstract

:
We present a curriculum and instruction framework for computer-supported teaching and learning about complex systems in high school science classrooms. This work responds to a need in K-12 science education research and practice for the articulation of design features for classroom instruction that can address the Next Generation Science Standards (NGSS) recently launched in the USA. We outline the features of the framework, including curricular relevance, cognitively rich pedagogies, computational tools for teaching and learning, and the development of content expertise, and provide examples of how the framework is translated into practice. We follow this up with evidence from a preliminary study conducted with 10 teachers and 361 students, aimed at understanding the extent to which students learned from the activities. Results demonstrated gains in students’ complex systems understanding and biology content knowledge. In interviews, students identified influences of various aspects of the curriculum and instruction framework on their learning.
Keywords:
computer-supported science curricula; NGSS; complex systems; student learning outcomes

1. Introduction

Ranging from a single fertilized egg to the climate of the entire planet, complex systems are part of the fabric of our natural and social environments. Knowledge of these systems is essential for a nuanced understanding of the world around us. Recognizing the importance of complex systems, the Next Generation Science Standards (NGSS) prominently feature them in their goals for educating K–12 students in the United States in the Cross Cutting Concepts, featuring topics such as Systems and System Models, Energy and Matter: Flows, Cycles, and Conservation, and Stability and Change. In the pure and applied sciences, the National Academies Keck Futures Initiative [1] has stressed the need to investigate the fundamental science behind these systems, as well as their benefits to society. However, the very nature of complex systems—with their many moving parts, nonlinear relationships, and unpredictability—makes them challenging to teach and to understand. To address these challenges, we present a computer-supported curriculum and instruction framework for teaching and learning about complex systems in high school science classrooms. In this design, development, and implementation project, we describe the features of the framework and document our work in translating it into practice in high school biology classrooms. We provide evidence from a preliminary study, aimed at gauging the extent to which students learned from these curriculum and instruction activities, which shows promising outcomes for further exploration and scaling.

1.1. Teaching and Learning about Complex Systems: Motivations and Theoretical Considerations

Although complex systems vary in their physical components, a common feature of all such systems is the presence of multiple interconnected elements that are bounded by structure and function. The interactions among the elements form a collective network of relationships that exhibit emergent properties not observable at subsystem levels. When perturbations occur, the network may self-organize in unpredictable ways, allowing new properties to emerge. The manner in which complex systems communicate, respond to perturbations, and self-organize is understood by studying the evolution of dynamic processes [2,3]. For example, human activities such as the burning of fossil fuels that emit carbon dioxide have impacted the carbon cycle in the atmosphere, which has in turn affected global weather systems in ways that are difficult to predict, e.g., when and where hurricanes will strike. The lower level interactions between human behavior and the environment are perturbations in normal atmospheric carbon cycling causing the system to adapt and self-organize, which emerges as extreme weather patterns at a macro level. What makes complex systems challenging to understand is that the mechanisms that fuel them, such as adaptation and self-organization, are often hidden and also take place over a long period of time. Nevertheless, learning about complex systems is a critical undertaking because so many human and environmental issues depend on our ability to investigate and manage them, e.g., the spread of disease, power grid robustness, and biosphere sustainability [1].
In order to understand the hidden dimensions and mechanisms of systems, scientists use computer simulations to visualize the dynamics, capture relational data at large scales, and make predictions [1]. In developing our computer-supported curriculum and instruction framework for complex systems, we are motivated by two strands of research in science education and the learning sciences: research documenting the value of computer simulations in the study of complex systems at the high school level, and research calling for the need to identify which facets of new curriculum and instruction activities are the most effective and efficient for the broadest possible audiences.

1.1.1. Using Computer Simulations to Teach Complex Systems

Computer modeling of complex systems, particularly within science education, has been an emerging focus in educational and learning sciences research. The ability to visualize patterns based on non-intuitive processes such as emergence or decentralization [4,5] is thought to be better enabled through computational simulations than through static images or descriptions found in textbooks and other traditional curricular resources [6]. Several software applications and associated curricula, including StarLogo, NetLogo, Biologica, and handheld Participatory Simulations [6,7,8,9,10,11], have been created to teach complex systems and models. These projects, which include several of our own, represent initial steps in the design and development cycle of computational and curricular supports intended to improve students’ understanding of complex systems. Much of this initial work focused on developing simulations of phenomena and understanding how students learn from the simulations. However, more work is needed in order to fully integrate these supports with existing classroom resources and to align them with new NGSS reforms. Specifically, we need to develop frameworks for implementation that can accommodate a range of variables, such as availability of computers and teacher knowledge, that influence classroom practice. In a recent National Research Council [12] report on implementing NGSS, the authors write,
Teachers need resources … that bring together target core ideas, crosscutting concepts, and practices. Such resources may include text materials, online resources (such as simulations or access to data), materials and equipment for investigative and design activities, and teacher manuals that include supports for pedagogical strategies needed to enact these lessons.
(p. 53)
At the time of the writing of that report, the authors suggested no curricular resources had been explicitly built for the NGSS, although we do know now that a number of initiatives are well underway, for example, Amplify Science and iQWST for elementary and middle school levels (see https://www.amplify.com/ and http://www.activatelearning.com/iqwst/). In the work reported here, we aim to support systems learning by incorporating many of the scientific practices and crosscutting themes articulated in the NGSS for high school science.

1.1.2. Understanding Implementation of Educational Tools for Teaching Complex Systems

Our study is also motivated by a need to determine which curriculum and instruction activities have the potential to succeed in a variety of learning environments and to influence the greatest number of students and teachers. This research challenge has been cogently described by Penuel and his colleagues [13]. To illustrate the necessary design elements for innovative educational tools and practices to take hold, Penuel’s team developed a design-based implementation research (DBIR) framework. The first element of the DBIR framework is the formation of teams of multiple stakeholders who are focused on persistent problems of practice. Such collaborative teams of researchers and practitioners should address important problems that present challenges to implementation, with practitioners having a say in defining the problems. The second element is a commitment to iterative collaborative design, in which teams focus their efforts on designing, collecting and analyzing data, learning from, and redesigning project activities that may address curriculum construction, professional development (PD), and organizational change. The third element entails developing theories about how both students and teachers learn in particular contexts, and the activities, resources, and organizations that support this learning. Finally, the fourth element of the DBIR framework concerns developing capacity for sustaining change within systems through intentional efforts to develop organizational infrastructures.
The DBIR framework has guided us throughout the design and implementation of our study. In constructing and carrying out our curriculum and instruction framework, for example, we attended a great deal to the first two DBIR elements (see Section 2.1. for more description about how we worked with teachers as collaborators in the design). We addressed the third element by incorporating a number of theories (see Section 1.2.2.) on how people learn in particular contexts (e.g., teachers in PD) with particular resources (e.g., computer simulations). We build on previous research on integrating curricula and tools into science classroom instruction (e.g., [14]) and professional development activities that support these efforts (e.g., [15]), with the goal of contributing new insights and extensions to these theories and research. Finally, in this article, we set our sights on the fourth DBIR element by offering a modular framework for other researchers and practitioners to adopt, assess, and adapt when building robust learning opportunities centered on complex systems for use in science classrooms.

1.1.3. Research on Science Instruction and Modeling Tools

In developing a curriculum and instruction framework for teaching and learning about complex systems, we build on research that has harnessed technology’s potential to scaffold learning experiences in the science classroom [16]. While a complete review of this research is beyond the scope of this paper, we briefly describe two long-standing projects that have shaped our efforts in developing classroom experiences for science students and teachers through anchoring curriculum in inquiry-based instruction, developing student scientific practices through simulations, and offering insights into how to work with teachers in professional development activities. WISE has pioneered a curriculum and instruction design that focuses on making thinking visible through modeling tools, developing activities that allow all students to access science content by bridging it to students’ prior knowledge, enabling collaboration between students to construct understanding, and promoting autonomous learning by engaging students in student-centered inquiry activities [17]. Similarly, BGuILE has developed technology-infused curricular units that help students construct scientific explanations through the negotiation of in-depth empirically collected data [18]. In the BGuILE curriculum design, students are guided to make connections between observed experimental patterns and domain theories through scientific explanations and scientific practices like argumentation [19]. Digital environments are used to simulate experiences that scientists might encounter in the real world. For example, Tabak and Reiser [20] describe an investigation strategy in evolutionary biology, enabled by the BGuILE modeling tool, that asks students to observe patterns of variable interactions, compare structural and behavioral outcomes induced by variable changes, relate structural and behavioral changes to function, and explain theories that emerge from the investigation process.
In addition to developing curriculum and technology-based tools for science instruction, researchers have focused on the PD activities needed to support teachers in the classroom. A review of the literature on PD for technology-enhanced inquiry science by Gerard and colleagues [15] suggests that teachers are more likely to improve students’ experiences of inquiry-based science learning when PD enables teachers to engage in constructivist learning processes. Such PD involves capitalizing on what teachers know, allowing them time to integrate ideas, and providing space for them to customize the curriculum through successive classroom tests and refinements. Professional guidance to help teachers customize the curriculum is also an important factor in this process, as is PD that continues for more than a year (which is typically the amount of time teachers need to make the necessary adjustments based on their unique instructional practices and classroom environments). Moreover, science inquiry projects in which the technology is embedded within the curricular units support implementation more than projects that rely on teachers to determine the technology’s affordance or to develop their own curricular units. Finally, partnerships between researchers and teachers help address a number of issues, including the technical challenges that impede instruction—a point that further supports the need for PD that lasts longer than the initial year of implementation. Beyond these best practices, we also considered the supports needed for (a) teaching and learning about complex systems as a content domain; and (b) for students programming their own computer models [9,21].

1.2. Complex Systems Curriculum and Instruction Framework

Our complex systems curriculum and instruction framework is depicted in Figure 1.
The framework has four major categories: curricular relevance, cognitively rich pedagogies, tools for learning and teaching, and content expertise. These categories build on the previously reviewed literature and are additionally aligned with the literature on needs and best practices for STEM teaching and learning through computer-supported modeling tools and complex systems content.

1.2.1. Curricular Relevance

This first category of the framework focuses on developing 21st-century competencies [22], ensuring alignment with standards [23], and supporting collaboration with teachers to promote teacher ownership [24,25,26]. All of these aspects of education work toward creating utility for students in terms of the scientific knowledge and practices they are engaged in as well as ensuring that teachers find relevance and utility in their teaching. In our project, we emphasized building 21st-century skills by incorporating into the curriculum problem solving, critical thinking, and self-directed learning through self-paced experiments performed in teams of two. We ensured alignment with standards by drawing on content, practices, and crosscutting concepts outlined in the NGSS. For example, each unit required that students interpret graphs, and in many cases, construct their own graphs from the data they collected. Modeling and interpreting models of biological phenomena, which highlight aspects of systems, allowed students to participate in practices that cut across science fields while deepening their content knowledge. We collaborated with teachers as research partners by seeking continual feedback about challenges in classroom implementation and by working collectively with teachers to solve problems, improve the project, and promote optimal implementation. Finally, we facilitated peer sharing through an online database where teachers could post lesson plans and comment on implementation details.

1.2.2. Cognitively Rich Pedagogies

The second framework category involves pedagogies that promote (a) the social construction of knowledge through collaboration and argumentation [27] and (b) constructionist learning through the construction of models [28] For example, in every unit, experimental results are examined and discussed through a process of argumentation whereby student groups are asked to reflect on evidence that can be brought to bear on claims made about the results. Students are also required to provide the reasoning linking their claims and evidence, with small-group responses shared in the larger group to enable group members to check their understanding. Figure 2 provides an example of an argumentation activity from the unit on evolution. Furthermore, in this category, using the StarLogo Nova application (described in more detail below), students learn how simulations work by deconstructing existing simulations (getting guided tours of the code behind them) and constructing their own simulations (often adding on to existing simulations to give them a head start). Both of these activities connect the code in the simulation to the underlying scientific phenomena in these units.

1.2.3. Tools for Teaching and Learning

The third category builds knowledge through computational modeling tools [29]. Participants use an agent-based modeling platform called StarLogo Nova that combines programming based on graphical blocks with a 3D game-like interface, as illustrated in Figure 3a,b. Figure 3a shows the graphical programming language in which computational procedures are built into easily assembled blocks [30,31] to execute commands. Using the blocks programming language, students can simply drag and drop blocks of code, which are organized into categories. This eliminates the need for students to know or remember command names and the accompanying syntax, which are both significant barriers to novice programmers. Figure 3b shows how the language is translated into the graphical interface of the system being modeled. These images depict an ecological system in which fish and algae interact.
Throughout the project, students interacted with models that visually represent scientific phenomena such as ecological and evolutionary systems. The models help students understand the dynamic processes of systems, such as self-organization and emergence, by visually representing system states and processes at multiple scales. Students use the models to conduct experiments by collecting and analyzing data and drawing evidence-based conclusions. Teacher guides and student activity packs promote teacher and student autonomy and also provide suggestions for adapting and extending practice, which we encourage teachers to do. Both the teacher guides and student activity packs make explicit connections to scientific practices, model representations of scientific ideas, and complex system processes. See Figure 4 for an excerpt from the teacher guide to the unit on evolution. Here we see instruction in collecting data from multiple trials (scientific practices) and how random variation both manifests in the model and is programmed to exhibit fish movement. Other examples of scientific practices include aggregating data for more accuracy and precision, controlling variables, changing variables to test hypotheses and make claims, arguing from evidence, and making sense of emerging system patterns through graphs that run in real time alongside the simulation. Other complex system characteristics that students can see through studying and manipulating the code include how agents interact with other agents, how agents and the environment interact, and the effects of perturbations on system states, e.g., equilibrium.

1.2.4. Content Expertise

The fourth category builds deeper understanding of complex systems [2] and biology [32]. The project team built instructional sequences for five high school biology units—Genetics, Evolution, Ecology, the Human Body, and Animal Systems. There is no set sequence for the units; instead, teachers can implement the units in the order that suits their school curriculum. The curricular materials for each unit take two or three days to complete and include popular and academic literature about complex systems, short movies, PowerPoint presentations, and vocabulary lists. These are provided to teachers.
An additional area of expertise that we intended to develop tangentially is computational thinking. Models are set up to allow students to explore the program that executes the model with the goal of developing skills related to computation, such as algorithmic thinking; as stated earlier, some models require students to manipulate the program and construct their own systems. However, we wanted to use these notions of computational thinking in the service of developing content expertise and did not explicitly measure its development in the project. For example, in predator–prey interactions, students become familiar with variables, how they are controlled, and how the variables can be programmed to exhibit system behaviors on the screen.

2. Methodology

In the following sections we describe our preliminary investigation of student learning outcomes from a yearlong implementation of project units in high school biology classrooms.

2.1. Context

The data and outcomes reported here result from classroom implementations of the project’s five curricular units during the 2013–2014 school year by teachers who trained with us over the course of two years (week-long summer PD sessions and workshops during the academic year). An extended description of PD activities can be found elsewhere, in addition to information on teacher learning and change in instructional practices [6]. However, we provide some brief characteristics of the PD in order to demonstrate the efforts made to ensure a high-quality training experience. We designed and conducted the PD following professional judgments about what constitutes high-quality PD: (a) aligned content; (b) active learning opportunities; (c) coherence with professional demands; (d) at least 20 h in duration and spread over a semester; and (e) collective participation of teachers from the same school, grade, or subject [23,33]. Of these five characteristics, we considered active learning to be particularly important. Due to the well-documented, steep learning curve teachers experience in adopting new technologies in their classroom [24,34], we emphasized exposure to computers [25] and extensive training on computers [35] through active engagement to give teachers needed experiences to support adoption efforts. We also incorporated the other characteristics judged to be important for a quality intervention. For example, we aimed to achieve coherence with professional demands by working closely with teachers to understand and incorporate daily issues they may face in negotiating new technology-based programs. A critical element in this partnership with teachers was the iterative program design cycles, in which teachers became co-designers in upgrading the simulation and curricular activities with the goal of improving classroom implementation. We delivered 80 h of face-to-face PD (40 in the first year and 40 in the second year). We also focused on collective participation by working only with high school biology teachers, which provided them with opportunities to work with others in the same content area. In some cases, several teachers from the same school were able to work together, which gave them opportunities to share knowledge of situational affordances and constraints.
After the summer PD in each year, teachers were expected to implement the five project units in their biology classrooms throughout the year. The units typically spanned two to three days of instruction. Students worked in pairs and completed inquiry prompts that required them to run experiments, make predications, collect and interpret data, and answer argumentation questions based on the StarLogo Nova simulation for that unit. The units covered the biology topics of Genetics, Evolution, Ecology, the Human Body, and Animal Systems. Nine of the teachers completed all five units; and one teacher completed three of the units. We provide data and results from their second year of implementation.

2.2. Participants

We recruited 10 teachers—seven women and three men—from seven Boston area public schools. The teachers came from a diverse set of schools. One school was as high as 71% ethnic/racial minorities, while another school was almost entirely white (3% minority). School-level percentages of low-income students ranged from 14% to 59%. The percentage of students who were proficient or advanced on the state standardized science test ranged from 65 to 94. Teachers, on average, had eight years of teaching experience, with a range of 3.5 to 19 years. We collected student data in 10 teachers’ classrooms from a total of 361 students ranging from freshman to 12th grade college prep and honors levels. The schools did not release individual student demographic and achievement data to us, so we cannot report accurate sample data in these areas. However, due to the range of classrooms and academic levels, we believe that the students we worked with are a relatively representative sample of the population-level statistics that are reported. See Table 1 and Table 2 for detailed demographic information and numbers of students distributed across grade levels.

2.3. Data Sources and Analyses

To investigate learning outcomes, we conducted a mixed methods evaluation of students over the course of the 2013–2014 school year. Both before and after the intervention, measurements were collected on student understanding of biology and complex systems. We also conducted focus group interviews with students. Biology content understanding was assessed using 14 multiple-choice questions compiled from several state and national standardized science exams (We consulted the 1999–2001, 2010, and 2011 New York State Board of Regents Biology exams, the 2009 California Standards Test in Biology, and the 2000 and 2009 National Assessment of Educational Progress (NAEP) exams). Students completed the test at both the start and end of the school year so that learning gains could be assessed. A paired t-test was conducted to analyze growth in student biology understanding.
Students also completed an assessment of complex systems understanding, at both the start and the end of the school year. This instrument consisted of an open-ended ecology prompt described below, asking students to write down anticipated changes in a biological complex system.
Imagine a flock of geese arriving in a park in your town or city, where geese haven’t lived before. Describe how the addition of these geese to the park affects the ecosystem over time. Consider both the living and non-living parts of the ecosystem.
Responses were scored on a scale of 1 (not complex) to 3 (completely complex) for each of four different dimensions of complex system understanding. These components were derived from earlier research [2,3,36,37] the categories of which are listed in Table 3. Aggregate scores on this exam ranged from 4 to 12.
For the coding, three raters were rigorously trained over a series of several weeks. Due to the complexity of this task, all raters were asked to code 20% of the data. We then ran reliability tests between pairs of raters in order to determine which raters had the highest alpha scores in each of the complex systems categories. Alpha scores for each pair ranged between 0.731 and 0.823. The pair of raters then scored all of the student responses in their assigned categories. All discrepant codes were discussed and a single code was assigned. Student growth in complex systems understanding was determined from their scored responses through a paired t-test.
Focus groups were also conducted at the end of each classroom observation with four to five students who volunteered to be interviewed. Teachers were instructed to select a range of students who they considered to be high, medium, and low achievers. The interview consisted of five structured and semi-structured questions to probe student learning of complex systems and biology when interacting with the simulations and curricula. The questions were: (1) How was your learning experience similar or different to how you learn normally in a classroom?; (2) Did you understand how to answer the argumentation questions?; (3) What was the main biology idea represented in this unit?; (4) What do you know about complex systems?; and (5) Have you ever used computer simulations or models before in this class or any other classroom? In total, 12 focus group interviews of 50 students (28 female, 22 male) were conducted in seven schools, lasting in total 6 h and 3 min. The focus group transcripts were coded according to a framework adapted from the NGSS to assess the impact of the curriculum and instructional framework on student learning. A categorization manual was constructed using descriptions of the five disciplinary core ideas, the eight scientific practices, and the seven crosscutting concepts from the NGSS [38]. Table 4 shows definitions used for three of the 20 NGSS categories (for a full description of the categorization manual, see [39]).
Initial coding was completed through a modified method of interaction analysis (IA). This typically involves analyzing video and/or audio data to examine the details of social interactions [40]. The analysis is normally performed in collaborative work groups. In our study, four researchers read through the transcripts of four focus group interviews (33% of the total number of focus groups interviewed) and identified responses from students that indicated learning in the NGSS categories. Responses could be coded in multiple categories. After an acceptable level of agreement was reached, one researcher coded the remaining eight focus group interviews.
The following excerpt provides further details about how students’ responses were coded. To answer the first interview question, “How was your learning experience similar or different to the way you learn normally in a classroom?” one student said,
Visualization helps and it helps that it’s moving so you know how long it takes, how fast the graphs, if you just look at the graph without any idea of what the simulation is, it would just make no sense to anybody. But if you watch the fish swim around and eat the algae, you saw the population plummet. It just shows you the exact numbers versus something that you don’t really understand what’s going on.
(Focus Group 5, 17 October 2013)
This student’s comment was coded in the first two categories listed in Table 4. Here the student discusses using the simulation to understand the relationship between the fish and algae in the ecology model, showing an understanding of developing and using models. He also mentions the use of the graph to provide evidence of population changes, showing an understanding of cause and effect relationships.

3. Results

The results reported below illustrate the curriculum and instruction framework’s influence on student learning. This section is organized around student content expertise, curricular relevance to the NGSS, and what students believed supported their learning.

3.1. Content Expertise

The project sought to develop content expertise in the topics of biology and complex systems. From students’ pre- and post-surveys, the results indicated gains in both content areas. Table 5 displays results from the paired t-test of biology scores, conducted at the beginning and end of the 2013–2014 academic year. The results show that student scores increased significantly from pre- to post-assessment—from a mean of 7.67 to 9.43 (SD = 2.47), where t(345) = 12.5, p < 0.001, and the effect size is 0.67 (Cohen’s d).
Analysis of responses from the complex systems ecology prompt show similar gains. Table 6 shows that students’ complex systems understanding demonstrated positive significant growth, moving from a mean of 5.80 (SD = 1.23) to 6.79 (SD = 1.29), where t(360) = −12.26, p < 0.001, and the effect size is 0.65 (Cohen’s d).
Our research design does not allow us to identify whether or not these gains would have occurred in the absence of our curriculum and instruction framework. As this was a preliminary study, we did not randomly assign students or classes to control groups. It is also important to note that average gains in student understanding were small in absolute terms. The typical student improved his or her score by only one or two points or questions over the course of the year. Nonetheless, statistical tests demonstrate that learning clearly occurred, and the effect size of students’ respective growth in content expertise is encouraging. The effect sizes of 0.67 and 0.65 are interpreted as medium effects in Cohen’s d terms [41]. Moreover, according to Bloom and colleagues, ninth graders typically experience an effect size of around only 0.19 in science learning over the course of their freshman year as measured by several nationally normed tests [42]. As ninth grade was the modal, or most common, grade level in our study, it is quite possible that the science learning under our framework exceeded that of traditional science classrooms.

3.2. Curricular Relevance

A central goal of the framework was to align curriculum and instruction with the NGSS. From the student focus group interviews, we coded 1056 responses indicating student learning outcomes that illustrated the standards. Figure 5 shows the frequency of responses that fell into each category. Students identified learning instances in 18 out of the 20 categories. Not surprisingly, given the emphasis of our simulation tool and curricula, the categories with the most responses were Developing and using models; Cause and effect; and Systems and system models. However, we were encouraged to see that most of the other NGSS categories were represented in what students thought they learned. Other categories showing relatively high frequencies of responses were Planning and Carrying out Investigations, Analyzing and Interpreting Data, Using Mathematics and Computations Thinking, and Patterns. That students discussed their learning experiences with respect to these scientific practices and cross-cutting themes, is important as the science education community aims to identify resources that enable NGSS goals to take place in science instruction. We discuss further alignment of NGSS categories in the section on students’ perceptions of learning supports.

3.3. Students’ Perceptions of Learning Supports

Students articulated benefits accrued from participating in cognitively rich pedagogies. All focus groups mentioned the positive impact on their learning as a result of participating in the argumentation process. In the following excerpt from a focus group interview, one student spoke about the learning affordances in figuring out what her evidence showed in order to answer one of the argumentation questions, which we included as a process to encourage the social construction of ideas (see Figure 2 for an example):
Well, I think the hardest part about ... it was like figuring out what the claim was like ... what does your evidence show? Why did I just collect all of this evidence? What did I discover off that? But after you figured out what your claim was you had a bunch of evidence to back it up. So everything from there on was pretty easy.
(Focus Group 7, 17 October 2013)
In this excerpt, we can see that this student was cognitively involved in understanding the purpose of the experiment and how the evidence was brought to bear on her understanding.
Another example of student learning resulting from project efforts to build in cognitively rich pedagogies happened while working with the unit “Catching Flies,” in which students learned to program their own simulations. The following quote illustrates how building the simulation supported this student’s particular style of learning:
I feel like I’m much more of a hands-on learner so for them to be projecting on the board and us watching it’s not as helpful for me than us doing it on the computer … when you get to do it on a computer by yourself because you get to actually experience it and that’s kind of important for learning. You can see all your mistakes and you can try to fix them yourself instead of having the teacher do it the first time and getting it right.
(Focus Group 9, 12 December 2013)
Through the actual manipulation of the program, students were able to examine and modify their own errors, which potentially led to deeper learning. In the following exchange, it is clear that through constructing simulations students were also able to begin testing their developing ideas of complex systems:
I think that all the coding, it was a lot of different things that came together, but then when it was on the simulation it was just a simple few flies moving around randomly and it didn’t look that different, but when you went back into the code it had a lot of different parameters.
(Focus Group 9, 12 December 2013)
In this unit, students program multiple variables into the system and learn about how different initial conditions and the inclusion of different variables can significantly impact the patterns that emerge. The excerpt shows that building the simulation, which was included as a constructivist activity, helped students make connections between the code, behaviors on the screen, and the complex concept that was represented.
Active experimentation with the simulations similarly enabled students to develop a deeper understanding of complex system processes. For example, in the following interview excerpt concerning the unit on enzymes, one student spoke about the ability to manipulate the inputs (the number of starches and enzymes present) in the model, which enabled her to gain a greater understanding of the complex system concept of predictability:
Yeah, I didn’t know if they sought out all the time or if they were just moving randomly most of the time. So I tried I think it was like 20 starches and then I added like 10 enzymes. I thought because it was 10 and 20 that it would come out like that, but it didn’t. It came out to totally different numbers and that just made me understand how no matter the number, you can always have different outcomes.
(Focus Group 10, 21 November 2013)
An accurate understanding of complex systems processes suggests that it is not possible to predict the exact outcome of system effects each time. What is important to note from the above quote is that the student recognized that outcomes of a system processes may vary based on the initial conditions. The student was able to gain this knowledge by experimenting with a simulation in which random variability was built into the system.
Beyond manipulating the simulations, and using evidence to understand phenomena, we were also interested in learning about the impact of the guided inquiry processes included in the curricula. All interviews indicated that students’ improvement in understanding the scientific phenomena was enabled by the StarLogo Nova simulations. Moreover, many students spoke about the utility of the student activity packs in providing them with room to explore and learn on their own. In the following excerpt, one student discusses how the dynamic visualization helped her to see how the system changed over time:
I think the hands-on experience just helps you process the information better because it’s right there and you can see it going on. You can see the process. You can see what’s happening. Even though it’s a short amount of time you understand what would happen over a long amount of time instead of just being told what happens.
(Focus Group 7, 17 October 2013)
Students also spoke about the hidden information that was revealed by working with the simulations. After the unit on diffusion of sugar into the bloodstream, a student noted how she understood the actual movement of the sugar molecules:
Yeah, and I learned that the molecules bounce off the walls randomly. I thought they just went in a straight line to the bloodstream.
(Focus Group 12, 11 February 2014)
With respect to the student activity packs, students discussed the curricula’s ability to offer additional opportunities to experiment and to answer student-generated questions, as documented in this excerpt:
But learning it beforehand and then having to follow specific directions and then doing it, it really helps because if you’re following the directions and it works right it’s going to be like “Oh, okay, I understand now. If I do this and I do that, then I can do that.” But then you can make other questions and hypotheses and everything like that and then you can be like “If I do this will this happen?” And you can actually try it because you have that freedom of trying it on the simulation.
(Focus Group 10, 21 November 2013)
While the activity packs provided specific questions to guide student learning, this student used the simulation to answer her own questions.

4. Discussion

The study of complex systems in education has been supported through policy, scholarship, and the allocation of resources. For example, all seven of the crosscutting concepts in the new NGSS reflect important aspects of complex systems such as Scale, and Structure and Function. This has raised challenges for educators who must follow the NGSS alongside other contextual and professional demands. Thus, understanding optimal methods for constructing educational experiences about complex systems is critical. In this paper, we introduced a framework for teaching and learning about complex systems that addressed needs in designing curriculum for classroom implementation [12]. The framework builds on previous research in science education on inquiry-based projects [17,20] and included other known best practices and recent research, demonstrating the importance of developing programs that have curricular alignment, engage in cognitively rich pedagogies, use appropriate tools for teaching and learning about complex systems, and develop content expertise in science and complex systems content.
This framework moves the field of complex systems in science education forward by offering a comprehensive program for classroom implementation that has been vetted by teachers as co-designers in the iterative design cycle. In addition to improving curriculum and instruction activities, this co-design model can also be understood as a necessary variable in producing successful implementation and student learning outcomes. Both the collaboration and the iteration represent the first two elements in the DBIR framework [13] in the service of tackling issues in adopting innovative technology-based programs.
In our curriculum and instruction framework design, we also focused heavily on constructing learning experiences that build on what we know about how learning best happens, especially in the domain of science. Much of our curriculum was constructed to capitalize on the ability of computer simulations to easily enable iterative experimentation, provide multiple sources and inputs for data collection, and present dynamic visual details about different aspects of systems that would not be accessible through static images or text-based descriptions. Based on student perceptions of what supported their learning, constructing the simulations also played a central role in understanding the components of the system and how they interacted with each other and the environment. With the inclusion of other important scientific practices like argumentation and analyzing and interpreting data in the learning activities, students were able to demonstrate robust learning in the content areas of biology and complex systems with effect sizes that exceed known yearly learning gains in science classrooms [42]. This is an encouraging result despite the fact that we did not have a control group for comparison. Our next step will be to run an experimental efficacy trial to support these preliminary findings. In other publications, (e.g., [39,43]) we have discussed some of the challenges associated with professional development and teacher learning as a result of implementation, which will also be investigated further in subsequent studies.
However, one of our main goals in this paper was to develop and provide promising evidence of the effectiveness of a modular framework for other similar systems researchers and practitioners to assess and use, which is the last element of the DBIR process. We believe this is an important pursuit, particularly as we embark on investigating science programs that can instantiate qualities of the NGSS to improve science learning for all students. In this preliminary study, we hope we have advanced this conversation, and we eagerly seek commentary in the service of advancing complex systems in education research and practice.

Acknowledgments

This research was funded by a grant from the U.S. National Science Foundation, #1019228.

Author Contributions

E.K., J.S. and D.W. contributed to the computer resources, the professional development, and overall vision of the project; I.S. and H.S. constructed the curriculum and conducted professional development; E.A. and J.K-Y contributed to the professional development and data collection; S.G. contributed to the development of the curriculum and instruction framework; E.A., J.K-Y, M.O. and C.E. contributed to the data analysis; S.Y. wrote the paper and conceived of the project with E.K; E.A. contributed writing, editing, and formatting of the paper.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. The National Academies. Keck Futures Initiative: Complex Systems: Task Group Summaries; The National Academies Press: Washington, DC, USA, 2009. [Google Scholar]
  2. Yoon, S. An evolutionary approach to harnessing complex systems thinking in the science and technology classroom. Int J. Sci Educ. 2008, 30, 1–32. [Google Scholar] [CrossRef]
  3. Yoon, S.A. Using social network graphs as visualization tools to influence peer selection decision-making strategies to access information about complex socioscientific issues. J. Learn Sci. 2011, 20, 549–588. [Google Scholar] [CrossRef]
  4. Penner, D. Explaining systems: Investigating middle school students’ understanding of emergent phenomena. J. Res. Sci. Teach. 2000, 37, 784–806. [Google Scholar] [CrossRef]
  5. Resnick, M.; Wilensky, U. Diving into complexity: Developing probabilistic decentralized thinking through role-playing activities. J. Learn Sci. 1998, 7, 153–172. [Google Scholar] [CrossRef]
  6. Yoon, S.; Klopfer, E.; Wang, J.; Sheldon, J.; Wendel, D.; Schoenfeld, I.; Scheintaub, H.; Reider, D. Designing to improve biology understanding through complex systems in high school classrooms: No simple matter! In Proceedings of the Computer Supported Collaborative Learning, Madison, Wisconsin, 2013.
  7. Colella, V.; Klopfer, E.; Resnick, M. Adventures in Modeling; Teachers College Press: New York, NY, USA, 2001. [Google Scholar]
  8. Klopfer, E.; Yoon, S. Developing games and simulations for today and tomorrow’s tech savvy youth. Tech Trends. 2005, 49, 33–41. [Google Scholar] [CrossRef]
  9. Klopfer, E.; Scheintaub, H.; Huang, W.; Wendel, D.; Roque, R. The simulation cycle - combining games, simulations, engineering and science using StarLogo TNG. Journal of ELearning. Dig. Media. 2009, 6, 71–96. [Google Scholar]
  10. Gobert, J. Leveraging technology and cognitive theory on visualization to promote students’ science learning. In Visualization in Science Education; Gilbert, J., Ed.; Springer: Dordrecht, The Netherlands, 2005; pp. 73–90. [Google Scholar]
  11. Wilensky, U.; Reisman, K. Thinking like a wolf, a sheep, or a firefly: Learning biology through constructing and testing computational theories—An embodied modeling approach. Cogn. Instr. 2006, 24, 171–209. [Google Scholar] [CrossRef]
  12. National Research Council. National Research Council. Guide to implementing the Next Generation Science Standards. In Committee on Guidance on Implementing the Next Generation Science Standards. Board on Science Education, Division of Behavioral and Social Sciences and Education; The National Academies Press: Washington, DC, USA, 2015. [Google Scholar]
  13. Penuel, W.R.; Fishman, B.J.; Cheng, B.H.; Sabelli, N. Organizing research and development at the intersection of learning, implementation, and design. Edauc. Res. 2011, 40, 331–337. [Google Scholar] [CrossRef]
  14. Sandoval, W.A.; Daniszewski, K. Mapping trade-offs in teachers’ integration of technology-supported inquiry in high school science classes. J. Sci. Educ. Tech. 2004, 13, 161–178. [Google Scholar] [CrossRef]
  15. Gerard, L.F.; Varma, K.; Corliss, S.B.; Linn, M. Professional development for technology-enhanced inquiry science. Rev. Educ. Res. 2011, 81, 408–448. [Google Scholar] [CrossRef]
  16. Kim, M.C.; Hannafin, M.J. Scaffolding problem solving in technology-enhanced learning environments (TELEs): Bridging research and theory with practice. Comput. Educ. 2011, 56, 403–417. [Google Scholar] [CrossRef]
  17. Slotta, J.; Linn, M. WISE Science: Web-Based Inquiry in the Classroom; Teachers College Press: New York, NY, USA, 2009. [Google Scholar]
  18. Reiser, B.J.; Tabak, I.; Sandoval, W.A.; Smith, B.; Steinmuller, F.; Leone, T.J. BGuILE: Strategic and conceptual scaffolds for scientific inquiry in biology classrooms. In Cognition and Instruction: Twenty Five Years of Progress; Carver, S.M., Klahr, D., Eds.; Erlbaum: Mahwah, NJ, USA, 2001; pp. 263–305. [Google Scholar]
  19. Sandoval, W.A.; Millwood, K.A. The quality of students’ use of evidence in written scientific explanations. Cognit. Instr. 2005, 23, 23–55. [Google Scholar] [CrossRef]
  20. Tabak, I.; Reiser, B.J. Software-realized inquiry support for cultivating a disciplinary stance. Pragmat. Cognit. 2008, 16, 307–355. [Google Scholar] [CrossRef]
  21. Jacobson, M.; Wilensky, U. Complex systems in education: Scientific and educational importance and implications for the learning sciences. J. Learn. Sci. 2006, 15, 11–34. [Google Scholar] [CrossRef]
  22. National Research Council. National Research Council. Education for life and work: Developing transferable knowledge and skills in the 21st century. In Committee on Defining Deeper Learning and 21st Century Skills; Pellegrino, J.W., Hilton, M.L., Eds.; The National Academies Press: Washington, DC, USA, 2012. [Google Scholar]
  23. Desimone, L. Improving impact studies of teachers’ professional development: Toward better conceptualizations and measures. Educ. Res. 2009, 38, 181–199. [Google Scholar] [CrossRef]
  24. Ertmer, P.A.; Ottenbreit-Leftwich, A.T.; Sendurur, E.; Sendurur, P. Teacher beliefs and technology integration practices: A critical relationship. Comput. Educ. 2012, 59, 423–435. [Google Scholar] [CrossRef]
  25. Mueller, J.; Wood, E.; Willoughby, T.; Ross, C.; Specht, J. Identifying discriminating variables between teachers who fully integrate computers and teachers with limited integration. Comput. Educ. 2008, 51, 1523–1537. [Google Scholar] [CrossRef]
  26. Thompson, J.; Windschitl, M.; Braaten, M. Developing a theory of ambitious early-career teacher practice. Am. Educ. Res. J. 2013, 50, 574–615. [Google Scholar] [CrossRef]
  27. Osborne, J. Arguing to learn in science: The role of collaborative, critical discourse. Science 2010, 328, 463–466. [Google Scholar] [CrossRef] [PubMed]
  28. Kafai, Y.B. Playing and making games for learning instructionist and constructionist perspectives for game studies. Game Cult. 2006, 1, 36–40. [Google Scholar] [CrossRef]
  29. Epstein, J. Why Model? J. Artif. Soc. Simul. 2008, 11. Available online: http://jasss.soc.surrey.ac.uk/11/4/12.html (accessed on 15 December 2015). [Google Scholar]
  30. Begel, A. LogoBlocks: A Graphical Programming Language for Interacting with the World. Advanced Undergraduate Thesis, Massachusetts Institute of Technology, Cambridge, MA, USA, 1996. Unpublished work. [Google Scholar]
  31. Roque, R.V. OpenBlocks: An Extendable Framework for Graphical Block Programming Systems. Master’s Thesis, Massachusetts Institute of Technology, Cambridge, MA, USA, 2007. Unpublished work. [Google Scholar]
  32. Lewis, J.; Wood-Robinson, C. Genes, chromosomes, cell division and inheritance—Do students see any relationship? Int. J. Sci. Educ. 2000, 22, 177–195. [Google Scholar] [CrossRef]
  33. Garet, M.S.; Porter, A.C.; Desimone, L.; Birman, B.F.; Yoon, K.S. What makes professional development effective? Results from a national sample of teachers. Am. Educ. Res. J. 2001, 38, 915–945. [Google Scholar] [CrossRef]
  34. Aldunate, R.; Nussbaum, M. Teacher adoption of technology. Comput. Hum. Behav. 2013, 29, 519–524. [Google Scholar] [CrossRef]
  35. Pierson, M.E. Technology integration practice as a function of pedagogical expertise. J. Res. Tech. Educ. 2011, 33, 413–430. [Google Scholar] [CrossRef]
  36. Jacobson, M.; Kapur, M.; So, H.; Lee, J. The ontologies of complexity and learning about complex systems. Instr. Sci. 2011, 39, 763–783. [Google Scholar] [CrossRef]
  37. Pavard, B.; Dugdale, J. The contribution of complexity theory to the study of socio-technical cooperative systems. In Unifying Themes in Complex Systems; Minai, A.A., Braha, D., Bar-Yam, Y., Eds.; Springer: Berlin, Germany, 2000; pp. 39–48. [Google Scholar]
  38. NGSS Lead States. Next Generation Science Standards: For States, By States; The National Academies Press: Washington, DC, USA, 2013. [Google Scholar]
  39. Yoon, S.; Koehler-Yom, J.; Anderson, E.; Lin, J.; Klopfer, E. Using an adaptive expertise lens to understand the quality of teachers’ classroom implementation of computer-supported complex systems curricula in high school science. Res. Sci. Tech. Educ. 2015, 33, 237–251. [Google Scholar] [CrossRef]
  40. Jordan, B.; Henderson, A. Interaction analysis: Foundations and practice. J. Learn Sci. 1995, 4, 39–103. [Google Scholar] [CrossRef]
  41. Cohen, J. Statistical Power Analysis for the Behavioral Sciences, 2nd ed.; Lawrence Erlbaum: Mahwah, NJ, USA, 1988. [Google Scholar]
  42. Bloom, H.S.; Hill, C.J.; Black, A.R.; Lipsey, M.W. Performance trajectories and performance gaps as achievement effect-size benchmarks for educational interventions. J. Res. Educ. Eff. 2008, 2, 289–328. [Google Scholar] [CrossRef]
  43. Yoon, S.; Anderson, E.; Koehler-Yom, J.; Evans, C.; Park, M.; Sheldon, J.; Schoenfeld, I.; Wendel, D.; Scheintaub, H.; Klopfer, E. Teaching about complex systems is not simple matter: Building effective professional development for computer-supported complex systems instruction. Inst. Sci. 2016. [Google Scholar] [CrossRef]
Figure 1. Complex systems curriculum and instruction framework.
Figure 1. Complex systems curriculum and instruction framework.
Systems 04 00038 g001
Figure 2. Curricular example of scaffolding the scientific argumentation process with the simulation activity.
Figure 2. Curricular example of scaffolding the scientific argumentation process with the simulation activity.
Systems 04 00038 g002
Figure 3. (a)The blocks-based programming interface for StarLogo Nova, in which student control properties of their virtual systems; (b) StarLogo Nova translates blocks of computer code into a virtual 3D ecosystem of fish, grasses, and algae.
Figure 3. (a)The blocks-based programming interface for StarLogo Nova, in which student control properties of their virtual systems; (b) StarLogo Nova translates blocks of computer code into a virtual 3D ecosystem of fish, grasses, and algae.
Systems 04 00038 g003
Figure 4. Excerpt of the teacher guide for the evolution unit. The callouts provide background information and suggestions for running the simulation in class.
Figure 4. Excerpt of the teacher guide for the evolution unit. The callouts provide background information and suggestions for running the simulation in class.
Systems 04 00038 g004
Figure 5. Frequency of student focus group responses as coded in each NGSS category.
Figure 5. Frequency of student focus group responses as coded in each NGSS category.
Systems 04 00038 g005
Table 1. School level demographic information.
Table 1. School level demographic information.
Teacher IDSchool IDSchool-Level CharacteristicsSample
% Low Income% Above Proficient MCAS% Minority *% Minority *
1359.0%65%71.7%79.7%
2359.0%65%71.7%79.7%
3427.7%75%38.9%43.4%
4136.0%71%NA31.6%
5359.0%65%71.7%79.7%
7427.7%75%38.9%43.4%
8614.8%92%29.9%42.2%
9714.2%83%3.2%20.0%
10518.0%94%36.7%38.2%
Avg.35.6%75.6%42.7%40.4%
* Minority is defined as non-white.
Table 2. Teacher experience and grade level taught with numbers of students in each grade.
Table 2. Teacher experience and grade level taught with numbers of students in each grade.
TeacherYears of Teaching ExperienceGrade 9Grade 10Grade 11Grade 12Total Number of Students
13.54500045
2141200012
381381022
4120160016
554510046
680018523
750480048
860063063
9190210921
1056500065
Total 180759214361
Table 3. Categories of complex system components.
Table 3. Categories of complex system components.
Complex Systems ComponentsDescriptions
PredictabilityThe emphasis is on the predictability of the effects caused by the agent in question. According to the clockwork framework, the way in which a part or agent operates or affects other components of the system is predictable. In a complex framework, it is impossible to anticipate precisely the behavior of the system. This is because the actions of agents cannot be predicted (as random forces or chance factors can affect an agent’s actions) even if we know the rules or characteristics of the agent.
ProcessesProcesses refer to the dynamism of the mechanisms that underlie the phenomena (i.e., how the system works or is thought to work). In a clockwork framework, there is a beginning, middle, and end in the system. The system is composed of static events. While perturbations (actions by/on parts) in the system may cause change to occur, the change terminates once an outcome is achieved. In a complex system framework, there is no definite beginning or end to the activity. System processes are ongoing and dynamic.
OrderThe focus is the organization of the system or phenomenon as centralized or decentralized. A clockwork framework assumes that all systems are controlled by a central agent (e.g., all action is dictated by a leader). Order is established top-down or determined with a specific purpose in mind. In a complex systems framework, control is decentralized and distributed to multiple parts or agents. Order in the system is self-organized or ‘bottom-up’ and emerges spontaneously.
Emergence and ScaleEmergence refers to the phenomenon where the complex entity manifests properties that exceed the summed traits and capacities of individual components. In other words, these complex patterns simply emerge from the simpler, interdependent interactions among the components. In a clockwork framework, parts of the system are perceived to be isolated, with little interdependency among them. This is because of the linear nature that characterizes these relationships. Thus, there are no large, global patterns that emerge from actions imposed on the system. Rather, these actions cause only localized changes (e.g., geese eat plants, causing a decrease in grass). In a complex system, because parts or agents are interdependent in multiple ways, an action (small or large) that is imposed on the system may have large and far-reaching consequences on the numerous parts and agents of the system. This may in turn result in large-scale change and evolution.
Table 4. Sample NGSS category definitions.
Table 4. Sample NGSS category definitions.
NGSS CategoryCategorization Manual Definition
Developing and using modelsModeling can begin in the earliest grades, with students’ models progressing from concrete “pictures” and/or physical scale models (e.g., a toy car) to more abstract representations of relevant relationships in later grades, such as a diagram representing forces on a particular object in a system. Modeling in 9–12 builds on K–8 experiences and progresses to using, synthesizing, and developing models to predict and show relationships among variables between systems and their components in the natural and designed worlds.
Cause and effectMechanism and explanation. Events have causes, sometimes simple, sometimes multifaceted. A major activity of science is investigating and explaining causal relationships and the mechanisms by which they are mediated. Such mechanisms can then be tested across given contexts and used to predict and explain events in new contexts. In grades 9–12, students understand that empirical evidence is required to differentiate between cause and correlation and to make claims about specific causes and effects. They suggest cause and effect relationships to explain and predict behaviors in complex natural and designed systems. They also propose causal relationships by examining what is known about smaller scale mechanisms within the system. They recognize that changes in systems may have various causes that may not have equal effects.
Systems and system modelsDefining the system under study—specifying its boundaries and making explicit a model of that system—provides tools for understanding and testing ideas that are applicable throughout science and engineering. In grades 9–12, students can investigate or analyze a system by defining its boundaries and initial conditions, as well as its inputs and outputs. They can use models (e.g., physical, mathematical, and computer models) to simulate the flow of energy, matter, and interactions within and between systems at different scales. They can also use models and simulations to predict the behavior of a system, and recognize that these predictions have limited precision and reliability due to the assumptions and approximations inherent in the models. They can also design systems to do specific tasks.
Table 5. Results for paired t-test biology content learning.
Table 5. Results for paired t-test biology content learning.
Pre-testPost-testt valuedfCohen’s d
Biology Content7.6739.42812.50 **3450.67
(2.36)(2.47)
** p ≤ 0.001; standard deviations listed in parentheses.
Table 6. Results for paired t-test complex systems content learning.
Table 6. Results for paired t-test complex systems content learning.
Pre-testPost-testt valuedfCohen’s d
Complex Systems 5.8036.78912.26 **3600.65
(1.23)(1.29)
** p ≤ 0.001; standard deviations listed in parentheses.
Systems EISSN 2079-8954 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top