Next Article in Journal
Revisiting the Common Practice of Sellars and Tegart’s Hyperbolic Sine Constitutive Model
Previous Article in Journal
Classical Molecular Dynamics Simulations of Surface Modifications Triggered by a Femtosecond Laser Pulse
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Characterizing Computational Thinking in the Context of Model-Planning Activities

by
Joseph A. Lyon
1,*,
Alejandra J. Magana
2 and
Ruth A. Streveler
3
1
Engineering Honors Program, Purdue University, West Lafayette, IN 47909, USA
2
Polytechnic Institute, Purdue University, West Lafayette, IN 47909, USA
3
School of Engineering Education, Purdue University, West Lafayette, IN 47909, USA
*
Author to whom correspondence should be addressed.
Modelling 2022, 3(3), 344-358; https://doi.org/10.3390/modelling3030022
Submission received: 26 May 2022 / Revised: 26 July 2022 / Accepted: 29 July 2022 / Published: 2 August 2022

Abstract

:
Computational thinking (CT) is a critical skill needed for STEM professionals and educational interventions that emphasize CT are needed. In engineering, one potential pedagogical tool to build CT is modeling, an essential skill for engineering students where they apply their scientific knowledge to real-world problems involving planning, building, evaluating, and reflecting on created systems to simulate the real world. However, in-depth studies of how modeling is done in the class in relation to CT are limited. We used a case study methodology to evaluate a model-planning activity in a final-year undergraduate engineering classroom to elicit CT practices in students as they planned their modeling approach. Thematic analysis was used on student artifacts to triangulate and identify diverse ways that students used CT practices. We find that model-planning activities are useful for students to practice many aspects of CT, such as abstraction, algorithmic thinking, and generalization. We report implications for instructors wanting to implement model-planning activities into their classrooms.

1. Introduction

Computational and data science are quickly emerging as needed skills across various industries and academic disciplines, yet educational institutions often fail to implement these skills into curricula [1,2]. This is partly because it has been challenging to assign departments within universities to teach computational and data science skills to undergraduates [1]. This issue is amplified by instructors who may not feel comfortable teaching these topics and expect students to learn these skills in other courses [3]. The net result is university graduates who cannot think through complex computational and data science problems. In order to address this critical educational gap, we need to more explicitly understand the specifics of students’ thinking as they develop CT skills.
These issues are especially apparent within engineering curricula, which are already perceived to be packed with courses by many engineering faculty [3]. One alternative to incorporating more courses into the curriculum is integrating computational thinking skills into existing courses. Research has shown that incorporating modeling activities into engineering classrooms allows students to simultaneously learn disciplinary and computational skills [4,5,6,7]. The additional benefit of this is that programming and disciplinary learning emerge through modeling, but computational thinking emerges through the applied model-building process as well [6]. However, few studies have tracked the emergence of CT through modeling processes in a classroom.
Models allow students to apply mathematical and computational concepts to solve real-world problems, and they have been heavily studied in engineering [8,9,10,11]. Often the modeling process is divided into multiple phases, including planning, building, evaluating, and revising [6,12,13,14]. Each phase is crucial to developing practical, real-world modeling skills in engineering practice.
Here we investigated how computational thinking emerged during the planning stages of the modeling process. This is part of a larger study looking at computational thinking through the modeling cycle and builds on a previous study that looked at computational thinking that emerges as part of the model-building process [6]. For this work, our research question is: What computational thinking practices emerged when students participated in model-planning activities? The question was important because it filled research gaps around further defining CT, showing explicit examples of CT in the classroom, and making explicit connections between the modeling process and CT.

2. Background

Multiple bodies of the literature informed this study, including (1) computational thinking, (2) project planning, and (3) model-eliciting activities. Collectively, these bodies of literature helped inform the design of our data collection and methods used to analyze the data. We give background information on each of the three bodies of literature below.

2.1. Computational Thinking (CT)

In recent years, computational thinking (CT) has been of particular interest to researchers and educators alike [15,16]. CT is essentially an umbrella term for different common thinking practices derived from computational sciences [17]. While the exact definition of CT has been debated for some time, many definitions include common elements such as abstraction, generalization, algorithmic thinking, evaluation, and problem decomposition [15,18,19,20].
Additionally, studies incorporating CT in the classroom have been surpassed mainly by researchers discussing CT in more general and definitional ways [16,21,22]. When operationalized, it is often in K12 settings rather than the undergraduate level [23,24]. This K12 literature base has covered a host of topics such as the definition of CT [25], how to assess CT [26,27], and how to teach CT [28]. However, this focus on K12 has left a significant gap in the literature for concrete implementation strategies of CT or best practices for developing CT in undergraduate educational settings. Specifically, what exactly does CT look like when it is used by students and what are ways of operationalizing it in the classroom? There are few studies that delve into the specifics of computational thinking, opting for more definitional approaches [16,21,22] and when implemented, studies often use computational thinking to structure course content instead of finding concrete evidence or emergent practices of CT in students’ work [16]. This study addresses the gaps of operationalizing CT and connecting it to classroom practice by focusing on evidence-based emergent CT practices using a defined CT framework [18,20].

2.2. Solution Planning

In terms of solving complex problems, planning often involves setting goals or deliverables of a problem and deciding on tasks and actions that are needed to solve the problem [29]. Students’ additional considerations while planning should be time constraints and meeting other criteria of the given problem [30]. Planning a solution pathway is a cognitive skill that plays a vital role in students’ final solution [31]. This effect is seen even more when the problem that the student is planning to solve involves more transfer from material they are comfortable with or is ill-structured [32]. Thus, planning the model before solving it is essential for student success for ill-structured modeling activities, such as the one in this study.
While the literature has shown that planning is a valuable mechanism for problem-solving, many modeling interventions tend not to include it as a cognitive or metacognitive step before constructing models. Some modeling-based learning frameworks have students collect their experiences and observations before building their models, so having some level of planning is not without precedent [12]. Within modeling interventions, students who plan their solution before starting often exhibit behaviors more aligned with subject-matter experts, focusing on their conceptual understanding of the material and avoiding trial-and-error solutions [33].

2.3. Model-Eliciting Activity (MEA)

Model-eliciting activities (MEAs) were originally used in mathematics contexts and have subsequently been used in engineering contexts for using real-world modeling problems in the engineering classroom [34,35,36,37]. MEAs help students learn complex problem-solving, mathematical modeling, and team skills. The activities are built on six core principles [34]: (1) the model-construction principle; (2) the reality principle; (3) the self-assessment principle; (4) the model-documentation principle; (5) the generalizability principle; and (6) the effective prototype principle. Collectively, these inform the creation of the activity used for this study.
MEAs are embedded in real-world contexts and involve creating a model of the context in which the activity is embedded. MEAs require that teams of students document each step of the modeling process and develop models that can be assessed by the students themselves through criteria given in the problem. The end solutions should be generalizable to other situations and contexts. In addition to learning from the mathematical model, this structure allows instructors to create modeling activities for their classrooms. Such activities enable students to transfer information from other areas of their education, engage students through team-based activities, and push the curriculum to be more student-centered and active [8,35,38,39]. For these reasons, the MEA framework was used to structure the modeling activities for this study.

3. Methods

We used a specific case and the MEA framework to design the learning intervention [34]. Classroom artifacts were then analyzed using thematic analysis to identify computational thinking themes throughout the assignments.

3.1. Participants and Context

Participants (N = 26) in this study were mainly in the final year of their undergraduate education within an Agricultural and Biological Engineering program. All students in the course were majoring in bioengineering or food process engineering. At the time of the research, the program reported having slightly more students that identified as female than as male. This was the only section of the course and was required of all students, so the class was approximately representative of the department.
The course was the first of a two-part capstone class covering unit operations within the food and pharmaceutical industries, focusing on designing such systems using thermodynamics, heat and mass transfer, and reaction kinetics, which had all been covered previously in the program. The class met twice a week for a one-hour lecture and twice a week for a two-hour lab. The students had previous computer programming training in earlier engineering coursework, although the coursework and prior experience varied. Much of the prior experiences in computer programming came from introductory engineering coursework, an intro to computer science course, and previous projects from other classes within their major.

3.2. Learning Intervention

This study investigated the initial modeling and planning phases, where students were instructed to plan out a computational solution to a real-world engineering problem. It was implemented into the only section of the course. The learning intervention was a four-part modeling sequence that followed the structure of an MEA. It consisted of: (1) planning the model; (2) building the model; (3) evaluating the model; and (4) reflecting on the model. The results presented here are focused primarily on the first phase of the learning intervention, planning the model. The entire four-part modeling sequence, along with intervention documents, can be found in previous publications by the authors (Lyon et al., 2019; Lyon & Magana, 2021).
During the planning the model phase, students were tasked with modeling the temperature distribution inside a can during a sterilization process. Students were given the problem in the form of an email from a systems engineer that gave them specific properties of foods that were being processed and blueprints for the processing line they were modeling. This was done to simulate a realistic scenario, a vital tenet of the MEA framework [8]. Students were divided into teams of three or four to plan out how they would model the process. Nearly all groups consisted of four team members, with only a few consisting of three. The students were able to choose their own groups, meaning no background or demographic information was used to create the student groupings. There were 15 total groups in the class. The questions posed to the students included:
  • What equations would they use?
  • What food properties were needed for their model?
  • What assumptions would they need to make?
  • What computational technique would they use?
  • How would they use all of these things together to construct a computational model?
Students were given a brief introduction to the problem and related concepts before the planning phase. Thus, they were prompted to recall details from previous coursework to try and create a potential modeling solution. The teams of three to four students were given the entire two-hour lab period to develop a potential plan for their computational model.
The problem was sufficiently complex that it required students to recall information from multiple previous courses, including heat transfer, thermodynamics, unit operations, mathematical modeling, and programming courses. The problem given to them was ill-defined, meaning that it had multiple possible solutions. Features such as nonrelevant details and some relevant variables were provided in ranges. These types of details require students to understand what they are doing and require them to make difficult decisions and assumptions about how to use the given details. This type of ill-structured problem is useful for promoting multiple solutions and increasing the ability for future problem-solving [40].

3.3. Data Collection

Two types of data were collected as part of this study: audio/video recordings (of four of the teams during the two-hour lab period) and planning templates (which were collected from each of the teams at the end of the lab period). One researcher transcribed audio/video data of the group planning process to transform it into a format suitable for thematic analysis. Table 1 shows the data collection and the total participants from the study.

3.4. Data Analysis

We used thematic analysis and followed the methodology as defined by Braun and Clarke [41] (p. 87). Their six-step process includes: (1) familiarizing yourself with your data; (2) generating initial codes; (3) searching for themes; (4) reviewing themes; (5) defining and naming themes; and (6) producing the report. We coded transcripts and course assignments (one assignment from each group for a total of 15) to develop a codebook of computational thinking themes that emerged from the model-planning process. First, each data point (transcripts and course assignments) was coded for significant computational thinking practices listed in Table 2, with the codebook being used and adapted from previous studies [6,20].
After coding for these broad categories, we did the second coding round to create subthemes within each CT category. As a starting point, subthemes for each category as defined by Lyon and Magana [6] were used, with the entire codebook used as a starting point. The frequency of each category and subtheme was tracked throughout the analysis to understand what CT categories were emerging and how often they showed up in student work. Depending on what emerged from the model-planning data, one researcher removed or added subthemes throughout the analysis process. This process resulted in a new codebook with some of the same subthemes (some removed because they did not emerge) and some new codes from the data. Thus, some of the definitions and subthemes are the same or overlap significantly with the original codebook [6].

3.5. Trustworthiness

Multiple approaches were taken to ensure trustworthiness. First, multiple types of data are used as a form of triangulation of the results, with classroom observations via audio/video recordings being used to understand the content of the student artifacts more deeply. Additionally, after the first researcher completed the coding, a second coder (a qualitative educational researcher with a background in the physical sciences) coded at least 20% of the statements previously coded by the first researcher for subthemes. This was done to make sure subtheme definitions were sound, and there was little overlap between the different identified subthemes in the data developed by the first researcher. After the coding round, the two researchers discussed differences and adjusted the codebook accordingly. This was done until the two researchers reached a greater than 80% agreement on the dually-coded statements.

4. Results

The results section is first broken down into a section that overviews the outcomes seen across the dataset. From there, each section’s outcomes one of the CT practices with an in depth look at the outcomes identified.

4.1. Overview of the Results

Many of the CT outcomes and definitions identified were similar or the same as the previous study [6]. However, some key differences to the codebook appear in bold text in Table 3.
As one can see from Table 3, all CT practices other than evaluation had at least one new identified CT outcome that was observed from the model-planning process. Additionally, some of the practices that were identified in the model-building process [6] were not identified in the model planning data.

4.2. Abstraction

From previous work, abstraction was one of the most commonly identified CT practices in the model-building process [6]. Our results indicate that the model-planning process is no different, with abstraction being the most frequently identified CT practice within the model-planning data. Table 4 below shows the finalized definitions and representative quotes of each CT outcome identified from the model-planning data.
There were two new codes for abstraction found during the thematic analysis process. One is about aligning properties with similar systems. It makes sense that it would appear during the model-planning process, as students are just seeing the problem for the first time and are thinking through ways to simplify the problem from systems that they have seen in problems previously. The second new CT outcome is geometric relationships. This is likely showing up in the model-planning data because students have to decide fairly early on what geometry they plan to assume to solve the problem. This decision is likely already made during the model-building process and is not at the forefront of their minds. One item to note is the scarcity of the infinite to finite code during the model-planning process compared to the model-building process. This could be due to students not thinking about numerical methods or programming as much during the model-planning process, which is significantly important to the model-building process.

4.3. Algorithmic Thinking

Algorithmic thinking was seen throughout the dataset as a common CT practice, with multiple observed outcomes. Table 5 shows the different CT outcomes observed in the data from the practice of algorithmic thinking, with both definitions and representative quotes.
As shown in Table 5, there are two new subthemes identified. The first, named stepwise approach, is where students outline a solution process step by step by diagramming or through thought experiments. Compared to the model-building phase, this likely showed up because students needed to plan out their solutions, which often included putting together a step-by-step plan of how they intended to solve the problem. Figure 1 below shows a diagram submitted as part of a student report that shows this stepwise approach code.
Additionally, considering parallel methods was a new code, where students often pointed out overlapping or duplicative information that showed multiple or parallel ways of solving the same problem. The majority of the CT outcomes observed in the practice of algorithmic thinking were an indication of later use/effect. This is likely because students needed to project forward what they needed to solve the problem and why they needed it by indicating how they planned to use the information or how it affected their algorithm.

4.4. Evaluation

Evaluation was the one category where no new outcomes emerged from the coding process; however, nearly all outcomes from the previous codebook were observed. Table 6 outlines the definitions and quotes associated with each observed outcome.
One of the most significant results from the analysis is that debugging is absent as an outcome in the model-planning process. This is likely due to debugging being closely tied to programming the solution, which is more central during the model-building process. Additionally, many of the coded statements focused on the mathematical method for the solution rather than the computational structuring of the solution. This demonstrates that students were much more in the mindset of solving the problem mathematically during the model-planning process. They were often evaluating their solution in a more mathematical sense.

4.5. Generalization

The CT practice of generalization was much more prevalent during the model-planning data than during the model-building data. Two new outcomes were identified for generalization and one was missing from the original codebook. Table 7 outlines the definitions of the identified outcomes and representative quotes.
This was the only CT practice that had completely new themes from the original codebook. Whereas the original codebook had reuse of code as an identified outcome, during the model-planning activity, two new outcomes emerged. The first, drawing from previous experiences, naturally connects to any planning activity. Students were often making explicit references to previous coursework, assignments, or work experiences and how they were helpful in the current problem. The other new outcome, projecting to other or future applications, had students looking at how different decisions they made during the planning phase would affect the future applications that their model could have.

4.6. Decomposition

The CT practice of decomposition was found in the dataset at a relatively low frequency. However, two different outcomes were identified through the model-planning process. Table 8 shows the different outcomes, definitions, and quotes identified.
One newly identified outcome was Allocating resources, where students look at decomposing the problem solution by breaking up work or doing different parts of the planning process synchronously. Another practice was where students created diagrams as an organizational tool during the planning process. Figure 2 shows an example of this coded behavior.
Diagrams such as Figure 1 and Figure 2 demonstrate two processes that were individually coded. First, the problem space is broken into multiple pieces, a decomposition practice coded as the organization of a larger solution method. Second, an algorithmic practice of putting these pieces into order was coded as a stepwise approach.

5. Discussion

The results of this study continue to provide evidence that modeling activities are effective methods for integrating CT into the undergraduate engineering classroom. While the connection between CT and modeling has been made previously in the literature [19], the results of this study further this connection by showing how CT is specifically practiced by students going through the planning process of a modeling cycle. Our results show that all five types of CT practices (i.e., abstraction, algorithmic thinking, evaluation, generalization, and decomposition) were found in students’ model-planning solutions. This is understandable because modeling and CT are two processes that have often been tied together in the literature [3,19,42]. Our results further develop the nature of this relationship by showing that CT is used within the planning stage of the modeling process and that unique and different outcomes seem to emerge as students plan their computational solutions.
One consistent theme through much of the results was the lack of emphasis on the programming aspects of the assignment. During the building phase of the modeling process, students often focus on the code or the computational elements of the modeling problem [6]. Many of the more computationally focused themes from the original codebook, such as debugging and code reuse, were not observed during the planning phase. However, even with the change in emphasis, many of the same outcomes appeared in the planning phase, as observed in the modeling process’s building phase. This lack of focus on the computational aspects of the problem may be attributed to many different factors, such as students’ lack of familiarity with programming compared to mathematics [3] or their lack of self-efficacy with making programming or computing decisions [43], [44]. In either case, students’ planning processes lacked emphasis on the computational elements of their modeling solutions.
Students’ plans seemed to focus on many of the realistic or mathematical elements of the problem, likely as they attempted to abstract the information from the problem statement into a mathematical form. This aligns with the modeling literature, which describes the first step of the modeling process as one where students are collecting as much relevant information from the real world as it relates to their solution [10,12,45]. The planning phase is where students are “breaking down the phenomenon under study into small pieces that can be incorporated into an external model” [46] (p. 241). Our results align with this in that students break the problem statement down from a real-world context into smaller pieces that they can make sense of and understand how they could put together into a mathematical context.
Another significant finding of this work is the prevalence of generalization in the planning phase of the modeling process. We found two new outcomes for generalization, one backward-looking and one forward-looking. The first new generalization theme, drawing from previous experiences, is an outcome where students look at ideas, concepts, or methods that they have learned in the past that can be used for the presented problem. The second new generalization theme, projecting to other or future applications, is a primarily forward-looking outcome. Students think about how decisions about their model may make their model useful or limited in those contexts. It is an expected outcome because our previous experiences are foundational to education and learning [47]. Yet, the outcome seems to go one step further: students are transferring information not into the same context but to new contexts. This type of knowledge transfer of information to new contexts is indicative of expert-like practice [48,49].
This is an encouraging finding that students are not only thinking externally about their presented problem but also thinking of ways to transfer the information to new contexts. These backward- and forward-looking generalization practices are likely emerging as students try to fit the current problem into their mental model of the phenomenon. The process of modeling is where students externalize their mental model or their detailed understanding of a phenomenon and how it works into a physical, real-world mathematical, or computational model [50,51]. It appears that during this planning phase of the modeling process, students are engaging with their understandings of the phenomenon and generalizing what information will be useful to the current problem and how that may make the model useful or limit its use in the future. These practices are desirable as students develop their adaptive expertise in computation [33].
Both new generalization themes seem to be about transferring information from previous situations and future situations indicative of expert-like practice [52,53]. The literature around adaptive expertise has shown that transferring information into new and novel situations is one way experts are differentiated from novices [48,49,54]. Our learning design of giving students very little instruction before planning their models may have encouraged them to try to adapt and innovate on their prior experiences to make sense of the problem.

5.1. Implications for Teaching and Learning

This study provides two main implications for instructors looking to implement modeling activities in their classrooms. The first is unique CT outcomes that students can practice by planning their modeling solutions. While the benefits of metacognitive practices in modeling are already well documented [33,55,56], our study furthers these implications by showing that building CT practices is an additional benefit to having planning as part of the modeling process. One issue with computational modeling problems is that some students seem to start coding without necessarily planning out their solutions, which indicates novice problem-solvers [33]. Suppose instructors give specific time for students to plan out their solutions. In that case, it may help students move toward expert-like practice and away from the novice-like practice of immediately solving the problem.
Furthermore, instructors should tailor modeling activities they are developing to amplify outcomes that were observed in our results. That means activity designers should incorporate opportunities for students to explore their previous experiences, consider future model applications, or integrate flow diagrams to incorporate algorithmic thinking or decomposition. By tailoring activities in this way, instructors can amplify naturally occurring outcomes during the modeling process. One opportunity to do so would be to engage in design-based research, similar to this study, which allows educators to understand the connection between the pedagogical design decisions and student outcomes [57,58].

5.2. Limitations and Future Work

There are some limitations to our study. First, the activity was situated within the specific context of food process engineering. Second, demographic information on the participants was not collected as part of this study. These two factors may limit the study’s transferability to other contexts and should be considered in future research applications. Additionally, the work is limited by what the students talked about or were aware of as they solved the modeling problem. This may be a critical limitation because the mental processes underlying CT elements like abstraction may occur at an implicit level and thus may not be articulated easily. There are likely other computational thinking outcomes that students are doing that they didn’t verbalize or were not aware they were doing.
Our future work will consider additional modeling cycle steps and how CT is elicited in those settings in order to track CT across the entire modeling cycle. This study is part of a larger effort to understand CT within the modeling and simulation process. Future efforts will continue to look at other steps of the modeling cycle. Additionally, understanding how different learning designs for modeling and different disciplinary context will be crucial in understanding the entire scope of CT within modeling activities and thus could improve the design of learning activities designed to elicit CT.

6. Conclusions

Computational thinking is becoming a critical skill for learners across disciplines. Modeling is an engineering activity that naturally lends itself to students practicing all aspects of CT but is understudied within a naturalistic classroom setting. We studied advanced engineering students planning their modeling solutions, and we identified various themes of CT outcomes that emerged while going through the planning process. Our results indicate that the planning process has distinctly different CT outcomes from the model-building process and amplifies various aspects of CT. Practices such as generalization were more prominent in the planning phase as students demonstrated specific outcomes indicative of expert-like practice. Because of this, instructors should prioritize incorporating time for students to plan their solutions when incorporating modeling activities into their classroom if their goal is to give students ample practice in all aspects of CT.

Author Contributions

Conceptualization, J.A.L. and A.J.M.; methodology, J.A.L. and A.J.M.; validation, J.A.L.; formal analysis, J.A.L.; investigation, J.A.L. and A.J.M.; funding acquisition, J.A.L. and A.J.M.; writing—original draft preparation, J.A.L.; writing—review and editing, A.J.M. and R.A.S.; supervision, A.J.M. and R.A.S. All authors have read and agreed to the published version of the manuscript.

Funding

This material is based on work supported in part by the National Science Foundation Graduate Research Fellowship Program under Grant Number DGE-1842166 as well as the National Science Foundation under Award Number DBI-2120200. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

Institutional Review Board Statement

The study was approved by the Institutional Review Board of Purdue University (protocol number 1806020715).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Conflicts of Interest

The authors declare that they have no conflict of interest.

References

  1. Finzer, W. The Data Science Education Dilemma. Technol. Innov. Stat. Educ. 2013, 7. Available online: http://escholarship.org/uc/item/6jv107c7 (accessed on 25 May 2022). [CrossRef]
  2. Irizarry, R.A. The Role of Academia in Data Science Education. Harv. Data Sci. Rev. 2020, 2. [Google Scholar] [CrossRef]
  3. Magana, A.J.; Coutinho, G.S. Modeling and simulation practices for a computational thinking-enabled engineering workforce. Comput. Appl. Eng. Educ. 2017, 25, 62–78. [Google Scholar] [CrossRef]
  4. Fennell, H.W.; Lyon, J.A.; Magana, A.J.; Rebello, S.; Rebello, C.M.; Peidrahita, Y.B. Designing hybrid physics labs: Combining simulation and experiment for teaching computational thinking in first-year engineering. In Proceedings of the 2019 IEEE Frontiers in Education Conference (FIE), Covington, KY, USA, 16–19 October 2019. [Google Scholar] [CrossRef]
  5. Magana, A.; Falk, M.; Reese, M.J. Introducing Discipline-Based Computing in Undergraduate Engineering Education. ACM Trans. Comput. Educ. 2013, 13, 1–22. [Google Scholar] [CrossRef]
  6. Lyon, J.A.; Magana, A.J. The use of engineering model-building activities to elicit computational thinking: A design-based research study. J. Eng. Educ. 2021, 110, 184–206. [Google Scholar] [CrossRef]
  7. Vieira, C.; Magana, A.J.; García, R.E.; Jana, A.; Krafcik, M. Integrating Computational Science Tools into a Thermodynamics Course. J. Sci. Educ. Technol. 2018, 27, 322–333. [Google Scholar] [CrossRef]
  8. Diefes-Dux, H.A.; Hjalmarson, M.; Zawojewski, J.S.; Bowman, K. Quantifying aluminum crystal size part 1: The model-eliciting activity. J. STEM Educ. Innov. Res. 2006, 7, 51–63. [Google Scholar]
  9. Lyon, J.A.; Fennell, H.W.; Magana, A.J. Characterizing students’ arguments and explanations of a discipline-based computational modeling activity. Comput. Appl. Eng. Educ. 2020, 28, 837–852. [Google Scholar] [CrossRef]
  10. Lyon, J.A.; Magana, A.J. A Review of Mathematical Modeling in Engineering Education. Int. J. Eng. Educ. 2020, 36, 101–116. [Google Scholar]
  11. Jung, H.; Diefes-Dux, H.A.; Horvath, A.K.; Rodgers, K.J.; Cardella, M.E. Characteristics of feedback that influence student confidence and performance during mathematical modeling. Int. J. Eng. Educ. 2015, 31, 42–57. [Google Scholar]
  12. Louca, L.T.; Zacharia, Z. Modeling-based learning in science education: Cognitive, metacognitive, social, material and epistemological contributions. Educ. Rev. 2012, 64, 471–492. [Google Scholar] [CrossRef]
  13. Shiflet, A.B.; Shiflet, G.W. Introduction to Computational Science: Modeling and Simulation for the Sciences; Princton University: Princeton, NJ, USA, 2014. [Google Scholar]
  14. Magana, A.J. Modeling and Simulation in Engineering Education: A Learning Progression. J. Prof. Issues Eng. Educ. Prof. 2017, 143, 04017008. [Google Scholar] [CrossRef]
  15. Kalelioğlu, F.; Gülbahar, Y.; Kukul, V. A Framework for Computational Thinking Based on a Systematic Research Review. Balt. J. Mod. Comput. 2016, 4, 583–596. [Google Scholar]
  16. Lyon, J.A.; Magana, A.J. Computational thinking in higher education: A review of the literature. Comput. Appl. Eng. Educ. 2020, 28, 1174–1189. [Google Scholar] [CrossRef]
  17. Wing, J.M. Computational thinking. Commun. ACM 2006, 49, 33–35. [Google Scholar] [CrossRef]
  18. Selby, C.C.; Woollard, J. Computational Thinking: The Developing Definition. In Proceedings of the 18th Annual Conference on Innovation and Technology in Computer Science Education, Canterbury, UK, 29 June–3 July 2013; pp. 5–8. [Google Scholar]
  19. Weintrop, D.; Beheshti, E.; Horn, M.; Orton, K.; Jona, K.; Trouille, L.; Wilensky, U. Defining Computational Thinking for Mathematics and Science Classrooms. J. Sci. Educ. Technol. 2016, 25, 127–147. [Google Scholar] [CrossRef]
  20. Curzon, P.; Dorling, M.; Selby, C.; Woollard, J. Developing computational thinking in the classroom: A framework. Comput. Sch. 2014. Available online: http://eprints.soton.ac.uk/id/eprint/369594 (accessed on 26 May 2022).
  21. Grover, S.; Pea, R. Computational Thinking in K-12. Educ. Res. 2013, 42, 38–43. [Google Scholar] [CrossRef]
  22. Ilic, U.; Haseski, H.I.; Tugtekin, U. Publication Trends Over 10 Years of Computational Thinking Research. Contemp. Educ. Technol. 2018, 9, 131–153. [Google Scholar] [CrossRef] [Green Version]
  23. Ehsan, H.; Rehmat, A.P.; Cardella, M.E. Computational thinking embedded in engineering design: Capturing computational thinking of children in an informal engineering design activity. Int. J. Technol. Des. Educ. 2021, 31, 441–464. [Google Scholar] [CrossRef]
  24. Rehmat, A.P.; Ehsan, H.; Cardella, M.E. Instructional strategies to promote computational thinking for young learners. J. Digit. Learn. Teach. Educ. 2020, 36, 46–62. [Google Scholar] [CrossRef]
  25. Tsarava, K.; Moeller, K.; Román-González, M.; Golle, J.; Leifheit, L.; Butz, M.V.; Ninaus, M. A cognitive definition of computational thinking in primary education. Comput. Educ. 2022, 179, 104425. [Google Scholar] [CrossRef]
  26. Kanaki, K.; Kalogiannakis, M. Assessing Algorithmic Thinking Skills in Relation to Gender in Early Childhood. Educ. Process Int. J. 2022, 11, 44–45. [Google Scholar] [CrossRef]
  27. Kalliopi, K.; Michail, K. Assessing Computational Thinking Skills at First Stages of Schooling. In ACM International Conference Proceeding Series; ACM: Barcelona, Spain, 2019; pp. 135–139. [Google Scholar] [CrossRef]
  28. Ung, L.-L.; Labadin, J.; Mohamad, F.S. Computational thinking for teachers: Development of a localised E-learning system. Comput. Educ. 2022, 177, 104379. [Google Scholar] [CrossRef]
  29. Lawanto, O. Students’ metacognition during an engineering design project. Perform. Improv. Q. 2010, 23, 117–136. [Google Scholar] [CrossRef]
  30. Dvir, D.; Raz, T.; Shenhar, A.J. An empirical analysis of the relationship between project planning and project success. Int. J. Proj. Manag. 2003, 21, 89–95. [Google Scholar] [CrossRef]
  31. Lucangeli, D.; Tressoldi, P.E.; Cendron, M. Cognitive and Metacognitive Abilities Involved in the Solution of Mathematical Word Problems: Validation of a Comprehensive Model. Contemp. Educ. Psychol. 1998, 23, 257–275. [Google Scholar] [CrossRef]
  32. Shin, N.; Jonassen, D.H.; McGee, S. Predictors of well-structured and ill-structured problem solving in an astronomy simulation. J. Res. Sci. Teach. 2003, 40, 6–33. [Google Scholar] [CrossRef]
  33. Magana, A.J.; Fennell, H.W.; Vieira, C.; Falk, M.L. Characterizing the interplay of cognitive and metacognitive knowledge in computational modeling and simulation practices. J. Eng. Educ. 2019, 108, 276–303. [Google Scholar] [CrossRef]
  34. Diefes-Dux, H.; Moore, T.; Zawojewski, J.; Imbrie, P.K.; Follman, D. A framework for posing open-ended engineering problems: Model-eliciting activities. In Proceedings of the 34th Annual Frontiers in Education 2004 (FIE 2004), Savannah, GA, USA, 20–23 October 2004. [Google Scholar] [CrossRef]
  35. Moore, T.J.; Miller, R.L.; Lesh, R.A.; Stohlmann, M.S.; Kim, Y.R. Modeling in Engineering: The Role of Representational Fluency in Students’ Conceptual Understanding. J. Eng. Educ. 2013, 102, 141–178. [Google Scholar] [CrossRef]
  36. Lesh, R.; Hoover, M.; Hole, B.; Kelly, A.; Post, T. Principles for developing thought-revealing activities for students and teachers. In The Handbook of Research Design in Mathematics and Science Education; Kelly, A., Lesh, R., Eds.; Lawrence Erlbaum Associates: Mahwah, NJ, USA, 2000; pp. 591–646. [Google Scholar] [CrossRef]
  37. Liu, Z.; Xia, J. Enhancing computational thinking in undergraduate engineering courses using model-eliciting activities. Comput. Appl. Eng. Educ. 2021, 29, 102–113. [Google Scholar] [CrossRef]
  38. Moore, T.J.; Guzey, S.S.; Roehrig, G.H.; Stohlmann, M.; Park, M.S.; Kim, Y.R.; Callender, H.L.; Teo, H.J. Changes in Faculty Members’ Instructional Beliefs while Implementing Model-Eliciting Activities. J. Eng. Educ. 2015, 104, 279–302. [Google Scholar] [CrossRef]
  39. Hjalmarson, M.; Diefes-Dux, H.A.; Bowman, K.; Zawojewski, J.S. Quantifying aluminum crystal size part 2: The model-development sequence. J. STEM Educ. Innov. Res. 2006, 7, 64–73. [Google Scholar]
  40. Kapur, M. Productive Failure. Cogn. Instr. 2008, 26, 379–424. [Google Scholar] [CrossRef]
  41. Braun, V.; Clarke, V. Using thematic analysis in psychology. Qual. Res. Psychol. 2006, 3, 77–101. [Google Scholar] [CrossRef] [Green Version]
  42. Sengupta, P.; Kinnebrew, J.S.; Basu, S.; Biswas, G.; Clark, D. Integrating computational thinking with K-12 science education using agent-based computation: A theoretical framework. Educ. Inf. Technol. 2013, 18, 351–380. [Google Scholar] [CrossRef]
  43. Busch, T. Gender Differences in Self-Efficacy and Attitudes toward Computers. J. Educ. Comput. Res. 1995, 12, 147–158. [Google Scholar] [CrossRef] [Green Version]
  44. Hutchison, M.A.; Follman, D.K.; Sumpter, M.; Bodner, G.M. Factors Influencing the Self-Efficacy Beliefs of First-Year Engineering Students. J. Eng. Educ. 2006, 95, 39–47. [Google Scholar] [CrossRef]
  45. Magana, A.J.; Vieira, C.; Fennell, H.W.; Roy, A.; Falk, M.L. Undergraduate Engineering Students’ Types and Quality of Knowledge Used in Synthetic Modeling. Cogn. Instr. 2020, 38, 503–537. [Google Scholar] [CrossRef]
  46. Louca, L.; Zacharia, Z.C. Toward an Epistemology of Modeling-Based Learning in Early Science Education. In Towards a Competence-Based View on Models and Modeling in Science Education; Springer: Cham, Switzerland, 2019; pp. 237–256. [Google Scholar] [CrossRef]
  47. Dewey, J. Experience and Education; Macmillan: New York, NY, USA, 1938. [Google Scholar] [CrossRef]
  48. McKenna, A.F. An Investigation of Adaptive Expertise and Transfer of Design Process Knowledge. J. Mech. Des. 2007, 129, 730–734. [Google Scholar] [CrossRef]
  49. Hatano, G.; Inagaki, K. Two courses of expertise. Res. Clin. Cent. Child Dev. Annu. Rep. 1984, 6, 27–36. [Google Scholar]
  50. Jonassen, D.H. Externally Modeling Mental Models. In Learning and Instructional Technologies for the 21st Century; Moller, L., Bond Huett, J., Harvey, D.M., Eds.; Springer: New York, NY, USA, 2009; pp. 49–74. [Google Scholar] [CrossRef]
  51. Nersessian, N.J. Model-Based Reasoning in Distributed Cognitive Systems. Philos. Sci. 2007, 73, 699–709. [Google Scholar] [CrossRef] [Green Version]
  52. Hmelo-Silver, C.E. Comparing expert and novice understanding of a complex system from the perspective of structures, behaviors, and functions. Cogn. Sci. 2004, 28, 127–138. [Google Scholar] [CrossRef]
  53. Carbonell, K.B.; Stalmeijer, R.E.; Könings, K.; Segers, M.; van Merriënboer, J.J. How experts deal with novel situations: A review of adaptive expertise. Educ. Res. Rev. 2014, 12, 14–29. [Google Scholar] [CrossRef]
  54. Delany, D. Advanced concept mapping: Developing adaptive expertise. In Concept Mapping-Connecting Educators: Proceedings of the Third International Conference on Concept Mapping: Vol. 3. Posters, Tallinn, Estonia and Helsinki, Finland, 22–25 September 2008; Concept Mapping Conference; Volume 3, pp. 32–35. Available online: https://cmc.ihmc.us/cmc-proceedings (accessed on 26 May 2022).
  55. Vieira, C.; Magana, A.J.; Roy, A.; Falk, M.L. Student Explanations in the Context of Computational Science and Engineering Education. Cogn. Instr. 2019, 37, 201–231. [Google Scholar] [CrossRef]
  56. Jaiswal, A.; Lyon, J.A.; Zhang, Y.; Magana, A.J. Supporting student reflective practices through modelling-based learning assignments. Eur. J. Eng. Educ. 2021, 46, 987–1006. [Google Scholar] [CrossRef]
  57. Barab, S.; Squire, K. Design-Based Research: Putting a Stake in the Ground. J. Learn. Sci. 2004, 13, 1–14. [Google Scholar] [CrossRef]
  58. The Design-Based Research Collective. Design-Based Research: An Emerging Paradigm for Educational Inquiry. Educ. Res. 2003, 32, 5–8. [Google Scholar] [CrossRef]
Figure 1. Diagram coded as stepwise approach as a type of algorithmic thinking.
Figure 1. Diagram coded as stepwise approach as a type of algorithmic thinking.
Modelling 03 00022 g001
Figure 2. Example diagram demonstrating organization of larger solution method.
Figure 2. Example diagram demonstrating organization of larger solution method.
Modelling 03 00022 g002
Table 1. Data overview for the current study.
Table 1. Data overview for the current study.
Data SourceNDescription
Audio/Video Files of In-Lab Discussions4 groups (15 students total)Four of the groups were randomly selected to be audio/video recorded during their entire planning process during lab time.
Planning Model Templates15 (one for each group)All groups turned in a planning model template, one of which is used in this analysis for each group in the study.
* four participants overlap the two data sources for a total of N = 26 participants.
Table 2. Practices within computational thinking (CT) and definitions from the literature [6].
Table 2. Practices within computational thinking (CT) and definitions from the literature [6].
Computational Thinking PracticeDefinition
AbstractionMaking problems easier to solve/think about by selectively removing/hiding unnecessary complexity by making assumptions about how the problem should operate or by neglecting or adding components to the problem.
Algorithmic ThinkingSetting up or talking about problems so that they can be solved in a stepwise manner by looking at the procedural steps/aspects of the solution; or by acknowledging the uses and effects of solution structures temporally and procedurally throughout the solution.
EvaluationComparing one’s own solution against various criteria either given by the problem itself or against criteria brought to the problem by the student themselves. This can also be a statement describing desirable qualities of the final solution in terms of function, display, or other final form characteristics.
GeneralizationReusing previous solutions, methods, or constructs for the current problem; or the use of current solutions, methods, or constructs from the current problem to hypothesize about their use in future contexts.
DecompositionThinking about how the pieces of the problem can be used separately, strategically breaking down the problem for easier solution or use, or breaking the problem down into subcomponent structures, pieces, or areas.
Table 3. Finalized codebook of CT practices for model-planning.
Table 3. Finalized codebook of CT practices for model-planning.
AbstractionAlgorithmic ThinkingEvaluationGeneralizationDecomposition
Ranges to values
Multiple to single
Dynamic to static
Geometric relationships
Similar systems
Infinite to finite
Indication of later use/effect
Stepwise approach
Parallel methods
Conditional logic
Solution accuracy
Solution complexity
Time efficiency
Design criteria
Solution usability
Solution flexibility
Previous coursework or experience
External applications
Organization of larger solution method
Allocation of resources
bold denotes a newly identified CT outcome.
Table 4. Definitions and quotes of CT outcomes identified for abstraction (letters denote individual deidentified students).
Table 4. Definitions and quotes of CT outcomes identified for abstraction (letters denote individual deidentified students).
OutcomeDefinitionQuote
Ranges to valuesSimplifying a range or list of values into a single value (e.g., worst-case scenario, average across a range, and picking the highest value) or a single function (assume that x follows function y).“So z-value is the number of degrees that requires that variation. So I feel like we should choose the highest one again.”
Multiple to singleSimplifying aspects of the problem that have multiple dimensions or factors and simplifying the dimensions or factors considered in the solution through choice of one of neglect of factors; or creating limits or boundaries to what is considered to be affecting or influencing the considered system. “A: The tin has, can we assume like no heat loss with the… I don’t know what I’m trying to think. Can we assume the can is negligible, like the metal part?D: Oh yeah, conduction through can wall is negligible because it’s so thick.”
Dynamic to staticMaking factors or variables constant, uniform, or unchanging that would normally vary in respect to other variables or aspects of the problem.“D: Density, like constant pressure means you have a density that’s not relying on pressure. It’s more relying on the temperature.”
Geometric relationshipsSimplifying problems by assuming a geometric characteristic or relationships between variables, constants, or factors within the problem space. “D: Yeah so we will have to assume.. uhC: Cylindrical.D: Cylindrical yeah. So we’ll get.”
Similar systemsAligning properties of a current unknown system or property of a system with knowns of a system that is well known or familiar. “B: What assumptions will you make to solve the problem?C: Assume, pumpkin pie filling is about like applesauce.”
Infinite to finiteMaking aspects of the problem or system that are infinite or continuous and making them finite or discrete in nature. “But uh depending on our can dimensions, we could do the whole like infinity, can split by infinity slabs, or just infinite cylinder.”
Table 5. Definitions and quotes of CT outcomes identified for algorithmic thinking (letters denote individual deidentified students).
Table 5. Definitions and quotes of CT outcomes identified for algorithmic thinking (letters denote individual deidentified students).
OutcomeDefinitionQuote
Indication of later use/effectIndicating that information will be useful for a later process or a later point in the code. Indication of what certain variables or structures will cause later in the solution or code or explanation of location of code for effect. “All I know is that we are going to need to determine like the alpha, so we need like the k’s and all of those things so that might be important then, like how much moisture there is, to determine the k value and like the cp
Using a stepwise approachGiving things a logical order or listing steps to follow in order to solve the problem. This can include listing steps explicitly (e.g., 1, 2, 3…), diagramming, giving things logical order (i.e., we should do x first, y second, and then z third), or thought experiments (i.e., if we do x, then y will happen, and then z will likely follow). “It’s saying, yeah. It is trying to say, okay, I don’t know my next value, but I know my temperatures right now throughout my area. Okay. Estimate, if I step one point in time, like one second, what will the temperature at that point I want to look at become. And then you take that value, you just guessed that your model and then plug it back into the equation for the next time around.”
Considering parallel methodsIndication that there are parallel or redundant solution methods or information given. “D: So in this equation they use Ea, since uh temperature changes [inaudible] effect. And that’s what z value does. A: Oh okay, so z already takes that into account?D: Yeah so z and D are intertwined like Ea and D are intertwined. […] Since we have both D and z we don’t need Ea.”
Conditional logicStudent making an if/else statement or conditional statement based on previous or future conditions (i.e., if I do x, y will happen)“I don’t, I don’t remember them. But like for a tuna can you probably shouldn’t assume it, because it’s really flat and not much curvature. But for like a soup can you might get away with it. [laughs] Why’d you choose it? Engineers instinct.”
Table 6. Definitions and quotes of CT outcomes identified for evaluation (letters denote individual deidentified students).
Table 6. Definitions and quotes of CT outcomes identified for evaluation (letters denote individual deidentified students).
OutcomeDefinitionQuote
Solution accuracyEvaluates the solution or the code in regards to the actual or perceived accuracy of the solution method decisions. “The exact properties of Nacho Cheese may not be available. I think it will limit accuracy.”
Solution complexityEvaluates the solution or code in regards to the amount of work, time, or effort needed to create the solution. “What are the benefits of this technique? Uh, it’s simplistic, it’s not insane.”
Time efficiencyEvaluates the solution method based on the amount of time the code or computer takes or wastes to arrive at the final solution. “I chose this method because it is computationally fast.”
Design CriteriaEvaluates the code or solution against the actual or perceived wishes and requirement of the stakeholders or problem statement. “This way if the moisture is content is lower, we will still be within the desired amount of sterilization.”
Solution usabilityEvaluates the solution or code in regards to ease of use for themselves or for others.“… allow us to create a usable solution.”
Solution flexibilityEvaluates the solution or code based on the ability to easily change or pursue future changes.“This method gives us a lot of control over the parameters and the ‘machinery’ of the program.”
Table 7. Definitions and quotes of CT outcomes identified for generalization (letters denote individual deidentified students).
Table 7. Definitions and quotes of CT outcomes identified for generalization (letters denote individual deidentified students).
OutcomeDefinitionQuotes
Drawing from previous experiencesUse of previous coursework or personal experiences to inform the current solution method. This can include previous coursework, professional experiences, or explicitly related concepts from previous courses.“C: Oh I remember this equation from [another course], the d equals d0, times…B: Oh yeahC: or [another course] I don’t know, d equals d0 times ten to the t0 minus t over z.”
Projecting to other or future applicationsReference to other problem spaces, external to the current problem, where the current solution would be useful or not useful based on that nature of the current problem or solution.“So the model will not account for really hot or really cold days.”
Table 8. Definitions and quotes of CT outcomes identified for decomposition (letters denote individual deidentified students).
Table 8. Definitions and quotes of CT outcomes identified for decomposition (letters denote individual deidentified students).
OutcomeDefinitionQuote
Organization of larger solution methodDecomposing the code or solution method as an organizational tool. See Figure 2
Allocating resourcesBreaking up a problem amongst group members in order to reduce complexity of the work required or to increase quality of the solution method“Do we wanna split up this in any way? Is that allowed? What we’re doing. Um. So like we’ll have to individually look up a bunch of equations and the limitations and benefits of the computer systems we decide to use. Or like the solving method we decide to use.”
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Lyon, J.A.; Magana, A.J.; Streveler, R.A. Characterizing Computational Thinking in the Context of Model-Planning Activities. Modelling 2022, 3, 344-358. https://doi.org/10.3390/modelling3030022

AMA Style

Lyon JA, Magana AJ, Streveler RA. Characterizing Computational Thinking in the Context of Model-Planning Activities. Modelling. 2022; 3(3):344-358. https://doi.org/10.3390/modelling3030022

Chicago/Turabian Style

Lyon, Joseph A., Alejandra J. Magana, and Ruth A. Streveler. 2022. "Characterizing Computational Thinking in the Context of Model-Planning Activities" Modelling 3, no. 3: 344-358. https://doi.org/10.3390/modelling3030022

Article Metrics

Back to TopTop