Next Article in Journal
Re-Envisioning Classroom Culture in an Introductory General Chemistry Course: Description of a Course Redesign Project
Previous Article in Journal
Life Skills and Volleyball Teaching: Comparison Between TGfU and Direct Instruction Model
Previous Article in Special Issue
Measuring Children’s Computational Thinking and Problem-Solving in a Block-Based Programming Game
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Associations Between Computational Thinking and Learning to Play Musical Instruments

School of Education, Tel Aviv University, Tel Aviv-Yafo 6997801, Israel
*
Author to whom correspondence should be addressed.
Educ. Sci. 2025, 15(3), 306; https://doi.org/10.3390/educsci15030306
Submission received: 29 October 2024 / Revised: 27 January 2025 / Accepted: 25 February 2025 / Published: 2 March 2025
(This article belongs to the Special Issue Measuring Children’s Computational Thinking Skills)

Abstract

:
This paper explores the association between computational thinking (CT) skills and learning to play musical instruments. While CT has often been linked to programming and STEM fields, its application to non-digital contexts remains underexplored. The two studies presented here address this gap. In the first, a quantitative study (N = 91), self-report questionnaires were filled out by young adults with varied musical backgrounds, who also undertook CT tests. We found a strikingly positive association between musical experience and CT performance, with some nuanced associations based on the characteristics and experience of music playing. In the second, qualitative study (N = 10), interviews were conducted with high school students who are highly experienced in music performance, aiming at identifying CT skills they use while learning to play musical pieces. The analysis revealed that they employ a wide range of CT skills, and that the manifestation of these skills differs by the stage of learning. These two studies complement each other, hence this paper sheds important new light on the associations between CT and the field of music education.

1. Introduction

Computational Thinking (CT) is the conceptual foundation required for solving complex problems effectively and efficiently. CT is instinctively associated with the use of computers, more often than not with programming, and not justly so; it comprises a set of skills that are important for problem-solving across domains and contexts and, therefore, is key in today’s world. Specifically, it is key to learning across the curriculum.
Over the last two decades or so, the associations between CT and various subject matters have been heavily looked at from various points of view. It has been studied how CT can contribute to different disciplines, how learning in other domains can enrich CT, and how often CT is intertwined in various educational contexts. As part of these explorations, some have particularly emphasized the development of CT.
However, what is still lacking in the current literature is an understanding of whether an extensive experience in a specific domain can contribute to the development of CT when this experience is fully independent of CT. This is the gap we aim to bridge in this contribution. We do so in the context of playing musical instruments during childhood and teenhood.
Learning to play musical instruments has many advantages for cognitive, meta-cognitive, and affective development of young people, and is therefore very popular for school-age children worldwide. Some aspects of learning to play musical instruments may seem to be associated with CT, like decomposing the big task of learning a complicated musical piece to smaller, more manageable components, and then bringing them together; identifying patterns in musical notations for having a deeper understanding of it; automating the mechanical aspects of playing; or being iterative for improving performance. However, a thorough study of the contribution of CT to learning to play musical instruments is still lacking. To address this need, we set up the following research questions:
  • CT performance by experience in, and characteristics of, playing musical instruments, among young adults.
    1.1.
    What are the differences in CT competency, based on experience in learning to play musical instruments?
    1.2.
    What are the differences in CT competency—for those who have experience in learning to play musical instruments—based on the characteristics of their musical experience?
  • Manifestation of CT skills among teens when learning to play musical instruments.
    2.1.
    How are CT skills manifested implicitly when teens learn to play musical instruments?
    2.2.
    What are the associations between manifested CT skills and the stage of learning?
    2.3.
    How do the CT components acquired during music studies contribute to problem-solving in computer science?

2. Literature Review

2.1. Computational Thinking Across Settings and Contexts

The term Computational Thinking (CT) was coined to describe a systematic approach to human solving of complex problems. Since its original conception by Papert (1980), it has been widely accepted as an imperative skill for today’s learners (Grover & Pea, 2013; Moreno-Leon et al., 2018; Shute et al., 2017). Major organizations, such as the World Economic Forum and the United Nations Educational, Scientific, and Cultural Organization (UNESCO), consider CT to be part of the new literacies necessary for being contributing members of society (Scott, 2015; World Economic Forum, 2015).
The history of CT over the last few decades has made it strongly associated with programming, or with computer-related activities at large. Indeed, CT stemmed from the recognition of the constant presence of computers around us, and was first introduced, by Papert (1980), in the context of programming; later, Wing’s (2006) paper, which helped to popularize the term, stated that CT draws on the concepts fundamental to computer science. However, Papert and Wing did not claim that CT is limited to the realm of computers. On the contrary, Papert argued that his approach could influence “how people think even when they are far removed from physical contact with a computer” (p. 4), and Wing suggested that CT will be part of a skill set of not only scientists, “but of everyone else” (p. 34). Today, operational definitions of CT often include both sub-skills that are strongly related to programming and computing—e.g., documenting and understanding software, analyzing and visualizing data, or familiarization with foundational programming concepts—and others that are related to general problem-solving across domains, e.g., evaluation, logical thinking, or problem decomposition (Tang et al., 2020). However, as recent literature reviews tell us, CT is still mostly practiced in the context of computational artifacts (Ezeamuzie & Leung, 2021; Khoo et al., 2022; Nordby et al., 2022; Ogegbo & Ramnarain, 2022; J. Su & Yang, 2023; Subramaniam et al., 2022), and mostly in the STEM disciplines (Chen et al., 2023; Cutumisu et al., 2019; Jin & Cutumisu, 2024; Lyon & Magana, 2020; Rao & Bhagat, 2024; Saad & Zainudin, 2022; Tatar & Eseryel, 2019). Recent incorporations of CT and music, for either making CT interdisciplinary or accessible to young students, demonstrate that this integration is still performed in a way that is closely related to programming (McCall et al., 2024; J.-M. Su et al., 2024; Zheng et al., 2023). We would like to break this twofold association, and to refer to CT as a general problem-solving skill that is relevant across the curriculum and across lifelong learning, with or without digital artifacts.

2.1.1. CT Without Digital Artifacts

Some of the most common theoretical frameworks of CT are explicitly based on experience with computer-based environments; for example, Scratch had informed Brennan and Resinck’s (2012) CT framework, and NetLogo computational models had informed Weintrop et al.’s (2016) definition of CT in the STEM classroom. Placing CT as part of an international effort to promote digital literacy (Law et al., 2018) supports the linkage between them.
Importantly, CT—as a mental process—is not dependent on computers, which is why we adhere to Shute et al.’s (2017) framework of CT, based on which CT is “the conceptual foundation required to solve problems effectively and efficiently (i.e., algorithmically, with or without the assistance of computers with solutions that are reusable in different contexts” (p. 151). This definition has six facets of CT: (1) Decomposition—dissecting a complex problem into manageable parts; (2) Abstraction—extracting the essence of a complex problem using data collection and analysis, pattern recognition, or modeling; (3) Algorithms—designing logical and ordered instructions for rendering a solution to a problem using algorithm design, parallelism, efficiency, or automation; (4) Debugging—detecting, identifying, and fixing errors; (5) Iteration—repeating design processes to refine solutions; and (6) Generalization—transferring CT skills to a wide range of contexts. Each of these can be taught and practiced in various contexts that are not computer dependent, hence CT can be thought of as a non-computer dependent skill (Caeli & Yadav, 2020; del Olmo-Muñoz et al., 2020; Huang & Looi, 2021), which makes it accessible to a wide range of learners, across various settings. As Roncoroni Osio and Bailón Maxi (2020) argued, CT can be used to solve any computable problem with just our brains, hands, and simple tools, like pencils, paper, or daily use objects. Indeed, even coding-related skills, like pattern recognition, algorithmic thinking, and debugging, could be taught by simply using cards and simple forms of representations (Tank et al., 2024).
Using unplugged activities is quite common when wishing to engage young children with CT. For example, decomposition was taught in primary school by letting children break down a long sequence of hand jive, tutting, and clapping in order to learn it more effectively; and abstraction was taught by letting students create a dough model that would include as few details as possible but would still allow them to represent whatever they choose to represent (Rijke, 2017). The notion of algorithms has been taught with young students leading their peers through a maze by using code-cards, with no computers involved (Kwon et al., 2024).
The use of pen-and-paper working sheets for teaching and practicing CT-related skills has also been popular. In this context, tasks could ask students, e.g., to describe in detail how to plant a tree (implementing decomposition and algorithms), to find the shortest path in a maze (pattern recognition and algorithms), or to instruct a peer to draw a given shape (abstraction and algorithms) (Puhlmann et al., 2019). We should recall that common CT assessment tools, like CTt and its derivatives, take a classical pen-and-paper approach with similar tasks (El-Hamamsy et al., 2022; Román-González, 2015). A different type of CT-related tasks that can be administered in pen-and-paper are Bebras challenges, in which students are required to use various CT-related skills to solve the task (Lockwood & Mooney, 2018).

2.1.2. CT Across the Curriculum and Lifelong Learning

CT has been acknowledged for its importance in developing knowledge and understanding of concepts across subject matters (Pollock et al., 2019; Tang et al., 2020). Although most heavily implemented in the STEM areas, there have been successful attempts to promote other disciplines using CT. A simple way of achieving this is by integrating programming and educational technology into the discipline. For example, historical databases were presented and analyzed using programming in a history class, in order to engage students with authentic data (Vlahović & Biškupić, 2023), and a voice-based programming app was used in order to engage young children with storytelling (Dietz et al., 2021). Indeed, using digital artifacts—mostly, programming-related—has been a common practice when integrating CT into various disciplines (Lyon & Magana, 2020; H. Ye et al., 2023), as coding has various important facets which give it potentially impactful cognitive, meta-cognitive, and socio-emotional dimensions (Brennan & Resnick, 2012; Melro et al., 2023).
Looked at from a different perspective, the integration of CT-related skills can be performed in a way that is more inherently integrated with the discipline, even in non-STEM areas. For example, entity-relationship diagrams were used as a means for implementing modeling—an imperative CT-related skill—in language lessons; students used these diagrams to identify the main characters and the relationships between them (Rottenhofer et al., 2021). In another case, abstraction and algorithms were inherently integrated into dance-focused lessons as a means to communicate dance moves and sequences to oneself and peers (Fairlie, 2023).
More broadly, we should recall that CT is essentially a skill for solving complex problems, hence it can be used to solve complex learning-related problems. For example, for English as a foreign language (EFL) students, writing essays in English is a complex task, which can be supported using CT skills, as demonstrated by (Nurhayati et al., 2022): Decomposition is used to breakdown the writing topic into sub-topic; abstraction is used to conceptualize paragraphs and to make sure they are coherent; and algorithmic thinking is used to create a logical flow of the text. On a more micro level of English learning by EFL students, CT skills were used to learn sentence-level grammar (Nurhayati et al., 2022); for example, using pattern analysis to identify correct forms of verbs, and using abstraction to model the structure of the sentence. These examples are highly relevant to the focus of our paper, as they show how CT can be looked at as a set of strategic tools that serve the successful completion of a complex learning task. Learning to play musical instruments is such a complex task; therefore, it is interesting to wonder about CT-related strategies that are associated with this task. Furthermore, considering that while practicing CT in the context of a specific domain, both CT-related and domain-related knowledge are acquired (H. Ye et al., 2023), and it is interesting to better understand the intrinsic relationship between learning to play musical instruments and CT, which could potentially enhance CT due to playing. These are the two issues that we now review.
Before doing so, it is important to note that CT and music have so far been integrated in various ways, either in the context of programming—or computational artifacts at large—or without them (Bell & Bell, 2018; Fanchamps et al., 2024; Petrie, 2022). As Chong (2018) clearly put it, there are some inherent associations between CT and music. For example, abstraction is strongly related to the very symbols used to represent notes and chords, or to the relationships between notes that together form chords and harmonies; patterns are observable along musical pieces in the form of repeated units or suspensions; and algorithms are evident in the form of written (symbolized) instructions given to individual players or bands that allow them to play a musical piece.

2.2. Learning to Play a Musical Piece: Stages and Strategies

There is no full agreement in the literature regarding the stages of learning to play a musical piece. It is important to remember that learning to play a musical instrument heavily involves individual practice, which may cause strategies for self-development to be created by learners (Ericsson et al., 1993; Hallam et al., 2012).
Interestingly, when Chaffin et al. (2003) studied this complex issue of learning to play a musical piece, they referred to it as a problem-solving experience, which makes it easy for us to associate it with CT. Their findings point to four phases of learning: (1) Scouting-it-out, which consists of sight-reading the piece from end to end, which we may associate with data collection and analysis; (2) Section-by-section, in which the piece is broken into pieces, each of which is practiced separately, which we can conceptualize as decomposition; (3) The gray stage, during which the whole piece is played in a way that is somewhat between conscious directness and full automation; and finally (4) Maintenance, which includes reflection and improvement, that is, is strongly associated with iteration and debugging. Similarly, Lehmann et al. identified three stages of learning, including—in this order—understanding the big picture of the piece, technically practicing different parts of the piece, and lastly polishing it; broadly speaking, these main stages refer to abstraction, decomposition, and debugging, respectively.
Austin and Berg (2006) grouped studying strategies used by musicians to master a musical piece based on the different behavior categories associated with self-regulated learning. Here, too, we can associate reported strategies with CT dimensions. For example, under Planning, strategies are related to abstraction—namely, looking over the music before practicing, and taking a systematic approach—and debugging, namely, focusing on difficult parts; under Strategies, repetition of difficult measures can be associated with the CT dimension of iteration, and decomposition and iteration are evident in, e.g., slowing down or working on parts of the piece before playing it all; and under Monitor Progress, debugging can be associated with working on the technique. Taken together, we conclude that while learning to play musical pieces, musicians develop various strategies, which may be associated with different stages of learning, and which can be linked to CT-related skills.

2.3. Cognitive Skills as Mediating Between Music and Computational Thinking

Over the last few decades, vast research has focused on the associations between music and improved cognition. In the early 1990s, it was Rauscher et al.’s (1993) seminal study that ignited this line of research. Rausher et al. demonstrated how listening to music, specifically to one of Mozat’s sonatas, had led to a temporary enhancement of spatial reasoning; this has been known as the “Mozart Effect”. Although the original study was limited to spatial reasoning only, using a given set of testing tools, a report in the New York Times claimed that the researchers “have determined that listening to Mozart actually makes you smarter”, which has led many scholars to test this claim. A review of many such studies over almost two decades concluded that listening to music—not just to Mozart’s—can change how people feel, which, in turn, influences how they perform on cognitive tests—mostly impacting cognition in the short term—hence supporting the arousal and mood hypothesis (Schellenberg, 2012). Still, some important, long-term cognitive contributions have been associated with music training or with listening to music—for both children and adults—which make it difficult to a priori reject associations between music and cognition (Habibi et al., 2018; Schneider et al., 2019).

3. CT Performance by Experience in, and Characteristics of, Playing Musical Instruments

In this section, we report on findings from a study that was designed to answer RQ1. The study employed a quantitative research methodology (N = 91 young adults), utilizing online questionnaires administered to a demographically diverse cohort of young adults, which exhibits variability concerning the subjects’ musical experiences.

3.1. Methodology

3.1.1. Research Population and Data Collection

Our sample consists of 91 young adults from Israel. Dissemination of the survey was facilitated through diverse WhatsApp and Facebook groups, targeting respondents aged 18–25 without additional demographic constraints. During the preliminary data curation phase, 4 participants were excluded for incomplete responses on the computational thinking test. Consequently, the effective sample size for subsequent statistical analyses amounted to N = 87.
The average age of the identified participants is 21.2 years (SD 2.3), with 62% females and 38% males. Of our population, almost half are either currently pursuing an academic degree or have already completed at least a Bachelor’s. Approximately 65% of the participants attested to having received musical instruction in the past. The mean programming experience level among the participants was 2.5 (SD 1.4; on a scale of 1–5). A concise description of our sample is provided in Table 1.

3.1.2. Independent Research Variables and Their Measurement

Independent variables regarding participants’ backgrounds and their experience in playing music were measured using an online survey. Below we present the items we used in this survey (in parentheses, next to each variable).

Background

  • Gender (“Gender”) [Male/Female/Other/Prefer not to disclose].
  • Age (“Age”) [Numeric].
  • Education Level (“What is the highest education level that you completed?”) [Partial High School/High School/Non-Academic Post-High School/Currently pursuing a bachelor’s degree/Bachelor’s degree and above]—Data were transformed into an ordinal variable [1–5, respectively].
  • Programming Experience (“To what extent are you experienced in programming?”) [1–5]—Measured using a Likert scale, ranging from 1 (“None”) to 5 (“Proficient”); there were no labels for the intermediate values.

Playing Music

  • Musical Experience (“Have you ever played a musical instrument for over a year?”) [Yes/No] (Those participants who responded with “No” were not presented with further questions regarding the variables below).
  • Number of Years of Musical Study (“How many years have you—or had you—played on the main musical instrument?”) [Numeric].
  • Perceived Playing Level (“In your opinion, what is the playing level that you achieved on the main musical instrument?”) [1–5]—Measured using a Likert scale, ranging from 1 (“Basic”) to 5 (“Professional”); there were no labels for the intermediate values.
  • Proficiency in Reading Sheet Music, Chords, and Tabs (“To what extent are you proficient in reading music in each of these?”) [1–5]—Measured using a Likert scale, for each type separately, with the following value-labels: 1, “Cannot read”; 2, “Reads a little”; 3, “Partially reads”; 4, “Knowledgeable”; 5, “Proficient”.
  • Understanding of Music Theory and Harmony (“What is your level of proficiency in understanding music theory and harmony?”) [1–5]—Measured using a Likert scale, ranging from 1 (“Basic”) to 5 (“Professional”); there were no labels for the intermediate values.
  • Musical Instruments Played (“On which musical instrument did you or do you play?”) [Piano/Keyboard/Guitar/Wind Instruments/String Instruments/Drums and Percussion] (Multiple choice)
  • Playing Style (“Which music style are you playing?”) For each style [Classical/Jazz/Rock-Pop/Metal/Klezmer], there was a selection of [Primary/Secondary/Occasional].
  • Experience in Playing in Ensembles (“Have you ever played in one or more of these ensembles?”) [Orchestra/Band/Big Band/Vocal Ensemble/Chamber Ensemble] (Multiple choice).

3.1.3. Dependent Variables (Computational Thinking) and Their Measurement

In order to assess the participants’ Computational Thinking level, we conducted an online questionnaire with ten computational thinking test (CTt)-style questions, and four Bebras tasks; in each case, the score was calculated by the percentage of correct responses. The CTt and the Bebras tasks together form a comprehensive evaluation of the participants’ CT skills. According to Román-González et al. (2019), it is recommended to integrate various tools to comprehensively assess an individual’s CT abilities, as information derived from each of them possesses distinct characteristics. Diagnostic assessment tools (such as the CTt) provide information on how students ‘remember’ and ‘understand’ various CT concepts, whereas skill transfer assessment tools (such as Bebras tasks) offer insights into students’ ability to ‘analyze’ and ‘apply’ their CT skills in diverse contexts (Román-González et al., 2019).

CT Test—CT Diagnostic Assessment Tool

The ten diagnostic assessment questions are in the computational thinking test (CTt) style, based on the test developed by Marcus Román-González (2015) and adapted from the Callysto Computational Thinking Test (Cutumisu et al., 2021). This test examines cognitive CT skills involving concepts and practices (sequence, meaning the importance of the order of operations needed to solve a problem, loops, which are repeating units, and conditions) and is found to have a relatively high internal consistency (Cronbach’s alpha of 0.76). This test was selected, as it is based on Gonzalez’s CTt but comprises only ten questions (as opposed to the original 28) and is already validated. The original test is in English and was translated into Hebrew by the researchers for the purposes of the questionnaire; translation was performed jointly by the first and the third authors during a few sessions until full agreement was achieved.

Bebras Tasks—CT Skill Transfer Assessment Tool

Four assessment questions for the transfer of CT skills (third part) are Bebras tasks. The international Bebras competition was established in Lithuania in 2003 with the goal of promoting interest and excellence in computer science among students worldwide from a CT perspective (Cartelli et al., 2010; Dagiene et al., 2014, 2019). Each year, the competition offers a set of tasks, with the essence of solving real meaningful problems for students through the transfer and demonstration of their CT skills. Since its inception, the competition has grown to a level where more than 1.3 million people in Europe and other parts of the world participated in the Bebras Challenge during the November 2015 Bebras Week (Dagienė & Sentance, 2016). While the roots of Bebras are not as an assessment tool, researchers have noted its ability to map its items to structures of problem-solving underlying programming and CT, such as algorithmic thinking and working with structures and patterns (Barendsen et al., 2015).
The Bebras tasks in the current research are derived from the study of Lockwood and Mooney (2018), who created a test to assess CT skills in two parallel versions, based on Bebras tasks, with 13 questions in each version. In their article, they report on a study where they ranked the difficulty level of each question. The test was found to be valid and effective for ages 15–19 (Lockwood & Mooney, 2018; Mooney & Lockwood, 2020). Previous studies have also used Bebras tasks as a tool to assess CT skills for undergraduate students (Boom et al., 2018; Dolgopolovas et al., 2015). For this study, four Bebras tasks with different difficulty levels and from different domains were selected from the test created by Lockwood and Mooney following a preliminary pilot conducted with three male and three female participants aged 18–21 years old; the participants in the preliminary test reported that the questions were challenging, taking an average of five minutes per question, and they found it interesting.
The correlation between the score of the CT test and the score in the Bebras tasks was moderate-high and significant (r = 0.72, p < 0.001). A Wilcoxon test (analogous to a paired-samples t-test) examining the difference between the median scores of the two tests did not yield a significant result: T = 1549, p = 0.97. However, since there is not enough research providing the same results, it was decided that we would not combine the two dependent variables in the study. Therefore, statistical analyses are performed separately for each of these tests.

3.1.4. Research Process

The questionnaire was administered through the Google Forms online platform during two distinct periods: from January 2023 to April 2023, and again a year later, from mid-February to mid-March 2024. The two samples were compared to ensure there were no major differences between the populations across these periods.

3.1.5. Data Analysis

Statistical analyses were conducted using the JASP program, predominantly employing parametric tests. When needed, the Mann–Whitney test was utilized as a non-parametric equivalent to the independent samples t-test. Additionally, when considering effect size in the Mann–Whitney test, we referred to the Rank-Biserial Correlation (RBC) (J. Cohen, 1988; Cureton, 1956).

3.2. Findings

In this chapter, the research findings are presented. In the first subsection, the descriptive statistic of the research variables (background variables, musical experience, and test scores) are displayed, and in the second subsection we describe the various relationships found between the independent variables (background variables and variables related to playing music) and the two dependent variables in the research—the score on the computational thinking test (CTt) and the score on the Bebras problems.

3.2.1. Descriptive Statistics for Determining the Course of Data Analysis

We tested for normality of the variables measuring age, musical experience, and CT. For medium-sized samples (50 < n < 300), if the absolute Z-value is less than 3.29 (corresponding to an alpha level of 0.05), we fail to reject the null hypothesis and can assume the distribution is within normality (Kim, 2013). Based on our analysis, it is appropriate to use parametric statistical tests when involving these variables. The findings are summarized in Table 2. As for the scale-based variables, we will use non-parametric analysis when they are involved.

3.2.2. Associations Between Background (Independent) Variables and CT (RQ2.1)

Following is a description of the associations between the background variables, accompanied by descriptive statistics in Table 3 and Table 4.
Gender. The gender variable was found to be significantly associated with both scores of the CT tests. As presented in Table 5, the average female score on CTt was lower than the average male score. This difference is statistically significant, with a medium effect size (Cohen’s d = 0.65). Similarly, the average females’ score on the Bebras tasks was lower than the average males’ score, with this difference also reaching significance, exhibiting a small-medium effect size of Cohen’s d = 0.44.
Age, level of education, and programming experience did not show strong correlations with computational thinking skills, as shown in Table 4.

3.2.3. Associations Between Music-Related Independent Variables and CT (RQ2.2)

The next aspect we aimed to understand was the correlation between the independent variables related to playing and the test scores. We found a moderate, positive, significant correlation between years of playing music and both CT tests, with r = 0.49 for CTt, and r = 0.36 for Bebras, both at p < 0.001 (N = 87). For considering a meaningful experience in playing music, we defined a threshold of playing for over a year; for participants who fulfill this condition (N = 52), correlations are stronger, with r = 0.50 for CTt and r = 0.46 for Bebras, both at p < 0.001. We continue our analysis in this sub-chapter for this sub-population. Interestingly, the perceived level of playing music did not show significant associations with either of the CT tests, with ρ = 0.17 for CTt, at = 0.24, and ρ = 0.20 for Bebras, at p = 0.17.
Reading Music. The ability to read music notes was positively correlated with the CTt and was marginally significant correlated with the Bebras score; the ability to read music as tabs was negatively correlated with both CTt and Bebras scores. This suggests that participants who are more proficient in reading music tabs tend to perform worse on these cognitive tests. Knowledge of music theory did not show significant correlations with the test scores. The findings are summarized in Table 5.
Musical Instruments Played. In this variable, participants could indicate all the musical instruments they played. Thirty-eight participants (73%) reported playing only one type of instrument, nine (17%) played two types of instruments, four (8%) played three types of instruments, and one (2%) played five types of instruments. No significant correlations were found between the number of types of instruments played and CT scores, with ρ = 0.02 for CTt, at p = 0.89, and ρ = −0.06 for Bebras score, at p = 0.69.
Among the participants, those who played wind instruments (N = 19) achieved higher average scores on the CTt compared to those who did not play wind instruments. Due to the relatively small groups, we used the Mann–Whitney U test. This difference was statistically significant, suggesting that playing wind instruments may be associated with enhanced CT skills. Findings are summarized in Table 6; other instruments did not show significant differences in CT; as there was only one participant who played string instruments other than guitar, we did not use this variable.
Playing Style. Here, respondents could choose “main style”, “secondary style”, “sometimes play”, or not mark anything for each of the styles played (classical, jazz, rock–pop, metal, or klezmer). To examine the relationship between each style and the scores on the two tests, an ANOVA test was performed. The data for all groups are summarized in Table 7.
Findings show a significant difference only for the jazz style, for both types of CT tests, with small-medium effects (η2 = 0.33 for the CTt and η2 = 0.41 for the Bebras score). Tukey post hoc tests revealed that respondents who reported jazz as playing only sometimes scored higher on CTt than those who reported that they played it as a secondary style, with t = 3.8, at p < 0.05. For the Bebras test, those who reported jazz as playing only sometimes scored higher than both those who reported that they played it as a secondary style (t = 4.5, at p < 0.001) and those who reported playing it as a main style (t = 3.1, at p < 0.05).
Experience in Playing in Ensembles. In this variable, respondents were able to list all the ensembles they had participated in, with options including orchestra, band, big band, chamber ensemble, or vocal ensemble. Six respondents indicated that they did not participate in any ensembles, 12 played in one type of ensemble, 17 played in two types of ensembles, 11 played in three types of ensembles, 3 played in four types of ensembles, and 3 participated in five different types of ensembles. No significant correlations were found between the number of types of ensembles in which respondents played and their scores on the CT tests, with ρ = 0.04 for CTt, at p = 0.80, and ρ = 0.03 for Bebras score, at p = 0.85.
Due to the relatively small group sizes, we used the Mann–Whitney U test. As shown by the findings presented in Table 8, we found that those who reported playing in an orchestra achieved significantly higher average scores, both on the CTt and on the Bebras tasks, than those who did not play in an orchestra, with a medium-small effect size (RBC = 0.37).

3.3. Discussion

The present study aimed to explore the differences in CT competency, based on experience in learning to play musical instruments, and to understand those differences based on the characteristics of the subjects’ musical experience. The study investigated various background and musical experience variables, focusing on different components characterizing musical involvement. Additionally, it examined CT skills using two measures: the computational thinking test (CTt), which provides a diagnostic assessment, and the Bebras tasks, which evaluate the application of computational thinking skills in problem solving. The findings offer several insights into the factors that influence CT test performance and highlight interesting associations between musical experience and CT.

3.3.1. Differences in CT Competency Based on Personal Characteristics

Gender

Our findings indicate significant gender differences in performance on both CTt and Bebras tasks. Men outperformed women on both tests, with medium effect sizes observed. This finding aligns with previous research suggesting similar gender differences for younger populations (Guggemos, 2021; Román-González et al., 2017); however, overall, past research has shown mixed results regarding the associations between CT and gender (Lin & Wong, 2024; Sun et al., 2022). Specific to Bebras tasks, it was found that boys perform better in tasks that have more text and simpler representation, while girls perform better in tasks that have less text and more complex representation (Budinská & Mayerová, 2017), and it might be that the tasks that we chose were biased towards this gender-based advantage. Future research should delve deeper into the underlying causes of these differences to develop targeted educational interventions that can help bridge the performance gap. Interestingly, assessing CT via digital games showed little or no difference between boys and girls (Guenaga et al., 2021; Israel-Fishelson et al., 2021), which may lead us to think about computer-based interventions.

Age

Age was found to be negatively correlated with scores on both the CTt and Bebras tasks. This indicates that younger participants tended to perform better on these cognitive tests. The consistency of this correlation across both test types suggests a stable association between age and CT performance. This finding is in line with previous research showing that cognitive abilities can decline with age starting as early as in the late teens and early twenties (Salthouse, 2009). However, it is essential to consider that other factors, such as differences in educational background or familiarity with the test content, might also play a role in this observed decline.

Programming Experience

The study found that programming experience was negatively correlated with Bebras tasks and is not correlated with CTt. This unexpected finding could suggest that individuals with programming experience might approach Bebras tasks with a different mindset, potentially overcomplicating the solution process. Further research is needed to explore this relationship and understand the nuances of how programming experience impacts different types of cognitive tasks. Previous studies found mixed findings regarding the associations between programming experience and CT when assessed using digital games, with either positive association (Guenaga et al., 2021) or no association (Israel-Fishelson et al., 2021).

3.3.2. CT Competency and Music Playing Characteristics

Musical Experience

One of the most striking findings was the significant positive association between musical experience and performance on the CTt. Participants with musical experience scored significantly higher on CTt compared to those without such a background, with a large effect size. This is consistent with previous studies that have documented the positive correlation between years of playing an instrument and cognitive performance (Bugos et al., 2007; Habibi et al., 2018; Hansen et al., 2013; Lee et al., 2007; Schellenberg, 2004). Based on the design of our study, we cannot conclude causal relationships and can only point to previous studies which, by their very design, suggest such causality. For example, Schellenberg (Schellenberg, 2004) claims that music lessons cause small increases in IQ, while Habibi et al. (2018) concluded that music training leads to some brain changes in school-age children, and Bugos et al. (2007) suggest that individualized piano instruction may serve as an effective cognitive intervention. Interestingly, musical experience did not show a significant association with Bebras tasks. This suggests that cognitive benefits of musical training may be more specific to certain types of problem-solving and logical reasoning skills rather than general computational thinking abilities.
Note that the current frequency of playing did not significantly correlate with scores on either test, suggesting that the overall duration of musical experience may be more critical than recent activity level. Notably, the perceived level of playing did not show a significant correlation with test scores, which may be attributed to common discrepancies between self-perceived and actual competencies (e.g., Mamolo et al., 2020; Porat et al., 2018). Therefore, it is interesting to refer to actual indicators of playing level, like instruments being played, participation in ensembles, playing style, and ability to read music. We refer to these now.

Reading Music

Reading music showed mixed results. The ability to read music notes was positively correlated with CTt scores, and was marginally positively correlated with Bebras tasks, suggesting that reading complex musical notation may be associated with certain aspects of computational thinking. However, reading music as chords and tabs did not show a significant positive correlation with test scores, and in fact, reading tabs was negatively correlated with both CTt and Bebras tasks. This may reflect the different cognitive demands of these music reading skills and implies that reliance on tab notation, which is often less complex than traditional music notation, might not support the cognitive skills measured by the CTt and Bebras tasks.

Musical Instruments Played

The findings indicate that participants who played wind instruments achieved significantly higher scores on the CTt compared to those who did not play wind instruments. One possible explanation for the higher CT scores among wind instrument players is the complexity and cognitive demands associated with playing these instruments. Previous research suggests that playing instruments like the piano or wind instruments requires higher levels of cognitive activation and complex skill sets compared to instruments such as keyboards (that are not piano), percussion, and guitar, which might be perceived as more straightforward or routine (Moriuchi et al., 2020; Shu, 2021). Another explanation could be that wind instrument players commonly practice sight reading, i.e., reading and performing pieces the player has not seen before; this is a challenging task, and could further develop their cognitive abilities (Galyen, 2005). In contrast, instruments like keyboards and guitars may often involve basic chord reading, and percussion instruments primarily require rhythm reading rather than melody reading, which may explain the lower CT scores associated with these instruments. Further research is needed to explore these differences in more depth and to understand the varying impacts of different instruments on cognitive skills.

Playing Style

Examining different playing styles, we found an interesting pattern regarding playing jazz. Those who only occasionally played jazz scored higher on CT tests than those who played it regularly. A possible explanation may be related to the fact that jazz is considered the most creative style of music playing, and jazz musicians demonstrate higher levels of creativity (Benedek et al., 2014).
Creativity may be a link between music-playing and CT. As creativity and CT refer to knowledge-construction processes and involve an overlapping set of thinking tools (Román-González et al., 2017), it is reasonable to look for associations between these two skills. In a way, these two skills share even deeper common ground, as both refer to finding new ways of thinking when facing problems that need to be solved (Israel-Fishelson & Hershkovitz, 2022b). Indeed, the International Society for Technology in Education and the Computer Science Teachers Association included creativity in their definition of CT (ISTE & CSTA, 2011). Different studies have highlighted the growing acceptance of the bidirectional association between computer science—and CT in particular—and creativity (Dagiene et al., 2019; Kong, 2019; Miller et al., 2013; Pérez Poch et al., 2016; Seo & Kim, 2016). On the one hand, creativity may catalyze solving algorithmic problems, creating computational products and developing new knowledge (Dagiene et al., 2019; Israel-Fishelson & Hershkovitz, 2022a; Romeike, 2007). On the other hand, creative thinking can be cultivated by practicing skills related to CT, e.g., observation, visualization, abstraction, imagination, and pattern recognition (Clements & Gullo, 1984; Seo & Kim, 2016; World Bank, 2019; Yadav & Cooper, 2017). In this light, the negative association between playing jazz and CT is in line with other studies that showed negative relationship between creativity and CT (Hershkovitz et al., 2019; Israel-Fishelson et al., 2021).

Ensemble Experience

Ensemble experience was positively correlated with CTt scores but not with Bebras tasks, indicating that participating in multiple ensembles might enhance specific cognitive skills related to CT. Playing in an orchestra, in particular, was associated with higher scores on both tests, highlighting the potential cognitive benefits of collaborative and structured musical activities. It is possible that the complexity required of musicians in such ensembles contributes to these benefits; when a musician participates in an ensemble, in addition to the cognitive skills needed for playing their instrument, they must also be aware of everything happening around them, such as the conductor and the other musicians, which requires musicians to exert higher cognitive-auditory effort (Habibi et al., 2018).
Another related issue may be spatial perception, which is relevant to playing in an ensemble and is relevant to CT as well. Studying abilities in a mathematics classroom of children who played musical instruments, it was concluded that they are affected positively in terms of their spatial–temporal reasoning (Tezer et al., 2016). This is evident, for example, in their perception of geometry-related content, or in their ability to see details and envision an object without seeing it. On the other hand, a meta-analysis of 15 studies showed that active music instruction of two years or less could lead to dramatic improvements in performance on spatial–temporal measures (Hetland, 2000), a finding that was also supported in a brain imagery study (Gaser & Schlaug, 2003), as well as in a study that demonstrated superiority in mental rotation abilities of students of music over students of education (Pietsch & Jansen, 2012). The spatial advantage is often explained by the fact that musicians have practiced turning visual, paper-based representation to motor movements in the real world. In the realm of CT, spatial reasoning and mental rotation have proved importance. This is understood especially in the context of simple maze-based CT tasks; such tasks revolve around leading an agent—whether physical or virtual—from point A to point B, using some representation of instructions, e.g., text-based or block-based programming, or symbol-based cards. More complex tasks could involve leading an agent through various spaces to achieve some pre-defined goals, e.g., collecting coins or scoring a goal. It is clear that spatial reasoning is crucial in these settings, hence it is not surprising that it is being developed after engaging with such tasks (Berson et al., 2023; Leonard et al., 2023), and that it is associated with errors in solving them (Ben-Yaacov & Hershkovitz, 2023). Looked at from a broader perspective, sequencing, decomposition, pattern recognition, and even abstraction—in the sense of hiding details—are connected to spatial reasoning (Chan et al., 2023), which may explain recurring associations observed between CT and spatial reasoning (e.g., Città et al., 2019; Soleimani et al., 2016).
Musical group interactions, defined as a situation where two or more people play music together—as evident in ensembles—bring participants into states of “togetherness” (Cross, 2009), in which special attention is given not only to physical actions but also to emotional states (Cross et al., 2012; Rabinowitch et al., 2013). This leads us to link our findings to the theory of mind, which refers to the capacity to understand other people by attributing mental states to them (cf. Byom & Mutlu, 2013; Frith & Frith, 2005); indeed, some interesting associations between theory of mind and computational thinking have been recently shown (J. Su et al., 2024).

4. Manifestation of CT Skills When Learning to Play Musical Instruments

In this section, we report on findings from a study that was designed to answer RQ2. In this study, we took a qualitative approach (N = 10 high-school students), collecting data via interviews with 12th-grade students who have reached the professional level of playing required in the final matriculation exams in music performances (Recital Bagrut), and who were also studying in an advanced Computer Science program.

4.1. Methodology

4.1.1. Research Field

This study was conducted in Israel, where the education system is mostly public, centralized, and is typically divided into three school levels: elementary schools (1st–6th grades), middle schools (7th–9th grades), and high schools (10th–12th grades). At the end of high school, students are required to take matriculation exams, of which some are mandatory (Hebrew, History, Civic, Mathematics, English, and Religious), and some are for each student to choose from a wide range of subject matters. Students can take exams of various levels of difficulty, which is determined by the number of Study Units (usually between 1–5).
For taking the five Study Units exam in Recital, Composition, Orchestration, and Conducting in Music, students are required to have a meaningful, continuous experience of playing musical instruments in one of the 85 conservatories of music approved by the Ministry of Education or by an approved private teacher. The Recital exam can be performed in five different music styles: Classical, Jazz, Rock and Pop, Arabic, and Middle Ages and Renaissance music. Each music style has its own requirements and evaluation methods, but, in general, all the styles have the same volume of preparation. An average exam includes the preparation of five different music pieces chosen by the student and performed live to a jury, and only once. The duration of the entire program for one student is between 25 and 40 min of playing music. According to officials in the Department of Music Education at the Ministry of Education1, in 2024, 881 students (out of 1400 12th-grade conservatory students) took this exam.
The course of study that is aimed at the matriculation exam in Computer Science at the level of five Study Units usually requires passing a screening exam. This course of study usually takes either two or three years, during which students learn both the fundamentals of computer science and programming.
As part of the research, interviews were conducted with 12th grade students from six different conservatories under the supervision of the Ministry of Education. These students completed a matriculation exam in music performance and a matriculation in computer science, as part of their studies in 2022. The interviews were transcribed and subjected to qualitative content analysis using a “bottom-up” model.

4.1.2. Research Population

Our sample includes 10 students (8 males and 2 females), aged 17–18, who were in the 12th grade in 2022, studying towards their matriculation exams in—among other subjects—Recital, Composition, Orchestration, and Conducting in Music and Computer Science—both at the level of five Study Units. The students performed a final recital exam in the classical field (7 students) and in the jazz field (3 students). The participating students played a variety of musical instruments: flute (1 student), clarinet (1 student), saxophone (3 students), tuba (2 students), electric guitar (1 student), piano (1 student), and drum set (1 student). Most of the students live in the center of the country (8 students), and one student lives in the north of the country; the cities and towns where the participants live are mostly characterized by a high socioeconomic status, all but one belong to Clusters 8 or 9 of the geographical units, where Cluster 1 includes the lowest socioeconomic level units, and Cluster 10—the highest; one participant lives in a city that belongs to Cluster 6 (Central Bureau of Statistics, 2024).

4.1.3. Research Tool and Research Process

Data collection was carried out using a semi-structured interview consisting of three parts, as described below. Interviews were conducted remotely, through Zoom, lasting between 25 and 45 min.

Part I—Demographic Details

After signing an informed consent form and before taking the interview, the students were asked about demographic details regarding their age, gender, years of education, place of residence, school, background in musical training, and achievements in music and computer science subjects. These details were filled in by the students in an online questionnaire.

Part II—Description of Thinking Processes When Solving Musical Problems

During the interview, the students were asked to describe their personal learning process in music, regarding, e.g., learning a piece of music, rehearsing the composition, and playing and practicing at home. They were also asked to detail their thought process when encountering problems or challenges—in the context of playing a musical piece—that required a solution. The questions in this study refer to how the student trains to perform the piece optimally. They aim to clarify the student’s work process, decision-making process, problem-solving method, and regular learning habits. The questions are formulated as guiding questions, as is customary in a semi-structured interview.

Part III—Transfer of Skills Between Music Education and Computer Science

Finally, the students were asked to identify connections between their ways of thinking in solving musical problems and their approaches to solving problems in computer science. They were asked how much they believed the skills developed through musical education contributed to their abilities in computer science and what skills were established and strengthened through learning and problem-solving in other fields.
Below are the key questions that guided the interviews, which are designed to explore these aspects thoroughly:
  • Please describe your practice routine when you receive a new piece from your teacher
    • How do you begin working on a new piece?
    • How do you deal with the challenging sections?
    • Are you able to identify your mistakes? If so, how?
    • How do you correct those mistakes?
  • Does your teacher provide you with specific instructions for home practice, or have you developed your own practice routine?
    • How long does it typically take you to learn a piece?
    • Can you describe your practice environment?
  • How do you think the learning process you just described is like your learning process in computer science?
    • How do you think this learning process helps you succeed in computer science?
    • Can you provide examples where you felt that your years of musical training helped you in computer science?

4.1.4. Data Analysis

The interviews were fully transcribed before analysis. The interviews’ full scripts were analyzed by both authors using the conventional content analysis approach, where coding categories are derived directly from the text data. Categories and themes were defined in an iterative converging process, upon full agreement between the two coders, moving from low-level codes to a coherent, high-level scheme (Elo & Kyngäs, 2008; Hsieh & Shannon, 2005; Mayring, 2000), with the basic unit of analysis being an interviewee’s statement. The coding and analysis of the data were carried out using the atlas.ti software. The questions were based on a reliable tool from a study that examined learning strategies in music students (Hallam et al., 2012). These questions covered four areas: learning strategies in training, training organization, concentration, and attitudes during training. For this study, statements related to training strategies were taken, noted for their reliability (average answers of 5.5 or higher on the Likert scale) among students in their eighth year of study or higher (Hallam et al., 2012).

4.2. Findings

The following section presents the key findings of the study, focusing on the students’ learning strategies during their independent practice at home, as well as the potential influence of these strategies on their success in computer science. The findings are structured according to three main phases of practice—Review, Working, and Reflection—each representing distinct steps within the broader learning process. For each phase, specific strategies and computational thinking skills that emerged from the students’ practice routines are discussed. At the end of this section, we present the findings regarding the contribution of musical practice to the development of computational thinking skills in computer science.

4.2.1. The Three Phases Structure of Practice (RQ1.1)

Overall, we identified three phases in the students’ practice routine: Review, Working, and Reflection. These phases are further divided into sub-phases, each representing specific strategies that target different aspects of learning. For example, the Review phase includes strategies such as listening to professional recordings and initial sight-reading, while the Working phase includes strategies like breaking down the piece into smaller sections, error identification, and fixing errors. The Reflection phase encompasses self-evaluation and targeted exercises to enhance weaker areas and skill refinement. In the following section, we explore each phase in detail, discussing the strategies employed by students and how these strategies support the overall learning process of mastering a musical piece.

The Review Phase

This initial stage involves practice strategies such as pre-playing, listening to the whole piece, sight reading through the composition (prima vista), identifying its form, and gathering background information on its creation. These strategies expose students to the material, assist in organizing information, and promote a holistic understanding of the piece. These activities align with several CT skills. Specifically, in this phase, students exhibited cognitive processes associated with abstraction—particularly, data collection and modeling—pattern recognition, and modeling.
  • Pre-Playing Listening to the Whole Piece
All but one of the participants reported that their initial practice strategy involves listening to the entire musical piece prior to playing it: “First of all, I listen to it several times, as many times as possible for several days” (S7). This preparatory listening phase enables students to grasp the fundamental music elements of the piece, as described by Student S2: “[It helps] understand how the piece is rhythmically [structured], in terms of pitch and notes, how the piece sounds at the most basic level”. Furthermore, this phase extends beyond merely gathering performance-related information. According to S8, pre-playing listening facilitates an understanding of “the whole [musical] style”, or, as an alternative, exposure to “the interpretation of a specific performance”, which assists students in shaping their personal interpretations for performing the piece.
During this sub-phase, we identified two CT-related categories of strategies that helped our participants to enhance their understanding and approach to the musical composition. These are data collection and modeling.
Data Collection. During this initial listening period, students gather a variety of data and information about the piece, including its general mood, tempo, scale, overall structure, orchestration, texture, main themes, recurrent musical phrases, virtuosic sections, and potential interpretations. This data collection process promotes modeling among musicians, enabling them to perceive the overarching “big picture” (S6) of the piece. As one student reflected, “When I start learning a new piece, I always listen to a few professional recordings. It helps me gather a sense of how the music should sound and what challenges I might face. This way, I know where the tricky parts are and what to focus on” (S3). By gathering such information through educated listening, students can “understand the structure of the work […] the musical form in which it is written” (S10) even before they begin the working phase of the learning process.
  • Sight Reading (Prima Vista)
Sight reading, a strategy used by six students, emerged as a key exercise in their practice routine. One student explained that during the first playthrough, he will “try to read through the piece more broadly—instead of really delving into each part, whether technically or musically” (S9), emphasizing the importance of understanding the overall structure rather than getting caught up in the details. Other students shared similar approaches, with one noting that they “play it all the way through […] with mistakes or without mistakes” (S1), and another stating, “I just go through it as it is—not at the correct speed, not with dynamics, just note reading” (S4).
Sight reading emerged as a key strategy, highlighting processes of abstraction through data collection, pattern recognition, and modeling
Data Collection. This preliminary run-through allows students to collect essential data, such as “sense of its [the piece] idea, the rhythm, the sounds, the style” (S1). Additionally, this phase helps identify challenging sections that may disrupt the flow, which are marked for focused practice later. For instance, a student remarked, “In the sight reading, I might notice […] there’s a tough transition here, or there’s a jump that I know is challenging [for me]” (S5).
Pattern recognition. Through this process, students also begin to recognize recurring patterns and motifs within the music, contributing to a deeper understanding of the composition. As one student observed, “It [the piece] is built in certain sections, sections that repeat themselves” (S4). This recognition of patterns and systemic elements enables students to approach the piece as a coherent whole rather than a series of isolated notes. Another student, a drummer, described a focus on understanding the overall “vibe” or “groove” rather than aiming to play every note precisely (S7). Finally, S10 highlights the significance of this strategy in facilitating the abstraction process:
“[with sight reading] I can immediately find similar sections and start thinking about how I plan to develop this, how I intend to vary between the repetitions of the theme, for example” (S10).
Modeling. By playing the entire piece in one go, students gain a broad perspective on its form. This allows students to grasp the overall structure of the composition and the “idea of the piece” (S1) or “general sense” (S5) of it, meaning the broad and general picture of the whole work. As one student explained, “I try to read through the piece more broadly—instead of really delving into each part, whether technically or musically” (S9). Another student explains the importance of understanding both the whole and the details: “You’re still familiarizing yourself with the piece, and you want to understand it both as a whole and [also to know] the notes better—so you play from start to finish” (S4). Additionally, one participant, a flute player who needs to save his energy through his performances, noted that this approach also contributes to planning endurance for the final performance: “It [sight reading] also gives me a sense, of course, it’s still far off, but for the continuous and long final performance, to plan my stamina—[for example] where to invest more [resources] and where to invest less [resources]” (S10). For very long pieces, having a general view of the entire work allows students to prioritize sections and techniques that require significant learning and practice in the upcoming stage—the working phase.

The Working Phase

The second, and the most prominent in the learning process, is the working phase, where students train both their minds and bodies to execute the musical instructions with precision. This phase begins with the identification of errors in playing—whether through comparing their performance to a professional performance, hearing their own playing in real-time, or analyzing the sheet music. Once an error is identified, students focus on correcting it using various strategies, including simplifying the task, repeated practice, isolating specific elements of the performance, and working on short segments of the piece. These strategies reflect CT skills, particularly in debugging and decomposition, as students engage in recognizing and correcting mistakes within the musical performance.
  • Errors Identification
In the working phase, various strategies for error identification emerged, reflecting CT skills in the domain of debugging. The interviews revealed three primary strategies used by students: identifying errors through listening to professional performances, real-time auditory feedback during practice, and identifying errors through reading the sheet music. Nearly all students (9 out of 10) relied on professional recordings to detect discrepancies between their performance and the expected standard, as one student described, “I might be making a musical mistake, something doesn’t fit, and then when I listen to the piece, performed by someone experienced or with accompaniment, I realize where I should come in, what I did wrong” (S5). This approach, echoed by others, shows how listening can uncover errors that may not be apparent during practice: “Often I listen to a recording, and it helps me understand whether I made a mistake or not” (S1), or “I’ll listen to the piece on YouTube afterward and realize that I didn’t play it as close to the performance I heard” (S2).
Another significant strategy involves students recognizing mistakes as they play, relying on their developed ear to notice when “something doesn’t sound right” (S1). This real-time self-feedback is crucial, as one student noted, “I often say to myself: ‘Wait, wait, wait, I extended something here, this transition doesn’t sound good’” (S5). The ability to detect errors during performance reflects the students’ auditory training, enabling them to identify issues such as incorrect rhythms or pitches. For instance, one participant mentioned, “Sometimes the musical phrase sounds strange to me. I check if it’s really supposed to be like that […] sometimes I realize there’s an accidental I missed” (S3). Another student, a tuba player, shared, “If it’s a pitch issue, I usually detect it through listening—it just doesn’t sound right to me” (S1).
The third strategy for identifying mistakes involves closely reading the sheet music, which helps students to spot deviations from the intended notes. As one student explained, “Reading the music helps me realize when my playing contains errors compared to what’s required in the text” (S4). This method is particularly useful for identifying technical challenges or nuances in dynamics that might otherwise be overlooked during playing: “Sometimes I notice small things in the music, like a transition between two notes or a section that’s the focus of the problem” (S4). Another student, a pianist, emphasized the importance of practicing with the sheet music even after memorization, noting how it helps in catching overlooked details:
“I practice with the sheet music—even when I know it by heart, I practice with the sheet music, and I look at the notes. So, I might notice there’s a mistake. Whether it’s in a passage that I can’t quite hear all the small notes, I’ll notice… I’ll think to myself, ‘Maybe it’s supposed to be like this? Maybe there should be a crescendo here?’ And then I’ll look at the sheet music and see, ‘Oh, okay, it says there’s a crescendo here’” (S9).
These strategies of error identification lay the groundwork for the next crucial step: effectively fixing the identified mistakes to enhance the overall performance.
  • Fixing Errors
In the strategies employed for fixing errors, students demonstrated expressions of CT, particularly in the areas of decomposition, debugging, and iteration. Debugging plays a central role in this process, as students actively worked to identify, analyze, and correct mistakes in their performance. The interviews revealed that students immediately address the identified mistakes rather than moving on to other sections of the piece, “I don’t just continue forward” (S2). Instead, they dedicate time and resources to correcting their errors, a practice reported by all interviewees. The correction process involves several learning strategies that are often used simultaneously or in combination, and all of which were mentioned by all of our participants: working in short segments, isolating and focusing on a single dimension of a performance, lowering the difficulty level of the preforming section, and repeatedly practicing the musical phrase or pattern to internalize the correct performance. To fix errors identified during practice, students utilize a combination of learning strategies, as indicated by the interviews. The first three strategies involve decomposition—in various forms—and the fourth involves iterations.
Decomposition of Challenging Parts. First, students break down the challenging sections where errors were detected into smaller, more manageable segments. As one student noted “I divide the piece into sections” (S3); “I try to play just the specific measure where the mistake occurred” (S1). One student described the approach as, “These sixteen measures [the challenging section], out of which, the four measures I didn’t succeed in—I’ll first correct those four measures” (S4). Another added, “I break it down into the smallest pieces” (S7), while a different student noted, “I’ll take a measure or two from where the mistake happened” (S10).
Decreasing Dimensionality. This decomposition extends beyond merely breaking the musical phrase into shorter parts; it also involves isolating different dimensions of performance, such as pitch, rhythm, dynamics, and coordination between hands (and in the case of drumming, between feet as well). One student explained how they focus on pitch by playing only the mouthpiece, separated from the instrument:
“If it’s high notes, usually with brass instruments, I play on the mouthpiece—it helps develop the embouchure, get used to the high notes, and internalize them better” (S1).
This strategy of isolating performance dimensions and focusing on one at a time allows for efficient error correction. Another student shared a similar approach:
“If I have a tricky rhythm, I work on the rhythm. If I need to work on something like altissimo [producing high notes with complex fingerings on the saxophone], I’ll work on that… I might play it without a metronome, without anything. Maybe not pay too much attention to the rhythm” (S3).
Decrease Difficulty. The third major strategy involves lowering the difficulty of the challenging section where errors were detected, primarily by reducing the tempo. Slowing down the tempo allows students to “absorb all the information that’s written” (S3), enabling concentration and thoughtful correction of the necessary dimensions: “I’ll play the easier passages at tempo and the harder ones at under-tempo, trying to pay attention to all the dynamic changes and articulations” (S3). Another student mentioned, “If the tempo feels like part of the problem, I lower the tempo to solve it” (S2), while another reflected, “When I correct something, I always start much slower. To understand the mistake, I play the specific section slower because when you play slower, it’s easier to distinguish between each note” (S1).
Repeated Practice. The final strategy identified in the interviews is the repeated practice of the segment where the mistake was found, with the goal of embedding the correction into the student’s performance so they can play the segment without errors. This strategy corresponds with the CT’s skills of iteration. One student described his use of repetition, “I’ll play it a few more times to make sure it sticks and that it wasn’t just a fluke” (S1). Another added, “If a note didn’t come out right or something didn’t work, I’ll play the measure again” (S2). One more student elaborated, “Sometimes I’ll work just on that segment and try to bring it up to full tempo until I can play it from start to finish at the same tempo” (S3). This iterative process of playing the passage repeatedly helps solidify the correct performance and eliminate errors: “I repeat it until I get it right; that’s the way… preferably two or three times, to make sure you really get it” (S4).
After addressing the identified errors, the final phase of the practice process involves reflection, allowing students to internalize their progress and refine their understanding of the piece.

The Reflection Phase

As outlined earlier, the reflection phase is the final stage in students’ learning strategies. Unlike the previous stages, the boundaries of this phase are less defined, with elements often overlapping with the working phase or even the preparation for the next piece in the student’s repertoire. This stage involves learning processes focused on enhancing the student’s playing abilities based on self-critique, typically through strengthening and developing musical skills and refining personal musical expression and interpretation. Two primary strategies emerged during this phase: enhancing musical abilities through exercises (identified in 6 out of 10 interviews) and seeking guidance from the teacher (found in 5 out of 10 interviews). Computational thinking skills, such as iteration and modeling, are subtly present in this reflective stage.
  • Developing Musical Abilities through Exercises
Iteration and Modeling. The findings suggest that in this strategy iteration and modeling work hand in hand when students feel their performance does not meet their standards. In these cases, students focus on strengthening their foundational musical abilities through exercises and etudes that may not be directly related to the current piece but are essential for broader skill development. These exercises not only improve basic techniques but also cultivate a deeper understanding of musical patterns, style, and form (modeling), allowing students to build the technical and conceptual framework needed to approach the challenges of the piece more effectively. This process goes beyond just learning the specific piece; it equips them to handle the challenges posed by the piece more effectively than if they had focused solely on practicing the piece itself. The extended work on exercises allows them to develop the skills necessary to overcome the difficulties presented by the music in a more efficient and comprehensive manner (iteration). As one student explained, “Doing exercises improves basic techniques. Then everything I do improves” (S2). This iterative process polishes and fine-tunes their abilities, leading to more efficient and effective problem-solving during practice.
S8 highlighted the importance of exercises in improvisation:
“Sometimes improvisation requires me to do various scale exercises, going up and down, or practicing approach notes, or breaking down chords, so I can improvise over a chart that’s really difficult—where you move through many keys and there are lots of chords. So I need to practice scales more, and how to connect them—then it’s more about different exercises.” (S8).
Another student described how they incorporate exercises into their routine to address specific issues:
“If I feel something isn’t sharp enough, I try to integrate it into my practice routine beyond just that specific part of the piece. I’ll do various exercises […] So in the end, I reach a point where I’m satisfied with how I play that part of the piece. And this way, I also advance myself beyond the specific thing that’s required—I try to improve the whole category I felt I was missing. I incorporate it into my practice” (S5).
  • Getting assistance from the personal teacher
In the strategy where students seek assistance from their teacher, or where the teacher provides feedback and improves the student’s playing by offering solutions, computational thinking expressions were not found. According to the interviews, students receive help from their teacher to solve problems they couldn’t resolve during the working phase. As one student mentioned, “[If] I can’t fix it, I ask my teacher for help” (S5). Another shared, “I try to do it on my own a lot of the time until I succeed, and if it’s really hard, I might write to my teacher” (S6). Another student explained, “If there are things I don’t quite understand—or chords I don’t understand, then I bring them to the lesson with the teacher so I can ask them—so they can explain it to me” (S8). One student even noted, “It’s also not my favorite style—so there, for example, it’s harder for me to spot those mistakes. That’s where my teacher helps me more” (S9).

4.2.2. The Contribution of Musical Practice to Computational Thinking in Computer Science (RQ1.2)

The findings of this research indicate that there is no conclusive evidence to support the notion that musical practice directly enhances computational thinking (CT) skills relevant to computer science. However, students provided mixed insights regarding the similarities between their musical training strategies and their approach to computer science, as well as the potential benefits of their musical experience in their computer science studies.
  • Similarities Between Musical Practice and Computer Science Problem-Solving
Approximately half of the students (5 out of 9) identified some similarities between their approach to practicing music and solving tasks in computer science. While their responses varied, the connections they drew were often tentative. For example, one student noted, “Now that I think about it, yes, the stages [of learning strategies] do match writing code” (S1). Another student reflected, “I can link them […] it’s a bit strange for me though” (S3). A third participant emphasized, “It’s a bit like focusing on the problem [in music], which reminds me of computer science” (S4).
Interestingly, each student highlighted different aspects of their learning approach that they believed aligned with their work in computer science. For instance, S1 compared the way they tackle a new programming problem to how they first approach a musical piece: “When writing code, you begin by understanding the overall idea. You read the question, plan what steps are needed, without actually writing code yet—just like playing a new piece ‘fresh’ for the first time” (S1). Similarly, S3 found a parallel between debugging code and correcting errors in a musical passage, saying, “When I work on a piece, I step back to fix mistakes—just like when debugging code, I step back and work on it in sections” (S3). Another student, S5, likened the initial stages of programming to the early phases of musical practice: “When I write a piece of code, I make a sketch—it’s like the first run-through of a piece. Then I go back and refine it” (S5).
  • The Impact of Musical Strategies on Computer Science Performance
While most students did not report a strong connection between specific musical learning strategies and their performance in computer science, four students hinted at a broader influence. These students suggested that the discipline developed through musical training, rather than specific strategies, positively impacted their ability to study computer science. However, they did not point to any learning phase or strategy as a direct contributor.
As S2 bluntly stated, “I don’t know if you can say it [music] helps or not. I’m going to help you prove that it doesn’t” (S2). Similarly, S7 noted, “I can’t really see a direct link” (S7). Despite these comments, some students indicated that their long-term engagement with music had cultivated valuable learning habits that benefited their computer science studies.
  • General Contributions of Music to Learning in Computer Science
Most students (9 out of 10) did acknowledge a general benefit of musical practice on their learning abilities in computer science. The primary contribution cited was the development of patience and self-discipline, both crucial qualities for success in computer science. As S9 pointed out, “First of all, it’s mainly the patience required in music” (S9). Another student elaborated on how this patience is essential when debugging code: “Playing music for years has taught me patience. When I make a mistake, I understand it takes a lot of work to fix it. It’s the same when I encounter an error in code” (S10). S6 added, “I’m much more experienced in just sitting down and trying to figure out what’s wrong, sometimes for hours” (S6).
Several students also highlighted the broader perspective music gave them, particularly when thinking systematically. S6 compared working in an orchestra to managing variables or classes in programming: “In an orchestra, you have to think, ‘Okay, I’m one musician in a group of 40. We all need to work together.’ It’s easy to relate that to programming concepts like variables or models” (S6). Similarly, S8 mentioned that playing with others in a musical group helps develop awareness of both one’s individual role and the larger context: “Music has made me think more about what’s happening around me—because when you play with others, you need to listen to them and think about your own part as well as theirs” (S8).
These reflections suggest that, while specific musical strategies may not directly translate to computer science, the broader skills of patience, discipline, and systemic thinking cultivated through years of musical practice do provide valuable support for students in the computational field.

4.2.3. Summary of Finding

The students’ interviews revealed that computational thinking skills can be found in each of the three phases. In the preparation phase, evidence of abstraction, data collection, pattern recognition, and modeling was identified through pre-playing listening, initial playthroughs sight reading (prima vista) of the piece, and recognition of its form. In the working phase, skills such as debugging, decomposition, and iteration were evident through strategies like identifying errors from listening to professional performances, hearing the student’s own playing, and reading the sheet music and performance instructions. Regarding decomposition, students frequently broke down complex musical phrases into manageable segments, reduced the difficulty by slowing the tempo, and separated the various dimensions (fingering, dynamics, articulation, intonation, etc.). Additionally, students repeatedly practiced the challenging phrases to internalize the corrections. In the final phase, the reflection phase, students’ testimonies indicated that computational thinking skills were present. Table 9 concisely outlines the learning strategies employed across all phases of practice, along with the corresponding facets of computational thinking skills that emerge from these strategies. Findings are summarized visually in Figure 1.

4.3. Discussion

The aim of this study was to map personal practices of advanced music students and to identify how computational thinking (CT) skills are acquired during those music practices at home. To achieve this, interviews were conducted with ten 12th-grade students who have significant experience in both music and computer science and who took both a recital exam in music and a matriculation exam in computer science during the 2021–2022 academic year. This section summarizes the main findings from the interview analysis, comparing them with previous research.

4.3.1. Practice Strategies in the Three Phases of Learning Music

The findings suggest that the students use a variety of well-documented practice strategies that can be structured into a three-phase learning model: the review phase, the working phase, and the reflection phase. These phases represent a learning process that begins with reviewing and gathering relevant information, continues with persistent practice aimed at mastering the material, and concludes with internal and external critique of the performance. This tripartite division aligns with the literature on analyzing musical pieces among music students confronting new works (Lehmann et al., 2007).
The review phase is the students’ first encounter with the material. In this phase, learning strategies that enable effective data collection, which subsequently influence decision-making during the working phase and later in the reflection phase, were identified. This phase corresponds to the planning or review stages in the literature (Austin & Berg, 2006). The primary strategies found include listening to a professional performance of the piece before playing it and performing an initial sight-reading, characterized by smooth, general playing that does not delve into details. This approach is aimed at obtaining a general overview of the piece, its form, central motifs, and challenging sections that will require focused work later.
The working phase is the second and most prolonged stage. In this phase, students focus their efforts on embedding the musical instructions into their minds and bodies. During this embedding process, students invest most of their resources in identifying and correcting mistakes. This phase includes elements related to strategy and monitoring in the literature, as well as what has been referred to as the ‘grey phase,’ which lacks clear strategies (Chaffin et al., 2003). Interestingly, all the students considered their inability to play as written as a “mistake” that needed correction, even if they could perform the piece from start to finish without errors. This phase, being the longest, involves various strategies for identifying and correcting playing mistakes. Error identification includes three main avenues: identifying mistakes by listening to a professional performance, thereby comparing the desired output reflected in the performance with the actual output during their playing; identifying mistakes by listening while playing; and identifying mistakes by reading the sheet music and realizing that the written instructions do not match their actual performance.
Embodied Learning, which integrates cognitive tasks with physical activities, has been found to enhance students’ self-perception and develop problem-solving and debugging skills (Sung et al., 2022). Additionally, this type of learning increases attention, aids memory, and reduces the likelihood of errors (Jusslin et al., 2022). It is possible that since music practice is inherently an embodied learning experience, it allows students to concentrate their efforts in one phase, thereby enabling them to navigate the intensive learning process during practice.
The reflection phase is the third and final stage. According to the literature, this corresponds to the phases of receiving support or maintenance, and sometimes parts of the ‘grey phase’ are included here. This phase is not clearly defined in terms of time, unlike the previous phases, and it is often integrated within the working phase. The main focus of this phase is the student’s reflective critique. This critique is conducted by the students themselves or through the guidance of their personal teacher. The goal is to ‘polish’ the student’s performance skills, either by revisiting corrections and improving the playing of the current piece or by developing broader musical skills through playing general exercises and etudes. Engaging in reflective processes enhances the learning of complex subjects and promotes professionalism and excellence. These reflective processes can be seen as components of Self-Directed Learning (SDL), where the learner directs their own learning process. Indeed, in music education, most of the learning is student-led, with the student practicing at home and independently dealing with new challenges each time (Lebler, 2007). SDL helps improve students’ attitudes and facilitates the exploration of challenging topics (Garrison, 1997; Winkel et al., 2017). To realize the full potential of SDL, students need to apply metacognitive strategies, such as those demonstrated by the participants in this study, namely reflective processes (Shannon, 2008). The reflection strategies identified in the study included self-questioning about their mastery of the piece, selecting learning strategies, making corrections, consulting with their personal teacher, and self-assessing their performance. The students’ reflective actions in this study can be explained in two ways. First, the students consciously engaged in reflective processes, even though they were not aware of metacognitive strategies or had not formally learned how to engage in reflection within SDL. In this case, it is important to highlight these learning methods to maximize the potential of these strategies on the one hand and to transfer such skills to other domains on the other (D. Cohen, 2017). The second possible explanation is that these students did not fully engage in SDL, particularly in structured reflective processes. This explanation also supports the finding that the reflection phase is the least clear regarding students’ use of learning strategies in music practice. In this case, it is important to strengthen learning habits related to reflective processes and metacognitive strategies to improve learning efficiency and quality (G. Cohen et al., 2022).
The three-phase model presented in this study is reminiscent of Pólya’s (1965) problem-solving model in mathematics. Pólya’s model consists of four stages: understanding the problem, planning the solution, executing the plan, and looking back for reflection. As indicated by the findings, during the review phase, students encounter the piece for the first time, which raises issues related to the gaps between their playing abilities and the musical material in the piece. After the review phase, students have (both explicitly and implicitly) a set of tasks to accomplish to achieve the overall goal—performing the piece at an adequate level. The planning stage, according to this model, does not occur in the conventional sense but as an immediate product of the review phase. Students reported that after reviewing the piece, they immediately began working and correcting the errors identified during the review phase. Sometimes, students stated that they worked systematically according to a planned priority order, while at other times, they simply began working on the challenges to which they were subconsciously drawn. Pólya’s “looking back” reflection stage, which involves abstracting the problem through the solution obtained, aligns with the reflection phase in this study, where students seek technical pieces and additional exercises to strengthen the playing skills required in the piece being studied. Therefore, while Pólya created a fundamental learning model that allows for efficient and effective problem-solving, the students in this study are not fully realizing the potential of their learning process because they are unaware of the learning stages and their roles within the learning process. When students follow these models, they can approach their musical practice with a clear plan, identify and solve problems that arise during practice, evaluate their progress, and make adjustments to their learning habits. This approach can lead to more efficient and effective practice sessions, thereby improving musical performance. Furthermore, recognizing the importance of reflection in the “looking back” stage can help students develop self-awareness and critical thinking skills, enabling them to identify their strengths and weaknesses as learner-musicians and develop targeted strategies for future learning improvement (Yapatang & Polyiem, 2022).
It is important to note that at no point did the students indicate in their interviews that the three-phase practice model was explicitly explained to them. Moreover, the students did not use stage descriptions, including the three-phase division, when describing their work. Structuring the practice process into three main phases seems natural, and it is possible that when the student faces a new piece, they naturally feel the need to review it first, understand how to work on it, and finally critique their playing. It is also possible that the personal teacher imparted these habits to the student but did so without constructing a clear model in the student’s mind. Additionally, the Israel Ministry of Education’s guidelines do not mention or recommend a structured learning process in music education, including phase division.

4.3.2. Computational Thinking Facets in Music Practice Strategies

As described in the literature review, studies attempting to link music and computer science often focus more on areas related to music composition and theoretical music subjects and less on performance and playing. The few studies that connected CT with the field of music did so concerning theoretical music subjects, including music theory and harmony (Bell & Bell, 2018; Chong, 2018), or alternatively in the field of composition (Edwards, 2011; Montero & Pihlainen, 2017). Thus, one might wonder why this topic does not attract more interest among researchers in the field of music or in integrating technologies in learning. It can be hypothesized that the field of theoretical music education is more organized and regulated than the field of playing, as it is conducted under the auspices of the state school system and is more significantly supervised by the authorities. Another possibility is that computer science researchers focus on music, rather than music researchers focusing on computer science, leading to studies that concentrate on musical creation using programming languages. Nonetheless, some parallels can be drawn to the findings of this study.
We found evidence of Computational Thinking (CT) skills throughout the learning process, with different phases of practice being characterized by distinct CT skills. As noted, during the review phase, students used abstraction skills such as data collection and modeling, which belong to the abstraction domain within the CT framework. Abstraction is a critical component of CT as it allows learners to focus on the most important aspects of a problem by filtering out unnecessary details, generalize solutions to similar problems, and identify patterns and relationships in data (Ezeamuzie et al., 2022). In fact, it appears that the skill of abstraction is crucial to the quality and efficiency of practice. Conducting the review phase thoroughly leads to a more efficient and faster working phase. Students decide to delay the urge to start learning the piece immediately upon receiving it in favor of gathering information about the piece as a whole, including identifying challenging areas in relation to their playing abilities.
Another important skill in this phase is pattern recognition, which is also part of the abstraction domain. Students reported that they often manage to identify recurring phrases, patterns, or motifs. This insight allows them to work on and practice one pattern but actually learn several sections where the pattern reappears, thereby maximizing practice efficiency. The study did not examine how students successfully identify recurring patterns—whether through listening to the music or by recognizing recurring patterns in the sheet music. Understanding this would be important for developing methods to improve music learning, opening avenues for further research. It is possible that pattern recognition is linked to the visual representation embodied in the sheet music. For example, a study that examined students without prior knowledge of negative numbers showed improved understanding and results in this area simply through the use of pattern recognition based on geometric shapes (Periasamy & Sivasubramaniam, 2018). Therefore, it would be interesting to explore whether treating musical notations as recurring shapes helps students recognize musical patterns in the pieces they learn. If so, focusing on abstraction skills allows for a smart and early planning process that will come in handy during the working phase.
During the working phase, students focus on debugging and decomposition. Debugging is a skill that, when mastered, enables students to identify the problem, fix it, and, in the musical context, embed the correction in both thought and movement. In the context of teaching English, it has been found that combining physical gestures with sound production and auditory clarifications may help students in error identification and correction (Nguyen, 2016; Smotrova, 2017); this combination also appears in music practice. Indeed, in this study, students indicated that they use these three channels: two auditory, by listening to professional performances and by identifying mistakes during their playing, and one visual-auditory, by reading the sheet music while playing and identifying the error when they realize there is a discrepancy between what they see written and what they hear in their playing. If the student fails to correct the mistake satisfactorily, they may resort to additional exercises and strengthening musical skills during the reflection phase.
The reflection phase, as mentioned, is relatively ambiguous (often referred to as the ‘grey phase’ in the literature), focusing mainly on the student’s reflective view of their musical performance. Within this reflection, the student may realize that their current abilities are insufficient to fix the problems identified during the review phase, which they tried to correct during the working phase. To strengthen their musical abilities, students turn to exercises (etudes) aimed at reinforcing the specific skills they lack, such as executing a complex passage, transitioning between positions or fingerings, varying dynamics in sound, and articulatory precision. This strategy of skill reinforcement during the reflection phase aligns with the process of iteration in CT. Iteration involves repeating a process until the desired outcome is achieved. It is a central component in problem-solving and algorithmic thinking, as it allows for the automation of repetitive tasks and the refinement of algorithms through repeated testing and modification. Therefore, in addition to the expression of iteration through repeated correction of the mistake to embed it, it also manifests in a deeper form when students choose to engage in similar exercises to enhance and strengthen their musical abilities before returning to work on the piece and its challenges.

5. General Discussion

In this paper, we explored the associations between computational thinking (CT) and learning to play musical instruments, both quantitatively (N = 91 young adults, 18–25 years old) and qualitatively (N = 10 senior high-school students, 17–18 years old), in ways that complement each other. Taken together, we portray a rich, nuanced picture of these associations in a way that helps us appreciate how these two studies complement each other.
Overall, there are positive associations between learning to play musical instruments and CT. This is evident by the correlations found in the quantitative study and can be explained by the mechanisms that we uncovered in the qualitative study. A quote attributed to Johann Sebastian Bach (1685–1750), one of the greatest composers in the history of Western classical music, says that it is easy to play any musical instrument, as all you have to do is touch the right key at the right time and the instrument will play itself. However, in practice, playing a musical instrument is a complex task involving the integration of rich physical, cognitive, and affective experiences. From our findings, we would like to highlight four important points.
First, CT is relevant across domains and contexts, however—as we established in the literature review—it has been mostly studied with relation to STEM domain and to computational artifacts; here, we see the connections between CT and a non-STEM domain in a context that is not related to computational artifacts. This is another step forward in solidifying Wing’s (2006) seminal message about CT being “a fundamental skill for everyone [… hence] we should add computational thinking to every child’s analytical ability” (p. 33). As we saw here, CT is an imperative skill in solving problems in music education, a yet understudied field in the context of CT. Indeed, we have seen how learning to play musical instruments is a powerful tool, and how CT skills are intertwined with strategies for solving this complex problem. Note, however, that participants in both studies were not asked directly about these associations, which leads us to the next point.
Second, engagement with complex tasks often necessitates the development of strategies and skills to effectively and efficiently solve them. This is evident from the two studies presented here; while in the qualitative study, we showed how CT is being implemented de facto for solving the complex problem of learning to play a musical piece, the quantitative study echoed it on a larger scale and also demonstrated how different complexities—as in different playing styles, reading music, and playing in ensembles—correlate with CT. Indeed, it has been shown that various skills can be acquired with no explicit instruction when they are beneficial, like in the case of problem-solving (Sandi-Urena et al., 2023), reflection (Pretorius & Ford, 2016), and data literacy (Michaeli et al., 2020). Of course, this will not necessarily happen for all people, hence the importance of explicitly teaching CT, and skills at large.
Third, transfer of CT skills is seemingly evident in the findings of our quantitative study; however, it is probably not explicit. Indeed, as our qualitative study demonstrates, there was no informed transfer of strategies used during the endeavor towards mastering musical instruments in a different context of studying computer science, despite the obvious associations between CT and programming (e.g., Agbo et al., 2019); it is most probable that the participants from the qualitative study were not aware at all of the very concept of CT, as it is rarely studied in the Israeli education system, and is not yet a common term in the public discourse. The transfer of skills is overall very difficult for people to achieve; therefore, it should be verbalized explicitly while teaching, implementing, and practicing skills across contexts (Billing, 2007; Jackson et al., 2019; McKeachie, 1987). This probably does not happen, as we learn from the quantitative study about the non-associations between knowledge of music theory and CT. Importantly, when using CT in teaching and learning, both students and teachers appreciate its importance, and consider it useful for future learning (Sabitzer et al., 2018). Moreover, CT-based interventions were shown to be effective in promoting learning across disciplines (J. Ye et al., 2022). The very transfer of CT skills from one context to another is also part of Shute et al.’s (2017) framework of CT—a facet that did not come up at all in our qualitative study—which leads us to the fourth and last point.
Finally, it is important to mention the methodological contribution of this work. We explored associations between CT and a domain that has not been yet closely associated with CT, i.e., music education. We found Shute et al.’s (2017) framework to be useful for the qualitative, bottom-up exploration, as it is domain-independent and can be easily applied to various contexts. Indeed, all but one facet (generalization) of this framework were prominent in our analysis, which demonstrates its comprehensiveness on the one hand, and makes us ponder about the necessity of the generalization facet in the conceptualization of CT. Moreover, that we used two CT assessment (i.e., CTt and Bebras tasks) in the quantitative, top-down study allowed us to emphasize differences between the different tools in their associations to some personal and music experience-related characteristics. It should be further studied which facets of Shut et al.’s framework are associated with each of these—and other—CT assessment tools, for aligning theoretical grounds with empirical tools.

6. Conclusions, Implications and Future Research

The studies presented here explore the nuanced relationships between various elements of musical training and computational thinking (CT), highlighting the mutual links between musical education and cognitive skills. Integrating music into educational practices may equip students with a robust toolkit for CT, and vice versa. By recognizing the cognitive benefits associated with musical experience, educators can develop more holistic curricula that support diverse aspects of cognitive development, ultimately fostering well-rounded students. On the other hand, while educating for music, it is important to incorporate CT as a means for students to progress in effective, meaningful ways.
Our study is, of course, not without limitations. Most importantly, our analysis is based on data collected in a single country (Israel). Geographic location and its corresponding cultural and social settings may affect a given set of educational, technological, and cultural beliefs and practices—all of which may impact the way music education is carried out. Moreover, we do not assume that our samples are representative of the music learners in this country. Finally, our quantitative data may suffer from biases and contaminations that often characterize online data collection (Andrade, 2020). Therefore, future research should aim to further study the associations between music education and CT with larger and more diverse samples to generalize the results and explore other aspects of musical experience more comprehensively; examining larger populations would also allow for more sophisticated analyses, specifically multivariable or hierarchical ones, which will allow the role of interactions between different variables, e.g., how gender interacts with other variables, to be tested. Longitudinal studies would also be valuable in understanding the long-term impact of musical training on CT skills development and exploring causal relationships. Additionally, examining the mechanisms underlying observed gender differences and the specific cognitive benefits of different musical activities could provide deeper insights into the interplay between music and cognition.

Author Contributions

Conceptualization, T.R.C., B.A. and A.H.; Methodology, B.A.; Validation, A.H.; Formal analysis, T.R.C. and B.A.; Resources, A.H.; Data curation, T.R.C. and B.A.; Writing—original draft, T.R.C. and B.A.; Writing—review & editing, A.H.; Visualization, B.A.; Supervision, A.H. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Ethics Committee of Tel Aviv University (0005577-1, 19 October 2022).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data will be available upon request.

Conflicts of Interest

The authors declare no conflict of interest.

Note

1
Personal communication, October 2024.

References

  1. Agbo, F. J., Oyelere, S. S., Suhonen, J., & Adewumi, S. (2019). A systematic review of computational thinking approach for programming education in higher education institutions. In ACM international conference proceeding series. Association for Computing Machinery. [Google Scholar] [CrossRef]
  2. Andrade, C. (2020). The limitations of online surveys. Indian Journal of Psychological Medicine, 42(6), 575–576. [Google Scholar] [CrossRef] [PubMed]
  3. Austin, J. R., & Berg, M. H. (2006). Exploring music practice among sixth-grade band and orchestra students. Psychology of Music, 34(4), 535–558. [Google Scholar] [CrossRef]
  4. Barendsen, E., Mannila, L., Demo, B., Grgurina, N., Izu, C., Mirolo, C., Sentance, S., Settle, A., & Stupuriene, G. (2015). Concepts in K-9 computer science education. In ITiCSE-WGP 2015—Proceedings of the 2015 ITiCSE conference on working group reports (pp. 85–116). Association for Computing Machinery. [Google Scholar] [CrossRef]
  5. Bell, J., & Bell, T. (2018). Integrating computational thinking with a music education context. Informatics in Education, 17(2), 151–166. [Google Scholar] [CrossRef]
  6. Benedek, M., Borovnjak, B., Neubauer, A. C., & Kruse-Weber, S. (2014). Creativity and personality in classical, jazz and folk musicians. Personality and Individual Differences, 63, 117–121. [Google Scholar] [CrossRef]
  7. Ben-Yaacov, A., & Hershkovitz, A. (2023). Types of errors in block programming: Driven by learner, learning environment. Journal of Educational Computing Research, 61(1), 178–207. [Google Scholar] [CrossRef]
  8. Berson, I. R., Berson, M. J., McKinnon, C., Aradhya, D., Alyaeesh, M., Luo, W., & Shapiro, B. R. (2023). An exploration of robot programming as a foundation for spatial reasoning and computational thinking in preschoolers’ guided play. Early Childhood Research Quarterly, 65, 57–67. [Google Scholar] [CrossRef]
  9. Billing, D. (2007). Teaching for transfer of core/key skills in higher education: Cognitive skills. Higher Education, 53(4), 483–516. [Google Scholar] [CrossRef]
  10. Boom, K. D., Bower, M., Arguel, A., Siemon, J., & Scholkmann, A. (2018, July 2–4). Relationship between computational thinking and a measure of intelligence as a general problem-solving ability. Annual Conference on Innovation and Technology in Computer Science Education, ITiCSE (pp. 206–211), Larnaca, Cyprus. [Google Scholar] [CrossRef]
  11. Brennan, K., & Resnick, M. (2012, April 13–17). New frameworks for studying and assessing the development of computational thinking. 2012 Annual Meeting of the American Educational Research Association (pp. 1–25), Vancouver, BC, Canada. [Google Scholar]
  12. Budinská, L., & Mayerová, K. (2017, November 14). Graph tasks in Bebras contest—What does it have to do with gender? 6th Computer Science Education Research Conference (pp. 83–90), Helsinki, Finland. [Google Scholar] [CrossRef]
  13. Bugos, J. A., Perlstein, W. M., McCrae, C. S., Brophy, T. S., & Bedenbaugh, P. H. (2007). Individualized Piano Instruction enhances executive functioning and working memory in older adults. Aging and Mental Health, 11(4), 464–471. [Google Scholar] [CrossRef]
  14. Byom, L. J., & Mutlu, B. (2013). Theory of mind: Mechanisms, methods, and new directions. Frontiers in Human Neuroscience, 7, 413. [Google Scholar] [CrossRef]
  15. Caeli, E. N., & Yadav, A. (2020). Unplugged approaches to computational thinking: A historical perspective. TechTrends, 64(1), 29–36. [Google Scholar] [CrossRef]
  16. Cartelli, A., Dagiene, V., & Futschek, G. (2010). Bebras contest and digital competence assessment. International Journal of Digital Literacy and Digital Competence, 1(1), 24–39. [Google Scholar] [CrossRef]
  17. Central Bureau of Statistics. (2024). Characterization and classification of geographical units by the socio-economic level of the population 2017; Central Bureau of Statistics.
  18. Chaffin, R., Imreh, G., Lemieux, A. F., & Chen, C. (2003). “Seeing the big picture”: Piano practice as expert problem solving. Music Perception, 20(4), 465–490. [Google Scholar] [CrossRef]
  19. Chan, S. W., Kim, M. S., Huang, W., & Looi, C.-K. (2023). Affordances of computational thinking activities in the development of spatial reasoning. In W. L. D. Hung, A. Jamaludin, & A. Abdul Rahman (Eds.), Applying the science of learning to education: An insight into the mechanisms that shape learning (pp. 267–286). Springer Nature. [Google Scholar]
  20. Chen, P., Yang, D., Metwally, A. H. S., Lavonen, J., & Wang, X. (2023). Fostering computational thinking through unplugged activities: A systematic literature review and meta-analysis. International Journal of STEM Education, 10(1), 47. [Google Scholar] [CrossRef]
  21. Chong, E. K. M. (2018, October 29). Teaching and learning music through the lens of computational thinking. International Conference on Art and Arts Education (pp. 1–7), Depok, Indonesia. [Google Scholar]
  22. Città, G., Gentile, M., Allegra, M., Arrigo, M., Conti, D., Ottaviano, S., Reale, F., & Sciortino, M. (2019). The effects of mental rotation on computational thinking. Computers & Education, 141, 103613. [Google Scholar] [CrossRef]
  23. Clements, D. H., & Gullo, D. (1984). Effects of computer programming on young children’s cognition. Journal of Educational Psychology, 76(6), 1051–1058. [Google Scholar] [CrossRef]
  24. Cohen, D. (2017). What do 7th-grade students learn from “Roots” project? [Master’s thesis, Tel Aviv University]. (In Hebrew). [Google Scholar]
  25. Cohen, G., Assi, A., Cohen, A., Bronshtein, A., Glick, D., Gabbay, H., & Ezra, O. (2022). Video-assisted self-regulated learning (SRL) training: COVID-19 edition. In Educating for a new future: Making sense of technology-enhanced learning adoption. EC-TEL 2022 (pp. 59–73). Springer. [Google Scholar] [CrossRef]
  26. Cohen, J. (1988). Statistical power analysis for the behavioral sciences. Routledge. [Google Scholar]
  27. Cross, I. (2009). La naturaleza evolucionista de la significacién musical. Musicae Scientiae, 13(2 Suppl.), 179–200. [Google Scholar] [CrossRef]
  28. Cross, I., Laurence, F., & Rabinowitch, T.-C. (2012). Empathy and creativity in group musical practices: Towards a concept of empathic creativity. In The Oxford handbook of music education, volume 2 (pp. 337–353). Oxford University Press. [Google Scholar] [CrossRef]
  29. Cureton, E. E. (1956). Rank-biserial correlation. Psychometrika, 21(3), 287–290. [Google Scholar] [CrossRef]
  30. Cutumisu, M., Adams, C., Glanfield, F., Yuen, C., & Lu, C. (2021). Using structural equation modeling to examine the relationship between preservice teachers’ computational thinking attitudes and skills. IEEE Transactions on Education, 1–7. [Google Scholar] [CrossRef]
  31. Cutumisu, M., Adams, C., & Lu, C. (2019). A scoping review of empirical research on recent computational thinking assessments. Journal of Science Education and Technology, 28(6), 651–676. [Google Scholar] [CrossRef]
  32. Dagiene, V., Futschek, G., & Stupuriene, G. (2019). Creativity in solving short tasks for learning computational thinking. Constructivist Foundations, 14(3), 382–396. [Google Scholar]
  33. Dagiene, V., Mannila, L., Poranen, T., Rolandsson, L., & Stupuriene, G. (2014). Reasoning on children’s cognitive skills in an informatics contest: Findings and discoveries from Finland, Lithuania, and Sweden. In Informatics in schools. Teaching and learning perspectives. Springer. [Google Scholar]
  34. Dagienė, V., & Sentance, S. (2016). It’s computational thinking! Bebras tasks in the curriculum. In Informatics in schools: Improvement of informatics knowledge and Perception. Springer. [Google Scholar]
  35. del Olmo-Muñoz, J., Cózar-Gutiérrez, R., & González-Calero, J. A. (2020). Computational thinking through unplugged activities in early years of Primary Education. Computers & Education, 150, 103832. [Google Scholar] [CrossRef]
  36. Dietz, G., Le, J. K., Tamer, N., Han, J., Gweon, H., Murnane, E. L., & Landay, J. A. (2021, May 8–13). Storycoder: Teaching computational thinking concepts through storytelling in a voice-guided app for children. 2021 CHI Conference on Human Factors in Computing Systems (pp. 1–15), Yokohama Japan. [Google Scholar] [CrossRef]
  37. Dolgopolovas, V., Savulionienė, L., & Dagiene, V. (2015, July 1–3). On evaluation of computational thinking of software engineering novice students. IFIP TC3 Working Conference “A New Culture of Learning: Computing and next Generations”, Vilnius, Lithuania. [Google Scholar] [CrossRef]
  38. Edwards, M. (2011). Algorithmic composition. Communications of the ACM, 54(7), 58–67. [Google Scholar] [CrossRef]
  39. El-Hamamsy, L., Zapata-Cáceres, M., Marcelino, P., Bruno, B., Dehler Zufferey, J., Martín-Barroso, E., & Román-González, M. (2022). Comparing the psychometric properties of two primary school Computational Thinking (CT) assessments for grades 3 and 4: The Beginners’ CT test (BCTt) and the competent CT test (cCTt). Frontiers in Psychology, 13, 1082659. [Google Scholar] [CrossRef]
  40. Elo, S., & Kyngäs, H. (2008). The qualitative content analysis process. Journal of Advanced Nursing, 62(1), 107–115. [Google Scholar] [CrossRef] [PubMed]
  41. Ericsson, K. A., Krampe, R. Th., & Tesch-Romer, C. (1993). The role of deliberate practice in the acquisition of expert performance. Psychological Review, 100(3), 363–406. [Google Scholar] [CrossRef]
  42. Ezeamuzie, N. O., & Leung, J. S. C. (2021). Computational thinking through an empirical lens: A systematic review of literature. Journal of Educational Computing Research, 60(2), 481–511. [Google Scholar] [CrossRef]
  43. Ezeamuzie, N. O., Leung, J. S. C., & Ting, F. S. T. (2022). Unleashing the potential of abstraction from cloud of computational thinking: A systematic review of literature. Journal of Educational Computing Research, 60(4), 877–905. [Google Scholar] [CrossRef]
  44. Fairlie, F. (2023, November 13–18). Encouraging the development of computational thinking skills through structured dance activities (Discussion paper). 23rd Koli Calling International Conference on Computing Education Research (pp. 1–8), Koli, Finland. [Google Scholar] [CrossRef]
  45. Fanchamps, L. J. A., Van Gool, E., van Dongen, J., Straus, M., & De Meyst, K. (2024, May 28–30). The effect of music producing on computational thinking among primary school students in childcare. 8th APSCE International Conference on Computational Thinking and STEM Education (pp. 7–11), Beijing, China. [Google Scholar]
  46. Frith, C., & Frith, U. (2005). Theory of mind. Current Biology, 15(17), R644–R645. [Google Scholar] [CrossRef]
  47. Galyen, S. D. (2005). Sight-reading ability in wind and percussion students: A review of recent literature. Update: Applications of Research in Music Education, 24(1), 57–70. [Google Scholar] [CrossRef]
  48. Garrison, D. R. (1997). Self-directed learning: Toward a comprehensive model. Adult Education Quarterly, 48(1), 18–33. [Google Scholar] [CrossRef]
  49. Gaser, C., & Schlaug, G. (2003). Brain structures differ between musicians and non-musicians. The Journal of Neuroscience, 23(27), 9240–9245. [Google Scholar] [CrossRef]
  50. Grover, S., & Pea, R. (2013). Computational thinking in K–12: A review of the state of the field. Educational Researcher, 42(1), 38–43. [Google Scholar] [CrossRef]
  51. Guenaga, M., Eguíluz, A., Garaizar, P., & Gibaja, J. (2021). How do students develop computational thinking? Assessing early programmers in a maze-based online game. Computer Science Education, 31(2), 259–289. [Google Scholar] [CrossRef]
  52. Guggemos, J. (2021). On the predictors of computational thinking and its growth at the high-school level. Computers and Education, 161, 104060. [Google Scholar] [CrossRef]
  53. Habibi, A., Damasio, A., Ilari, B., Sachs, M. E., & Damasio, H. (2018). Music training and child development: A review of recent findings from a longitudinal study. Annals of the New York Academy of Sciences, 1423(1), 73–81. [Google Scholar] [CrossRef]
  54. Hallam, S., Rinta, T., Varvarigou, M., Creech, A., Papageorgi, I., Gomes, T., & Lanipekun, J. (2012). The development of practising strategies in young people. Psychology of Music, 40(5), 652–680. [Google Scholar] [CrossRef]
  55. Hansen, M., Wallentin, M., & Vuust, P. (2013). Working memory and musical competence of musicians and non-musicians. Psychology of Music, 41(6), 779–793. [Google Scholar] [CrossRef]
  56. Hershkovitz, A., Sitman, R., Israel-Fishelson, R., Eguíluz, A., Garaizar, P., & Guenaga, M. (2019). Creativity in the acquisition of computational thinking. Interactive Learning Environments, 27(5–6), 628–644. [Google Scholar] [CrossRef]
  57. Hetland, L. (2000). Learning to make music Enhances spatial reasoning. Journal of Aesthetic Education, 34(3/4), 179. [Google Scholar] [CrossRef]
  58. Hsieh, H. F., & Shannon, S. E. (2005). Three approaches to qualitative content analysis. Qualitative Health Research, 15(9), 1277–1288. [Google Scholar] [CrossRef]
  59. Huang, W., & Looi, C.-K. (2021). A critical review of literature on “unplugged” pedagogies in K-12 computer science and computational thinking education. Computer Science Education, 31(1), 83–111. [Google Scholar] [CrossRef]
  60. Israel-Fishelson, R., & Hershkovitz, A. (2022a). Cultivating creativity improves middle school students’ computational thinking skills. Interactive Learning Environments, 32(2), 431–446. [Google Scholar] [CrossRef]
  61. Israel-Fishelson, R., & Hershkovitz, A. (2022b). Studying interrelations of computational thinking and creativity: A scoping review (2011–2020). Computers & Education, 176, 104353. [Google Scholar] [CrossRef]
  62. Israel-Fishelson, R., Hershkovitz, A., Eguíluz, A., Garaizar, P., & Guenaga, M. (2021). The associations between computational thinking and creativity: The role of personal characteristics. Journal of Educational Computing Research, 58(8), 1415–1447. [Google Scholar] [CrossRef]
  63. ISTE & CSTA. (2011). Operational definition of computational thinking. Available online: https://cdn.iste.org/www-root/Computational_Thinking_Operational_Definition_ISTE.pdf (accessed on 1 February 2025).
  64. Jackson, D., Fleming, J., & Rowe, A. (2019). Enabling the transfer of skills and knowledge across classroom and work contexts. Vocations and Learning, 12(3), 459–478. [Google Scholar] [CrossRef]
  65. Jin, H.-Y., & Cutumisu, M. (2024). Cognitive, interpersonal, and intrapersonal deeper learning domains: A systematic review of computational thinking. Education and Information Technologies, 29, 22723–22756. [Google Scholar] [CrossRef]
  66. Jusslin, S., Korpinen, K., Lilja, N., Martin, R., Lehtinen-Schnabel, J., & Anttila, E. (2022). Embodied learning and teaching approaches in language education: A mixed studies review. Educational Research Review, 37, 100480. [Google Scholar] [CrossRef]
  67. Khoo, N. A. K. A. F., Ishak, N. A. H. N., Osman, S., Ismail, N., & Kurniati, D. (2022). Computational thinking in mathematics education: A systematic review. AIP Conference Proceedings, 2633, 030043. [Google Scholar] [CrossRef]
  68. Kim, H.-Y. (2013). Statistical notes for clinical researchers: Assessing normal distribution (2) using skewness and kurtosis. Restorative Dentistry & Endodontics, 38(1), 52. [Google Scholar] [CrossRef]
  69. Kong, S. (2019). Components and methods of evaluating computational thinking for fostering creative problem-solvers in senior primary school education. In S. Kong, & H. Abelson (Eds.), Computational thinking education (pp. 119–142). Springer. [Google Scholar]
  70. Kwon, K., Jeon, M., Zhou, C., Kim, K., & Brush, T. A. (2024). Embodied learning for computational thinking in early primary education. Journal of Research on Technology in Education, 56(4), 410–430. [Google Scholar] [CrossRef]
  71. Law, N., Woo, D., de la Torre, J., & Wong, G. (2018). A global framework of reference on digital literacy skills for indicator 4.4.2. Information Paper no. 51. UNESCO Institute for Statistics. [Google Scholar]
  72. Lebler, D. (2007). Getting smarter music: A role for reflection in self-directed music learning [Ph.D. dissertation, Queensland University of Technology]. [Google Scholar]
  73. Lee, Y.-s., Lu, M.-j., & Ko, H.-p. (2007). Effects of skill training on working memory capacity. Learning and Instruction, 17(3), 336–344. [Google Scholar] [CrossRef]
  74. Lehmann, A. C., Sloboda, J. A., & Woody, R. H. (2007). Psychology for musicians. Oxford University Press New York. [Google Scholar] [CrossRef]
  75. Leonard, J., Djonko-Moore, C., Francis, K. R., Carey, A. S., Mitchell, M. B., & Goffney, I. D. (2023). Promoting computational thinking, computational participation, and spatial reasoning with LEGO robotics. Canadian Journal of Science, Mathematics and Technology Education, 23(1), 120–144. [Google Scholar] [CrossRef]
  76. Lin, S., & Wong, G. K. W. (2024). Gender differences in computational thinking skills among primary and secondary school students: A systematic review. Education Sciences, 14(7), 790. [Google Scholar] [CrossRef]
  77. Lockwood, J., & Mooney, A. (2018). Developing a computational thinking test using Bebras problems. In A. Piotrkowicz, R. Dent-Spargo, S. Dennerlein, I. Koren, P. Antoniou, P. Bailey, T. Treasure-Jones, I. Fronza, & C. Pahl (Eds.), Joint proceedings of the CC-TEL 2018 and TACKLE 2018 workshops, co-located with 13th European conference on technology enhanced learning (EC-TEL 2018). Maynooth University Research Archive Library. Available online: https://ceur-ws.org/Vol-2190/TACKLE_2018_paper_1.pdf (accessed on 28 February 2025).
  78. Lyon, J. A., & Magana, A. J. (2020). Computational thinking in higher education: A review of the literature. Computer Applications in Engineering Education, 28(5), 1174–1189. [Google Scholar] [CrossRef]
  79. Mamolo, L. A., Grace, S., & Sugano, C. (2020). Self-perceived and actual competencies of senior high school students in General Mathematics. Cogent Education, 7, 1779505. [Google Scholar] [CrossRef]
  80. Mayring, P. (2000). Qualitative Content Analysis. Forum: Qualitative Social Research, 1(2), 20. [Google Scholar] [CrossRef]
  81. McCall, L., Allen, B., Freeman, J., Garrett, S., Grossman, S., Paz, J., Edwards, D., McKlin, T., & Lee, T. (2024). Developing computational thinking in middle school music technology classrooms. In Proceedings of the 55th ACM technical symposium on computer science education V. 2 (pp. 1746–1747). Association for Computing Machinery. [Google Scholar] [CrossRef]
  82. McKeachie, W. J. (1987). Cognitive skills and their transfer: Discussion. International Journal of Educational Research, 11(6), 707–712. [Google Scholar] [CrossRef]
  83. Melro, A., Tarling, G., Fujita, T., & Kleine Staarman, J. (2023). What else can be learned when coding? A configurative literature review of learning opportunities through computational thinking. Journal of Educational Computing Research, 61(4), 901–924. [Google Scholar] [CrossRef]
  84. Michaeli, S., Kroparo, D., & Hershkovitz, A. (2020). Teachers’ use of education dashboards and professional growth. International Review of Research in Open and Distance Learning, 21(4), 61–78. [Google Scholar] [CrossRef]
  85. Miller, L. D., Soh, L. K., Chiriacescu, V., Ingraham, E., Shell, D. F., Ramsay, S., & Hazley, M. P. (2013, October 23–26). Improving learning of computational thinking using creative thinking exercises in CS-1 computer science courses. 2013 IEEE Frontiers in Education Conference (FIE) (pp. 1426–1432), Oklahoma City, OK, USA. [Google Scholar] [CrossRef]
  86. Montero, C. S., & Pihlainen, K. (2017). Let’s play!: Music improvisation as a medium to facilitate computational thinking. In Proceedings of the 17th Koli Calling international conference on computing education research (pp. 199–200). Association for Computing Machinery. [Google Scholar] [CrossRef]
  87. Mooney, A., & Lockwood, J. (2020). The analysis of a novel computational thinking test in a first year undergraduate computer science course. All Ireland Journal of Teaching and Learning in Higher Education, 12(1), 27. [Google Scholar]
  88. Moreno-Leon, J., Roman-Gonzalez, M., & Robles, G. (2018, April 17–20). On computational thinking as a universal skill: A review of the latest research on this ability. 2018 IEEE Global Engineering Education Conference (EDUCON) (pp. 1684–1689), Santa Cruz de Tenerife, Spain. [Google Scholar] [CrossRef]
  89. Moriuchi, T., Ishii, R., Nakata, H., Jaberzadeh, S., Nakashima, A., Nakamura, J., Anan, K., Nishi, K., Matsuo, T., Hasegawa, T., Mitsunaga, W., Iso, N., & Higashi, T. (2020). The vividness of motor imagery is correlated with corticospinal excitability during combined motor imagery and action observation. Frontiers in Human Neuroscience, 14, 581652. [Google Scholar] [CrossRef]
  90. Nguyen, M.-H. (2016). A micro-analysis of embodiments and speech in the pronunciation instruction of one ESL teacher. Issues in Applied Linguistics, 20(1), 111–134. [Google Scholar] [CrossRef]
  91. Nordby, S. K., Bjerke, A. H., & Mifsud, L. (2022). Computational thinking in the primary mathematics classroom: A systematic review. Digital Experiences in Mathematics Education, 8(1), 27–49. [Google Scholar] [CrossRef]
  92. Nurhayati, N., Silitonga, L. M., Subiyanto, A., Murti, A. T., & Wu, T.-T. (2022). Computational thinking approach: Its impact on students’ English writing skills. In Innovative technologies and learning (pp. 423–432). Springer. [Google Scholar] [CrossRef]
  93. Ogegbo, A. A., & Ramnarain, U. (2022). A systematic review of computational thinking in science classrooms. Studies in Science Education, 58(2), 203–230. [Google Scholar] [CrossRef]
  94. Osio, U. R., & Bailón Maxi, J. (2020). Computational thinking: Digital literacy without computers. Revista ICONO14 Revista Científica de Comunicación y Tecnologías Emergentes, 18(2), 379–405. [Google Scholar] [CrossRef]
  95. Papert, S. (1980). Mindstorms: Children, computers, and powerful ideas. Basic Books, Inc. [Google Scholar]
  96. Periasamy, E., & Sivasubramaniam, P. (2018). Pattern recognition concept in learning negative numbers subtraction operation. International Research Journal of Education and Sciences, 2(1), 24–30. [Google Scholar]
  97. Petrie, C. (2022). Interdisciplinary computational thinking with music and programming: A case study on algorithmic music composition with Sonic Pi. Computer Science Education, 32(2), 260–282. [Google Scholar] [CrossRef]
  98. Pérez Poch, A., Olmedo Torre, N., Sánchez Carracedo, F., Salán Ballesteros, M. N., & López Álvarez, D. (2016). On the influence of creativity in basic programming learning at a first-year Engineering course. International Journal of Engineering Education, 32(5B), 2302–2309. [Google Scholar]
  99. Pietsch, S., & Jansen, P. (2012). Different mental rotation performance in students of music, sport and education. Learning and Individual Differences, 22(1), 159–163. [Google Scholar] [CrossRef]
  100. Pollock, L., Mouza, C., Guidry, K. R., & Pusecker, K. (2019). Infusing computational thinking across disciplines. In Proceedings of the 50th ACM technical symposium on computer science education (pp. 435–441). Association for Computing Machinery. [Google Scholar] [CrossRef]
  101. Porat, E., Blau, I., & Barak, A. (2018). Measuring digital literacies: Junior high-school students’ perceived competencies versus actual performance. Computers & Education, 126, 23–36. [Google Scholar] [CrossRef]
  102. Pólya, G. (1965). How to solve it: A new aspect of mathematical method. Princenton University Press. Available online: https://press.princeton.edu/books/paperback/9780691164076/how-to-solve-it (accessed on 1 February 2025).
  103. Pretorius, L., & Ford, A. (2016). Reflection for learning: Teaching reflective practice at the beginning of university study. International Journal of Teaching and Learning in Higher Education, 28(2), 241–253. [Google Scholar]
  104. Puhlmann, C., Dante, B., Couto Barone, A., Marimon Boucinha, R., Reichert, J., Brackmann, C. P., & Couto Barone, D. A. (2019). Development of computational thinking in Brazilian schools with social and economic vulnerability: How to teach computer science without machines. International Journal of Innovation Education and Research, 7(4), 79–96. [Google Scholar] [CrossRef]
  105. Rabinowitch, T. C., Cross, I., & Burnard, P. (2013). Long-term musical group interaction has a positive influence on empathy in children. Psychology of Music, 41(4), 484–498. [Google Scholar] [CrossRef]
  106. Rao, T. S. S., & Bhagat, K. K. (2024). Computational thinking for the digital age: A systematic review of tools, pedagogical strategies, and assessment practices. Educational Technology Research and Development, 72, 1893–1924. [Google Scholar] [CrossRef]
  107. Rauscher, F. H., Shaw, G. L., & Ky, C. N. (1993). Music and spatial task performance. Nature, 365(6447), 611. [Google Scholar] [CrossRef] [PubMed]
  108. Rijke, W. J. (2017). Computational thinking in primary school: An examination of abstraction and decomposition for different ages. Informatics in Education, 1, 77–92. [Google Scholar] [CrossRef]
  109. Román-González, M. (2015, July 6–8). Computational thinking test: Design guidelines and content validation. The 7th Annual International Conference on Education and New Learning Technologies (pp. 2436–2444), Barcelona, Spain. [Google Scholar]
  110. Román-González, M., Moreno-León, J., & Robles, G. (2019). Combining assessment tools for a comprehensive evaluation of computational thinking interventions. In S. Kong, & H. Abelson (Eds.), Computational thinking education (pp. 79–98). Springer. [Google Scholar] [CrossRef]
  111. Román-González, M., Pérez-González, J.-C., & Jiménez-Fernández, C. (2017). Which cognitive abilities underlie computational thinking? Criterion validity of the Computational Thinking Test. Computers in Human Behavior, 72, 678–691. [Google Scholar] [CrossRef]
  112. Romeike, R. (2007, June 27–29). Three drivers for creativity in computer science education. IFIP-Conference on Informatics, Mathematics and ICT 2007, Boston, MA, USA. [Google Scholar]
  113. Rottenhofer, M., Sabitzer, B., & Rankin, T. (2021). Developing computational thinking skills through modeling in language lessons. Open Education Studies, 3(1), 17–25. [Google Scholar] [CrossRef]
  114. Saad, A., & Zainudin, S. (2022). A review of project-based learning (PBL) and computational thinking (CT) in teaching and learning. Learning and Motivation, 78, 101802. [Google Scholar] [CrossRef]
  115. Sabitzer, B., Demarle-Meusel, H., & Jarnig, M. (2018, April 17–20). Computational thinking through modeling in language lessons. IEEE Global Engineering Education Conference (pp. 1913–1919), Santa Cruz de Tenerife, Spain. [Google Scholar]
  116. Salthouse, T. A. (2009). When does age-related cognitive decline begin? Neurobiology of Aging, 30(4), 507–514. [Google Scholar] [CrossRef]
  117. Sandi-Urena, S., Cooper, M., & Stevens, R. (2023). Effect of cooperative problem-based lab instruction on metacognition and problem-solving skills. Chemical Education Research, 89, 700–706. [Google Scholar] [CrossRef]
  118. Schellenberg, E. G. (2004). Music lessons enhance IQ. Psychological Science, 15(8), 511–514. [Google Scholar] [CrossRef]
  119. Schellenberg, E. G. (2012). Cognitive performance after listening to music: A review of the Mozart Effect. In R. MacDonald, G. Kreutz, & L. Mitchell (Eds.), Music, health, and wellbeing (pp. 324–338). Oxford University Press. [Google Scholar]
  120. Schneider, C. E., Hunter, E. G., & Bardach, S. H. (2019). Potential cognitive benefits from playing music among cognitively intact older adults: A scoping review. Journal of Applied Gerontology, 38(12), 1763–1783. [Google Scholar] [CrossRef] [PubMed]
  121. Scott, C. L. (2015). The futures of learning 2: What kind of learning for the 21st century? UNESDOC. [Google Scholar]
  122. Seo, Y.-H., & Kim, J.-H. (2016). Analyzing the effects of coding education through pair programming for the computational thinking and creativity of elementary school students. Indian Journal of Science and Technology, 9(46), 1–5. [Google Scholar] [CrossRef]
  123. Shannon, S. (2008). Using metacognitive strategies and learning styles to create self-directed learners. Institute for Learning Styles Research Journal, 1, 14–28. [Google Scholar]
  124. Shu, Y. (2021). Influence of piano playing on logical thinking formation of future musicians. Thinking Skills and Creativity, 42, 100961. [Google Scholar] [CrossRef]
  125. Shute, V. J., Sun, C., & Asbell-Clarke, J. (2017). Demystifying computational thinking. In Educational research review (Vol. 22, pp. 142–158). Elsevier Ltd. [Google Scholar] [CrossRef]
  126. Smotrova, T. (2017). Making pronunciation visible: Gesture in teaching pronunciation. TESOL Quarterly, 51(1), 59–89. [Google Scholar] [CrossRef]
  127. Soleimani, A., Green, K. E., Herro, D., & Walker, I. D. (2016). A tangible, story-construction process employing spatial, computational-thinking. In Proceedings of the 15th international conference on interaction design and children (pp. 157–166). Association for Computing Machinery. [Google Scholar] [CrossRef]
  128. Su, J., & Yang, W. (2023). A systematic review of integrating computational thinking in early childhood education. Computers and Education Open, 4, 100122. [Google Scholar] [CrossRef]
  129. Su, J., Yang, W., Yim, I. H. Y., Li, H., & Hu, X. (2024). Early artificial intelligence education: Effects of cooperative play and direct instruction on kindergarteners’ computational thinking, sequencing, self-regulation and theory of mind skills. Journal of Computer Assisted Learning, 40, 2917–2925. [Google Scholar] [CrossRef]
  130. Su, J.-M., Lin, Y.-E., Hsu, W.-F., & Wu, T.-T. (2024, July 6–12). An interactive picture storybook scheme for lower-grade elementary students to learn music and computational thinking. 2024 16th IIAI International Congress on Advanced Applied Informatics (IIAI-AAI) (pp. 712–713), Takamatsu, Japan. [Google Scholar] [CrossRef]
  131. Subramaniam, S., Maat, S. M., & Mahmud, M. S. (2022). Computational thinking in mathematics education: A systematic review. Cypriot Journal of Educational Sciences, 17(6), 2029–2044. [Google Scholar] [CrossRef]
  132. Sun, L., Hu, L., & Zhou, D. (2022). Programming attitudes predict computational thinking: Analysis of differences in gender and programming experience. Computers & Education, 181, 104457. [Google Scholar] [CrossRef]
  133. Sung, W., Ahn, J., & Black, J. B. (2022). Elementary students’ performance and perceptions of robot coding and debugging: Embodied approach in practice. Journal of Research in Childhood Education, 36(4), 681–696. [Google Scholar] [CrossRef]
  134. Tang, X., Yin, Y., Lin, Q., Hadad, R., & Zhai, X. (2020). Assessing computational thinking: A systematic review of empirical studies. Computers and Education, 148, 103798. [Google Scholar] [CrossRef]
  135. Tank, K. M., Ottenbreit-Leftwich, A., Moore, T. J., Yang, S., Wafula, Z., Kim, J., Fagundes, B., & Chu, L. (2024). Investigating sequencing as a means to computational thinking in young children. International Journal of Computer Science Education in Schools, 6(3), 67–77. [Google Scholar] [CrossRef]
  136. Tatar, C., & Eseryel, D. (2019, October 21–25). A literature review: Fostering computational thinking through game-based learning in K-12. The 42nd Annual Convention of The Association for the Educational Communications and Technology (pp. 288–297), Las Vegas, NV, USA. [Google Scholar]
  137. Tezer, M., Cumhur, M., & Hürsen, E. (2016). The spatial-temporal reasoning states of children who play a musical instrument, regarding the mathematics lesson: Teachers’ views. EURASIA Journal of Mathematics, Science and Technology Education, 12(6), 1487–1498. [Google Scholar] [CrossRef]
  138. Vlahović, I., & Biškupić, I. O. (2023, May 22–26). Fostering critical and computational thinking in the field of primary and secondary education in non-stem subjects by using data sets and applications. 2023 46th MIPRO ICT and Electronics Convention (MIPRO) (pp. 672–677), Opatija, Croatia. [Google Scholar] [CrossRef]
  139. Weintrop, D., Beheshti, E., Horn, M., Orton, K., Jona, K., Trouille, L., & Wilensky, U. (2016). Defining computational thinking for mathematics and science classrooms. Journal of Science Education and Technology, 25(1), 127–147. [Google Scholar] [CrossRef]
  140. Wing, J. M. (2006). Computational Thinking. Communications of the ACM, 49(3), 33–35. [Google Scholar] [CrossRef]
  141. Winkel, A. F., Yingling, S., Jones, A.-A., & Nicholson, J. (2017). Reflection as a learning tool in graduate medical education: A systematic review. Journal of Graduate Medical Education, 9(4), 430–439. [Google Scholar] [CrossRef] [PubMed]
  142. World Bank. (2019). Children learning to code: Essential for 21st century human capital. World Bank. [Google Scholar]
  143. World Economic Forum. (2015). New vision for education unlocking the potential of technology. World Economic Forum. [Google Scholar]
  144. Yadav, A., & Cooper, S. (2017). Fostering creativity through computing. Communications of the ACM, 60(2), 31–33. [Google Scholar] [CrossRef]
  145. Yapatang, L., & Polyiem, T. (2022). Development of the mathematical problem-solving ability using applied cooperative learning and Polya’s problem-solving process for grade 9 students. Journal of Education and Learning, 11(3), 40–46. [Google Scholar] [CrossRef]
  146. Ye, H., Liang, B., Ng, O.-L., & Chai, C. S. (2023). Integration of computational thinking in K-12 mathematics education: A systematic review on CT-based mathematics instruction and student learning. International Journal of STEM Education, 10(1), 3. [Google Scholar] [CrossRef]
  147. Ye, J., Lai, X., & Wong, G. K. (2022). The transfer effects of computational thinking: A systematic review with meta-analysis and qualitative synthesis. Journal of Computer Assisted Learning, 38(6), 1620–1638. [Google Scholar] [CrossRef]
  148. Zheng, R., Keller, D., Timoney, J., Lazzarini, V., & Messina, M. (2023, November 2–4). Toward lite coding: Designing the emugel prototype to boost music-computational thinking. UbiMus2023, II-Ubiquitous Music Symposium, Derry, UK. [Google Scholar]
Figure 1. Computational thinking facets in music practice strategies.
Figure 1. Computational thinking facets in music practice strategies.
Education 15 00306 g001
Table 1. Characteristics of the research population.
Table 1. Characteristics of the research population.
AgeN, Mean (SD)87, 21.23 (2.29)
GenderFemale54 (62%)
Male33 (38%)
Musical ExperienceLearned to play57 (65%)
Did not learn to play30 (35%)
Education LevelPartial High School3 (3.4%)
High School33 (37.9%)
Non-academic post-high school10 (11.5%)
Currently pursuing a bachelor’s degree29 (33.3%)
Bachelor’s degree and above12 (13.8%)
Programming ExperienceN, Mean (SD)87, 2.48 (1.4)
Table 2. Distribution data of the background variables. Bolded values violate the normality assumption.
Table 2. Distribution data of the background variables. Bolded values violate the normality assumption.
AgeYears of Playing *CTt ScoreBebras Score
N87548787
Mean21.26.653.653.5
SD2.304.331.432.2
Skewness0.310.390.140
SE Skewness0.260.330.260.26
Kurtosis−1.11−1.13−1.46−1.11
SE Kurtosis0.510.640.510.51
Z_Skewness1.191.180.540
Z_Kurtosis−2.181.77−2.86−2.18
* For participants that have played for more than a year.
Table 3. Correlations between the independent music-related variables and the CT scores, for participants with over a year of musical experience (N = 52).
Table 3. Correlations between the independent music-related variables and the CT scores, for participants with over a year of musical experience (N = 52).
Independent VariableCTtBebras
Can read notesρ = 0.31 *ρ = 0.27, p = 0.06
Can read cordsρ = 0.17, p = 0.23ρ = 0.25, p = 0.08
Can read tabs aρ = −0.34 *ρ = −0.34 *
Know music theoryρ = −0.21, p = 0.14ρ = −0.14, p = 0.33
* p < 0.05; a For this variable, N = 49.
Table 4. Relationship between background variables and test scores.
Table 4. Relationship between background variables and test scores.
Variable (n = 87)CTtBebras
Ager = −0.22 *r = −0.22 *
Level of Educationρ = −0.17
p = 0.12
ρ = −0.17
p = 0.11
Prog. Experienceρ = −0.17
p = 0.13
ρ = −0.23 *
* p < 0.05.
Table 5. Relationship between gender and musical experience to the test scores.
Table 5. Relationship between gender and musical experience to the test scores.
Score’s Mean (SD)
Variable T Test
Gender—CTtFemale (n = 54)
46.1 (30.1)
Male (n = 33)
65.8 (30.1)
t(85) = 2.95 **
Cohen’s d = 0.65
Gender—BebrasFemale (n = 54)
48.2 (31.4)
Male (n = 33)
62.1 (31.9)
t(85) = 2 *
Cohen’s d = 0.44
* p < 0.05 ** p < 0.01.
Table 6. Test scores within the different instruments played.
Table 6. Test scores within the different instruments played.
PlayingNot PlayingComparison Test (Mann–Whitney)
Pianon1635
CTt61.9 (29.7)60.0 (33.2)U = 270.0
p = 0.85
Bebras57.8 (32.6)57.1 (34.6)U = 278.0
p = 0.98
Keyboard (not piano)n1041
CTt54.0 (31.7)62.2 (21.1)U = 239.5
p = 0.42
Bebras55.0 (36.9)57.9 (33.3)U = 214.5
p = 0.83
Drums/Percussionn1041
CTt52.0 (36.8)62.7 (30.7)U = 239.5
p = 0.42
Bebras45.0 (43.8)60.4 (30.6)U = 253.5
p = 0.24
Guitarn1734
CTt56.5 (32.0)62.6 (32.0)U = 329
p = 0.43
Bebras52.9 (37.4)59.6 (32.0)U = 317
p = 0.57
Wind Instrumentsn1933
CTt77.4 (25.6)51.8 (31.6)U = 171.5 *
p = 0.018
RBC = 0.45
Bebras67.1 (27.7)51.5 (35.3)U = 230.5
p = 0.11
* p < 0.05.
Table 7. Playing style and the test scores.
Table 7. Playing style and the test scores.
Style Frequencyn, Mean (SD)Comparison Test
Classical MusicCTtMainn = 32, 62.5 (31.4)F = 0.62
p = 0.59
Secondaryn = 7, 51.4 (36.7)
Sometimesn = 6, 51.7 (30.6)
BebrasMainn = 32, 57.0 (32.5)F = 0.39
p = 0.68
Secondaryn = 6, 46.4 (44.3)
Sometimesn = 7, 62.5 (34.5)
Jazz MusicCTtMainn = 16, 55.0 (33.3)F = 7.25 **
η2 = 0.33
Secondaryn = 7, 25.7 (23.0)
Sometimesn = 10, 78.0 (19.9)
BebrasMainn = 16, 50.0 (36.5)F = 10.43 ***
η2 = 0.41
Secondaryn = 7, 21.4 (26.7)
Sometimesn = 10, 87.5 (17.7)
Rock–Pop MusicCTtMainn = 14, 52.1 (32.0)F = 0.87
p = 0.43
Secondaryn = 7, 70.0 (29.4)
Sometimesn = 10, 50.0 (34.6)
BebrasMainn = 14, 53.6 (39.0)F = 0.37
p = 0.69
Secondaryn = 7, 64.3 (34.9)
Sometimesn = 10, 47.5 (43.2)
Metal MusicCTtMainn = 3, 50.0 (43.6)F = 0.49
p = 0.62
Secondaryn = 10, 41.0 (30.0)
Sometimesn = 10, 56.0 (35.7)
BebrasMainn = 3, 33.3 (57.7)F = 1.24
p = 0.31
Secondaryn = 10, 32.5 (31.3)
Sometimesn = 10, 57.5 (37.4)
Klezmer Music aCTtMainn = 1, 20 (14.14)U = 107.5
p = 0.11
Secondaryn = 12, 42.31 (28.33)
Sometimesn = 13, 60 (34.42)
BebrasMainn = 1, 25 (35.35)U = 110.0
p = 0.08
Secondaryn = 12, 38.46 (34.78)
Sometimesn = 13, 57.14 (35.93)
** p < 0.01; *** p < 0.001; a For this variable, since only one participant chose “Main”, we used the Mann–Whitney test to compare the “Secondary” and “Sometimes” groups.
Table 8. Experience in playing in different types of ensembles.
Table 8. Experience in playing in different types of ensembles.
PlayingNot PlayingComparison Test
Orchestran3220
CTt68.12 (30.1)46 (29.63)U = 201.5 *
p = 0.025
RBC = 0.37
Bebras64.84 (31.02)43.75 (33.32)W = 200 *
p = 0.021
RBC = 0.37
Bandn2526
CTt54.8 (33.18)63.08 (30.04)U = 369.5
p = 0.4
Bebras52 (35.3)61.54 (31.8)W = 378
p = 0.31
Big Bandn1437
CTt55.71 (33.22)60.27 (31.31)U = 279.5
p = 0.67
Bebras50 (33.97)59.46 (33.52)W = 183
p = 0.1
Chambern2329
CTt65.22 (33.01)55.17 (30.19)U = 271.5
p = 0.25
Bebras63.04 (33.6)51.72 (32.69)U = 263.5
p = 0.19
Vocaln1438
CTt61.43 (29.83)58.95 (32.53)U = 262
p = 0.94
Bebras53.57 (29.18)57.89 (34.92)U = 286.5
p = 0.67
* p < 0.05.
Table 9. Computational thinking facets in music practice strategies.
Table 9. Computational thinking facets in music practice strategies.
PhasePractice StrategiesSub-StrategiesComputational Thinking (CT) Skills
Review PhasePre-Playing Listening to the Whole Piece Data Collection and Analysis, Modeling
Sight Reading Data Collection and Analysis, Pattern Recognition,
Modeling
Working PhaseError IdentificationError Identification by listening to professional performanceDebugging
Error Identification by hearing through playingDebugging
Error Identification by readingDebugging
Fixing ErrorsDecomposition of Challenging PartsDecomposition,
Debugging
Decreasing DimensionalityDecomposition,
Debugging
Decrease DifficultyDecomposition,
Debugging
Repeated PracticeDebugging,
Iteration
Reflection PhaseDeveloping Musical Abilities through Exercises Iteration,
Modeling
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Regev Cohen, T.; Armon, B.; Hershkovitz, A. The Associations Between Computational Thinking and Learning to Play Musical Instruments. Educ. Sci. 2025, 15, 306. https://doi.org/10.3390/educsci15030306

AMA Style

Regev Cohen T, Armon B, Hershkovitz A. The Associations Between Computational Thinking and Learning to Play Musical Instruments. Education Sciences. 2025; 15(3):306. https://doi.org/10.3390/educsci15030306

Chicago/Turabian Style

Regev Cohen, Tami, Bar Armon, and Arnon Hershkovitz. 2025. "The Associations Between Computational Thinking and Learning to Play Musical Instruments" Education Sciences 15, no. 3: 306. https://doi.org/10.3390/educsci15030306

APA Style

Regev Cohen, T., Armon, B., & Hershkovitz, A. (2025). The Associations Between Computational Thinking and Learning to Play Musical Instruments. Education Sciences, 15(3), 306. https://doi.org/10.3390/educsci15030306

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop