Next Article in Journal
Terahertz Single-Pixel Imaging Improved by Using Silicon Wafer with SiO2 Passivation
Next Article in Special Issue
Usability Evaluation for the Integration of Library Data Analysis and an Interactive Artwork by Sensing Technology
Previous Article in Journal
Study on the Influencing Factors of the Atomization Rate in a Piezoceramic Vibrating Mesh Atomizer
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Augmented Reality to Facilitate Learning of the Acoustic Guitar

by
Jorge Martin-Gutierrez
1,*,
Marta Sylvia Del Rio Guerra
2,
Vicente Lopez-Chao
3,
René Hale Soto Gastelum
2 and
Jose Fernando Valenzuela Bojórquez
2
1
Department of Techniques and Projects in Engineering and Architecture, Universidad de La Laguna, 38071 Tenerife, Spain
2
Department of Computer Science, Universidad de Monterrey, Nuevo León 66238, Mexico
3
Department of Architectural Graphics, Universidade da Coruña, 15001 Coruña, Spain
*
Author to whom correspondence should be addressed.
Appl. Sci. 2020, 10(7), 2425; https://doi.org/10.3390/app10072425
Submission received: 6 March 2020 / Revised: 27 March 2020 / Accepted: 29 March 2020 / Published: 2 April 2020
(This article belongs to the Special Issue Advanced Technologies in Lifelong Learning)

Abstract

:

Featured Application

A basic assumption shared by the fields of human–computer interaction and usability studies is that user-interfaces should be designed so that they deliver effective performance, a satisfactory user experience, and are easy to use.

Abstract

Many people wishing to learn a musical instrument opt to learn using alternative or informal methods instead of the traditional Master–Apprentice model that requires a greater cognitive load. This paper presents an augmented reality (AR)-based application designed to teach and train guitar chords, with the novelty that it is also used to teach short melodies consisting of four chord transitions so that users have to change hand and finger positions. The app uses high-quality 3D models of an acoustic guitar and animated hand to indicate correct finger positions and the movements required when changing from one chord to another. To follow the animated instructions, the learner overlaps the 3D model onto the neck of the physical guitar and his or her own hand. A system usability scale (SUS) questionnaire was used to measure the usability of the application. A score of 82.0 was obtained, which is higher than the average of 68 points that indicates the application is good from a user experience perspective, thus satisfying the purpose for which it was created. Having analysed the data for both groups—individuals with no prior experience of playing a musical instrument versus individuals with prior experience—it was concluded that the application provided a useful learning approach for all participants involved in the study, regardless of experience. That said, those possessing prior experience of playing an instrument learnt faster. It should be noted that the research revealed significant difference in learning by gender, with male participants learning faster than female participants. Similar results have been detected in other research performed in the field of music, as well as in other fields. As this study required spatial reasoning when viewing the 3D model, the differences identified this case may well have arisen as a consequence of differences in men and women’s spatial awareness, thereby leaving open an alternative line of research.

1. Introduction

Learning how to play a musical instrument is a demanding process, and learners must dedicate years to developing a variety of complex skills [1]. In general, the Master–Apprentice model is used in the teaching–learning process involved in musical training [2]. However, informal learning processes [3] offering a more fast-track approach may also be adopted. Individuals eager to learn how to how to play a musical instrument quickly often opt for informal methods, despite the fact that no specific informal method offers the guarantees offered by formal training [4].
Menin and Schiavio argue that the ease with which one can interact with an instrument provides the basis for musical comprehension. If we subscribe to this line of reasoning, it may well prove to be a fundamental element in learning how to play an instrument [5]. In light of this finding, we have conducted a training in which the guitar learner can interact with the instrument.
Technologies are now being used to train across a range of different fields. Keebler points out that technologies are very efficient at helping people with skills training and establishes a close relationship between STEAM and guitar learning through AR systems [6]. In the field of informal education of musical instruments, it is worth mentioning a proposal for a real-time system for online learning-based visual transcription of piano music [7], a visualisation system based on augmented reality (AR) for correct hand and finger placement on the guitar [8] or, equally, a system for teaching yourself guitar [9]. In 2003, Cakmakci, Berard and Coutaz became the first authors to apply AR-based instant tracking to a musical instrument; their aim was to reduce students’ cognitive load in comparison to traditional teaching methods [10]. To do so, these researchers came up with an AR-based learning system for the electrical bass guitar that uses fiducial markers to indicate which strings needed to be press for any given chord.
The ARPiano system by Trujano offers an example of a well-established piece of AR technology being used for musical training. This tool has been used to assist those wishing to learn how to play the piano for some time now. As a tool, it allows students to learn in an easier and more didactic manner [11]. It does so by precisely locating the keys and contours of the keyboard and then displaying the score as virtual fingers play the correct keys on the keyboard. The student merely has to follow the virtual fingers in order to play the score correctly [12].
Hackl and Anthes designed and implemented a piano training application called HoloKeys, developed for Microsoft’s HoloLens to facilitate piano learning [13]. The app can superimpose the keys that must be played onto a real piano. However, due to field of vision limitations, researchers were unable to draw conclusive results from the Holokeys study aimed at establishing the most suitable didactic models for this interesting app.
This ability to render an image with robust real-time 3D tracking using the recognition of textures and edges [14] has helped in the virtualization of elements, which in turn has facilitated learners understanding of musical chords on the piano [8]. However, such tracking on the guitar has proven more complex; any objects or structures behind (or around) this instrument makes it harder for detection algorithms [14] to define, and thus detect, the edges of the guitar.
Keebler’s study indicates that technology and informal learning environments facilitate learning during initial basic training [6]. Van Nimwegen et al. [15] also argue that formal learning techniques prove more difficult for students who are complete beginners. They state that, at times, formal techniques may negatively affect motivation, and argue that even though formal techniques assist students in retaining greater volumes of information in the long run, individuals have a greater tendency to give up and stop learning the instrument [16] when too many obstacles are encountered at the start.
Technologies such as the guitar AR-based Fretlight© reduce cognitive load and the effort that must be made to learn [6]. The learning system used in Fretlight© provides direct and overlapping information directly on the neck of the guitar indicating where and when to place fingers.
It is important to note that both students and teachers experience difficulties with the traditional Master–Apprentice model. Trainers need to design a teaching strategy that organizes time based on: student experience; face-to-face training; and periods of autonomous learning during which the student’s progress will very much depend on his or her own personal skill set [17]. In reviewing the main obstacles faced by students receiving musical training, it was found that students typically encounter two main problems: developing their musical ear [18], and understanding the instructional material used for learning (e.g., diagrams or drawings). In terms of the latter, AR can be used to address this challenge as it can be used to facilitate the visualisation of elements such as finger positioning [19]. A good example of this is the AR system proposed by Löchtefeld et al. [20], who designed and implemented an AR model that uses mobile projector mounted on the guitar’s headstock to project instructions that facilitate correct finger placement. Another recent proposal for learning the guitar suggests using coloured finger markers and then projecting the corresponding colours onto the strings and frets using AR to instruct learners how to perform a chord [21]. The aforementioned studies do not display hand movements involved in the chord transitions of short melodies.
Building on the aforementioned work, this paper describes a new AR system that has been developed by the authors that allows learners to see the correct position and movement of fingers when playing chords and chord transitions. The AR system developed by the authors facilitates visualization and, consequently, imitation by learners who are complete beginners. This is achieved by means of colour-coded finger markers, and a semi-transparent 3D model of a hand. Both elements are superimposed onto the neck and frets of the physical guitar displayed onscreen. For the purpose of this study, research has focused on individuals who are right-handed. The focus of the research has been to develop and test the usability of a system that provides user-centred training (to individuals learning the guitar). An informal learning method has been proposed that has a low cognitive load, and which is capable of teaching musical chords in a short period of time.

2. Objectives and Hypothesis

The general objective of this study is to research the use of AR in informal learning processes, and to establish whether it facilitates learning. In this case, the authors test whether an AR application of guitar learning initiation helps apprentices perform hand posture and finger positioning to play chords and establish the level of usability of the proposed AR model [22] using a system usability scale (SUS) instrument. The apprentices were arranged into two groups: those who had experience playing a different instrument from the guitar and those who had no experience playing any musical instrument.
The following research hypotheses (HR) and their corresponding null hypotheses (Ho) have been defined for this study:
  • HR1: Apprentices perform correctly the chords by imitating the movement of the hand and fingers positioning from a three-dimensional animation superimposed on the neck of the physical guitar. Null Hypothesis: Ho = Apprentices do not perform correctly the chords by imitating the movement of the hand and fingers positioning from a three-dimensional animation superimposed on the neck of the physical guitar.
  • HR2: Learners with experience playing musical instruments different to the guitar perform chords more quickly than learners without experience in playing musical instruments. Null Hypothesis: Ho = Learners with experience playing musical instruments different to the guitar do not perform chords more quickly than learners without experience to play musical instruments.
  • HR3: Learners with experience in playing a musical instrument different to the guitar perform chord transitions more quickly than learners without experience in playing musical instruments. Null Hypothesis: Ho = Learners with experience in playing a musical instrument different to the guitar do not perform chord transitions more quickly than learners without experience in playing musical instruments.
  • HR4: Age influences training when AR technologies are used to teach guitar. Null Hypothesis: Ho = Age does not influence training when AR technologies are used to teach guitar.
  • HR5: Gender influences training when AR technologies are used to teach guitar. Null Hypothesis: Ho = Gender does not influence training when AR technologies are used to teach guitar.
Although many studies have demonstrated the benefits of using AR in the educational area, only few have measured their usability and efficiency [23]. In the case of this study, the authors have taken steps to measure the training benefits and the degree of usability presented by the proposed concept. In addition, in this sense, the authors have tried to avoid the traditional side of pure education [24]. The authors of this paper have opted to use a model that focuses on the user and the training process [25], rather than using models based on deep learning that require complicated cognitive processes.

3. Conceptual Design

This section provides an overview of system hardware and software specifications, and provides details on how the individual components interrelate.
The focus of this research is on identifying whether AR facilities training. In this case, whether it can help complete beginners train how to play chords on the guitar correctly by demonstrating correct finger placement. In using AR to create interactions between the real and virtual world it is possible to create a more interesting train experience, meaning that training can be delivered in an interesting and satisfactory way [23]. Taking this into account, the research team developed a 3D model of an acoustic guitar that is implemented in AR to display the neck and frets of the guitar. When building this model, the intention was to construct a tool that would allow any novice to identify the strings and frets that needed to be played for a particular chord. To achieve this, a static colour is assigned to each finger. Figure 1 below shows the distribution of colours by finger.
Using this colour code, learners are instructed how to position fingers on the strings when playing any given chord. As there was a perceived risk that learners might quickly forget or confuse the colours assigned to each finger, researchers also modelled a semi-transparent 3D hand to indicate correct hand and finger placements. This hand was made semi-transparent for improved visibility. Figure 2 below shows how the G chord is rendered by the software. In this figure it is possible to discern both of the aforementioned elements, the colour code used for finger placements and the 3D hand.
Some chords require more than one string to be pressed down by the same finger (barre chords). For such chords, a vertical line was created that passes through the fret/strings that need to be pressed down simultaneously by a single finger.
The learner can visualize the model using a mobile device, can zoom in and out at will, and can also freely rotate the model 360° to allow them to accommodate the model into the position it is easiest for them to observe.
To measure if a chord or short melody is played correctly researchers use the smartphone app Chord Detector. The time taken to perform chords or chord transitions is measured using a chronometer. Figure 3 below details the scenario design used to test the research hypotheses and measure usability.

4. Methodology

The ability to interpret musical notes is a fundamental skill required of any musician. Nonetheless, even professional musicians find it extremely difficult to read and perform musical notes accurately whilst giving a performance [26]. As such, the challenge of this research lay in developing an AR prototype that presents musical notes in a manner that is easy to visualise in order that participants can quickly and easily identify the correct finger movements and placements required to play chords.
For this research, the degree of ease with which participants play the guitar was measured using the SUS model. The data gathered from the survey was used to assist in identifying the success of the 3D model in terms of the proposed objective. The influence of the variable age was analysed in function of the ease with which participants were able to train to play the guitar using the proposed system.
The following data was recorded to establish participants’ success rates when tested: the time taken to learn how to position fingers correctly; the time taken to play the chords correctly; and the number of times chords are successfully played in the set time (success rate).

4.1. Participants

For the purpose of this study a total of 36 university students were recruited from different programs run by the University of Monterrey in Mexico. None of the participants had ever studied the guitar. All participants were right-handed.
Participants’ ages ranged from 18 to 25; the mean age (M) of the participants was 20.42 years, with a standard deviation (SD) of 1.24.
Participants were then selected at random and given a screener to identify whether they had any previous experience playing musical instruments. From this screener, two archetypes were established: (a) Students who had no prior experience playing any instrument, and (b) students who had prior experience playing an instrument (not the guitar). Those participants with a musical background had experience playing the flute, the piano, the drums and the violin, with an average of 4.8 years of experience for their respective instruments. Based on these archetypes, the 36 participants were divided into two groups, each containing 18 participants (9 men and 9 women):
Group 1—Students who had no prior experience playing any instrument.
Group 2—Students who had prior experience playing an instrument.
The experiment was supervised by a professional music teacher.

4.2. Equipment

The experiment required two steel-string acoustic guitars, an iPhone XR, an Apple TV device, and the following mobile apps: Augment3D to project the digital guitar, Chord Detector to analyse chords as they were played, and Stopwatch to measure the time it takes to achieve the first success (integrated in the Chord Detector app).
The Augment3D application used to display the 3D model [27] was run on an iPhone XR with the following specifications: LCD screen with a resolution of 1792 × 828 pixels at 326 ppi; IOS 12 operating system; 64 GB internal storage; Bionic A12 processor; 4 GB RAM; and a 12 MP camera.
An AppleTV device allow to display the iPhone screen in a larger format to ensure that participants could see the image being rendered on the mobile phone. In this instance, researchers used a TV 42.
The Chord Detector app [28] was run on a Samsung Galaxy J4 with the following specifications: OLED screen with a resolution of 1280 × 720 pixels; Android 8.0 operating system; Processor Exynos 7570; 32 GB internal storage; 2 GB RAM; and a 13 MP camera.
During the tests, researchers used Chord Detector version 1.0.6. This smartphone application detects and analyses the frequency of audio sources to detect musical chords. It then displays the chord that has been identified onscreen and show stopped timer.

4.3. Procedure

The experiment consisted of delivering 1 h training sessions that were held in classrooms at the university. All participants, regardless of archetype, received the same training. Sessions were designed so that there were two students per session. In each session, participants were asked to learn chords and chord transitions on an acoustic guitar.
In total 18 1 h sessions were run to train all 36 participants. These sessions were delivered over a period of 5 days. To achieve this, four 1 h sessions were run per day: Two sessions in the morning and two sessions in the afternoon. On the fifth day, two additional sessions were required to complete training.
Participants received a brief introduction prior to commencing the aforementioned sessions. In this introduction they were thanked for volunteering and provided details of the two sets of tasks they would be asked to perform (see Table 1).
It was explained that Task 1 consists of playing seven different chords independently of one another, and that Task 2 consists of playing six chord transitions (short melodies). All participants were told they would be playing chords following the chord sequence listed in Table 1. They were also informed that they would learn through a training these chords by imitating a 3D model (Figure 4).
Task 1 serves two purposes: Firstly, it is intended to help familiarise participants with the system; secondly, it is designed to test whether it is easier to play chords using AR than traditional sheet music. The latter is established based on the experience of the professional music teacher-supervisor running the sessions.
These tasks were designed to allow researchers to analyse whether the use of AR facilitates training. Thus, in this instance, researchers tested whether AR helped learners who were complete beginners, and studying the guitar informally, to master guitar chords and chord transitions by themselves. The tasks, therefore, serve as a means through which to verify the hypotheses.
After the task 1 was complete, participants took a five-minute break before do task 2.
Time limits were set for playing chords. These time limits were defined based on the work of Yuan [29], which demonstrated that the first two minutes of performing music are dedicated to becoming familiar with the rhythm and also finger placement. As such, a max limit of 1.5 min was set for individual chords, whilst a max of 5 min was set for chord transitions. Participants were not informed of time limits so as not to induce stress. The chord execution tasks were intended to test whether participants confused frets, or misplaced fingers when changing their positions. A protocol was established during the tests: If participants exceeded the set time limit on a task, the supervisor would log the task as a failed attempt. Nonetheless, in such cases the supervisor would wait additional 30 s before stopping the student to see whether the task could be completed. After these additional 30 s, the participant would be informed that the task is finished and they would need to move on to the next one. In an effort to keep stress levels to a minimum, the supervisor ensured participants were not made to feel there was a problem if any given task could not be completed.
A mobile phone was used to render the 3D AR model. However, to ensure the participant could easily visualise the 3D model being superimposed onto the physical guitar, the model was duplicated and displayed on a TV screen to enlarge the image.
The Chord Detector application was used to check if the participant had played the corresponding chord correctly. The time taken to perform the chord was recorded. Time was measured from the moment the instruction was given to play the first chord until the app recognizes the sound, at which point the stopwatch is stopped. This process was repeated for each chord, and the success rate and timings for each chord were logged. In terms of optimal results, the shorter the time taken to perform the chord the better.
Once a pair of participants completed the session, the next set of participants was brought in to receive their instructions. Those who had finished were asked to complete the SUS survey. This survey is used to establish the usability of the proposed prototype.

5. Results

This section contains the results according to the aforementioned variables. Section 5.1 presents details of how easy participants found the proposed tasks. Section 5.2 describes the usability of the 3D model. Section 5.3 shows performance differences by Age and by Gender. Finally, Section 5.4 presents observational findings. Table 2 below contains the statistical description of data that was compiled for each of the experimental groups (Timings and success rates).

5.1. Ease of Playing the Guitar

A total of 36 participants were evaluated in the experiment. These participants were divided into two groups, each containing 18 participants (9 men and 9 women). Group 1 contained participants with no prior experience playing musical instruments at all, whilst Group 2 contained participants who had some prior experience playing an instrument. None of the participants had any prior experience of playing the guitar.
Using the times of success of each participant for Task 1 and Task 2, a single-factor ANOVA for performed for each group. Previously, the normality of the data was calculated using a Shapiro-Wilk test. From ANOVA analysis, the resulting p-values for “time taken” were 0.016 for Task 1 and 0.023 for Task 2. This indicates that there are significant differences between the two groups in terms of the time taken to play individual chords (Task 1) and to play chord transitions (Task 2). Notably, timings are shorter for individuals with prior experience of playing a musical instrument (Group 2). Therefore, research hypothesis HR2 is accepted: “Learners with experience playing musical instruments different to the guitar perform chords more quickly than learners without experience in playing musical instruments”.
Based on the success rate and timings for each task, hypothesis HR3 is also accepted: “Learners with experience in playing a musical instrument different to the guitar perform chord transitions more quickly than learners without experience in playing musical instruments”.
Both groups successfully performed the tasks, as such hypothesis HR1 is accepted: “Apprentices perform correctly the chords by imitating the movement of the hand and fingers positioning from a three-dimensional animation superimposed on the neck of the physical guitar”.
A comparative analysis of each task was performed using Welch’s t-test (see Table 2). This analysis identified significance differences between each task by group. In all instances, a significant difference was identified (p-value < 0.05). In other words, there is a significant difference between both groups in terms of the time taken to perform each task. With the exception of the task that consisted of playing the C chord, Group 2 performed all chords and chord transitions faster than Group 1. In the case of the C chord, there was also a significant difference (p-value = 0.021), however in this instance it was Group 1 that performed the task faster than Group 2 (see Table 3).

5.2. Usability of 3D Model

To calculate the usability level of the model, participants were asked to complete the well-known system usability scale (SUS) survey [22,30] once they had finished all tasks. This survey was used to establish their perceptions and measure the usability of the AR app for training to place the hand and position the fingers on each chord to play the guitar.
According to Lewis and Sauro [31], usability studies using the SUS should have sample sizes of at least 12. In this case, all 36 participants answered the ten questions on the SUS. Interpreting scoring can be complex, “The participant’s scores for each question are converted to a new number, added together and then multiplied by 2.5 to convert the original scores of 0–40 to 0–100. Though the scores are 0–100, these are not percentages and should be considered only in terms of their percentile ranking” [30]. SUS is a highly robust and versatile tool for usability professionals and, based on research by Bangor [32], a SUS score above 68 would be considered above average, whilst anything below 68 would be considered below average; however, the best way to interpret results involves “normalizing” the scores to produce a percentile ranking [32,33].
An average score of 82 out of 100 was obtained from SUS questionnaire administrated to participants. This ranks the 3D AR-based model as “very useful” and places it well above the average of 68.

5.3. Performance Differences between Ages

In this experiment, the majority of participants were aged between 18 and 21 years old (81%), with the remainder aged between 22 and 25 years old (19%). Based on the results obtained for these age groups, there is no significant difference in the speed with which they were able to complete the tasks. The HR4 is not accepted.
A two-way ANOVA was applied to establish comparisons between the times taken by both groups and the variable Gender (see Table 4). The time taken by participants considered a dependent variable whilst experimental group and Gender are considered independent variables. Two ANOVA analyses were performed, the first considering all times recorded for Task 1 (Table 5) and the second considering all times recorded for Task 2 (Table 6).
The results indicate there are differences in the times taken by each group to perform chords (Task 1), and that there are differences between the times taken by men and women, and even in the interaction between both.
In Task 2, there are differences by group and Gender, but there is no significant difference in the interaction between Gender and Experimental group based on their prior experience. Therefore, research hypothesis HR5 is accepted: “Gender influences training when AR technologies are used to teach guitar”.

5.4. Observational Findings

In addition to gathering quantitative results to test the research hypotheses, the research team also closely monitored the behaviour of the participants during the experiment to obtain observational findings. During these observations, researchers identified that one of the main problems encountered during tasks was distinguishing the colours of strings and fingers on the projected image. That said, participants commented that the model was very intuitive given that they could zoom in or rotate it 360° to get a better view of the guitar. They felt this feature allowed them to complete tasks faster. The translucent 3D model of a hand was easy to follow and did not cause confusion. Being able to view a mirror image of the 3D model on screen that was superimposed onto their own hand facilitated training, as the learner would imitate and follow the model’s instructions.
Finally, participants with no prior experience playing musical instruments proved to be particularly enthused by the model and were nicely surprised by the experience. They said this training approach appealed to them, and, in many cases, it had even motivated them to continue learning the guitar in the future.

6. Conclusions and Future Work

It was shown that people who had played different musical instruments than the guitar before the experiment were able to perform the basic chords (C, D, E, F, G, A, B), and some chord transitions, with greater ease than individuals without any prior experience with musical instruments. Those with no musical instruments experience needed a longer amount of time to correctly perform the chords. That said, there was one anomaly in the results: The C chord proved to be the easiest chord for participants without musical knowledge to play, and the times recorded for this group proved faster than the times recorded for the group containing individuals with prior experience playing musical instruments.
As participants were able to practice how to play chords in a single 1 h session, the proposed AR system is deemed to have produced satisfactory results.
Observations made during the experiment revealed that experienced participants demonstrated greater patience when using the AR model, while those with no experience of musical instruments sought to play the chord as quickly as possible (and often incorrectly).
All participants, regardless of age, fully accepted the proposed technology. That said, the authors recognise that the analysed age ranges differ only slightly (18–21 and 22–25). Therefore, it would be advisable to gather and analyse data from wider range of ages, for example children or the middle aged.
Previous research based on learning to play the guitar using AR systems often focuses on the effectiveness of the system. Regarding user perception with the proposed system, Liarokapis [9] mentions “all users agreed that the system is easy to use and that the visualization process is satisfactory. Besides, participants found the interaction techniques very useful and easy to use in contrast to commercial software learning tools. In terms of the technical part of the evaluation, as expected, most of the users preferred the monitor-based visualization versus the Head Mounted Displays - based visualization”. Besides, users comment on the drawback that the use of marks is a disadvantage since it must always be in front of the camera. Del Rio et al. [21] conducted a SUS like the one presented in this article and obtained a similar evaluation of the usability of the system. In cases where there is mention of user perception, participants indicated that it is easy to use and that they were interested in learning.
The proposed training system delivers results that reveal significant difference by gender, both in terms of playing chords and playing short melodies. The data show that men perform both tasks faster than women. However, based on the research data and our observational findings it was not possible to determine why men are faster. The tasks require that participants possess both physical dexterity as well as spatial skills.
Given that they need the ability to interpret the movements demonstrated by the 3D model, it is worth questioning whether spatial ability might be a determining factor. Several studies have indicated that, generally speaking, men have better spatial awareness than women as a result of different factors. This statement has even been reaffirmed in conclusions drawn from more recent research [34,35,36,37].
The research described in this paper showed that it is easier for individuals with prior experience playing musical instruments to play acoustic guitar using AR. The degree of usability of the model is acceptable. Therefore, students feel comfortable interacting with the model and can use it to train in their free time without the need for a teacher [38], which supports that AR technology is appropriate in the first training steps.
The results obtained show that the 3D model also has a positive impact on people without any prior experience with a musical instrument other than the guitar. Data shows that more than 85% of the tests were completed within the established time limit. Based on the findings, the authors recommended teaching beginners the C chord first.
Finally, the results of this study open the door for designers looking to create applications in which AR models interact directly with a physical guitar. The study provides a guide for modelling such applications, which at this stage cannot be based on common design standards [23].
When planning future AR application design that relates to teaching acoustic guitar, developers are recommended to implement animated models that allow users to visualise smoother finger movements. It can be confirmed that the model used in this research has an acceptable level of usability; this opens the door for further research comparing learning acoustic guitar using AR versus the traditional learning model. For this to have positive results, the AR model must be well-designed to ensure that augmented reality offers an intuitive and motivating tool for learning [39].
In summary, informal music learning could benefit from removing the barrier of learning under a formal environment in which cognitive load carries significant weight, and focusing learning on enhancing an easy-to-follow learning and training system like augmented reality systems. The benefits of these systems can motivate learners to continue playing to the point that they actively seek a teacher to develop the ability to learn more traditional forms of music once they have reached a certain level of competence.

Author Contributions

The contributions to this paper are as follows: conceptualization, investigation and methodology M.S.D.R.G., J.M.-G., R.H.S.G. and J.F.V.B.; software, validation, R.H.S.G. and J.F.V.B.; formal analysis, data curation J.M.-G.; writing—original draft preparation, R.H.S.G. and J.F.V.B.; supervision, J.M.-G. and M.S.D.R.G.; writing—review and editing, J.M.-G. and V.L.-C. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Acknowledgments

Our warmest thanks goes out to the students of the University of Monterrey who participated in this study and made this project possible.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Schafer, R. The Soundscape: Our Sonic Environment and the Tuning of the World; Destiny Books: Rochester, VT, USA, 1993. [Google Scholar]
  2. Ramirez, R.; Perez, A.; Waddell, G.; Williamon, A.; Canepa, C.; Ghisio, S.; Kolykhalova, K.; Mancini, M.; Volta, E.; Volpe, G.; et al. Enhancing Music Learning with Smart Technologies. In Proceedings of the 5th International Conference on Movement and Computing—MOCO ’18, Genoa, Italy, 28–30 June 2018; pp. 1–4. [Google Scholar]
  3. Keebler, J.R.; Wiltshire, T.J.; Smith, D.C.; Fiore, S.M.; Bedwell, J.S. Shifting the Paradigm of Music Instruction: Implications of Embodiment Stemming from an Augmented Reality Guitar Learning System. Front. Psychol. 2014, 5, 471. [Google Scholar] [CrossRef] [PubMed]
  4. Green, L. Group Cooperation, Inclusion and Disaffected Pupils: Some Responses to Informal Learning in the Music Classroom. Presented at the RIME Conference 2007, Exeter, UK. Music Educ. Res. 2008, 10, 177–192. [Google Scholar] [CrossRef]
  5. Menin, D.; Schiavio, A. Rethinking Musical Affordances. AVANT 2012, III, 202–215. [Google Scholar]
  6. Keebler, J.R.; Wiltshire, T.J.; Smith, D.C.; Fiore, S.M. Picking up STEAM: Educational Implications for Teaching with an Augmented Reality Guitar Learning System. In Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer: Berlin/Heidelberg, Germany, 2013. [Google Scholar]
  7. Akbari, M.; Liang, J.; Cheng, H. A Real-Time System for Online Learning-Based Visual Transcription of Piano Music. Multimed. Tools Appl. 2018, 77, 25513–25535. [Google Scholar] [CrossRef]
  8. Motokawa, Y.; Saito, H. Support System Using Augmented Reality Display for Use in Teaching the Guitar. J. Inst. Image Inf. Telev. Eng. 2007, 61, 789–796. [Google Scholar] [CrossRef]
  9. Liarokapis, F. Augmented Reality Scenarios for Guitar Learning. In Proceedings of the Theory and Practice of Computer Graphics 2005, TPCG 2005—Eurographics UK Chapter Proceedings, Canterbury, UK, 14 March 2005; pp. 163–170. [Google Scholar]
  10. Cakmakci, O.; Bèrard, F.; Coutaz, J. An Augmented Reality Based Learning Assistant for Electric Bass Guitar. In Proceedings of the 10th International Conference on Human-Computer Interaction, Creta, Creece, 23–27 June 2003. [Google Scholar]
  11. Trujano, F.; Khan, M.; Maes, P. ARPiano Efficient Music Learning Using Augmented Reality; Springer: Berlin/Heidelberg, Germany, 2018. [Google Scholar]
  12. Huang, F.; Zhou, Y.; Yu, Y.; Wang, Z.; Du, S. Piano AR: A Markerless Augmented Reality Based Piano Teaching System. In Proceedings of the 2011 3rd International Conference on Intelligent Human-Machine Systems and Cybernetics, IHMSC 2011, Hangzhou, China, 26–27 August 2011; pp. 47–52. [Google Scholar]
  13. Hackl, D.; Anthes, C. HoloKeys—An Augmented Reality Application for Learning the Piano. In Proceedings of the 10th Forum Media Technology and 3rd All Around Audio Symposium, St. Pölten, Austria, 29–30 November 2017; pp. 140–144. [Google Scholar]
  14. Vacchetti, L.; Lepetit, V.; Fua, P. Combining Edge and Texture Information for Real-Time Accurate 3D Camera Tracking. In Proceedings of the ISMAR 2004: Third IEEE and ACM International Symposium on Mixed and Augmented Reality, Arlington, VA, USA, 5 November 2004. [Google Scholar]
  15. Van Nimwegen, C.; Van Oostendorp, H.; Schijf, H.J.M. Externalization vs. Internalization: The Influence on Problem Solving Performance. In Proceedings of the IEEE International Conference on Advanced Learning Technologies, ICALT 2004, Joensuu, Finland, 30 August–1 September 2004; pp. 311–315. [Google Scholar]
  16. McDermott, J.; Gifford, T.; Bouwer, A.; Wagy, M. Should Music Interaction Be Easy? In Music and Human-Computer Interaction; Holland, S., Wilkie, K., Mulholland, P., Seago, A., Eds.; Springer: Berlin/Heidelberg, Germany, 2013. [Google Scholar]
  17. Hietanen, L.; Ruokonen, I.; Ruismäki, H.; Enbuska, J. Student Teachers’ Guided Autonomous Learning: Challenges and Possibilities in Music Education. Procedia Soc. Behav. Sci. 2016, 217, 257–267. [Google Scholar] [CrossRef] [Green Version]
  18. Kraus, N.; Chandrasekaran, B. Music Training for the Development of Auditory Skills. Nat. Rev. Neurosc. 2010, 11, 599–605. [Google Scholar] [CrossRef] [PubMed]
  19. Kerdvibulvech, C.; Saito, H. Guitarist Fingertip Tracking by Integrating a Bayesian Classifier into Particle Filters. Adv. Hum. Comput. Interact. 2008, 2008, 1–10. [Google Scholar] [CrossRef] [Green Version]
  20. Löchtefeld, M.; Krüger, A.; Gehring, S.; Jung, R. GuitAR—Supporting Guitar Learning through Mobile Projection. In Proceedings of the Conference on Human Factors in Computing Systems, Vancouver, BC, Canada, 7–12 May 2011; pp. 1447–1452. [Google Scholar]
  21. Del Rio-Guerra, M.S.; Martin-Gutierrez, J.; Lopez-Chao, V.A.; Flores Parra, R.; Ramirez Sosa, M.A. AR Graphic Representation of Musical Notes for Self-Learning on Guitar. Appl. Sci. 2019, 9, 4527. [Google Scholar] [CrossRef] [Green Version]
  22. Brooke, J. SUS-A Quick and Dirty Usability Scale. Usability Eval. Ind. 1996, 189, 4–7. [Google Scholar]
  23. da Silva, M.M.O.; Teixeira, J.M.X.N.; Cavalcante, P.S.; Teichrieb, V. Perspectives on How to Evaluate Augmented Reality Technology Tools for Education: A Systematic Review. J. Braz. Comput. Soc. 2019, 25, 3. [Google Scholar] [CrossRef] [Green Version]
  24. Merriam, S.B.; Tisdell, E.J. Qualitative Research: A Guide to Design and Implementation, 4th ed.; John Wiley & Sons: Hoboken, NJ, USA, 2015. [Google Scholar]
  25. Ooaku, T.; Linh, T.D.; Arai, M.; Maekawa, T.; Mizutani, K. Guitar Chord Recognition Based on Finger Patterns with Deep Learning. In Proceedings of the 4th International Conference on Communication and Information Processing, Qingdao, China, 2–4 November 2018; pp. 54–57. [Google Scholar]
  26. Lu, C.-I.; Greenwald, M.L.; Lin, Y.-Y.; Bowyer, S.M. Reading Musical Notation versus English Letters: Mapping Brain Activation with MEG. Psychol. Music 2019, 47, 255–269. [Google Scholar] [CrossRef]
  27. 3D and Augmented Reality Product Visualization Platform Augment. Available online: https://www.augment.com/ (accessed on 25 February 2020).
  28. Chords Detector 2.0 2.0 Android. Available online: https://chords-detector.es.aptoide.com/ (accessed on 25 February 2020).
  29. Yuan, B.; Folmer, E. Blind Hero: Enabling Guitar Hero for the Visually Impaired. In Proceedings of the ASSETS’08: The 10th International ACM SIGACCESS Conference on Computers and Accessibility, Halifax, NS, Canada, 13–15 October 2008; pp. 169–176. [Google Scholar]
  30. System Usability Scale (SUS). Usability.Gov. Available online: https://www.usability.gov/how-to-and-tools/methods/system-usability-scale.html (accessed on 25 February 2020).
  31. Lewis, J.R.; Sauro, J. The Factor Structure of the System Usability Scale. In Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer: Berlin/Heidelberg, Germany, 2009. [Google Scholar]
  32. Bangor, A.; Kortum, P.; Miller, J. Determining What Individual SUS Scores Mean: Adding an Adjective Rating Scale. J. Usability Stud. 2009, 4, 114–123. [Google Scholar]
  33. Sauro, J. MeasuringU: Measuring Usability With The System Usability Scale (SUS). MeasuringU. 2011. Available online: http://measuringu.com/sus/ (accessed on 28 October 2019).
  34. Self, C.M.; Gopal, S.; Golledge, R.G.; Fenstermaker, S. Gender-Related Differences in Spatial Abilities. Prog. Hum. Geogr. 1992, 16, 315–342. [Google Scholar] [CrossRef]
  35. Yuan, L.; Kong, F.; Luo, Y.; Zeng, S.; Lan, J.; You, X. Gender Differences in Large-Scale and Small-Scale Spatial Ability: A Systematic Review Based on Behavioral and Neuroimaging Research. Front. Behav. Neurosci. 2019, 128. [Google Scholar] [CrossRef] [PubMed]
  36. Wong, W.I.; Yeung, S.P. Early Gender Differences in Spatial and Social Skills and Their Relations to Play and Parental Socialization in Children from Hong Kong. Arch. Sex. Behav. 2019, 48, 1589–1602. [Google Scholar] [CrossRef] [PubMed]
  37. Wong, M.; Castro-Alonso, J.C.; Ayres, P.; Paas, F. Investigating Gender and Spatial Measurements in Instructional Animation Research. Comput. Hum. Behav. 2018, 89, 446–456. [Google Scholar] [CrossRef] [Green Version]
  38. Kularbphettong, K.; Roonrakwit, P.; Chutrtong, J. Effectiveness of Enhancing Classroom by Using Augmented Reality Technology. In Advances in Intelligent Systems and Computing; Springer: Berlin/Heidelberg, Germany, 2019. [Google Scholar]
  39. Patzer, B.; Smith, D.C.; Keebler, J.R. Novelty and Retention for Two Augmented Reality Learning Systems. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2014, 58, 1164–1168. [Google Scholar] [CrossRef]
Figure 1. Distribution of colours by finger.
Figure 1. Distribution of colours by finger.
Applsci 10 02425 g001
Figure 2. 3D model showing correct finger positions for the G chord.
Figure 2. 3D model showing correct finger positions for the G chord.
Applsci 10 02425 g002
Figure 3. Scenario demonstrating how learners use the mobile augmented reality (AR) app.
Figure 3. Scenario demonstrating how learners use the mobile augmented reality (AR) app.
Applsci 10 02425 g003
Figure 4. Example of task: Student performing the G chord.
Figure 4. Example of task: Student performing the G chord.
Applsci 10 02425 g004
Table 1. Proposed learning tasks.
Table 1. Proposed learning tasks.
Task 1: Play ChordsTask 2: Play Chord Transitions
1. Play C majorCh1. C-G-A-D
2. Play D majorCh2. E-D-A-G
3. Play E majorCh3. E-A-D-C
4. Play F majorCh4. G-E-D-C
5. Play G majorCh5. D-C-A-G
6. Play A majorCh6. A-G-C-E
7. Play B major
Table 2. Summary of statistical description of data: Mean values in seconds and standard deviation.
Table 2. Summary of statistical description of data: Mean values in seconds and standard deviation.
Group 1—Participants without Experience Playing Musical Instruments Group 2—Participants with Experience Playing other Musical Instruments
Average Time (SD)Success RateTaskAverage Time (SD)Success Rate
47 (18)2C50 (15)1
108 (21)1D25 (7)3
18(8)4E13 (5)4
54(12)1F29 (6)2
37 (9)2G13 (4)4
58 (13)1A16 (6)4
48 (9)1B31 (5)2
275 (29)1Chord_1116 (7)2
324 (33)1Chord_253 (10)3
133 (21)1Chord_343 (12)3
185 (19)1Chord_497 (15)2
252 (15)1Chord_5159 (17)1
341 (26)1Chord_6217 (14)1
Table 3. Welch t-test for Task 1 (playing individual chords) and Task 2 (playing chord transitions).
Table 3. Welch t-test for Task 1 (playing individual chords) and Task 2 (playing chord transitions).
Tasks
CDEFGABCh1Ch2Ch3Ch4Ch5Ch6
p-Value0.0210.0110.0270.0030.0340.0450.0150.4020.0130.0260.0200.0370.047
Table 4. Summary of statistical description of data by group and gender.
Table 4. Summary of statistical description of data by group and gender.
Tasks 1 Play ChordsTasks 2
Play Chord Transitions
Group 1Males53.90 (16.02)241.92 (24.21)
Females51.81 (23.11)261.41 (34.53)
Group 2Males17.46 (5.20)101.50 (14.54)
Females33.11 (11.3)126.83 (25.21)
Table 5. Two-way ANOVA: Time taken to play musical chords.
Table 5. Two-way ANOVA: Time taken to play musical chords.
SourceType III Sum of SquaresglMean SquareFSig.
Corrected Model54,636.107(a)318,212.03649.9670.000
Intercept386,810.0361386,810.0361061.2680.000
Groups47,150.125147,150.125129.3630.000
Gender2780.12512780.1257.6270.006
Groups * Gender4706.03614706.03612.9120.000
Error90,390.857248364.479
Total531,837.000252
Corrected Total145,026.964251
Table 6. Two-way ANOVA: Time taken to play chord transitions (short melodies).
Table 6. Two-way ANOVA: Time taken to play chord transitions (short melodies).
SourceType III Sum of SquaresglMean SquareFSig.
Corrected Model1,050,175.125(a)3350,058.37581.9320.000
Intercept7,223,745.37517,223,745.3751690.7300.000
Groups1,022,175.33411,022,175.334239.2420.000
Gender27,540.334127,540.3346.4460.012
Groups * Gender459.3751459.3750.1080.743
Error905,782.5002124272.559
Total9,179,703.000216
Corrected Total1,955,957.625215

Share and Cite

MDPI and ACS Style

Martin-Gutierrez, J.; Del Rio Guerra, M.S.; Lopez-Chao, V.; Soto Gastelum, R.H.; Valenzuela Bojórquez, J.F. Augmented Reality to Facilitate Learning of the Acoustic Guitar. Appl. Sci. 2020, 10, 2425. https://doi.org/10.3390/app10072425

AMA Style

Martin-Gutierrez J, Del Rio Guerra MS, Lopez-Chao V, Soto Gastelum RH, Valenzuela Bojórquez JF. Augmented Reality to Facilitate Learning of the Acoustic Guitar. Applied Sciences. 2020; 10(7):2425. https://doi.org/10.3390/app10072425

Chicago/Turabian Style

Martin-Gutierrez, Jorge, Marta Sylvia Del Rio Guerra, Vicente Lopez-Chao, René Hale Soto Gastelum, and Jose Fernando Valenzuela Bojórquez. 2020. "Augmented Reality to Facilitate Learning of the Acoustic Guitar" Applied Sciences 10, no. 7: 2425. https://doi.org/10.3390/app10072425

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop