Next Article in Journal
Towards an Uncertainty-Aware Visualization in the Digital Humanities
Next Article in Special Issue
The Effects of Gamification in Online Learning Environments: A Systematic Literature Review
Previous Article in Journal
Exhibiting Uncertainty: Visualizing Data Quality Indicators for Cultural Collections
Previous Article in Special Issue
Enhancing Fun through Gamification to Improve Engagement in MOOC

Informatics 2019, 6(3), 30; https://doi.org/10.3390/informatics6030030

Article
Video Games and Collaborative Learning in Education? A Scale for Measuring In-Service Teachers’ Attitudes towards Collaborative Learning with Video Games
Department of Didactics, Organization and Research Methods, University of Salamanca, Salamanca 37008, Spain
*
Author to whom correspondence should be addressed.
Received: 30 June 2019 / Accepted: 1 August 2019 / Published: 5 August 2019

Abstract

:
Students’ motivation is a fundamental factor in the educational process, and can be facilitated through new methodologies and technologies, including gamification, video games, collaborative learning, or, in particular, the methodology called “collaborative learning with video games” (which is presented and can be understood as the implementation of educational activities in which students have to work together to achieve a goal, and the main resource of the activity is a video game). However, if teachers themselves are not motivated, or if they lack a positive attitude towards implementing these new methodologies, it will be difficult for students to feel motivated when approaching said resources. Therefore, it is important to know what teachers’ attitudes towards them are. The aim of this research is the creation of an attitudes scale towards collaborative learning with video games, aimed at in-service primary school teachers. Different methodological steps were followed that made its construction possible, such as the analysis of items and the verification of their reliability, resulting in a rigorous attitudes scale of 33 items, with a reliability of α = 0.947. This implies that the measurement instrument is validated and allows one to know the attitudes of in-service primary school teachers towards a new methodology related to the implementation of video games in education.
Keywords:
video games; collaborative learning; education; teacher; attitudes; primary education; technology; ICT; Likert scale

1. Introduction

Students’ motivation is a fundamental factor in the educational process at any educational stage. It can be facilitated through the implementation of new methodologies in the educational process, including project-based learning and problem-based learning, but also through the use of different technologies, such as robotics, augmented reality and virtual reality. In this text, we focus on gamification and video games, and in particular, on the methodology called “collaborative learning with video games”, which deals with the use of video games in collaborative learning activities. These tools generate motivation and promote learning in students. However, if teachers do not have a positive attitude towards the implementation of these new tools, it will be difficult for students to feel motivated when these resources are implemented in their learning process. Therefore, it is important to know what teachers’ attitudes towards them are [1,2,3].
The aim of this study was the creation of an attitudes scale directed towards collaborative learning with video games, aimed at in-service primary school teachers. With that in mind, this article is divided into several sections. We start with the theoretical framework, in which literature on the key concepts of the article is reviewed, including topics such as gamification and video games, motivation, learning gains and teachers’ attitudes. A description of the study is presented in the following sections, including the methodology used and the results related to the creation of the instrument. Finally, the conclusions are addressed.

2. Theoretical Framework

Student motivation is a key factor for their learning, as indicated by various studies [4,5,6]. In fact, the implementation of gamification and video games in education has improved motivation for primary education students [7,8,9], secondary education students [10,11,12] and even higher education students [13,14,15,16]. In the same way, the implementation of collaborative learning methodologies has also contributed to students’ motivation [6,17,18].
Furthermore, gamification and video games in education also enable learning gains for students in terms of knowledge, skills and attitudes, for different education system stages, including primary education [19,20,21], secondary education [22,23,24] and higher education [16,25,26]. At the same time, the implementation of collaborative learning methodologies also contributes to student learning [17,27,28,29].
This motivated us to create a methodology called “collaborative learning with video games”, which brings together, in a single methodology, the advantages and criteria of implementing video games and collaborative learning in education. In that sense, it can be understood as the implementation of educational activities in which students have to work together, sharing responsibilities to achieve a goal (for instance, to do a task, to do a project, to complete a chart, to create a digital presentation, to write an essay, etc.), while discussing different perspectives and contributing with their ideas. The main resource of this activity is a video game [30]. It is important to highlight that collaborative learning between students can happen inside the video game, outside the game, or in both spaces (inside and outside the video game) depending on the type of educational strategy or activity the teacher chooses to implement. Educational experiences using this methodology are found in the literature [31,32,33,34]. Martín [35] provided further examples.
Considering these methodologies, students feel more motivated to face the educational process, and in turn, obtain learning gains. This poses a new question: are teachers motivated to implement these new technologies in their educational practices? What are teachers’ attitudes, opinions and perspectives about it? If teachers are not motivated in their implementation, it will be difficult for students to feel motivated when these resources are presented as part of the learning process. In fact, as Tejedor and García-Valcárcel [3] said, one of the biggest factors that influences the integration of any pedagogical innovation, new methodology or new technological resource in educational practices, is the attitude of affected teachers. Therefore, it is important to understand teachers’ viewpoints on these issues.
In this regard, we could find several studies referring to in-service and pre-service teachers’ attitudes towards video games in education [1,2,36,37,38], and towards gamification in educational settings [39,40,41,42]. In general, research shows that teachers’ attitudes towards these approaches are positive.
This article focuses on collaborative learning with video games; in particular, it reveals the creation and subsequent validation of a Likert-type attitude scale towards collaborative learning with video games, aimed at in-service primary education teachers.

3. Materials and Methods

The creation of an attitude scale requires a rigorous process in order to obtain an appropriate and validated instrument to measure the target attitude. At the same time, it is important to highlight that there are different types of attitude scales. In our specific case, we selected a Likert-type attitude scale. Likert-type attitude scales use a construction method that adapts to the measurement of different types of attitudes [43]. We took the ideas of multiple authors into account in creating this instrument [44,45].
Morales, Urosa and Blanco [44] synthesized the process of creation for this type of scale into specific stages, including the definition of the specific attitude to be measured as a key step. The next steps are: (1) the preparation of the instrument through the writing of several items and the preparation of additional information; (2) the obtainment of data from an adequate sample; (3) the item analysis, the calculation of reliability, the analysis of the scale content structure and the selection of definitive items.
It is fundamental to define the attitude to be measured, which, in our case, is teachers’ attitudes towards collaborative learning with video games. This can be defined as the relatively stable predisposition of teachers to respond favorably or unfavorably to the implementation of educational activities in which students have to work together, sharing responsibilities to achieve a goal (for instance, to do a task, to do a project, to complete a chart, to create a digital presentation, or to write an essay), while discussing different perspectives and contributing with their ideas, where the main resource of this activity is a video game.
Regarding the writing of the items of the scale, it is necessary to create several sentences. As Morales [46] said, they are usually written in the form of opinions with which the person may or may not agree. In addition, other instruments that measure the same or similar attitudes can be taken into account for elaboration purposes. For that reason, we took the measurement instrument called “Semantic differential: learning through collaborative projects with Information and Communication Technologies (ICT)” [47] and the questionnaire “Opinion about collaborative learning methodology” [17] into account. A total of 75 preliminary items were prepared at this time. As Morales, Urosa and Blanco [44] pointed out, provisional items should be reviewed by more than one person, allowing for modification or elimination as needed. In our case, they were reviewed by an in-service primary education teacher (that is to say, a professional with a similar status to the final recipients of this instrument), by an expert in written comprehension and written composition, and an expert in educational technology. This process gave rise to 64 items. The following response mode was established: (1) strongly disagree (SD), (2) disagree (D), (3) indifferent (I), (4) agree (A), and (5) strongly agree (SA). As Morales, Urosa and Blanco [44] pointed out, additional information must be prepared, so other questions (specifically, 20 questions) were also incorporated into the complete questionnaire to provide more information including, for instance, age, gender, courses and disciplines taught this year, and university degrees finished.
Once we acquired a sufficient number of items and decided how to respond to them, it was necessary to validate our preliminary instrument by means of a methodologically appropriate procedure. Specifically, the instrument developed up to this point had to be subject to a content validation process by expert judgment. As Cabero and Barroso [48] said, the use of experts as a strategy to assess teaching materials, as well as data collection instruments (as in our case) or the methodologies used in the educational process, is quite common in the field of educational research. Six experts took part in our study. Considering that they were experts in more than one field, it is important to highlight that four were primary school teachers, five were experts in the implementation of information and communication technologies (ICT) or video games in educational settings, two were experts in collaborative learning, and two were experts in construction and validation of measuring instruments and research methods. Considering their own opinions and knowledge, the experts had to rate the validity of each item from 1 (very bad) to 5 (very good) in relation to the objective of the scale and the specific attitude to be measured. They could also submit other suggestions about the instrument in general and about specific items in a space created in the questionnaire for scale validation. Regarding the selection criteria of the items at this stage, that is, which preliminary items were kept in the instrument and which were not, we took into account the percentage of experts who considered that an item was good or very good (taking as criterion to be above 60% of experts), the average obtained (establishing as criterion an average equal to or greater than 4), and the specific comments and suggestions provided by the experts. In addition, in the case that some of the items did not meet these criteria, we took the utility of that item on the scale into account, and were able to keep those items for the next step in which they were used with a sample. As a result of this process, we kept 57 items of the initial 64, namely those that applied to in-service primary school teachers.
We then went to the next step of creating the scale, which involved obtaining data from appropriate samples. Specifically, the sample included 223 Spanish in-service primary school teachers. With this data, we proceeded to carry out item analysis, reliability calculation, analysis of the scale content structure and final selection of the items using the statistical software SPSS 22 (IBM Corporation, Armonk, NY, USA). These aspects will be presented in the following section.

4. Data Analysis and Results

Firstly, we present the results that show the sample characterization. The total sample was 223 Spanish in-service primary school teachers, and 113 were men (50.7%) and 110 were women (49.3%). The respondents were 21–60 years old, the statistical mode was 38 years old (5.4% of the sample) and the mean was 36.09. In regards to what levels they teach (considering that primary education in Spain consists of six years, from 6 to 12 years old), 79 teachers taught the first year, 71 the second year, 78 the third year, 81 the fourth year, 87 the fifth year, and 87 the sixth year. It is important to highlight that in Spain the same teacher can work across different age levels depending on the discipline. In regards to their university-level education, excluding their undergraduate degree to be a primary school teacher, 17 teachers (7.6%) were enrolled in a master’s degree and 36 teachers (16.1%) held a master’s degree at the time of answering the questionnaire. Furthermore, nine teachers (4%) were enrolled in a PhD, and five teachers (2.24%) had finished their doctoral studies.
Once we obtained data from a sample, the next step was to analyze the items and the verification of reliability. We had to check whether each item from the initial version measured the same attitude as the other items. This is fundamental, in order to know if it is possible to sum its specific item score in a total score that supposedly measures the attitude that we want to study, taking into account that the total score of each person is one that will later be interpreted. This check was done through item analysis, and we used the item–total correlation procedure, which is, properly speaking, the correlation of each item with the sum of all others, or the correlation of each item with the total minus the item, i.e., the corrected item–total correlation [44]. We wanted to check whether scoring high on an item means, in fact, getting a high total score on the rest of the scale. When selecting our items, we had to take into account the fact that those with scores that correlated most highly with the sum of all the others are those that have more in common, and we can assume they measure the same as the rest. However, the items that show non-significant or very low correlations in relation to the rest of the items must be eliminated from the scale [44]. In addition, it must be taken into account that the process does not have to be automatic, rather the researchers’ ideas about what they are trying to measure need to be considered, so conceptual criteria must also be taken into account. Thus, in Appendix A we show the total-element statistics of the items in the 57 item scale version, and in Appendix B we show the total-element statistics of the items in the final version of 33 items; that is, these are the definitive 33 items that will be part of the scale (the complete items can be seen in Appendix C, translated from Spanish). It is important to highlight that definitive items are numbered in the appendices and in tables according to their sequence in the final instrument and in Appendix C (from ‘item 1’ to ‘item 33’), while the eliminated items have been named through letters (from ‘eliminated item A’ to ‘eliminated item X’).
Considering the above, this gives rise to an attitude scale of 33 items, with a reliability of α = 0.947. Furthermore, we carried out factorial analysis as a method to check the construct validity of the final version with the 33 items. As Morales, Urosa and Blanco [44] pointed out, factorial analysis with the rotated factors allows us to appreciate whether we are measuring what we say we measure, by clarifying the matters that underlie several variables, what items are defining each factor and how these factors relate to each other, helping us to clarify the structure of the instrument and the construct. In Table 1, we show data about the factors extracted in the analysis, and in Table 2, we show the rotated component matrix for the 33 item final version of the scale.
As we can see in Table 1, six factors were extracted that explain 60.882% of the total variance, taking into account that, to determine the number of factors that have to be extracted, those components with eigenvalues greater than 1 are conserved [49]. This also fulfils what was indicated by Nunnally [43], because it is necessary to eliminate the factors in which no variable has a weight superior to 0.30, and, as can be appreciated in Table 2, all factors have some variable with a weight greater than this. In addition, it is necessary to take into account only those factors that are defined by at least three items [44], which is what happens with our six factors (as can be seen in Table 2 in the different columns for each component).
In order to interpret factor structure, we examined the saturations that, in each factor, obtained the items of the scale [49] according to the results in Table 2. We attended mainly to those items with the largest weights [44] and chose, in those cases where there are items that were saturated in more than one factor, to place them with the factor in which they saturated the most.
As we can see in Table 2, the first factor is integrated by items 4, 6, 10, 12, 14, 17, 18, 19, 20, 22, 23, 24 and 28, and explains 38.9% of the variance (as we can see in the third column of Table 1). We denominated it “educational possibilities”, because the items highlight the educational possibilities of collaborative learning with video games. For instance, the greater the interaction between the teacher and the students, the greater the students’ autonomy in their learning, the development of students’ capacity for initiative and the possibility to explore ideas and concepts more fully.
The second factor is integrated by items 2, 21, 25, 27, 31, 32 and 33, and explains 6.41% of the variance. We called it “positive disposition to implement activities” by incorporating those items that include formulations showing interest, inclination or attraction towards the approach of collaborative learning activities with video games, for example, showing interest to collaborate with other teachers who implement these kinds of activities or showing interest to work in a school where this methodology was supported.
The third factor is integrated by items 1, 26, 29 and 30, and explains 5.656% of the variance. We called it “denial as educational methodology” because the items are related to the rejection of collaborative learning with video games as a possible methodology to be applied in educational practices, indicating that implementing this methodology is impossible and inappropriate. Taking this into account, it is important to highlight that all the items in this factor are negative and, in the analysis, it was necessary to reverse the score.
The fourth factor is integrated by items 5, 8 and 13, and explains 3.421% of the variance. We denominated it “concerns about neglecting the learning” by incorporating those items that are related to teachers’ concerns about the implementation of this kind of methodology and the problem of neglecting or not giving the required importance to learning by the students, such as taking learning lightly and not putting effort into educational tasks. As for the previous factor, all the items in this factor are also negative and, in the analysis, it was necessary to reverse the score.
The fifth factor is integrated by items 9, 11 and 16, and explains 3.335% of the variance. We denominated it “useful and inclusive learning strategy” by incorporating those formulations related to the idea of collaborative learning with video games methodology as a learning strategy that allows the inclusion of all students and that allows learning relevant matters for their lives in the complex and diverse world in which we live.
Finally, the sixth factor is integrated by items 3, 7 and 15, and explains 3.161% of the variance. We called it “teacher denial due to loss of time” by incorporating those formulations in which the teacher rejects this approach, considering it a waste of time in terms of class time and personal time. In this case, the three items are negative and, in the analysis, it was also necessary to reverse the score.
The analysis of items shown, and the factorial analysis carried out, led us to confirm the selection of the 33 items as elements for the final version of the scale. The scale has a reliability of α = 0.947, with the reliability of each factor as following:
  • Factor 1 “educational possibilities”: α = 0.921.
  • Factor 2 “positive disposition to implement activities”: α = 0.876
  • Factor 3 “denial as educational methodology”: α = 0.762
  • Factor 4 “concerns about neglecting the learning”: α = 0.814
  • Factor 5 “useful and inclusive learning strategy”: α = 0.662
  • Factor 6 “teacher denial due to loss of time”: α = 0.696.
Finally, it should also be noted that, although we tried to have the same number of items in the affective, cognitive and behavioural fields, the scale has the following final structure:
  • Twelve items related to the affective field (items 2, 3, 5, 7, 8, 13, 21, 25, 26, 27, 29 and 33): α = 0.873
  • Thirteen items related to the cognitive field (items 1, 4, 6, 9, 10, 12, 14, 17, 18, 20, 22, 24 and 28): α = 0.904
  • Eight items related to the behavioural field (items 11, 15, 16, 19, 23, 30, 31 and 32): α = 0.832.

5. Conclusions

Video games, gamification and collaborative learning are elements that can be implemented in education because they generate student motivation and contribute to student learning in different education system stages, as we saw in the theoretical framework of this article. However, the generation of students’ motivation and students’ learning when teachers implement these resources in education can also be influenced by the teachers’ attitudes towards these elements. In fact, the teacher’s attitude towards a new resource or methodology is one of the main factors that contributes to its implementation in educational practices. Therefore, it becomes relevant to know teachers’ attitudes towards new methodologies and resources, which in our case is collaborative learning with video games methodology, which brings together in a single methodology the advantages and criteria of implementing video games and collaborative learning in education. For that reason, in order to know and analyze these attitudes, validated and reliable instruments are required, which must be built with a rigorous construction process. In this text, we showed the creation of an attitudes scale towards collaborative learning with video games aimed at in-service primary school teachers, developed in a rigorous way that enables us to know teachers’ attitudes towards this specific methodology. We believe that the availability of this instrument in the scientific community will contribute to the study of this variable for other researchers who are interested in the area, as well as the development of other measurement instruments related to the field of video games, gamification and collaborative learning.

Author Contributions

Conceptualization, M.M.-d.-P.; methodology, M.M.-d.-P, A.G.-V.M.-R. and A.H.M.; formal analysis, M.M.-d.-P. and A.G.-V.M.-R.; investigation, M.M.-d.-P., A.G.-V.M.-R. and A.H.M.; writing—original draft preparation, M.M.-d.-P.; writing—review and editing, M.M.-d.-P., A.G.-V.M.-R. and A.H.M.; funding acquisition, M.M.-d.-P.

Funding

In terms of the first author, this research was made possible through the funding of a FPU predoctoral grant (FPU13/02194) from the Ministry of Education, Culture and Sport of Spain.

Acknowledgments

We would like to thank the experts who participated in the revision of the instrument and the in-service primary school teachers who answered the initial version of the attitudes scale.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

Appendix A

Table A1. Total-element statistics of the items in the 57 item scale version.
Table A1. Total-element statistics of the items in the 57 item scale version.
Items 3Scale Mean If Item DeletedScale Variance If Item DeletedCorrected Item-Total CorrelationCronbach’s Alpha If Item Deleted
Eliminated item A220.63662.5590.6020.958
Item 2220.62665.2820.5510.958
Eliminated item B220.87668.1530.4830.958
Eliminated item C220.62672.4710.3860.959
Eliminated item D221.07664.5860.4480.958
Item 3220.56664.8340.6640.958
Eliminated item E220.41671.5500.4370.958
Eliminated item F220.96661.9980.5290.958
Eliminated item G220.65669.4450.4900.958
Item 16220.64667.9350.5820.958
Item 9220.61669.9430.4420.958
Eliminated item H220.93665.7160.4560.958
Eliminated item I221.23665.8190.5920.958
Eliminated item J221.12669.6920.4230.958
Eliminated item K221.36665.3030.5180.958
Item 14221.05664.4620.5770.958
Item 7220.55665.6270.5610.958
Eliminated item L220.50671.8280.5070.958
Eliminated item M220.79667.0200.4750.958
Item 17221.23663.9630.6110.958
Item 15220.53672.8450.5010.958
Item 12221.16662.1720.5750.958
Eliminated item N220.65672.0140.4330.958
Item 11220.79669.2550.5240.958
Item 10221.18666.8260.4790.958
Item 13221.61660.1570.4850.958
Item 19220.91667.3430.5540.958
Item 5221.61654.4180.5750.958
Item 23220.79666.1460.5930.958
Item 20220.95665.9930.5750.958
Item 18220.99664.2250.6290.958
Item 22221.14661.3940.6150.958
Item 27220.74660.4070.7050.957
Item 21220.69662.7100.6710.958
Item 6220.93664.0990.6390.958
Item 32220.70670.3570.5460.958
Eliminated item O221.51682.6380.0820.961
Item 24220.86663.3450.6570.958
Item 4221.13662.5940.5660.958
Item 28220.95661.2400.6690.958
Eliminated item P221.62662.1560.4610.958
Eliminated item Q221.37663.3780.5410.958
Eliminated item R221.19660.1650.6150.958
Item 8221.37652.1070.6460.958
Eliminated item S221.32666.8310.4210.959
Eliminated item T221.23666.8080.4470.958
Eliminated item U220.68671.6520.4390.958
Item 26220.88662.6830.5600.958
Item 25220.75662.9430.5810.958
Item 29220.95662.0060.5680.958
Eliminated item V220.70664.4720.5930.958
Item 30220.71662.7210.6040.958
Item 31220.88663.1520.6300.958
Eliminated item W220.69669.1080.4880.958
Item 1220.85664.5440.5110.958
Item 33220.68666.9130.5650.958
Eliminated item X220.65663.8240.5850.958
3 Cronbach’s Alpha with 57 items: 0.959.

Appendix B

Table A2. Total-element statistics of the items in the final version of 33 items.
Table A2. Total-element statistics of the items in the final version of 33 items.
Items 4Scale Mean If Item DeletedScale Variance If Item DeletedCorrected Item-Total CorrelationCronbach’s Alpha If Item Deleted
Item 1126.69256.4780.4750.947
Item 2126.46256.3840.5360.946
Item 3126.39257.1320.6020.946
Item 4126.97253.0080.6140.945
Item 5127.45250.1590.5450.946
Item 6126.77254.4380.6770.945
Item 7126.39256.7240.5410.946
Item 8127.21249.4080.5960.946
Item 9126.44258.5540.4540.947
Item 10127.02255.7790.5210.946
Item 11126.63257.8460.5530.946
Item 12127.00253.1890.6060.945
Item 13127.45254.6270.4270.948
Item 14126.89255.2530.5870.946
Item 15126.37261.6850.4530.947
Item 16126.48258.0880.5640.946
Item 17127.07254.8870.6240.945
Item 18126.83254.5740.6640.945
Item 19126.75256.0540.6080.945
Item 20126.79255.7980.6040.945
Item 21126.53253.9890.6920.945
Item 22126.98252.0990.6720.945
Item 23126.63256.1790.6110.945
Item 24126.70254.6710.6650.945
Item 25126.59254.4500.5850.946
Item 26126.72256.4470.4840.947
Item 27126.58251.8750.7530.944
Item 28126.78252.5570.7100.945
Item 29126.78254.1970.5590.946
Item 30126.55254.4830.6020.945
Item 31126.72254.6000.6350.945
Item 32126.53258.4840.5810.946
Item 33126.52257.3230.5530.946
4 Cronbach’s Alpha with 33 items: 0.947.

Appendix C

Items of “collaborative learning with video games attitudes scale” for in-service primary school teachers (translated from Spanish).
  • Implementing video games for collaborative learning in educational practices is impossible.
  • I would like to implement collaborative learning activities with video games in educational practices.
  • If I implemented collaborative learning activities with video games in educational practices, I would feel that I am wasting class time.
  • Collaborative learning with video games allows for greater interaction between the teacher and his/her students.
  • I worry that collaborative learning activities with video games encourage students not to put effort into educational tasks and activities.
  • Collaborative learning with video games allows students to jointly build knowledge about curricular content.
  • I think receiving training in collaborative learning with video games is a waste of time.
  • I worry that collaborative learning with video games is a distraction from the course syllabus.
  • Collaborative learning with video games is a good strategy for the inclusion of students with special education needs.
  • When working with video games in groups, students would pay attention to the opinions of other students.
  • I would implement collaborative learning activities with video games to help students learn to share responsibilities.
  • By working collaboratively with video games in educational practices, students would relate to each other more easily.
  • I worry that collaborative learning with video games encourages students to take learning lightly.
  • Collaborative learning with video games allows students to learn to work autonomously.
  • If my students asked me to carry out collaborative learning activities with video games in educational practices, I would refuse.
  • I would implement collaborative learning activities with video games to help students develop useful life skills.
  • Collaborative learning activities with video games help to explore ideas and concepts more fully.
  • Students have greater autonomy in their learning when they take part in collaborative learning activities with video games.
  • I would implement collaborative learning activities with video games to increase the students’ self-esteem.
  • Students would put more effort to share knowledge among them if they worked collaboratively with video games.
  • I would like to encourage the curiosity of students through collaborative learning with video games.
  • Video games facilitate the implementation of collaborative activities with students.
  • I would implement collaborative learning activities with video games to develop the students’ capacity for initiative.
  • When working collaboratively with video games, the explanations given among the members of the group facilitate the understanding of the concepts.
  • I would like to work in a school where the implementation of collaborative learning activities with video games with students was supported.
  • I would be overwhelmed if I had to implement collaborative learning activities with video games with students.
  • I would like to develop the students’ creativity through collaborative learning with video games.
  • When working collaboratively with video games in educational practices, the interaction generated with classmates increases the level of student learning.
  • I do not believe that collaborative learning with video games is an appropriate classroom methodology that improves education.
  • If I had to implement new activities in the educational practices, they would never be collaborative learning activities with video games.
  • If there were sufficient resources within the school, I would frequently implement collaborative learning activities with video games.
  • I would implement collaborative learning activities with video games to facilitate the students to learn the course syllabus.
  • I would like to collaborate with other teachers who implement collaborative learning activities with video games in their educational practices.

References

  1. Noraddin, E.M.; Kian, N.T. Academics’ attitudes toward using digital games for learning & teaching in Malaysia. Malays. Online J. Educ. Technol. 2014, 2, 1–21. [Google Scholar]
  2. Pastore, R.S.; Falvo, D.A. Video games in the classroom: Pre- and in-service teachers’ perceptions of games in the K-12 classroom. Int. J. Instr. Technol. Distance Learn. 2010, 7, 49–57. [Google Scholar]
  3. Tejedor, F.J.; García-Valcárcel, A. Competencias de los profesores para el uso de las TIC en la enseñanza. Análisis de sus conocimientos y actitudes. Rev. Esp. Pedagog. 2006, 64, 21–43. [Google Scholar]
  4. Anjomshoa, L.; Sadighi, F. The Importance of Motivation in Second Language Acquisition. IJSELL 2015, 3, 126–137. [Google Scholar]
  5. Feng, R.; Chen, H. An Analysis on the Importance of Motivation and Strategy in Postgraduates English Acquisition. Engl. Lang. Teach. 2009, 2, 93–97. [Google Scholar] [CrossRef]
  6. Serrano-Cámara, L.M.; Paredes-Velasco, M.; Alcover, C.M.; Velazquez-Iturbide, J.A. An evaluation of students’ motivation in computer-supported collaborative learning of programming concepts. Comput. Hum. Behav. 2014, 31, 499–508. [Google Scholar] [CrossRef]
  7. Gooch, D.; Vasalou, A.; Benton, L.; Khaled, R. Using Gamification to Motivate Students with Dyslexia. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, San Jose, CA, USA, 7–12 May 2016; pp. 969–980. [Google Scholar]
  8. Hill, V. Digital citizenship through game design in Minecraft. New Libr. World 2015, 116, 369–382. [Google Scholar] [CrossRef]
  9. Ordiz, T. Gamificación: La vuelta al mundo en 80 días. Infanc. Educ. Aprendiz. 2017, 3, 397–403. [Google Scholar] [CrossRef]
  10. Morillas, C.; Muñoz-Organero, M.; Sánchez, J. Can Gamification Improve the Benefits of Student Response Systems in Learning? An Experimental Study. IEEE Trans. Emerg. Top. Comput. 2016, 4, 429–438. [Google Scholar] [CrossRef]
  11. Quintero, L.E.; Jiménez, F.; Area, M. Más allá del libro de texto. La gamificación mediada con TIC como alternativa de innovación en Educación Física. Retos Nuevas Tend. Educ. física deporte recreación 2018, 34, 343–348. [Google Scholar]
  12. Hong, G.Y.; Masood, M. Effects of Gamification on Lower Secondary School Students’ Motivation and Engagement. Int. J. Educ. Pedagog. Sci. 2014, 8, 3765–3772. [Google Scholar]
  13. Corchuelo-Rodríguez, C.A. Gamificación en Educación Superior: Experiencia innovadora para motivar estudiantes y dinamizar contenidos en el aula. EDUTEC 2018, 63, 29–41. [Google Scholar] [CrossRef]
  14. Çağlar, S.; Kocadere, S.A. Possibility of Motivating Different Type of Players in Gamified Learning Environments. In Proceedings of the EDULEARN16 Conference, Barcelona, Spain, 4–6 July 2016; pp. 1987–1994. [Google Scholar]
  15. Landers, R.N.; Callan, R.C. Casual Social Games as Serious Games: The Psychology of Gamification in Undergraduate Education and Employee Training. In Serious Games and Edutainment Applications; Ma, M., Oikonomou, A., Jain, L., Eds.; Springer: London, UK, 2011; pp. 399–423. [Google Scholar]
  16. Topîrceanu, A. Gamified learning: A role-playing approach to increase student in-class motivation. Procedia Comput. Sci. 2017, 112, 41–50. [Google Scholar] [CrossRef]
  17. García-Valcárcel, A.; Hernández, A.; Recamán, A. La metodología del aprendizaje colaborativo a través de las TIC: Una aproximación a las opiniones de profesores y alumnos. Rev. Complut. Educ. 2012, 23, 161–188. [Google Scholar] [CrossRef]
  18. Jones, B.D.; Epler, C.M.; Mokri, P.; Bryant, L.H.; Paretti, M.C. The Effects of a Collaborative Problem-based Learning Experience on Students’ Motivation in Engineering Capstone Courses. Interdiscip. J. Probl.-Based Learn. 2013, 7, 34–71. [Google Scholar] [CrossRef]
  19. Capell, N.; Tejada, J.; Bosco, A. Los videojuegos como medio de aprendizaje: Un estudio de caso en matemáticas en Educación Primaria. Píxel-Bit 2017, 51, 133–150. [Google Scholar] [CrossRef]
  20. Chen, H.R.; Jian, C.H.; Lin, W.S.; Yang, P.C.; Chang, H.Y. Design of Digital Game-based Learning in Elementary School Mathematics. In Proceedings of the 2014 7th International Conference on Ubi-Media Computing and Workshops, Ulaanbaatar, Mongolia, 12–14 July 2014; pp. 322–325. [Google Scholar]
  21. Miller, D.J.; Robertson, D.P. Educational benefits of using game consoles in a primary classroom: A randomised controlled trial. Br. J. Educ. Technol. 2011, 42, 850–864. [Google Scholar] [CrossRef]
  22. Brom, C.; Preuss, M.; Klement, D. Are educational computer micro–games engaging and effective for knowledge acquisition at high–schools? A quasi–experimental study. Comput. Educ. 2011, 57, 1971–1988. [Google Scholar] [CrossRef]
  23. Huizenga, J.; Admiraal, W.; Akkerman, S.; Ten Dam, G. Mobile game-based learning in secondary education: Engagement, motivation and learning in a mobile city game. J. Comput. Assist. Learn. 2009, 25, 332–344. [Google Scholar] [CrossRef]
  24. Cheng, M.-T.; Lin, Y.-W.; She, H.-C. Learning through playing Virtual Age: Exploring the interactions among student concept learning, gaming performance, in-game behaviors, and the use of in-game characters. Comput. Educ. 2015, 86, 18–29. [Google Scholar] [CrossRef]
  25. Perini, S.; Luglietti, R.; Margoudi, M.; Oliveira, M.; Taisch, M. Learning and motivational effects of digital game-based learning (DGBL) for manufacturing education—The Life Cycle Assessment (LCA) game. Comput. Ind. 2018, 102, 40–49. [Google Scholar] [CrossRef]
  26. Tan, A.J.Q.; Lee, C.C.S.; Lin, P.Y.; Cooper, S.; Lau, L.S.T.; Chua, W.L.; Liaw, S.Y. Designing and evaluating the effectiveness of a serious game for safe administration of blood transfusion: A randomized controlled trial. Nurse Educ. Today 2017, 55, 38–44. [Google Scholar] [CrossRef]
  27. Awedh, M.; Mueen, A.; Zafar, B.; Manzoor, U. Using Socrative and Smartphones for the support of collaborative learning. IJITE 2014, 3, 17–24. [Google Scholar] [CrossRef]
  28. Casillas, S.; Martín, J.; Martín, M.; Hernández, M.J. Proyecto “Empléate”. In Proyectos de Trabajo Colaborativo con TIC; García-Valcárcel, A., Ed.; Síntesis: Madrid, Spain, 2015; pp. 219–230. [Google Scholar]
  29. Terenzini, P.T.; Cabrera, A.F.; Colbeck, C.L.; Parente, J.M.; Bjorklund, S.A. Collaborative Learning vs. Lecture/Discussion: Students’ Reported Learning Gains. J. Eng. Educ. 2001, 90, 123–130. [Google Scholar] [CrossRef]
  30. Martín, M.; García-Valcárcel, A.; Hernández, A. Video games in teacher training: Design, implementation and assessment of an educational proposal. In Proceedings of the Fourth International Conference on Technological Ecosystems for Enhancing Multiculturality (TEEM ’16), Salamanca, Spain, 2–4 November 2016; pp. 1147–1154. [Google Scholar]
  31. Henderson, L.; Klemes, J.; Eshet, Y. Just Playing a Game? Educational Simulation Software and Cognitive Outcomes. J. Educ. Comput. Res. 2000, 22, 105–129. [Google Scholar] [CrossRef]
  32. Lester, J.C.; Spires, H.A.; Nietfeld, J.L.; Minogue, J.; Mott, B.W.; Lobene, E.V. Designing game-based learning environments for elementary science education: A narrative-centered learning perspective. Inf. Sci. 2014, 264, 4–18. [Google Scholar] [CrossRef]
  33. Pareto, L.; Haake, M.; Lindström, P.; Sjödén, B.; Gulz, A. A teachable-agent based game affording collaboration and competition: Evaluating math comprehension and motivation. Educ. Technol. Res. 2012, 60, 723–751. [Google Scholar] [CrossRef]
  34. Sung, H.Y.; Hwang, G.J. A collaborative game-based learning approach to improving students’ learning performance in science courses. Comput. Educ. 2013, 63, 43–51. [Google Scholar] [CrossRef]
  35. Martín, M. Videojuegos y aprendizaje colaborativo. Experiencias en torno a la etapa de Educación Primaria. EKS 2015, 16, 69–89. [Google Scholar]
  36. Jenny, S.E.; Hushman, G.F.; Hushman, C.J. Pre-service teachers’ perceptions of motion-based video gaming in physical education. Int. J. Technol. Teach. Learn. 2013, 9, 96–111. [Google Scholar]
  37. Martín, M.; Basilotta, V.; García-Valcárcel, A. An approach to Spanish Primary School Teachers’ attitudes towards collaborative learning with video games and the influence of teacher training. In Proceedings of the Fourth International Conference on Technological Ecosystems for Enhancing Multiculturality (TEEM ’16), Salamanca, Spain, 2–4 November 2016; pp. 715–719. [Google Scholar]
  38. Proctor, M.D.; Marks, Y. A survey of exemplar teachers’ perceptions, use, and access of computer-based games and technology for classroom instruction. Comput. Educ. 2013, 62, 171–180. [Google Scholar] [CrossRef]
  39. Alabbasi, D. Exploring Teachers Perspectives towards Using Gamification Techniques in Online Learning. Turk. Online J. Educ. Tojet 2018, 17, 34–45. [Google Scholar]
  40. Hung, A.C.Y.; Zarco, E.; Yang, M.; Dembicki, D.; Kase, M. Gamification in the wild: Faculty perspectives on gamifying learning in higher education. Issues Trends Educ. Technol. 2017, 5, 4–22. [Google Scholar]
  41. Martí-Parreño, J.; Seguí-Mas, D.; Seguí-Mas, E. Teachers’ Attitude towards and Actual Use of Gamification. Procedia Soc. Behv. 2016, 228, 682–688. [Google Scholar] [CrossRef]
  42. Sánchez-Mena, A.; Martí-Parreño, J. Drivers and Barriers to Adopting Gamification: Teachers’ Perspectives. Electron. J. e-Learn. 2017, 15, 434–443. [Google Scholar]
  43. Nunnally, J.C. Psychometric Theory; McGraw-Hill: New York, NY, USA, 1978. [Google Scholar]
  44. Morales, P.; Urosa, B.; Blanco, A. Construcción de Escalas de Actitudes “Tipo Likert”: Una Guía Práctica; La Muralla y Hespérides: Madrid and Salamanca, Spain, 2003. [Google Scholar]
  45. Tejedor, F.J.; García-Valcárcel, A.; Prada, S. Medida de actitudes del profesorado universitario hacia la integración de las TIC. Comunicar 2009, 33, 115–124. [Google Scholar]
  46. Morales, P. Medición de Actitudes en Psicología y Educación: Construcción de Escalas y Problemas Metodológicos; Publicaciones de la Universidad Pontificia Comillas: Madrid, Spain, 2006. [Google Scholar]
  47. García-Valcárcel, A. Proyectos de Trabajo Colaborativo con TIC; Síntesis: Madrid, Spain, 2015. [Google Scholar]
  48. Cabero, J.; Barroso, J. La utilización del juicio de experto para la evaluación de TIC: El coeficiente de competencia experta. Bordón 2013, 65, 25–38. [Google Scholar] [CrossRef]
  49. García, E.; Gil, J.; Rodríguez, G. Análisis Factorial. Cuadernos de Estadística, 7; Editorial La Muralla: Madrid, Spain, 2000. [Google Scholar]
Table 1. Data about the factors extracted in the analysis (in the final version with 33 items).
Table 1. Data about the factors extracted in the analysis (in the final version with 33 items).
Total Variance Explained 1
ComponentInitial EigenvaluesExtration Sums of Squared LoadingsRotation Sums of Squared Loadings
Total% of VarianceCumulative %Total% of VarianceCumulative %Total% of VarianceCumulative %
112.83738.90038.90012.83738.90038.9006.40219.40019.400
22.1156.41045.3102.1156.41045.3104.36013.21132.612
31.8665.65650.9661.8665.65650.9662.7428.30840.920
41.1293.42154.3861.1293.42154.3862.4157.31948.239
51.1003.33557.7211.1003.33557.7212.1066.38354.621
61.0433.16160.8821.0433.16160.8822.0666.26160.882
70.9632.91963.802
80.9402.84966.651
90.7832.37369.024
100.7542.28471.308
110.7102.15073.458
120.6892.08875.546
130.6391.93777.483
140.6131.85979.342
150.5771.74981.091
160.5461.65682.746
170.5011.51884.265
180.4661.41285.677
190.4571.38587.062
200.4421.34088.402
210.4161.26289.664
220.3991.21090.874
230.3901.18092.054
240.3481.05393.107
250.3290.99794.104
260.3070.92995.033
270.2820.85395.886
280.2780.84296.728
290.2580.78297.509
300.2350.71198.221
310.2290.69498.915
320.2110.64099.555
330.1470.445100.000
1 Extraction method: principal component analysis.
Table 2. Rotated component matrix for the final version of the instrument with 33 items.
Table 2. Rotated component matrix for the final version of the instrument with 33 items.
Rotated Component Matrix 2
Component
123456
Item 40.730
Item 220.702
Item 120.690 0.305
Item 180.6850.311
Item 280.680
Item 60.663
Item 100.611 0.309
Item 190.604 0.333
Item 170.593
Item 200.545
Item 140.535 0.440
Item 240.5290.369
Item 230.5280.385
Item 33 0.727
Item 25 0.722
Item 210.3550.711
Item 270.4910.616
Item 2 0.571
Item 31 0.551
Item 320.3130.5010.316 0.357
Item 1 0.690
Item 29 0.640
Item 30 0.3200.616 0.311
Item 26 0.3910.508
Item 13 0.866
Item 50.314 0.784
Item 8 0.4160.610
Item 9 0.704
Item 16 0.633
Item 110.443 0.463
Item 15 0.664
Item 3 0.348 0.573
Item 7 0.349 0.526
2 Extraction method: principal component analysis. Rotation method: varimax with Kaiser normalization.

© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Back to TopTop