Next Article in Journal
Pathways to the Professoriate: The Experiences of First-Generation Latino Undergraduate Students at Hispanic Serving Institutions Applying to Doctoral Programs
Previous Article in Journal
Motivations and Paths to Becoming Faculty at Minority Serving Institutions
Previous Article in Special Issue
Socially Challenged Collaborative Learning of Secondary School Students in Singapore
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Efficient Use of Clickers: A Mixed-Method Inquiry with University Teachers

Department of Applied Social Sciences, The Hong Kong Polytechnic University, Hung Hom, Kowloon, Hong Kong, China
*
Author to whom correspondence should be addressed.
Educ. Sci. 2018, 8(1), 31; https://doi.org/10.3390/educsci8010031
Submission received: 12 December 2017 / Revised: 9 February 2018 / Accepted: 14 February 2018 / Published: 1 March 2018
(This article belongs to the Special Issue Collaborative Learning with Technology—Frontiers and Evidence)

Abstract

:
With the advancement of information technology and policies encouraging interactivities in teaching and learning, the use of students’ response system (SRS), commonly known as clickers, has experienced substantial growth in recent years. The reported effectiveness of SRS has varied. Based on the framework of technological-pedagogical-content knowledge (TPACK), the current study attempted to explore the disparity in efficiency of adopting SRS. A concurrent mixed method design was adopted to delineate factors conducive to efficient adoption of SRS through closed-ended survey responses and qualitative data. Participants were purposefully sampled from diverse academic disciplines and backgrounds. Seventeen teachers from various disciplines (i.e., tourism management, business, health sciences, applied sciences, engineering, and social sciences) at the Hong Kong Polytechnic University formed a teacher focus group for the current study. In the facilitated focus group, issues relating to efficient use of clickers, participants explored questions on teachers’ knowledge on various technologies, knowledge relating to their subject matters, methods and processes of teaching, as well as how to integrate all knowledge into their teaching. The TPACK model was adopted to guide the discussions. Emergent themes from the discussions were extracted using NVivo 10 for Windows, and were categorized according to the framework of TPACK. The survey, implemented on an online survey platform, solicited participants on teachers’ knowledge and technology acceptance. The close-ended survey comprised 30 items based on the Technological Pedagogical Content Knowledge (TPACK) framework and 20 items based on the Unified Theory of Acceptance and Use of Technology (UTAUT). Participating teachers concurred with the suggestion that use of clickers is instrumental in engaging students in learning and assessing formative students’ progress. Converging with the survey results, several major themes contributing to the successful implementation of clickers, namely technology, technological-pedagogical, technological-content, technological-pedagogical-content knowledge, were identified from the teacher focus groups. The most and second most frequently cited themes were technological-pedagogical-content Knowledge and the technological knowledge respectively. Findings from the current study triangulated with previous findings on TPACK and use of clickers, particularly, the influence of technological-pedagogical-content Knowledge and technological knowledge on successful integration of innovations in class. Furthermore, the current study highlighted the impact of technological-pedagogical and technological-content knowledge for further research to unfold technology adoption with these featured TPACK configurations, as well as rendering support to frontline academics related to integration of technology and pedagogy.

1. Background

Advancement of information technology in recent decades opens new arrays of pedagogical strategies for incorporating technology. One of the most widely adopted technologies in education is the student response system (SRS). Also known as learner response system or “clickers”, the student response system (SRS) is a medium allowing students in class to respond to teachers’ questions instantly with simple portal keypads or smart devices. Previous research revealed that SRS are effective in enhancing students’ engagement and facilitating interactions between teachers and students, particularly in large class environments [1,2,3,4]. Still, as teachers have certain degree of autonomy to choose the technology matching their pedagogical needs, not all of them choose to adopt SRS in their classes. The current study attempted to explore the underlying determinants of using SRS towards collaborative learning, in particular the influence of teachers’ knowledge on belief, with the mixed-method approach.
Past studies on the use of SRS towards collaborative learning have demonstrated the associations of SRS with effective classroom engagement of students and interactive formal assessment [5,6,7]. With regard to teaching, SRS helps to promote contingent teaching [8], collects instant feedback from students [9], actualizes just-in-time-teaching [10], improves class engagement [11], as well as facilitating students’ attendance and retention of knowledge [7,12,13].
In understanding the acceptance and adoption of SRS and other emerging technology, beliefs are crucial and recurrent themes [14,15,16,17] to predict behavioral intention of using an innovation. Perceived consequences about an innovation tend to influence behavioral pattern of using it. The Unified Theory of Acceptance and Use of Technology (UTAUT) [18] offers a parsimonious solution on efficient deployment of technology, proposing how individual beliefs impact on behavioral intention on technology use. Integrating eight models and theories relevant to technology acceptance (for details and background of the eight models and how they were adapted to UTAUT, please refer to Vankatesh’s work of applying an integrative approach to the examination of technology adoption [18])—Theory of Reasoned Action [19], the Technology Acceptance Model [17], the Motivational Model [20], the Theory of Planned Behavior [21], the Combined Technology Acceptance Model and Theory of Planned Behavior [22], the Model of PC Utilization [23], the Innovation Diffusion Theory [24], and the Social Cognitive Theory [25]—the model posits four core constructs on beliefs affecting behavioral intention to use technology, namely (i) performance expectancy; (ii) effort expectancy; (iii) social influence, and (iv) facilitating conditions.
Performance expectancy is the belief concerning instrumental consequences associated with the technology use. Effort expectancy describes the beliefs about the cognitive burden of learning to use the technology. Social influence is the beliefs on important others who use the technology. The facilitating condition is the belief about the organizational and technical infrastructure support to the technology use. Among these factors on belief, performance expectancy and effort expectancy are widely considered as the key antecedents to technology adoption. The current study will focus on exploring the determinants on these two factors.
It has been revealed in recent systematic reviews that an abundance of empirical studies have adopted the UTAUT to investigate the impact of beliefs on behavioral intention and usage of technology [26,27]. Nevertheless, to contextualize the UTAUT in education settings, more work is necessary to examine the determinants of beliefs. One of the prominent external variables is knowledge on information technology. A growing body of literature revealed that IT knowledge, computer efficacy, and computer anxiety are significant predictors of effort expectancy and performance expectancy [14,28,29,30,31]. People with a higher level of computer efficacy tend to be less frustrated by the hindrance encountered, more likely to hurdle the problems, inclined to appreciate the usefulness of the technology, and thus the likelihood for them to use it in the future increases.
Nonetheless, efficient adoption of an innovation in education settings requires an array of proficiencies, rather than merely the skill of technology application. It also involves knowledge to use technology in productive and innovative ways, ultimately towards delivering the subject content through the technological medium. Mishra and Koehler [32,33] conceptualized teaching into three types of core knowledge. Besides the knowledge of using technology, successful delivery of knowledge to students is also contingent upon teachers’ knowledge on subject content and pedagogy, as well as the interactions among these bodies of knowledge. These interactions between knowledge about technology, content, and pedagogy formed the foundation of the Technological Pedagogical Content Knowledge (TPACK) framework. Developed from Shulman’s work on the Pedagogical Content Knowledge (PCK) framework [34], TPACK propose a model of technology integration for effective teaching. TPACK is a generic model and has been widely adopted in various disciplinary and multidisciplinary efforts [35,36]. To date, educators and researchers are still actively developing and refining the framework of TPACK conceptually, theoretically, and empirically [35,36]. Previous research suggested that knowledge on technology is relevant to effort expectancy and performance expectancy [14,31]. The current study attempts to address the research gap and extended prior efforts to investigate if other domains of TPACK, not just technology knowledge, will contribute to the determinants of UTAUT. The study is deemed a novel and systemic effort in examining whether various domains of TPACK play a significant role in effort expectancy and performance expectancy.
An intriguing aspect of the TPACK framework is that the nature of teachers’ knowledge is context-driven. The types of knowledge teachers applied in their teaching are subtly influenced by the context [37]. Education researchers categorized a range of contextual factors in three hierarchical levels [38]. The micro, meso and macro levels refer to contextual factors in classroom, namely institutional and societal conditions, whereby teachers construct their knowledge in teaching and the manner they integrate technology in their classes. Context is imperative in research on TPACK, yet it is often, if not always, missing from prior studies [38]. Stretching the ecological insight of the study, the current effort addressed the role of context in the qualitative inquiry and probed the complexity of interactions among teachers, students, learning environment.

2. Materials and Methods

This study features a concurrent mixed method design [39] for simultaneous triangulation of quantitative and qualitative data sources, using a closed-ended survey and focus group for examining factors conducive to efficient adoption of SRS. Combining qualitative and quantitative methods not only allowed us to dissect the multifacetedness of the problem at hand, but also took advantage of the strength of both methods and compensated the weaknesses inherent in the other method. The quantitative study allowed exploration of the associative strength of relevant factors, while the qualitative study offered an in-depth understanding on the associations among them.

2.1. Participants

A closed-ended survey was administered to teachers at a university in Hong Kong with a high adoption rate of SRS. To improve on the representativeness of the samples, participating teachers were purposefully drawn in consideration of their disciplines, gender and professional ranks. Fifty-two teachers completed and returned the surveys. All participating teachers with valid survey returns were seasoned SRS users in their own classes, having adopted SRS for more than one year (Table 1).
Regarding the focus group study, 17 of the participating teachers (10 males and 7 females) who completed the closed-ended survey were invited to join the focus group study. The 17 participants were also drawn from diverse academic disciplines and backgrounds (i.e., tourism management, business, health sciences, applied sciences, engineering, and social sciences) to maintain the representativeness of the sample (Table 2).

2.2. Instruments

The closed-ended survey comprised 70 items tapping into various dimensions of the Technological Pedagogical Content Knowledge (TPACK) model, and the Unified Theory of Acceptance and Use of Technology (UTAUT). Items of TPACK and UTAUT were adapted from the previous studies [18,40]. Results were represented on a five-point Likert scale, ranging from 1 (strongly disagree) to 5 (strongly agree).
A focus group protocol was developed for the qualitative study. Questions were constructed around themes on teachers’ knowledge and technology adoption for the facilitation of semi-structured discussion in the focus group.

2.3. Procedure

The closed-ended survey was launched on an online survey platform between May and July of 2015. Teachers from seven faculties including the Faculty of Applied Science and Textiles, Faculty of Business, Faculty of Construction and Environment, Faculty of Engineering, Faculty of Health and Social Sciences, Faculty of Humanities, and School of Hotel and Tourism Management participated in the survey. A total of 52 teachers completed the survey.
Among those who completed the survey, 17 were invited to join our focus group study. Five focus groups were conducted between December 2015 and February 2016. A semistructured questioning method was adopted in the group discussions. This method allowed both consistency and flexibility in the questions asked in various focus groups. The discussions in the focus groups were digitally recorded, and the audio files were transcribed verbatim to ensure a further analysis of the discussions. All groups lasted for 60 min and were facilitated by a moderator. All moderators have received training in psychology and research methods, and are experienced in conducting focus-group research.

2.4. Data Analysis

All data from the closed-ended survey were analyzed with the Statistical Package for the Social Sciences (SPSS), version 22. All transcriptions of the focus group discussions were coded using NVivo 10 for further analysis processes. All coders were well-trained in the area of psychology, education, and qualitative research methods. In the inquiry of how and why teachers’ knowledge exerts influence on technology adoption, a grounded approach was employed. Themes regarding various domains of teachers’ knowledge and their relevance to effort expectancy and performance expectancy were continuously analyzed and sampled from the transcriptions until no emergent themes were able to be extracted from the transcriptions.

3. Results

3.1. Result of the Quantitative Study

The descriptive statistics for various constructs of UTAUT and TPACK is demonstrated in Table 3. With the minimum rating of 1 (strongly disagree) and maximum rating of 5 (Strongly agree), the mean ratings for various dimensions of UTAUT are between 3.30 and 3.88. The highest mean scores are facilitating condition (M = 3.88, SD = 0.58), and behavioral intention (M = 3.88, SD = 0.77), whilst the lowest mean score is social influence (M = 3.30, SD = 0.78).
With regard to the TPACK, the means for various TPACK scores range from 3.43 to 4.15. The highest mean score is the content knowledge (M = 4.15, SD = 0.74), and the lowest mean scores are technology knowledge (M = 3.43, SD = 0.78) and TPACK knowledge (M = 3.75, SD = 0.72).
A correlation analysis was conducted to examine the association among various constructs of TPACK and UTAUT (Table 4). Technology knowledge was positively related to Performance expectancy (r = 0.32, p < 0.05), effort expectancy (r = 0.41, p < 0.01), and behavioral intention (r = 0.30, p < 0.05). Teachers with higher scores in technology knowledge tended to appreciate the usefulness and the usability of SRS. They were more likely to report that they would keep using SRS in the future. In addition, positive associations were found among TPACK knowledge, performance expectancy (r = 0.38, p < 0.01), effort expectancy (r = 0.37, p < 0.01), and behavioral intention (r = 0.28, p < 0.05). Teachers with higher TPACK knowledge reported SRS were useful and easy to use, and they would be more likely to use SRS in the future.

3.2. Result of the Qualitative Study

The findings in qualitative study was triangulated with the variables in the quantitative study. Major themes and patterns were extracted from the discussions in the five focus group studies until achieving theoretical saturation. The themes were then aligned to the constructs of TPACK and UTAUT to obtain an in-depth understanding on how teachers’ knowledge influences performance expectancy and effort expectancy of using SRS.

3.2.1. Impact of Technology Knowledge on Performance Expectancy

The SRS offers a platform for instant question-and-answer sessions in class. Yet with better proficiency in technology, one was able to recognize how the SRS can assist them to achieve other goals and enhance the performance expectancy.
a. Facilitating class management
Class management is the process by which teachers establish order, engage students or elicit their cooperation in class [41]. Good class management strategies facilitate learning, improve teacher–student interaction, and minimize off-task behavior [42]. Some teachers reported that SRS alleviated the administrative hassle of, for example, taking attendance. Also, the results of the SRS sessions could automatically uploaded to Learning Management System (LMS). These greatly improve the efficiency of teaching in class:
“I adopted clickers to replace paper and pen for taking attendance. It saved much time and teachers and students could focus more on teaching and learning.”
—A teacher in electrical engineering.
b. Promoting teacher–student interaction
Supportive teacher–student interactions promote student engagement and learning and have appreciable effect on learning outcomes [43,44]. Nevertheless, with the increasing number of large classes in universities, the opportunity of discourse between teachers and students has abated. Reporting from the focus groups, some teachers surprised with how SRS improves the classroom dynamic and student engagement:
“It’s the only way they (students) interact with me. SRS is crucial. In big classes without this technology, it is just silence. It is just one way. That’s not the way to learn. They have to be involved in the class.”
—A teacher in mathematics.
c. Fostering formative assessment
Formative assessment refers to the recurring use of assessment-based information to recognize and respond to students’ learning outcomes to enhance learning process [45]. It is an important teaching practice in education setting. With the facilitation of SRS, teachers were able to understand student learning progress more easily. Formative assessment became feasible, even in large classes:
“You posed a question to test the basic understanding of students, to evaluate how efficient your teaching was, and to assess how well they absorbed the information.”
—A teacher in computer science.
“…I could check the results instantly, and see how many students can understand the concepts. If not many students could understand, I would spend more time for explanation. It is helpful for my teaching as well as for the students’ learning.”
—A teacher in textile and clothing.
d. Keeping records of students’ performance for further analysis
Responses obtained from SRS not only allowed teachers to keep track of students’ performance, the data could also be employed for further statistics analysis. Based on the data, teachers could design and refine their teaching strategies that fit students’ needs.
“It (SRS) is very efficient. You don’t have to input them (students’ responses). They are already in the soft copy.”
—A teacher in nursing.
e. Better utilizing built-in functions for particular subjects
Not all teachers had adopted all functions embedded in SRS. Some teachers admitted that to fully appreciate the usefulness of SRS in teaching, certain technological knowledge was required in preparing an SRS session. For instance, a teacher in applied mathematics reported that to display formulas in the SRS environment, teachers should have some practical knowledge on LateX.

3.2.2. Impact of Technological Pedagogical Content Knowledge on Performance Expectancy

a. Facilitating innovative pedagogy with SRS in particular subjects
Technological Pedagogical Content Knowledge also facilitated innovative pedagogical strategies with SRS in particular contexts. Peer assessment, peer instruction, and class experiment were made feasible with SRS:
“A student completed a piece of work for sharing in class, and other students would use clickers (SRS) to give a score. I found the whole process convenient. The results could be exported, which is much better than entering the results manually one by one.”
—A teacher in health technology informatics.
“I would put up a case, which was deemed confusing to students, and let them have a discussion. Then I would give them one more piece of information, and then asked them switch around for another round of discussion…”
—A teacher in accounting and finance.
“In my class, I used the students as the participants. The experiment is very simple. (I) just asked two questions, with only some different wordings. Because of the different wordings, they get different results …”
—A teacher in psychology.

3.2.3. Impact of Technology Knowledge on Effort Expectancy

a. Proficiency in information technology
Teachers agreed that SRS was able to facilitate their teaching. Nevertheless, to run a SRS session smoothly required knowledge and implementation skills on harnessing the featured information technology. Numerous studies emphasize the role of information technology proficiency in predicting effort expectancy [46,47,48]. Concurring with prior studies, many teachers in the focus group study reported how knowledge on information technology influences their perception on SRS. To efficiently integrate SRS in class, teachers need to feel comfortable with the software environment. Besides software operation, knowledge on compatibility of computers and other infrastructure of information technology also ensured a seamless adoption of the student response system. As some technologically anxious teachers commented:
“I was worried the first time I used it. Mostly technological issues: internet/Wi-Fi connection, software compatibility with computers in different lecture rooms.”
—A teacher in microbiology.
“The technological challenges vary from room to room, student to student.”
—A teacher in psychology.
With good technological knowledge, teachers were able to manage the SRS confidently and feel easier to balance the time of SRS sessions and lecturing effectively to benefit knowledge delivery and consolidation:
“If an SRS session was too long, students might do something else and be distracted. Just like changing tires. I had to do it quickly…”
—A teacher in psychology.

3.2.4. Impact of Technological Pedagogical Content Knowledge on Effort Expectancy

a. Openness to incorporate innovative pedagogy and technology in particular subjects
Openness/resistance to change is one of the critical factors on effort expectation and technology adoption [49,50]. One interesting finding in our focus group study is that teachers with higher TPACK were more ready to embrace innovative technological and pedagogical strategies. They were more willing to spend time and other resources to explore the possibility of integrating new ideas and technologies into their subject, as several teachers with high TPACK reflected:
“This semester, I use a combination of TodaysMeet, and this thing called a Web White Board, and SRS in math instruction.”
—A teacher in mathematics.
“I used clickers to conduct peer instruction, and to see if this can help students to enhance learning effectiveness, and manage difficult concepts in accounting.”
—A teacher in accounting finance.
b. Higher perceived enjoyment (Technological Pedagogical Content Knowledge)
Perceived enjoyment refers to the degree of enjoyment perceived through using the technology [20]. The association between perceived enjoyment and effort expectancy is supported by prior studies on technology adoption [51,52]. In our focus group study, teachers with higher TPACK tend to find it intrinsically enjoyable to prepare a class with SRS, even though the hours of preparation work are usually longer:
“To create questions, insert pictures, set up answers, make sure it looks nice... On average it takes an hour for each lecture. Fine. I think students are much more engaged, and to be honest, it is less boring for me, too. I can do a static lecture with my document projector. Anyone can do that. But actually I would be very bored at that type of class.”
—A teacher in mathematics.

3.3. Aligning Results of Quantitative and Qualitative Studies

Current research evidence suggested that technology domain is central to the TPACK framework in planning, design, and implementation of integrating technology [53]. Findings in the current quantitative and qualitative inquiries supported such observation and offered delineation of how TPACK, not only the technology domain, is central to adoption of SRS towards practicing collaborative learning in class with the effort expectancy and performance expectancy. Except the association between Technological Content and Effort Expectancy, themes extracted from the qualitative study were aligned well with the associations revealed in the quantitative study. Themes extracted from the focus group studies were summarized in Table 5 below.

4. Discussions and Conclusions

The current study attempted to explore the factors for successful adoption of SRS, and how teachers’ knowledge impacts on beliefs in particular. Findings from the current study echoed previous findings on TPACK and use of SRS, especially, the influence of technological-pedagogical-content knowledge and technological knowledge on successful integration of innovations in class [54]. The association between teachers’ knowledge and factors influencing usage behavior were first examined with the quantitative analysis. Results demonstrated that technology knowledge and TPACK knowledge were positively correlated with performance expectancy and effort expectancy. Qualitative data from focus groups were then examined to perform an in-depth exploration of how teachers’ knowledge impacted on effort expectancy and performance expectancy.
On how teachers’ knowledge enhanced performance expectancy, teachers in the focus group study revealed that with the knowledge on technology, SRS could facilitate class management, communication between teacher and student, formative evaluation, keeping track of students’ performance, and better utilizing the specific function in SRS especially in particular subject, whilst TPACK knowledge enabled teachers to incorporate innovative pedagogical strategies with SRS matching the needs of particular subjects. Regarding how teachers’ knowledge enhances effort expectancy in using SRS, several themes emerged: proficiency in information technology (technological knowledge), openness to embrace new pedagogies and technologies in their teaching (technological pedagogical content knowledge), and higher perceived enjoyment (technological pedagogical content knowledge). The current study echoed prior research in technology adoption—that technology knowledge is an imperative determinant of effort expectancy and performance expectancy. Yet, the current effort took one step further in enriching the concept of technology knowledge by taking into consideration the impact of subject content and pedagogy. Findings in quantitative and qualitative studies both offered evidence that TPACK also exerts influence on effort expectancy and performance expectancy.
Both the quantitative and qualitative findings supported that teachers’ knowledge—technology and TPACK—influences performance expectancy and effort expectancy of using SRS. Accordingly, to promote adoption of SRS in university settings, resources should be allocated to support training on using technology so that teachers’ transaction cost of switching from conventional unidirectional lecture to interactive classrooms facilitated by emerging innovations is not significant, or the benefit can simply outweigh the transaction cost of introducing SRS in their classes.
The implications for training are easy to discern. This is especially crucial for the dissemination and implementation of an emerging technology when considering the low mean scores of both technology knowledge and TPACK knowledge. First, the training should be able to empower teachers to think about, or work with the technology in more different ways, rather than just teaching them how to use it. In our focus group study, teachers with better technology knowledge did not just view SRS as a simple response device or application. They had a broad understanding on the technology and used it in different ways (e.g., taking attendance, collect data for statistical analysis). Besides, it is essential to regularly upgrade not just teachers’ knowledge on technology, but more importantly, the knowledge on TPACK, the methods of applying the technology in their own disciplines to match their unique teaching needs, so that to optimize the effectiveness of using the technology. Futhermore, the contents of the training should value simplicity to ease-in technologically-anxious teachers in adopting the technology confidently and effortlessly. In addition, the training should be designed to balance the utilitarian and hedonic components of using the technology [55,56], so that teachers can keep a more positive perception on the technology and continue to adopt it in their class, in view of the extra time cost or resources involved.

5. Limitations and Future Directions

The current study values context in which education professionals deliberate and equilibrate different domains in TPACK towards arriving at a pedagogical design and decision [50]. Nonetheless, contextual components in the themes extracted in the qualitative inquiry were mainly on the micro level. Meso-level context such as logistic constraint faced by large undergraduate classes and macro-level context such as reform policies on tertiary education, also mediates the impact of TPACK. Future research could further delineate the variations among the interrelations of meso- and macro-level factors in determining TPACK orientations.
Teaching knowledge in the age of information technology explosion is continuously evolving. This statement is particularly accurate for knowledge involving emerging technology. The TPACK framework is a widely adopted framework delineating technology integration in classroom. Nonetheless, the subconstructs used in the framework were static in nature. To dissect the problem with better precision, some researchers proposed the concept of TPACKing: using the radical constructivist approach to address the aggregative nature of knowledge building [57]. In the process of investigating various domains of knowledge, the roles of contextual influences and past experiences of participating teachers should be taken into consideration. The continuous interplays of teachers’ knowledge and their environment should be underscored in understanding technology adoption in future study.
In our focus group study, no themes can be aligned with the technological content knowledge, which was found statistically significant in the quantitative study. Particular items in this construct in the closed-end survey should be reviewed and refined in reference to the findings of the qualitative study to improve the sensitivity as well as the face validity of the construct. In addition, technology adoption is a multifaceted problem. A structural equation model illustrating the interrelationships among various constructs is essential to offer a gestalt solution to the problems presented.

Acknowledgments

The research is funded by the Hong Kong SAR UGC’s Additional Funding for Teaching and Learning Related Initiatives for the 2012-15 Triennium (UGC-89BM).

Author Contributions

All these authors contributed equally on study design, data analyses, and manuscript drafting

Conflicts of Interest

The founding sponsors had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, and in the decision to publish the results.

Appendix A

Table A1. Items in the self-report questionnaire.
Table A1. Items in the self-report questionnaire.
SubconstructsItems
1Technology knowledgeI know how to solve my own technical problems.
2Technology knowledgeI can learn technology easily.
3Technology knowledgeI keep up with important new technologies.
4Technology knowledgeI frequently play around with the technology.
5Technology knowledgeI know about a lot of different technologies.
6Technology knowledgeI have the technical skills I need to use technology.
7Technology knowledgeI have had sufficient opportunities to work with different technologies.
8Content knowledgeI have sufficient knowledge about my discipline.
9Content knowledgeI can use a way of thinking adopted in my discipline.
10Content knowledgeI have various ways and strategies of developing my understanding of my discipline.
11Pedagogical knowledgeI know how to assess student performance in a classroom.
12Pedagogical knowledgeI can adapt my teaching based upon what students currently understand or do not understand.
13Pedagogical knowledgeI can adapt my teaching style to different learners.
14Pedagogical knowledgeI can assess student learning in multiple ways.
15Pedagogical knowledgeI can use a wide range of teaching approaches in a classroom setting.
16Pedagogical knowledgeI am familiar with common student understandings and misconceptions.
17Pedagogical knowledgeI know how to organize and maintain a class.
18Pedagogical content knowledgeI know how to select effective teaching approaches to guide student thinking and learning in my discipline.
19Technological content knowledgeI know about technologies that I can use for understanding and doing my discipline.
20Technological pedagogical knowledgeI can choose technologies that enhance the teaching approaches for a lesson.
21Technological pedagogical knowledgeI can choose technologies that enhance students’ learning for a lesson.
22Technological pedagogical knowledgeMy teacher education program has caused me to think more deeply about how technology could influence the teaching approaches I use in my classroom.
23Technological pedagogical knowledgeI am thinking critically about how to use technology in my classroom.
24Technological pedagogical knowledgeI can adapt the use of the technologies that I am learning about to different teaching activities.
25TPACKI can teach lessons that appropriately combine subject contents of my discipline, technologies, and teaching approaches.
26TPACKI can use strategies that combine content, technologies, and teaching approaches that I learned about in my coursework in my classroom.
27TPACKI can choose technologies that enhance the content for a lesson.
28TPACKI can select technologies to use in my classroom that enhance what I teach, how I teach, and what students learn.
29TPACKI can provide leadership in helping others to coordinate the use of content, technologies, and teaching approaches in my discipline.
30Performance expectancyI would find the Student Response System (“Clickers”) useful in my teaching.
31Performance expectancyUsing the Student Response System (“Clickers”) enables me to accomplish teaching tasks more quickly.
32Performance expectancyUsing the Student Response System (“Clickers”) increases my teaching productivity.
33Performance expectancyIf I use the Student Response System (“Clickers”), I will increase my chances of becoming more competent in teaching.
34Effort expectancyThe procedures of using the Student Response System (“Clickers”) would be clear and understandable.
35Effort expectancyIt would be easy for me to become skillful at using the the Student Response System (“Clickers”).
36Effort expectancyI would find the Student Response System (“Clickers”) easy to use.
37Effort expectancyLearning to operate the Student Response System (“Clickers”) is easy for me.
38Social influencePeople who influence my teaching behavior think that I should use the Student Response System (“Clickers”).
39Social influencePeople who are important to me think that I should use the Student Response System (“Clickers”).
40Social influenceMy faculty/department/school has been helpful in the use of the Student Response System (“Clickers”).
41Social influenceIn general, the members of my teaching community support the use of the Student Response System (“Clickers”).
42Facilitating conditionsI have the tangible resources necessary (e.g., equipment, accessories) to use the Student Response System (“Clickers”).
43Facilitating conditionsI have the knowledge necessary to use the Student Response System (“Clickers”).
44Facilitating conditionsThe Student Response System (“Clickers”) is compatible with other e-learning systems I use.
45Facilitating conditionsA specific person or group is available for assistance with using the Student Response System (“Clickers”).
46Behavioral intentionI am a keen user of the Student Response System (“Clickers”).
47Behavioral intentionAll things considered, I think it is positive to keep using the Student Response System (“Clickers”) in my class.
48Behavioral intentionAll things considered, I think it is good to keep using the Student Response System (“Clickers”) in my class.
49Behavioral intentionAll things considered, I think it is beneficial to keep using the Student Response System (“Clickers”) in my class.
Table A2. Questions used in focus group discussion
Table A2. Questions used in focus group discussion
A.The Relationship between Technology and Pedagogy
1.How could clickers enhance pedagogy?
2.What did you need to consider in integrating clickers with you teaching?
3.Please share a remarkable event/experience in using clickers.
4.How could you use clickers more efficiently to facilitate teaching?
B.Technology and Content
1.How could clickers help students in understanding subject contents?
2.What are the subject specific factors in implementing clickers?
3.How could you use clickers more efficiently to facilitate students in understanding subject contents?
C.Technology, content and pedagogy
1.What are the relationship among technology, content and pedagogy?
D.Clickers and clickers questions
1.Please present the types of clicker questions used in your class.
2.Which types of questions are more beneficial to students?
3.Which types of questions can enhance learning motivation/interests?
4.Which types of questions can improve academic performance?
5.Which types of questions can help in understanding concepts?
E.Implementation of clickers
1.Do you think clickers is easy to use?
2.Will you continue to use clickers in class?
3.Which classes you will use clickers in the future?
4.Can you use clickers in other areas? Any new types of questions?

References

  1. Şad, S.N.; Göktaş, Ö. Preservice teachers’ perceptions about using mobile phones and laptops in education as mobile learning tools. Br. J. Educ. Technol. 2014, 45, 606–618. [Google Scholar] [CrossRef]
  2. Park, S.Y.; Nam, M.-W.; Cha, S.-B. University students’ behavioral intention to use mobile learning: Evaluating the technology acceptance model. Br. J. Educ. Technol. 2012, 43, 592–605. [Google Scholar] [CrossRef]
  3. Lindquist, D.; Denning, T.; Kelly, M.; Malani, R.; Griswold, W.G.; Simon, B. Exploring the potential of mobile phones for active learning in the classroom. In Proceedings of the ACM SIGCSE Bulletin, Covington, KY, USA, 7–11 March 2007; pp. 384–388. [Google Scholar]
  4. Cain, J.; Black, E.P.; Rohr, J. An audience response system strategy to improve student motivation, attention, and feedback. Am. J. Pharm. Educ. 2009, 73, 21. [Google Scholar] [CrossRef] [PubMed]
  5. Gok, T. An Evaluation of Student Response Systems from the Viewpoint of Instructors and Students. Turk. Online J. Educ. Technol. 2011, 10, 67–83. [Google Scholar]
  6. Hepplestone, S.; Holden, G.; Irwin, B.; Parkin, H.J.; Thorpe, L. Using Technology to Encourage Student Engagement with Feedback: A Literature Review. Res. Learn. Technol. 2011, 19, 117–127. [Google Scholar] [CrossRef]
  7. Kay, R.H.; LeSage, A. Examining the benefits and challenges of using audience response systems: A review of the literature. Comput. Educ. 2009, 53, 819–827. [Google Scholar] [CrossRef]
  8. Draper, S.W.; Brown, M.I. Increasing interactivity in lectures using an electronic voting system. J. Comput. Assist. Learn. 2004, 20, 81–94. [Google Scholar] [CrossRef]
  9. Cleary, A.M. Using Wireless Response Systems to Replicate Behavioral Research Findings in the Classroom. Teach. Psychol. 2008, 35, 42–44. [Google Scholar] [CrossRef]
  10. Novak, G.M. Just-in-time teaching. New Dir. Teach. Learn. 2011, 2011, 63–73. [Google Scholar] [CrossRef]
  11. Stowell, J.R.; Oldham, T.; Bennett, D. Using Student Response Systems (“Clickers”) to Combat Conformity and Shyness. Teach. Psychol. 2010, 37, 135–140. [Google Scholar] [CrossRef]
  12. Caldwell, J.E. Clickers in the Large Classroom: Current Research and Best-Practice Tips. CBE-Life Sci. Educ. 2007, 6, 9–20. [Google Scholar] [CrossRef] [PubMed]
  13. Campbell, J.; Mayer, R.E. Questioning as an instructional method: Does it affect learning from lectures? Appl. Cogn. Psychol. 2009, 23, 747–759. [Google Scholar] [CrossRef]
  14. Agarwal, R.; Karahanna, E. Time Flies When You’re Having Fun: Cognitive Absorption and Beliefs about Information Technology Usage. MIS Q. 2000, 24, 665–694. [Google Scholar] [CrossRef]
  15. Ajzen, I.; Fishbein, M. Understanding Attitude and Predicting Social Behavior; Prentice Hall: Englewood Cliffs, NJ, USA, 1980. [Google Scholar]
  16. Davis, F.D. Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Q. 1989, 13, 319–340. [Google Scholar] [CrossRef]
  17. Davis, F.D.; Bagozzi, R.P.; Warshaw, P.R. User Acceptance of Computer Technology: A Comparison of Two Theoretical Models. Manag. Sci. 1989, 35, 982–1003. [Google Scholar] [CrossRef]
  18. Venkatesh, V.; Morris, M.G.; Gordon, B.D.; Davis, F.D. User Acceptance of Information Technology: Toward a Unified View. MIS Q. 2003, 27, 425–478. [Google Scholar] [CrossRef]
  19. Fishbein, M.; Ajzen, I. Belief, Attitude, Intention, and Behavior: An Introduction to Theory and Research; Addison-Wesley: Boston, MA, USA, 1977. [Google Scholar]
  20. Davis, F.D.; Bagozzi, R.P.; Warshaw, P.R. Extrinsic and Intrinsic Motivation to Use Computers in the Workplace. J. Appl. Soc. Psychol. 1992, 22, 1111–1132. [Google Scholar] [CrossRef]
  21. Ajzen, I. The Theory of Planned Behavior. Organ. Behav. Hum. Decis. Process. 1991, 50, 179–211. [Google Scholar] [CrossRef]
  22. Taylor, S.; Todd, P. Assessing IT usage: The role of prior experience. MIS Q. 1995, 19, 561–570. [Google Scholar] [CrossRef]
  23. Thompson, R.L.; Higgins, C.A.; Howell, J.M. Personal computing: Toward a conceptual model of utilization. MIS Q. 1991, 15, 125–143. [Google Scholar] [CrossRef]
  24. Moore, G.C.; Benbasat, I. Development of an instrument to measure the perceptions of adopting an information technology innovation. Inf. Syst. Res. 1991, 2, 192–222. [Google Scholar] [CrossRef]
  25. Bandura, A. Social Foundations of Thought and Action: A Social Cognitive Perspective; Princeton-Hall: Englewood Cliffs, NJ, USA, 1986. [Google Scholar]
  26. Dwivedi, Y.K.; Rana, N.P.; Chen, H.; Williams, M.D. A Meta-analysis of the Unified Theory of Acceptance and Use of Technology (UTAUT). In Governance and Sustainability in Information Systems. Managing the Transfer and Diffusion of IT; Springer: Berlin, Germany, 2011; pp. 155–170. [Google Scholar]
  27. Taiwo, A.A.; Downe, A.G. The Theory of User Acceptance and Use of Technology (UTAUT): A Meta-Analytic Review of Empirical Findings. J. Theor. Appl. Inf. Technol. 2013, 49, 48–58. [Google Scholar]
  28. Compeau, D.R.; Higgins, C.A. Computer self-efficacy: Development of a measure and initial test. MIS Q. 1995, 19, 189–211. [Google Scholar] [CrossRef]
  29. Hsu, M.K.; Wang, S.W.; Chiu, K.K. Computer attitude, statistics anxiety and self-efficacy on statistical software adoption behavior: An empirical study of online MBA learners. Comput. Hum. Behav. 2009, 25, 412–420. [Google Scholar] [CrossRef]
  30. Macharia, J.K.; Pelser, T.G. Key factors that influence the diffusion and infusion of information and communication technologies in Kenyan higher education. Stud. High. Educ. 2014, 39, 695–709. [Google Scholar] [CrossRef]
  31. Padilla-Meléndez, A.; Garrido-Moreno, A.; Del Aguila-Obra, A.R. Factors affecting e-collaboration technology use among management students. Comput. Educ. 2008, 51, 609–623. [Google Scholar] [CrossRef]
  32. Mishra, P.; Koehler, M.J.; Kereluik, K. Looking back to the future of educational technology. TechTrends 2009, 53, 49. [Google Scholar]
  33. Mishra, P.; Koehler, M.J. Technological pedagogical content knowledge: A framework for teacher knowledge. Teach. Coll. Rec. 2006, 108, 1017. [Google Scholar] [CrossRef]
  34. Shulman, L.S. Those who understand: Knowledge growth in teaching. Educ. Res. 1986, 15, 4–14. [Google Scholar] [CrossRef]
  35. Anderson, A.; Barham, N.; Northcote, M.T. Using the TPACK framework to unite disciplines in online learning. Australas. J. Educ. Technol. 2013, 29, 549–565. [Google Scholar] [CrossRef]
  36. Tournaki, N.; Lyublinskaya, I. Preparing special education teachers for teaching mathematics and science with technology by integrating the TPACK framework into the curriculum: A study of teachers’ perceptions. J. Technol. Teach. Educ. 2014, 22, 243–259. [Google Scholar]
  37. Voogt, J.; Fisser, P.; Pareja Roblin, N.; Tondeur, J.; van Braak, J. Technological pedagogical content knowledge—A review of the literature. J. Comput. Assist. Learn. 2013, 29, 109–121. [Google Scholar] [CrossRef]
  38. Rosenberg, J.M.; Koehler, M.J. Context and technological pedagogical content knowledge (TPACK): A systematic review. J. Res. Technol. Educ. 2015, 47, 186–210. [Google Scholar] [CrossRef]
  39. Creswell, J.W.; Plano-Clark, V.L. Choosing a Mixed Methods Design. In Designing and Conducting Mixed Methods Research, 2nd ed.; Creswell, J.W., Plano-Clark, V.L., Eds.; SAGE Publications: Thousand Oaks, CA, USA, 2011; pp. 53–106. [Google Scholar]
  40. Schmidt, D.A.; Baran, E.; Thompson, A.D.; Mishra, P.; Koehler, M.J.; Shin, T.S. Technological Pedagogical Content Knowledge (TPACK): The Development and Validation of an Assessment Instrument for Preservice Teachers. J. Res. Technol. Educ. 2009, 42, 123–149. [Google Scholar] [CrossRef]
  41. Emmer, E.T.; Stough, L.M. Classroom management: A critical part of educational psychology, with implications for teacher education. Educ. Psychol. 2001, 36, 103–112. [Google Scholar] [CrossRef]
  42. Dicke, T.; Elling, J.; Schmeck, A.; Leutner, D. Reducing reality shock: The effects of classroom management skills training on beginning teachers. Teach. Teach. Educ. 2015, 48, 1–12. [Google Scholar] [CrossRef]
  43. Reyes, M.R.; Brackett, M.A.; Rivers, S.E.; White, M.; Salovey, P. Classroom emotional climate, student engagement, and academic achievement. J. Educ. Psychol. 2012, 104, 700. [Google Scholar] [CrossRef]
  44. Martin, A.J.; Anderson, J.; Bobis, J.; Way, J.; Vellar, R. Switching on and switching off in mathematics: An ecological study of future intent and disengagement among middle school students. J. Educ. Psychol. 2012, 104, 1. [Google Scholar] [CrossRef]
  45. Decristan, J.; Klieme, E.; Kunter, M.; Hochweber, J.; Büttner, G.; Fauth, B.; Hondrich, A.L.; Rieser, S.; Hertel, S.; Hardy, I. Embedded Formative Assessment and Classroom Process Quality How Do They Interact in Promoting Science Understanding? Am. Educ. Res. J. 2015, 52, 1133–1159. [Google Scholar] [CrossRef]
  46. Abu-Shanab, E.; Pearson, J.M. Internet banking in Jordan: An Arabic instrument validation process. Int. Arab J. Inf. Technol. 2009, 6, 235–244. [Google Scholar]
  47. Aggelidis, V.P.; Chatzoglou, P.D. Using a modified technology acceptance model in hospitals. Int. J. Med. Inform. 2009, 78, 115–126. [Google Scholar] [CrossRef] [PubMed]
  48. Dadayan, L.; Ferro, E. When technology meets the mind: A comparative study of the technology acceptance model. In Proceedings of the EGOV, Copenhagen, Denmark, 22–26 August 2005; pp. 137–144. [Google Scholar]
  49. Nov, O.; Ye, C. Resistance to change and the adoption of digital libraries: An integrative model. J. Assoc. Inf. Sci. Technol. 2009, 60, 1702–1708. [Google Scholar] [CrossRef]
  50. Venkatesh, V. Determinants of perceived ease of use: Integrating control, intrinsic motivation, and emotion into the technology acceptance model. Inf. Syst. Res. 2000, 11, 342–365. [Google Scholar] [CrossRef]
  51. Sun, H.; Zhang, P. Causal relationships between perceived enjoyment and perceived ease of use: An alternative approach. J. Assoc. Inf. Syst. 2006, 7, 24. [Google Scholar] [CrossRef]
  52. Teo, T.; Noyes, J. An assessment of the influence of perceived enjoyment and attitude on the intention to use technology among pre-service teachers: A structural equation modeling approach. Comput. Educ. 2011, 57, 1645–1653. [Google Scholar] [CrossRef]
  53. Archambault, L.M.; Barnett, J.H. Revisiting technological pedagogical content knowledge: Exploring the TPACK framework. Comput. Educ. 2010, 55, 1656–1662. [Google Scholar] [CrossRef]
  54. Cheung, G.; Chan, K.; Brown, I.; Wan, K. Teachers’ Knowledge and Technology Acceptance: A Study on the Adoption of Clickers. In Proceedings of the 11th International Conference on e-Learning: ICEl2016, Kuala Lumpur, Malaysia, 2–3 June 2016; p. 46. [Google Scholar]
  55. Bere, A. Exploring determinants for mobile learning user acceptance and use: An application of UTAUT. In Proceedings of the 11th International Conference on Information Technology: New Generations (ITNG), Las Vegas, NV, USA, 7–9 April 2014; pp. 84–90. [Google Scholar]
  56. Raman, A.; Don, Y. Preservice teachers’ acceptance of learning management software: An application of the UTAUT2 model. Int. Educ. Stud. 2013, 6, 157–164. [Google Scholar] [CrossRef]
  57. Olofson, M.W.; Swallow, M.J.; Neumann, M.D. TPACKing: A constructivist framing of TPACK to analyze teachers’ construction of knowledge. Comput. Educ. 2016, 95, 188–201. [Google Scholar] [CrossRef]
Table 1. Demographic characteristic of the teachers participating in the survey.
Table 1. Demographic characteristic of the teachers participating in the survey.
N%
Gender
 Female2650
 Male2650
Disciplines
 Business1121.15
 Health & Social Sciences1426.92
 Humanities & Design713.46
 Sciences, Technology & Engineering59.62
 Tourism & Hospitality1528.85
Professional Ranks
 Professor47.70
 Assistant professor2344.23
 Teaching fellow2548.08
Table 2. Demographic characteristic of the teachers participating in the focus group.
Table 2. Demographic characteristic of the teachers participating in the focus group.
N%
Gender
 Female741
 Male1059
Disciplines
 Business318.65
 Health & Social Sciences741.18
 Humanities & Design15.88
 Sciences, Technology & Engineering423.53
 Tourism & Hospitality211.76
Table 3. Descriptive statistics of various dimensions of Technological Pedagogical Content Knowledge (TPACK) and Unified Theory of Acceptance and Use of Technology (UTAUT).
Table 3. Descriptive statistics of various dimensions of Technological Pedagogical Content Knowledge (TPACK) and Unified Theory of Acceptance and Use of Technology (UTAUT).
MinimumMaximumMeanStd. Deviation
TPACK model
Technology knowledge1.005.003.430.78
Content knowledge1.005.004.150.74
Pedagogical knowledge1.295.004.100.65
Technological content2.005.003.810.72
Technological-pedagogical2.005.003.810.56
TPACK knowledge1.405.003.750.72
UTAUT model
Performance expectancy2.005.003.780.77
Effort expectancy2.255.003.760.66
Social influence1.255.003.300.78
Facilitating condition2.505.003.880.58
Behavioral intention1.755.003.880.77
Table 4. Correlation table of various dimensions of teachers’ knowledge as measured by Technological Pedagogical Content Knowledge (TPACK) to technology acceptance as measured by Unified Theory of Acceptance and Use of Technology (UTAUT).
Table 4. Correlation table of various dimensions of teachers’ knowledge as measured by Technological Pedagogical Content Knowledge (TPACK) to technology acceptance as measured by Unified Theory of Acceptance and Use of Technology (UTAUT).
Performance ExpectancyEffort ExpectancySocial InfluenceFacilitating ConditionBehavior Intention
Technology0.320 *0.405 **0.1720.1680.300 *
Content0.0670.199−0.0240.2130.243
Pedagogical0.2030.2510.1230.422 **0.183
Pedagogical Content0.0270.210−0.1570.0550.031
Technological Content0.2670.347 *−0.0270.1940.199
Technological Pedagogical 0.2320.223−0.0300.335 *0.225
TPACK0.383 **0.366 **0.1010.512 **0.282 *
* Correlation is significant at the 0.05 level (2-tailed); ** Correlation is significant at the 0.01 level (2-tailed).
Table 5. Themes extracted from the focus group studies.
Table 5. Themes extracted from the focus group studies.
Performance ExpectancyEffort Expectancy
Technology Knowledge
Facilitating class management
Promoting teacher-student interaction
Fostering formative assessment
Keeping record of students’ performance for further analysis
Better utilizing built-in functions for particular subjects
Proficiency in information technology
TPACK
Facilitating innovative pedagogy with SRS in particular subjects
Openness to incorporate innovative pedagogy and technology in particular subjects
Higher perceived enjoyment

Share and Cite

MDPI and ACS Style

Cheung, G.; Wan, K.; Chan, K. Efficient Use of Clickers: A Mixed-Method Inquiry with University Teachers. Educ. Sci. 2018, 8, 31. https://doi.org/10.3390/educsci8010031

AMA Style

Cheung G, Wan K, Chan K. Efficient Use of Clickers: A Mixed-Method Inquiry with University Teachers. Education Sciences. 2018; 8(1):31. https://doi.org/10.3390/educsci8010031

Chicago/Turabian Style

Cheung, George, Kelvin Wan, and Kevin Chan. 2018. "Efficient Use of Clickers: A Mixed-Method Inquiry with University Teachers" Education Sciences 8, no. 1: 31. https://doi.org/10.3390/educsci8010031

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop