Next Article in Journal
Twenty-First Century Learning as a Radical Re-Thinking of Education in the Service of Life
Next Article in Special Issue
Supporting Project-Based Learning through Economical and Flexible Learning Spaces
Previous Article in Journal
Education for Sustainable Development through International Collaboration. A Case Study on Concepts and Conceptual Change of School‐Students from India and Austria on Gender Equality and Sustainable Growth
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Tales from the Exam Room: Trialing an E-Exam System for Computer Education and Design and Technology Students

1
School of Education, Edith Cowan University, Mt Lawley 6050, Australia
2
School of Education, Curtin University, Bentley 6102, Australia
*
Author to whom correspondence should be addressed.
Educ. Sci. 2018, 8(4), 188; https://doi.org/10.3390/educsci8040188
Submission received: 4 September 2018 / Revised: 22 October 2018 / Accepted: 23 October 2018 / Published: 28 October 2018

Abstract

:
The Centre for Schooling and Learning Technologies (CSaLT) at Edith Cowan University (ECU) was asked in 2016 to be the Western Australian arm of a national e-exam project. This project used a bespoke exam system installed on a USB-drive to deliver what would have been traditional paper-based exams in an enclosed computer-based environment that was isolated from the internet and any resources other than those provided by the lecturer. This paper looks at the two exams chosen by the Western Australian group for the trial; a programming exam for pre-service computing teachers and an occupational health and safety exam for pre-service design and technology teachers. Both groups were drawn from the Graduate Diploma in Education course at ECU. The paper looks at the nature of the exam environment and the procedure for creating e-exams. It also outlines the exam procedures used and examines the feedback provided by both the lecturers and students involved. Conclusions are drawn about the suitability of the e-exam system and improvements are recommended as well as a discussion about e-exams and digital assessment more generally.

1. Introduction

In a world that is saturated with digital devices of all kinds and where student ownership of laptops computers is above 90% [1]; Pagram et al. [2] it is odd that universities around Australia cling to the paper-based hand-written exam. This paper reports on an e-exam system that was developed to redress this imbalance. Specifically, it reports upon the Western Australian part of a national research project that trialed an e-exam system in a variety of universities and faculties. In Western Australia the faculty chosen was Education at Edith Cowan University (ECU)—one of the largest teacher education schools in Western Australia (WA).
The national project had the following expected outcomes:
  • To ascertain students’ preferences in relation to Information and Communication Technology (ICT) use as it relates to supervised assessment.
  • To provide insight into student experiences of the use of an e-exam system in a supervised assessment.
  • To identify any gaps between student expectations, experiences, expected technological capabilities of the e-exam system and real-life practice.
  • To provide insight into student performance by comparing/contrasting typed versus handwritten forms of high stakes assessments.
  • To contribute to guidance for academics in regard to deployment of ICT enhanced supervised assessments.
  • To contribute to identification of support needed by staff to develop their use of ICT in the supervised assessment process.
The project was designed to explore and contrast student expectations and experiences for hand written responses versus typed responses within supervised assessments, current practice, and best practice with respect to ICT use in supervised assessment. It is anticipated that the findings will be used as part of a dialogue with lecturers, students, administrators, and curriculum designers to inform the development and potential deployment of technology enhanced supervised assessments. The aim of which is to allow for greater pedagogical richness in supervised assessments that are able to take full advantage of the affordances of ICTs, with a view to increasing student motivation and engagement.
This paper will begin with a review of the literature around e-examinations followed by a description of the context of the ECU study. Following this are a number of sections explaining the method and findings of different stages of the ECU implementation. The paper is brought to a close with conclusions and recommendations.

2. Literature

The expansion of ICT has given birth to the use of current electronic exams (e-exams). For the purposes of this paper, e-exams are defined as timed computer-based summative assessments conducted using a computer running a standardized operating system. An advantage of this is that students already possess a familiarity with the technologies used for conducting these e-exams [3]. A recent survey of students at a large Australian university conducted during 2012 indicated 98% ownership of mobile WiFi-enabled devices with laptop ownership the highest at 91% [4], so there is no doubt students have the digital fluency and access to undertake examinations using a computer.
E-exams are having a significant impact on assessment and have been widely used in higher education in the world [5,6] They are considered to have many important advantages over traditional paper based methods, such as a decrease in cost [7], marking automation [8,9], adaptive testing [10], increase of assessment frequency [11] and the ability to test greater numbers of learners [12]. Additional benefits of e-exams over paper-based examinations are that these systems allow the use of text, images, audio, video, and interactive virtual environments [13]. However, there are continuing challenges related to the use of e-exams including security and human interference [14], incapacity to evaluate high-order thinking competencies [10], the inappropriateness of technological infrastructures [7], and the complexity of the system which means that significant training may be needed [15,16].
Several studies have been conducted on the perceptions of students and lecturers with regard to the application of e-exams in higher education. Often students held positive attitudes toward e-exams because of advantages such as time efficiency, low cost and perceived improvement of assessment quality [9]. However, students were not unanimously in favor and gave reasons such as problems logging on, speed of typing and unfamiliarity with the exam system and software [17]. Lecturers also held a variety of opinions. Some research found that the majority of lecturers preferred e-exams to traditional methods of exams [15,16,18], others were resistant to adopting e-exams because they were reticent to change established examination habits and norms [18,19,20].
Research undertaken by Al-Qdah and Ababneh [21] demonstrated that different question types and different disciplines would affect the result of e-exams. For example, in Computer Science and Information Technology students have been found to perform better on multiple-choice and true or false questions. Other studies using both computing and English majors found that many students completed e-exams in 30% less time than paper based exams [18], while Santoso et al. [9] believed that students from a non-IT background would find difficulties with e-exams [17]. Therefore, the type of questions and the nature of the discipline needed to be taken into consideration when it comes to designing e-exams.

3. Context of the Western Australian Trials

The university where the research took place is situated in the metropolitan area of Perth, Western Australia, is a large university with approximately 30,000 students, 17,546 of whom are female. These students are spread over three campuses. Historically, the University has its foundations in teacher education and training and its School of Education is the largest in Western Australia, with 5617 students and 104 academic staff [22].
It was decided to undertake the research using students from the university’s one-year Graduate Diploma in Education (secondary) course with the two specializations chosen being design and technology and computing. These were chosen as the subjects are more challenging in terms of content for an e-exam system while minimizing other variables such as computer literacy (the students in these subjects tending to have a high level of computer literacy).
The e-exam environment used in the National Teaching and Learning project was a bespoke USB based system that allowed a closed (no internet access) exam to be administered on a Windows or Macintosh computer. The computer simply boots from the USB drive instead of its normal hard-drive and this locks out any network, hard drive or wireless access. This results in the students being locked within the exam environment.
The topics chosen for the WA trial were specifically chosen to push the limits of the e-exam system, these being; safe operating procedures (in design and technology), a highly graphical exam; and computer programming using the Python language (in computing) requiring a modified e-exam environment with a Python language editor.

4. Method and Findings: Production and Management

The process of creating and administering the e-exam can be seen in Figure 1. One of the researchers prepared the exam materials and retrieved the student-responses for marking by the lecturers.
In each class, there were between 9 and 15 students and the lecturers provided a Word version of their exam to the researchers. From these, two master copies of each exam were created. The first master was a test version that did not contain the actual questions that would be in the exam and was use in a pre-trial test given to each class one week before the actual exam. The idea of this pre-trial was to train the students in the use of the e-exam environment. The actual exam was identical, but include the examination questions themselves. Both the trial and the actual exam masters were tested by the researchers and the lecturers.
A USB duplicator was used to produce the examination USB drives for student use. These were then manually checked to ensure all files had been copied correctly. In each case the exams and trials were invigilated by the research team.

4.1. Design and Technology (D&T) Exam

For the design and technology exam, a paper exam with content based upon safe operating procedures was created. The original paper-based exam consisted of short answer questions responded to on the exam paper itself often in response to a photographic clue. In transferring this Microsoft Word based exam to the e-exam format, considerable editing was required in order to allow the graphics to display correctly while allowing a text box for responses. The trial of this exam took place in the students’ normal classroom (a D&T workshop area) and students were asked to bring along their own laptop computers, with the researchers providing a few university owned ones (these were adapted from the normal university standard operating environment in order to allow booting from the USB) as a backup.
The trial went with only a few technical hitches outlined in the overall findings section, and students were resistant to using their own technology. The exam itself took place in a computer lab, and the exam went well but the limitations of the word processor provided or rather its differences from Microsoft Word were evident. For example, drawing was difficult, and students found navigating the USB based Linux operating system problematic.

4.2. Computing Exam

For the computing exam, a programming exam based on the Python language was used. As the Python language had been blocked as a security measure in the normal e-exam system, a special version was developed to allow it to run. This then allowed the students to program and save their work to the USB. The trial and exam took place in the students’ normal computer lab classroom with some students using their own computers for the trial, however, all elected to use the lab computers in the exam.

5. Production and Management Findings

Production of the exams as exam papers took some time as they had to be adapted and in the case of the computing exam the system itself needed to be changed. Production of the USBs was a straightforward but time consuming process with all USBs needing to be tested as there were occasionally fails in the copying process. Invigilation needed to be by staff with some technical knowledge in order to be ready to deal with any problems that may have arisen (however there were very few). Technical staff were also needed to recover the completed exams from the USBs for the lecturers to mark. Thus, overall, the e-exam needed more time both pre and post-exam and additional technical support, when compared to the paper equivalent.

6. Method & Findings: Student Interviews

Immediately following each examination a focus-group style interview was held with four students to investigate the students’ reactions and feelings toward the exam. The interviews were semi-structured, broadly covering issues around handwriting vs computer use, personal vs university technology, exam software and Linux OS ease of use, student enjoyment of approach, and suggestions for improving the system.

6.1. D&T Exam

All interviewed students were very definite about preferring to type exam answers as opposed to using pen and paper. They gave the expected reasons for this related to editing, spelling, legibility, and speed.
While students used university desktops for the exam it was initially thought that the students would use their own technology. However, students were not in favor of this, mostly due to a distrust around booting up their machine with a different operating system and concomitant fear of losing data. Additionally while all students had laptops some of these were corporate machines (for example from the Education Department) and students were even less keen for these to be booted from the USB. In the final analysis, there were not any problems with booting the desktop machines in preparation for the actual examination.
The Linux operating system, software, and general environment did result in some problems for the students. Students expressed that they felt a lack of familiarity with the menus/shortcuts in the software and had some difficulties navigating the file manager associated with the operating system. Despite students expressing concerns about the software, when asked directly, they did not relate a strong preference toward Microsoft Office over Libreoffice (the wordprocessor provided). At least one student was unable to locate where the files were being saved and this was indicative of students’ difficulties with the file manager. Students also felt that it would have been better if the system auto-saved their progress.
Finally, students were concerned that parts of the examination were not ‘locked down’ as they could edit the actual questions or delete them accidentally.

6.2. Computing Exam

All interviewed students preferred to type answers in preference to hand writing. The reasons given were speed and ease of editing.
Students were not in favor of using their own machines for the exam and gave the following reasons:
  • Inconsistency between machine specifications leading to inequity
  • Different hardware may not work
  • One student had older hardware with a noisy fan and was worried about disturbing others
  • Fear of losing data
As computing students, this group did not express difficulties with the operating system and the file manager. However, they were critical of Libreoffice, particularly the drawing tools that were required to be used for flowcharting. They felt dedicated flowcharting software should have been provided as time was wasted figuring out how to use the Libreoffice drawing tools. Students felt that the programming environment (Python) was well integrated into the system and functioned as expected. Students also commented on the way that windowing works differently in Linux, but if anything, saw this as an advantage.
While the students navigated the file manager well, they still believed that the instructions regarding exactly where files should be saved needed to be clearer.

6.3. Student Interview Findings Overall

The following were the major findings from the focus-group interviews across the two groups:
  • The machines used for the actual examination booted from the USB and no technical difficulties were experienced
  • Students prefer to type their exams with reasons given as legibility, spelling, editability, and speed
  • Students do not want to use their own laptops with reasons given as fear of losing data, inequity, and laptop ownership (in the case of corporate laptops)
  • Students experienced unfamiliarity with the operating system particularly the Linux file manager
  • Students experienced unfamiliarity with the office software (in this case Libreoffice)
  • The Python programming language was integrated well into the system
  • Specialized software may be needed for some tasks (e.g., flowcharting)
  • Aspects of the exam needed to be locked down so as to be un-editable
  • An auto-save function should be included with the OS for the exam

7. Method and Findings: Staff Interviews

The two staff members responsible for the examinations were individually interviewed on the day of the exam. The combined results from these interviews are presented here.
Both staff members expressed that their students would prefer to do an exam using typing rather than hand writing giving the same reasons as the students with the following additions:
  • Less paper
  • Digital work easier to mark
The staff members also agreed with the students in believing it was better to use the university machines than those owned by the students and again they gave the same reasons as the students. Despite the students expressing concern the staff members did not feel that the differences between the e-exam system and software that the students were accustomed to was enough to warrant concern.
Overall the staff members felt that the system was best for multiple choice questions and written responses. Disadvantages of the system were risk of computer crash and consequent loss of work, slightly unfamiliar file system, having to change the bios of computers for USB boot and the system was inappropriate for subjects with a large practical component due to the limitations of the e-exam system. The staff also expressed that the system should run online with an auto-save function.

8. Method and Findings: Student Survey

The small population sizes in these two exam trials mean that comparisons and/or generalizations to larger populations are impossible. However, this does not make the data meaningless. In a system such as this, designed for high-stakes examination situations, one would hope for strong agreement to positively stated questions even in a small group and this is how this data has been interpreted. That is, in interpreting these charts the authors’ expectations were that if the examination system is viable there would be high-levels of agreement to survey questions.
Due to space limitations the results from selected pre and post-survey questions most useful to the current discussion are given here. These selected questions were Likert style questions showing general approval or disapproval toward the e-exam system. The x-axes in the following charts are as follows: SD = strongly disagree, D = disagree, N = Neutral, A = agree, and SA = strongly agree.

8.1. D&T Exam

The following charts (Figure 2) are taken from question 10 in the pre-implementation student survey, hence, the numbering from 10.1 to 10.6 below. The results indicate that prior to the e-exam the six D&T students felt reasonably confident about the upcoming e-exam (10.4). However, they had reservations about the instructions, the technical steps, the USB boot, and the e-exam system and software. While this is concerning, it is understandable that leading into an assessment that will affect their marks that they should feel a little anxious. The ‘disagree’ responses are still problematic as with an exam system one would like the students to be feeling very positive.
Subsequent to the D&T e-exam students participated in a post-implementation survey and some relevant charts from question 5 of this survey are shown in Figure 3. The charts suggest that after the e-exam the nine D&T students were mostly positive regarding their experience. The main concerns were around the possibility of technical failures, and cheating. The neutral responses may be considered concerning as it would be desirable to have mostly positive responses for a system designed for a high-stakes situation. Of the nine only 3 agree with the statements ‘I would recommend the e-exam system to others’. The reasons for this may be garnered from questions 5.3 and 5.4 that show that the students were concerned about technical failure and the possibility of cheating.

8.2. Computing Exam

Figure 4 shows that prior to the computing e-exam the nine Computing students felt very confident about the upcoming e-exam, with 7 out of 9 either agreeing or strongly agreeing (10.4). They felt confident about the instructions, the technical steps, the USB boot and the e-exam system and software. This is understandable both due to the nature of the students (studying computing) and the fact that they knew the exam would be electronic regardless of whether the e-exam system was used or not. Additionally, the Python environment was the same as the environment they were used to, and this is in contrast to the word processor (Libreoffice), which was a little unfamiliar.
Figure 5 suggests that after the computing e-exam the eight computing students were mostly positive regarding their experience and felt that the e-exam suited the use of computers (5.1). However, being computing students may have made them less trusting about the reliability of the exam and the possibility of cheating (5.3 and 5.4). It is possible that this factored in to their reluctance to be prepared to recommend the e-exam system to others, with only 3 of them agreeing with this statement (5.6). Again, neutral responses to these statements may be concerning to the designers of the e-exam system.

9. Conclusions

Combining the results from the interviews and surveys for the two groups (computing and D&T the following overall conclusions may be made. These conclusions should be read with appreciation of the small sample sizes and the atypical nature of the two e-exams. For examinations that were in this trial, there appears to be little advantage in using the e-exam system.
  • The exams required extensive pre-preparation beyond what is standard for both D&T and computing before production. Also subsequent to the e-exams data needs to be copied from the USBs. This pre and post work is extra to running a traditional examination.
  • Students did not wish to use their own computers for the e-exam principally because of data-security fears.
  • The software and environment also presented some difficulties for students. Understanding the Linux file system and using the Libreoffice word processor (particularly for drawing) were the main issues.
  • Overall students were reasonably positive toward the e-exam system but were fearful of losing data.

Recommendations

  • The system should have an auto-save or save to the cloud. This may lead to improvements in post exam efficiency and would also address student problems with the Linux file system.
  • Specific software for certain tasks such as drawing would be a good inclusion.

10. Post-Script

The above findings represent the situation at the conclusion of the ECU trials. However, development of the e-exam system has been ongoing. Since this paper was written the following enhancements to the software have been achieved:
  • It is now possible to run the e-exam system in a fully online mode. However, a completely reliable network connection is required for this.
  • Auto saving has been added to both online and offline versions of the e-exam system to protect student data in the event of a ‘crash’.

Author Contributions

J.P., M.C. and A.C. were involved in project set up and Edith Cowan University and data collection. M.C., J.P., A.C. and H.J. were involved in data analysis, presentation of results and writing of the final paper.

Funding

The Western Australian component of this research, reported on in this paper, received no funding.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Pagram, J.; Cooper, M. Cross-cultural probing: An examination of university student ICT ownership and use of e-learning materials in Thai and Australian contexts Research Online. In Proceedings of the International Conference: Innovative Research in a Changing and Challenging World, Phuket, Thailand, 16–18 May 2012; Australian Multicultural Interaction Institute: Phuket, Thailand, 2012. [Google Scholar]
  2. Pagram, J.; Jin, H.; Cooper, M. Pre-service Primary Teachers’ Preparedness to teach Design and Technology: A Western Australian Perspective. In Technology: An Holistic Approach to Education; Technology Environmental Mathematics and Science (TEMS) Education Research Centre: Christchurch, New Zeland, 2017; pp. 244–253. [Google Scholar]
  3. Hillier, M.; Fluck, A. Arguing again for e-exams in high stakes examinations. In Proceedings of the 30th Ascilite Conference, Sydney, Australia, 1–4 December 2013. [Google Scholar]
  4. McCue, R. UVic Law Student Technology Survey. Faculty of Law, University of Victoria, 2012. Available online: https://docs.google.com/file/d/0BxJ017RDW9tGWlljLXpmZ3Y1b00/edit?pli=1 (accessed on 16 August 2018).
  5. Thomas, P.; Rice, B.; Paine, C.; Richards, M. Remote electronic examinations: Student experiences. Br. J. Educ. Technol. 2002, 33, 537–549. [Google Scholar] [CrossRef]
  6. Walker, R.; Voce, J.; Ahmed, J. 2012 Survey of Technology Enhanced Learning for Higher Education in the UK. Universities and Colleges Information Systems Association, 2012. Available online: http://www.ucisa.ac.uk/_/media/groups/ssg/surveys/TEL_survey_ 2012_final_ex_apps (accessed on 16 August 2018).
  7. James, R. Tertiary student attitudes to invigilated, online summative examinations. Int. J. Educ. Technol. High. Educ. 2016, 13, 1–13. [Google Scholar] [CrossRef]
  8. Ras, E.; Whitelock, D.; Kalz, M. The promise and potential of e-assessment for learning. In Measuring and Visualizing Learning in the Information-Rich Classroom; Reimann, P., Bull, S., Kickmeier-Rust, M., Vatrapu, R., Wasson, B., Eds.; Routledge: New York, NY, USA, 2015; pp. 21–40. [Google Scholar]
  9. Snodgrass, S.J.; Ashby, S.E.; Rivett, D.A.; Russell, T. Implementation of an electronic objective structured clinical exam for assessing practical skills in preprofessional physiotherapy and occupational therapy programs: Examiner and course coordinator perspectives. Australas. J. Educ. Technol. 2014, 30, 152–166. [Google Scholar] [CrossRef]
  10. Fluck, A.; Pullen, D.; Harper, C. Case study of a computer based examination system. Australas. J. Educ. Technol. 2009, 25, 509–523. [Google Scholar] [CrossRef]
  11. Sclater, N. The Demise of eAssessment Interoperability? In Proceedings of the WWWrong, Sissi, Lassithi, Greece, 17 September 2007; Volume 317 of CEUR Workshop Proceedings. Davis, H.C., Duval, E., Muramatsu, B., White, S., van Assche, F., Eds.; CEUR-WS: Sissi, Lassithi, Greece, 2007. [Google Scholar]
  12. Jordan, S. Assessment for learning: Pushing the boundaries of computer-based assessment. Pract. Res. High. Educ. 2009, 3, 11–19. [Google Scholar]
  13. Llamas-Nistal, M.; Fernandez-Iglesias, M.J.; Gonzalez-Tato, J.; Mikic-Fonte, F.A. Blended e-assessment: Migrating classical exams to the digital world. Comput. Educ. 2013, 62, 72–87. [Google Scholar] [CrossRef]
  14. Miguel, J.; Caballé, S.; Xhafa, F.; Prieto, J. Security in online learning assessment towards an effective trustworthiness approach to support E-learning teams. In Proceedings of the IEEE 28th International Conference on Advanced Information Networking and Applications, Victoria, BC, Canada, 13–16 May 2014. [Google Scholar]
  15. Adebayo, O.; Abdulhamid, S.M. E-exams system for Nigerian universities with emphasis on security and result integrity. Int. J. Comput. Internet Manag. 2010, 18, 1–12. [Google Scholar]
  16. Adegbija, M.V.; Fakomogbon, M.A.; Daramola, F.O. The new technologies and the conduct of e-examinations: A case study of National Open University of Nigeria. Br. J. Sci. 2012, 3, 59–66. [Google Scholar]
  17. Wibowo, S.; Grandhi, S.; Chugh, R.; Sawir, E. A pilot study of an electronic exam system at an Australian university. J. Educ. Technol. Syst. 2016, 45, 5–33. [Google Scholar] [CrossRef]
  18. Betlej, P. E-examinations from student’s perspective—The future of knowledge evaluation. Cognit. Creat. Support Syst. 2013, 9, 12–21. [Google Scholar]
  19. Dwivedi, Y.K.; Wade, M.R.; Schneberger, S.L. Information Systems Theory: Explaining and Predicting Our Digital Society; Springer: Dordrecht, The Netherlands, 2012. [Google Scholar]
  20. Kuikka, M.; Markus, K.; Laakso, M.J. Challenges when introducing electronic exam. Res. Learn. Technol. 2014, 22, 217–226. [Google Scholar] [CrossRef] [Green Version]
  21. Al-Qdah, M.; Ababneh, I. Comparing online and paper exams: Performances and perceptions of Saudi students. Int. J. Inf. Educ. Technol. 2017, 7, 107. [Google Scholar] [CrossRef]
  22. Edith Cowan University. About ECU. 2017. Available online: http://www.ecu.edu.au/about-ecu/welcome-to-ecu/about-ecu (accessed on 16 August 2018).
  23. Hillier, M. To type or handwrite: Student’s experience across six e-Exam trials. In Globally Connected, Digitally Enabled, Proceedings of the ASCILITE Conference, Perth, WA, Australia, 29 November–2 December 2015; Reiners, T., von Konsky, B.R., Gibson, D., Chang, V., Irving, L., Clarke, K., Eds.; Australasian Society for Computers in Learning in Tertiary Education: Perth, Australia, 2015; pp. 131–142. [Google Scholar]
Figure 1. E-exam work flow [23].
Figure 1. E-exam work flow [23].
Education 08 00188 g001
Figure 2. D&T students’ responses to selected pre-examination survey questions.
Figure 2. D&T students’ responses to selected pre-examination survey questions.
Education 08 00188 g002
Figure 3. D&T students’ responses to selected post-examination survey questions.
Figure 3. D&T students’ responses to selected post-examination survey questions.
Education 08 00188 g003
Figure 4. Computing students’ responses to selected pre-examination survey questions.
Figure 4. Computing students’ responses to selected pre-examination survey questions.
Education 08 00188 g004aEducation 08 00188 g004b
Figure 5. Computing students’ responses to selected post-examination survey questions.
Figure 5. Computing students’ responses to selected post-examination survey questions.
Education 08 00188 g005

Share and Cite

MDPI and ACS Style

Pagram, J.; Cooper, M.; Jin, H.; Campbell, A. Tales from the Exam Room: Trialing an E-Exam System for Computer Education and Design and Technology Students. Educ. Sci. 2018, 8, 188. https://doi.org/10.3390/educsci8040188

AMA Style

Pagram J, Cooper M, Jin H, Campbell A. Tales from the Exam Room: Trialing an E-Exam System for Computer Education and Design and Technology Students. Education Sciences. 2018; 8(4):188. https://doi.org/10.3390/educsci8040188

Chicago/Turabian Style

Pagram, Jeremy, Martin Cooper, Huifen Jin, and Alistair Campbell. 2018. "Tales from the Exam Room: Trialing an E-Exam System for Computer Education and Design and Technology Students" Education Sciences 8, no. 4: 188. https://doi.org/10.3390/educsci8040188

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop