A Phenomenological Inquiry into Lecturers’ Acceptance of Computer-Based Testing in Higher Education Through the Lens of the Technology Acceptance Model
Abstract
1. Introduction
- What is the perceived usefulness of CBT among university lecturers?
- What is the perceived ease of use of CBT from the perspective of university lecturers?
- How do university lecturers express their attitudes toward the use of CBT?
- What factors shape university lecturers’ behavioural intentions to adopt CBT in their assessment practices?
2. The Technology Acceptance Model
3. Methods
3.1. Research Design
3.2. Sample of the Study
3.3. Data Collection
3.4. Data Analysis
4. Results
4.1. Perceived Usefulness of CBT Among University Lecturers
4.1.1. Theme 1: Adoption Challenges
“Initially, we only used multiple-choice questions (MCQs) for first-year students. For second-year students, we gradually introduced essay questions alongside MCQs. The process becomes easier with familiarity. If you are new to CBT, it may seem difficult at first, but once you get used to the system, it is quite straightforward.”
“Well, the experience has been mixed. The process requires you to be very fast and follow specific guidelines. For instance, if you want students to answer 60 questions, you are expected to set at least three times that number, meaning you must prepare 180 questions. This can be quite challenging. Additionally, you have to write to the CBT centre to request availability of the exam venue, and sometimes you need to follow up to ensure your request has been approved.”
4.1.2. Theme 2: Perceived Usefulness in Managing Large Classes
“CBT significantly increases the speed of grading compared to traditional methods. In the traditional system, marking and compiling results could take two to three weeks, and some lecturers even delay submissions. However, with CBT, students can receive their results instantly or soon after the exam. If additional compilation is needed, the process is still much faster and easier for exam officers and lecturers.”
“If I have a class of 100 students, grading paper-based exams manually could take me hours. With CBT, the grading process is automated and can be completed in just a few minutes. I only need to review the results, eliminating the need for repetitive marking.”
“With CBT, grading is automated, making it easier for students to receive their scores promptly.”
“With CBT, it is easier to assess a large number of students efficiently. The system allows for instant feedback and automated result compilation, saving time and resources.”
4.1.3. Theme 3: Time Efficiency and Reduction in Workload
“With CBT, there is no need for marking scripts manually. The only task is setting the questions, and the system handles the rest. There is no need for printing, marking, or manually compiling results. The system automates these processes, making the workload significantly lighter for lecturers.”
“It allows academic staff to allocate time to other important academic activities instead of spending long hours marking papers. This improves efficiency and productivity.”
“This saves time and effort compared to paper-based exams.”
4.1.4. Theme 4: Concerns About Question Types and Assessment Quality
“CBT exams are often too simple, primarily consisting of multiple-choice questions, where the answers are already provided. In our department, we don’t even encourage multiple-choice questions, whether in paper-based or computer-based formats. We believe other forms of assessment are more effective.”
“But simulations require more than just CBT. Some types of assessment, like practical-based tasks, are better suited for other methods.”
4.1.5. Theme 5: Security and Cheating Prevention
“On the one hand, CBT is beneficial, especially in this digital era. However, the way it is implemented at our university is not entirely effective. Students often sit too close to each other, which increases the chances of communication and cheating. Some students even write exams on behalf of others due to lax identity verification. There is still much room for improvement.”
“In many cases, the level of cheating in CBT is very high.”
4.2. Perceived Ease of Use of CBT Among University Lecturers
4.2.1. Theme 6: Usability of CBT Varies Based on Digital Literacy
“The platform is user-friendly. Once you select an answer, you can easily move to the next question. There is no confusion, as the system is straightforward.”
“It is very easy, especially if you are computer literate. The system guides you through each step, similar to an ATM, with clear prompts and instructions.”
“The platform is somewhat flexible, especially for someone who is computer-literate and familiar with online systems. However, for those who are not very conversant with computers, some training is needed before they can use them effectively.”
“Honestly, as lecturers, we don’t interact much with the platform. Most administrative functions, like uploading questions and managing the exams, are handled by the CBT administrators. I believe it would be better if academic staff were given more control over the process, including setting up the exams and monitoring attendance.”
4.2.2. Theme 7: Persistent Technical Barriers Affecting CBT Implementation
“Technical issues do arise, especially when the server is down. If there is an internet failure, it can disrupt the entire process, which is a significant disadvantage.”
“Technical issues do occur occasionally. When they happen, we report them to the technical staff at the CBT centre. These staff members have a computer science background and are responsible for resolving such problems. However, technical issues can sometimes cause unnecessary difficulties for students.”
“For example, there was a time we were conducting a CBT exam, and the network went completely down. We had to wait for a long period, about four to five hours, for it to be restored. Some students were already exhausted by then, making it difficult to conduct the exam effectively. Eventually, we had to postpone the exam to allow students to prepare again.”
“The only technical issue I have experienced is related to the electricity supply. Sometimes, there are power outages when we are about to start the exam.”
“Network issues are the main challenge. Sometimes, the LMS may go down, which can delay exams. System shutdowns and connectivity problems can also affect the assessment process.”
4.3. Lecturers’ Attitudes Towards CBT
Theme 8: Positive Attitudes Toward CBT
“Shifting to CBT reduces paperwork, which is a major advantage. Traditional methods involve extensive writing, printing, and manual marking, which are all stressful. CBT is easier, more efficient, and provides instant feedback.”
“It is beneficial in reducing paperwork and streamlining assessments.”
“Since CBT requires a comprehensive approach, lecturers must cover the entire syllabus rather than focusing on just a few areas. This makes lecturers more serious about ensuring that students receive a well-rounded education.”
“It has encouraged me to cover the curriculum more comprehensively. Since I need a wide range of questions, I ensure that students are exposed to all necessary topics.”
“It is beneficial because it aligns with the digital era and helps students develop ICT skills.”
“There is a strong relationship because both teaching and assessments are moving toward digital formats. The transition aligns well with modern learning approaches.”
4.4. Lecturers’ Behavioural Intention to Use CBT
4.4.1. Theme 9: Strong Behavioural Intention to Use CBT in the Future
“Yes, I will continue using CBT, especially for lower-level courses. However, improvements need to be made, particularly in areas like seating arrangements and exam security.”
“Yes, of course. It makes the process easier and more efficient for both students and lecturers.”
“Definitely! With the increasing number of students, manual marking is not feasible. CBT makes setting, marking, and recording results much easier.”
“The simplicity and efficiency of CBT make it worthwhile. It saves time and allows me to focus on other academic responsibilities.”
4.4.2. Theme 10: Preference for Hybrid Assessment Models and Some Recommendations
“Yes, but it should be combined with paper-based exams, depending on the subject. While CBT is efficient, some courses require written responses.”
“CBT is ideal for large courses and multiple-choice assessments. However, for essay-based exams, traditional paper methods are preferable.”
“We need a large CBT centre with at least 1000 well-equipped computer stations. This would not only accommodate students comfortably but could also serve as a revenue-generating facility.”
“Training should be provided to all lecturers to ensure that they can effectively use the system. Even those who are already familiar with CBT need continuous training because technology is constantly evolving.”
“The system should be regularly updated for better compatibility. Additionally, more training should be provided for lecturers.”
5. Discussion
6. Conclusions
7. Limitations of the Study
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Baartman, L.K.J.; Quinlan, K.M. Assessment and feedback in higher education reimagined: Using programmatic assessment to transform higher education. Perspect. Policy Pract. High. Educ. 2023, 28, 57–67. [Google Scholar] [CrossRef]
- Syahbrudin, J.; Istiyono, E.; Khairudin, M.; Anggraini, A.; Wusqo, I.U.; Mariam, M.; Muflihan, Y. Computer-based assessment research trends and future directions: A bibliometric analysis. Contemp. Educ. Technol. 2025, 17, ep554. [Google Scholar] [CrossRef]
- Boud, D.; Ajjawi, R.; Dawson, P.; Tai, J. Developing Evaluative Judgment in Higher Education: Assessment for Knowing and Producing Knowledge; Routledge: England, UK, 2018. [Google Scholar]
- Rawlusyk, P.E. Assessment in higher education and student learning. J. Instr. Pedagog. 2018, 21, 34. [Google Scholar]
- Nieminen, J.H.; Yang, L. Assessment as a matter of being and becoming: Theorising student formation in assessment. Stud. High. Educ. 2024, 49, 1028–1041. [Google Scholar] [CrossRef]
- Twist, L. Changing times, changing assessments: International perspectives. Educ. Res. 2021, 63, 1–8. [Google Scholar] [CrossRef]
- Agostini, D.; Lazareva, A.; Picasso, F. Advancements in technology-enhanced assessment in tertiary education. Australas. J. Educ. Technol. 2024, 40, 1–7. [Google Scholar] [CrossRef]
- Sembey, R.; Hoda, R.; Grundy, J. Emerging technologies in higher education assessment and feedback practices: A systematic literature review. J. Syst. Softw. 2024, 211, 111988. [Google Scholar] [CrossRef]
- Truscan, D.; Ahmad, T.; Tran, C.H. Applying Test-Driven Development for Improved Feedback and Automation of Grading in Academic Courses on Software Development. In Frontiers in Software Engineering Education; Bruel, J., Capozucca, A., Mazzara, M., Meyer, B., Naumchev, A., Sadovykh, A., Eds.; Springer: Berlin/Heidelberg, Germany, 2020; pp. 310–323. [Google Scholar]
- Pengelley, J.; Whipp, P.R.; Malpique, A. A testing load: A review of cognitive load in computer and paper-based learning and assessment. Technol. Pedagog. Educ. 2025, 34, 1–17. [Google Scholar] [CrossRef]
- Yeboah, D. Undergraduate students’ preference between online test and paper-based test in Sub-Saharan Africa. Cogent Educ. 2023, 10, 2281190. [Google Scholar] [CrossRef]
- Weirich, S.; Sachse, K.A.; Henschel, S.; Schnitzler, C. Comparing test-taking effort between paper-based and computer-based tests. Appl. Psychol. Meas. 2024, 48, 3–17. [Google Scholar] [CrossRef] [PubMed]
- Bennett, R.E.; Goodman, M.; Hessinger, J.; Kahn, H.; Ligget, J.; Marshall, G.; Zack, J. Using multimedia in large-scale computer-based testing programs. Comput. Hum. Behav. 1999, 15, 283–294. [Google Scholar] [CrossRef]
- Chai, H.; Hu, T.; Wu, L. Computer-based assessment of collaborative problem solving skills: A systematic review of empirical research. Educ. Res. Rev. 2024, 43, 100591. [Google Scholar] [CrossRef]
- Adubika, T.O.; Agashi, P.P. A comparative study of the effect of computer based examination on text anxiety. Int. J. Adv. Res. 2021, 9, 354–358. [Google Scholar] [CrossRef]
- Abba, A.; Abubakar, A.A. Challenges of computer based test among senior secondary school students in Zaria local government area of Kaduna state. Afr. Sch. J. Pure Appl. Sci. 2020, 18, 90–104. [Google Scholar]
- Perry, K.; Meissel, K.; Hill, M.F. Rebooting assessment. Exploring the challenges and benefits of shifting from pen-and-paper to computer in summative assessment. Educ. Res. Rev. 2022, 36, 100451. [Google Scholar] [CrossRef]
- Zakariya, Y.F.; Danlami, K.B.; Shogbesan, Y.O. Affordances and constraints of a blended learning course: Experience of pre-service teachers in an African context. Humanit. Soc. Sci. Commun. 2024, 11, 1596. [Google Scholar] [CrossRef]
- Nurpeisova, A.; Shaushenova, A.; Mutalova, Z.; Ongarbayeva, M.; Niyazbekova, S.; Bekenova, A.; Zhumaliyeva, L.; Zhumasseitova, S. Research on the development of a proctoring system for conducting online exams in Kazakhstan. Computation 2023, 11, 120. [Google Scholar] [CrossRef]
- Tyohemba, H. Federal govement targets 100% computer-based exams by 2027. In LEADERSHIP Newspaper; Leadership Newspaper Group: Abuja, Nigeria, 2025. [Google Scholar]
- Azor, R.O.O.; Edna, N. Computer-based test (CBT), innovative assessment of learning: Prospects and constraints among undergraduates in University of Nigeria, Nsukka. In Implementation of Educational Technology in the 21st Century Secondary Schools in Delta; Oklahoma State University: Stillwater, OK, USA, 2019. [Google Scholar]
- Shobayo, M.A.; Binuyo, A.O.; Ogunmakin, R.; Olosunde, G.R. Perceived effectiveness of Computer–Based Test (CBT) mode of examination among undergraduate students in South-Western Nigeria. Int. J. Educ. Libr. Inf. Commun. Technol. 2023, 1, 1–12. [Google Scholar]
- Usman, K.O.; Olaleye, S.B. Effect of computer based test (CBT) examination on learning outcome of colleges of education ntudents in Nigeria. Math. Comput. Sci. 2022, 7, 53–58. [Google Scholar]
- Ukwueze, C.A.; Uzoagba, O.N. ICT Literacy and Readiness for Computer Based Test Among Public Secondary School Students in Anambra State. N. Media Mass Commun. 2021, 97, 1–14. [Google Scholar] [CrossRef]
- Davis, F.D.; Granić, A. The Technology Acceptance Model: 30 Years of TAM; Springer Nature: Cham, Switzerland, 2024. [Google Scholar]
- Davis, F.D. Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Q. 1989, 13, 319–340. [Google Scholar] [CrossRef]
- Fishbein, M.; Ajzen, I. Belief, Attitude, Intention and Behavior: An Introduction to Theory and Research; Addison-Wesley: Reading, MA, USA, 1975. [Google Scholar]
- Venkatesh, V.; Davis, F.D. A theoretical extension of the technology acceptance model: Four longitudinal field studies. Manag. Sci. 2000, 46, 186–204. [Google Scholar] [CrossRef]
- Venkatesh, V.; Morris, M.G.; Davis, G.B.; Davis, F.D. User acceptance of information technology: Toward a unified view. MIS Q. 2003, 27, 425–478. [Google Scholar] [CrossRef]
- Teo, T. Factors influencing teachers’ intention to use technology: Model development and test. Comput. Educ. 2011, 57, 2432–2440. [Google Scholar] [CrossRef]
- King, W.R.; He, J. A meta-analysis of the technology acceptance model. Inf. Manag. 2006, 43, 740–755. [Google Scholar] [CrossRef]
- Park, S.Y. An analysis of the technology acceptance model in understanding university students’ behavioral intention to use e-learning. Educ. Technol. Soc. 2009, 12, 150–162. [Google Scholar]
- Wingo, N.P.; Ivankova, N.V.; Moss, J.A. Faculty perceptions about teaching online: Exploring the literature using the technology acceptance model as an organizing framework. Online Learn. 2017, 21, 15–35. [Google Scholar] [CrossRef]
- Alshehri, A.; Rutter, M.; Smith, S. Assessing the relative importance of an e-learning system’s usability design characteristics based on students’ preferences. Eur. J. Educ. Res. 2019, 8, 839–855. [Google Scholar] [CrossRef]
- Bagozzi, R. The Legacy of the Technology Acceptance Model and a Proposal for a Paradigm Shift. J. Assoc. Inf. Syst. 2007, 8, 244–254. [Google Scholar] [CrossRef]
- Benbasat, I.; Barki, H. Quo vadis TAM? J. Assoc. Inf. Syst. 2007, 8, 211–218. [Google Scholar] [CrossRef]
- Creswell, J.W.; Creswell, J.D. Research Design: Qualitative, Quantitative, and Mixed Methods Approaches, 4th ed.; Sage: Newbury Park, CA, USA, 2017. [Google Scholar]
- Zakariya, Y.F.; Alotaibi, S.B.; Alrashood, J.S.; Alrosaa, T.M. Computer-based testing in higher education: A phenomenology investigation into undergraduate students’ perspectives through the technology acceptance model. Front. Psychol. 2026, 17, 1602964. [Google Scholar] [PubMed]
- Bond, M.; Marín, V.I.; Dolch, C.; Bedenlier, S.; Zawacki-Richter, O. Digital transformation in German higher education: Student and teacher perceptions and usage of digital media. Int. J. Educ. Technol. High. Educ. 2018, 15, 48. [Google Scholar] [CrossRef]
- Henderson, M.; Selwyn, N.; Aston, R. What works and why? Student perceptions of ‘useful’ digital technology in university teaching and learning. Stud. High. Educ. 2017, 42, 1567–1579. [Google Scholar] [CrossRef]
- Castañeda, L.; Selwyn, N. More than tools? Making sense of the ongoing digitizations of higher education. Int. J. Educ. Technol. High. Educ. 2018, 15, 22. [Google Scholar] [CrossRef]
- Naveed, Q.N.; Qureshi, M.R.N.; Tairan, N.; Mohammad, A.; Shaikh, A.; Alsayed, A.O.; Shah, A.; Alotaibi, F.M. Evaluating critical success factors in implementing E-learning system using multi-criteria decision-making. PLoS ONE 2020, 15, e0231465. [Google Scholar] [CrossRef] [PubMed]
- Alhabeeb, A.; Rowley, J. E-learning critical success factors: Comparing perspectives from academic staff and students. Comput. Educ. 2018, 127, 1–12. [Google Scholar] [CrossRef]

| SN | Name | Gender | Department | Rank |
|---|---|---|---|---|
| 1 | Mrs. Oke | Female | Computer Science | Lecturer II |
| 2 | Dr. Aliyu | Male | Biochemistry | Senior Lecturer |
| 3 | Mr. Abiodun | Male | Animal Science | Lecturer II |
| 4 | Mr. Sadiq | Male | History | Assistant Lecturer |
| 5 | Mr. Mutiu | Male | Electrical Engineering | Lecturer II |
| 6 | Mr. Usman | Male | Computer Science | Lecturer II |
| 7 | Mr. Salami | Male | Mass Communication | Lecturer II |
| 8 | Dr. Abdullah | Male | Science Education | Reader |
| TAM Construct | Theme | Core Meaning/Description | Illustrative Implications |
|---|---|---|---|
| PU | Theme 1: Adoption Challenges | Initial transition to CBT is difficult due to learning curves, question-setting demands, and administrative procedures. | Early difficulty may temporarily reduce PU but improves with familiarity. |
| Theme 2: Managing Large Classes | CBT enables fast grading, automated results, fairness, and scalability for large enrolments. | Strong functional advantage increases PU for high-enrolment courses. | |
| Theme 3: Time Efficiency & Workload Reduction | CBT reduces marking, printing, and result compilation workload. | Frees time for teaching, research, and administration. | |
| Theme 4: Assessment Quality Concerns | Limited capacity of CBT (mainly MCQs) to assess higher-order thinking and practical skills. | PU is constrained for disciplines requiring subjective assessment. | |
| Theme 5: Security & Cheating Risks | Concerns about identity verification, seating proximity, and supervision. | Perceived security weaknesses reduce trust in CBT outcomes. | |
| PEOU | Theme 6: Digital Literacy Dependence | Ease of use varies based on lecturers’ ICT competence and system control. | Training and lecturer autonomy enhance PEOU. |
| Theme 7: Technical Barriers | Network failures, power outages, and system downtime disrupt exams. | Technical instability reduces confidence and perceived ease. | |
| AtU | Theme 8: Positive Attitudes | CBT aligns with digital education, reduces stress, and enhances coverage and efficiency. | Positive attitudes are driven by efficiency and modern relevance. |
| BI | Theme 9: Intention to Continue Use | Strong willingness to keep using CBT, especially for large and lower-level courses. | High likelihood of sustained CBT adoption. |
| Theme 10: Hybrid Preference & Recommendations | Preference for combining CBT with paper-based exams; calls for infrastructure and training. | Context-sensitive adoption strengthens long-term use. |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2026 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Zakariya, Y.F. A Phenomenological Inquiry into Lecturers’ Acceptance of Computer-Based Testing in Higher Education Through the Lens of the Technology Acceptance Model. Trends High. Educ. 2026, 5, 23. https://doi.org/10.3390/higheredu5010023
Zakariya YF. A Phenomenological Inquiry into Lecturers’ Acceptance of Computer-Based Testing in Higher Education Through the Lens of the Technology Acceptance Model. Trends in Higher Education. 2026; 5(1):23. https://doi.org/10.3390/higheredu5010023
Chicago/Turabian StyleZakariya, Yusuf Feyisara. 2026. "A Phenomenological Inquiry into Lecturers’ Acceptance of Computer-Based Testing in Higher Education Through the Lens of the Technology Acceptance Model" Trends in Higher Education 5, no. 1: 23. https://doi.org/10.3390/higheredu5010023
APA StyleZakariya, Y. F. (2026). A Phenomenological Inquiry into Lecturers’ Acceptance of Computer-Based Testing in Higher Education Through the Lens of the Technology Acceptance Model. Trends in Higher Education, 5(1), 23. https://doi.org/10.3390/higheredu5010023
