Next Article in Journal
Survey of Smart Contract Framework and Its Application
Previous Article in Journal
Modeling Data Flows with Network Calculus in Cyber-Physical Systems: Enabling Feature Analysis for Anomaly Detection Applications
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Obstacles to Applying Electronic Exams amidst the COVID-19 Pandemic: An Exploratory Study in the Palestinian Universities in Gaza

by
Raed Bashitialshaaer
1,
Mohammed Alhendawi
2 and
Helen Avery
3,*
1
Centre for Advanced Middle Eastern Studies, Lund University, Box 201, 221 00 Lund, Sweden
2
Education and Psychology, Al-Azhar University, Gaza Box 1277, Palestine
3
Centre for Advanced Middle Eastern Studies, Centre for Environmental and Climate Science Lund University, 221 00 Lund, Sweden
*
Author to whom correspondence should be addressed.
Information 2021, 12(6), 256; https://doi.org/10.3390/info12060256
Submission received: 4 May 2021 / Revised: 15 June 2021 / Accepted: 16 June 2021 / Published: 20 June 2021

Abstract

:
In the context of the COVID-19 pandemic, we aim to identify and understand the obstacles and barriers in applying electronic exams successfully in the process of distance education. We followed an exploratory descriptive approach through a questionnaire (one general, open question) with a sample of university teachers and students in four of the largest universities Palestinian in Gaza. A total of 152 were returned from 300 distributed questionnaires. The results indicate that the university teachers and students faced 13 obstacles, of which 9 were shown to be shared between teachers and students, with a significant agreement in the regression analysis. Several of the obstacles perceived by respondents are in line with the literature and can be addressed by improved examination design, training, and preparation, or use of suitable software. Other obstacles related to infrastructure issues, leading to frequent power outages and unreliable internet access. Difficult living conditions in students’ homes and disparities in access to suitable devices or the internet make social equity in connection with high-stakes examinations a major concern. Some recommendations and suggestions are listed at the end of this study, considering local conditions in the Gaza governorates.

1. Introduction

This study investigates and identifies the obstacles and barriers to the use of electronic exams in the distance education system in Gaza during the COVID-19 pandemic and compares the obstacles identified by university teachers and students. Today, education systems and particularly higher education institutions face multiple challenges imposed on them by factors such as funding constraints, massification, successive scientific and technological developments, and rapid societal changes. We are also witnessing the emergence of new vulnerabilities and disruptions, with consequences for educational systems. After the World Health Organization declared that COVID-19 was a pandemic [1], governments worldwide were forced to close educational institutions. As of April 2020, approximately 1.5 billion learners were affected [2]. Thus, in many countries, educators were abruptly forced to switch to online teaching, through electronic learning and assessment systems. This sudden shift to a distance education system in an emergency has caused shock and tension among students and faculty members, both personally and professionally, as this process requires additional efforts and has revealed unexpected obstacles [3,4].
Gewin [5] notes that preparing online teaching can take three times longer than face-to-face teaching, and this extra workload has been a contributing factor to burnout among academics during the pandemic. However, the current crisis [6] also offers an opportunity to explore options to overcome some of the obstacles and barriers to the use of electronic exams and other challenges in the learning and assessment system. In the Arab world, the Palestinian Al-Quds Open University was the first to establish a distance learning university, see Abdelhai [7], but many other Arab universities are currently attempting to revitalize the distance education system to continue to provide educational activities for students.
An electronic examination can be conducted in a wide range of forms, just as it is inserted into varying educational contexts. Earlier research in the field of information and communication technologies (ICTs) has documented their adaptation and impact in higher education over the last decades and speculated on future trends [8,9,10,11]. However, remote teaching and learning do not have to be ICT-mediated, while on-campus education can use a variety of ICTs. Much of the literature concerns the intersection of these areas, and electronic assessment is therefore also often understood as an element of ICT-mediated remote teaching and learning. Terms such as ‘remote’ or ‘distance’ education are often used interchangeably with the terms ‘online’, ‘digital’, ‘electronic’ or ‘ICT-mediated’ teaching and learning, but while it is true that digital technologies play an increasing role, remote education also uses other approaches. Inversely, before the pandemic, ICTs were frequently used in various blended learning approaches as a complement to face-to-face modes, while higher education during the pandemic has been forced to resort to emergency teaching [12] and rely almost exclusively on these technologies. In designing programs, it is nevertheless important to make the distinction, since in certain contexts and for certain types of exams possible automation of correcting and documenting exam results can be seen as a major advantage, and thus motivate electronic assessment for courses that are otherwise given on-campus. In other cases, ICT-mediated remote teaching and learning may be considered advantageous for various reasons (such as reducing the cost of commuting to campus or offering greater flexibility to working students), but the characteristics of the type of assessment may be difficult or expensive to transpose to an electronic format.
ICTs have proved crucial to maintaining educational programs during the ongoing crisis [13]. Nevertheless, it is clear that many students and teachers need additional training in online teaching and learning to conduct it effectively (see Gonçalves and Capucha [14]), as well as with respect to digitally mediated assessment. Reliability and sufficient availability of technological infrastructures, such as learning tools and digital learning resources in the form of online courses, e-books, and e-notes, are of utmost importance [15]. Access to adequate internet is crucial [16,17,18,19,20]. A study of emergency online education in India, for instance, suggested that unreliable internet connectivity was the major source of student anxiety [16]. Under optimal conditions, adequate connectivity notably enables immediate student feedback [21], so that online teaching and digitally mediated assessment can become a viable alternative or complement to face-to-face interaction or paper exams [22], particularly for formative assessment.
Digital equity is a major barrier to access to higher education globally. Roughly four billion people worldwide have no access to the internet [23], while one billion live in areas that are not covered by mobile broadband networks [24]. Even where ICT services are available, the high cost of connectivity relative to income presents a problem of affordability [25]. Approximately half of all learners worldwide do not have access to a computer, and 40% do not have internet access at home [2]. Although mobile learning [26,27], in particular, could offer low-income countries a cheaper option for providing higher education, financial constraints in these countries are coupled with poor infrastructure, while individual learners may not be able to afford mobile devices compatible with university education delivery systems. For inclusive education systems, the situation of students with a disability also needs to be considered [24], while using digital delivery of education also has implications for gender equity [28]. Issues noted in a study by Khlaif and Salha [29] of emergency remote teaching during the pandemic in Palestine, Libya, and Afghanistan thus included internet access, as well as the availability of devices (phones or computers), but also the fact that several members of the same household might be sharing the same device. The learning environment at home was often not suitable. Respondents also mentioned the violation of privacy, connected to sharing account information, and to using webcams in students’ homes.
Designing adequate educational evaluation is a key element of ICT-mediated distance education, both with respect to the credibility and recognition of qualifications that are awarded, and to offer feedback to students, teachers, and institutions on the learning processes [30,31]. Educational evaluation is conducted using many methods and tools and is frequently designed to indicate the individual differences and levels of students’ abilities [32]. The concept of educational evaluation in its simplest sense is to collect and analyze data aimed at determining the degree of achievement of goals to make decisions and address deficiencies. This can be done in several ways, including remote evaluation (e-assessment) [33,34]. Many studies suggest that electronic exams allow teachers to accurately measure the educational content over time [35], while Ranne [36] argues that these exams can contribute to increasing the motivation of the learner through tests of appropriate items, in which the low-achieving students tend to prefer the structural test over electronic exams [36]. The use of advanced devices makes the electronic link process easy and fast, with adequate protection programs to avoid leakage of questions [32].

1.1. Objectives

The present study is based on data from a larger study conducted in April, May, September, and October 2020 with university teachers and students in Algeria, Egypt, Iraq, and Palestine [37,38]. Considering that conditions for implementing emergency distance teaching and examination vary greatly across contexts, the present study aims to investigate teacher and student experiences of electronic examination for the specific context of the Gaza governorate. The area is subject to several major infrastructure constraints, notably concerning power outages and internet connectivity, which impact opportunities for future development.
The present study examines and compares the obstacles to the use of electronic exams from the specific perspectives of university teachers and students amidst the COVID-19 pandemic in the Palestinian universities in the governorates of Gaza. An intended outcome is to help address these obstacles, contribute to the success of electronic exams, and, thus, to the success of the local distance education system. Therefore, the main objectives of this study are to:
  • Identify the obstacles faced by university teachers and students at the Palestinian universities in the governorates of Gaza in the use of electronic examinations during the COVID-19 pandemic.
  • Clarify agreement and differences in teacher and student perceptions of the identified obstacles.
  • Establish the significance of the obstacles in the local context.
  • Consider possible approaches to addressing these obstacles, suited to the local context.

1.2. Major Questions of This Study

This study was conducted during the global COVID-19 pandemic, which has become a barrier to education systems around the world, including for universities in the Palestinian governorates. To limit physical contact and the spread of the virus, the Palestinian universities have introduced a system of remote emergency teaching, and electronic exams were used as an alternative to traditional proctored on-campus written exams until the end of the academic year. The main questions in this study are as follows:
  • How were remote unproctored electronic written exams during the COVID-19 pandemic perceived by teachers and students at Gaza universities?
  • Are there any differences in the obstacles identified by teachers and students at Gaza universities concerning the electronic exams during the COVID-19 pandemic?

2. Electronic Exams Theoretical Framework

2.1. The Concept of Electronic Exams

Electronic exams are a product of information and communication technology (ICT) intended to solve barriers faced in conducting physical examinations. They use electronic means (e.g., a computer or other digital devices), to approximate the conduct of a physical exam, but offer other options than physical examination to restrict the time and space available [39]. They can be applied in a common classroom with fixed computers or with students’ own devices to solve online exams and assignments.
Exams may include several elements, including the demonstration of practical or professional skills. In professional contexts, in particular, problems may not be well defined, and multiple approaches to solutions may be possible. Exams can be distinguished along several independent dimensions, including paper versus electronic exams (medium), and on-campus versus remote exams (location). With respect to the risk for fraud (see Section 2.3 below), a further distinction is proctored versus unproctored exams (proctoring). The latter may or may not be monitored automatically through various systems, including webcams. For this study, we will mainly be comparing on-campus proctored paper exams (that students and teachers of the study were used to before the pandemic), with the remote unproctored electronic exams that were used in Gaza during the pandemic.
As for the wider field of remote and ICT-mediated teaching and learning in higher education, the terminology used in the literature is not consistent. We will here use the term “electronic assessment” to cover both formative and summative assessment, while the terms “electronic exams” and “electronic examination” will be used for summative assessment that counts towards students’ qualifications. According to Al-Khayyat [40], electronic exams are an objective and computerized system that may contain different models given to students in which randomization is used when selecting the questions for each exam. The aim is mostly to identify individual differences between students. However, as for other types of assessment [41,42], electronic exams can also be applied to measure performance differences between educational institutions, as well as to research the academic achievement of specific demographics or geographical areas. To the extent that ICTs are used as a proxy indicator for different types of development in Agenda 2030, it is conceivable that the use of electronic exams may also serve the purpose of signaling progress with respect to SDG targets.
Assessments are a particularly critical element of higher education since they are used to guarantee the quality of knowledge and competences of degrees and certificates. For students, assessments are the gateway to obtaining these certificates, and therefore determine their future employment opportunities. Both to ensure quality in higher education and for educational equity, careful consideration of forms, content, and modalities of assessment is therefore essential, since any other gains made through the development of teaching, learning, or program design will be lost if the assessment is not adequate.

2.2. Electronic Exams—Characteristics and Advantages

There are several differences between proctored classroom paper exams given on-campus and electronic exams that are given remotely. Classroom exams require a specific timetable and location, and classrooms and supervisors must be reserved well in advance, for each time the exam is completed [43]. With electronic exams, the teachers and students can access the exam from a wider range of locations [43] and schedules, therefore, becoming less dependent on the space available at the educational institution. Electronic exams also potentially reduce the time and cost spent by students commuting to the location of the exam, which can be a significant factor to reduce stress if distances are great or transport is not reliable. If students do not have to travel to a specific location, students from remote areas can participate more easily [19,44,45], and this consideration can also be significant in contexts of armed conflicts, unpredictable checkpoints, high insecurity, or as a result of natural disasters. However, under extreme conditions ICT infrastructure may be affected, and online communication can also be restricted for political reasons.
Additional potential advantages with electronic exams—regardless of whether they are given remotely or on-campus—are reductions in printing and paper costs [44]. Bennett [46] mentions similar advantages, such as ease of reviewing the test items (saving time), saving the cost of printing, reducing time to read and review, and showing exam results without the cost of printing. Nevertheless, many arguments concerning the reduction in time for correcting exams and for administrating results suppose fully automated systems, with question banks and keys of acceptable answers (i.e., some type of multiple-choice where answers are limited) [47].
According to Knowly [44] for standardized exams with automated correction, we can save time because the distribution of the exam does not take time; only requiring contacting other teachers and students online and then students get their exams from question bank and receive their results instantly. Ismail et al. also point to time potentially saved in the administration of exam results [47].
Ryan [48] summarizes six characteristics of electronic exams compared to paper exams: interactivity, flexibility, time-saving, instant feedback, reduced resource use, and record-keeping. Online exam software can include a variety of features such as automation of scheduling, customization of test-taking options, proctoring, multi-language support or transcripts generation [49].

2.3. Electronic Exams—Disadvantages

Electronic exams require skills and experience in information technology by the student and university teachers, which requires adequate training. An unproctored remote online exam system is more susceptible to fraud, so university teachers have to keep in mind that students are allowed to take the exam on their own devices in their own time with nobody to check on them [44]. Numerous studies highlight the problem of academic fraud [50,51,52,53], which can lead to reduced trust in the fairness of e-examination [54]. Various technical solutions have been proposed to these issues, such as secure communication systems based on cryptography, biometric authentication, and face recognition [54,55,56]. Luck et al. [51] instead stress a social and educational approach to supporting academic integrity and combating fraud. Other strategies include giving exams that require students to give arguments or show the process for arriving at solutions. The age and maturity of students can further play a role in supporting integrity [57]. For proctored electronic exams taken on-campus, there is no clear indication that fraud may be easier than for on-campus paper exams, although there may be added threats if students can bring their own devices [58]. Proctored electronic exams taken on-campus additionally allow increased technology-enabled countermeasures against cheating [59,60].
Bennett [46] states that one of the most important disadvantages of electronic exams is that preparing them takes a long time, as they require skills and training for the preparation and implementation processes. He adds that measuring higher skills is one of the difficulties faced in electronic assessment. According to Guàrdia et al. [30], electronic assessment is predominantly carried out with traditional forced-choice measures, reducing students’ opportunities for learning and developing higher-order skills. If the electronic assessment is used to replace paper exams that only consider if answers are right or wrong, automated correction can speed up a process that is already mechanical. However, if perceived advantages of automated correction force teachers to design questions with simple answers, it will reduce the educational potential of teacher feedback. Iglesias Pradas et al. [61] note that measures of student performance may be influenced by the form of assessments used in the emergency remote situation. In their study, continuous assessment assignments had been delayed, simplified, or removed. The typical final exam in their programs would normally include problems presented as cases, where students are required to apply theoretical concepts as a sequential process. Both descriptions of the process and final result are included in the assessment. By contrast, during the pandemic, some courses instead used multiple-choice questions, which tend to measure knowledge rather than skills. Comparison of student achievement with emergency remote electronic examination with results relating to paper exams pre-pandemic is therefore problematic.
While electronic assessment and exams can reduce certain costs for HEIs, they entail considerable investments in electric and internet infrastructure. Further costs are connected to electricity for servers, as well as paying for licenses, acquiring, maintaining, and updating devices and software. The generalization of digital technologies has negative sustainability impacts, (see Kuntsman and Rattle [62]) while the complex interconnected systems that support digital communication and storing of information required for digitally mediated examination generate multiple vulnerabilities, including issues of cybersecurity and integrity of personal data. Specifically for high-stakes examinations, any technical glitches have potentially serious consequences for students and are a factor of stress. Finally, it should be noted that practical skills are difficult or impossible to replicate in ICT-mediated modes, although for certain subject areas simulation or virtual reality approaches have been attempted.
This study does not include discussions of challenges concerning the electronic assessment of practical skills, or the impact on learning outcomes and assessment of reduced opportunities for lab work, in-service training periods, etc.

2.4. Earlier Research on Electronic Examination and Emergency Remote Teaching

A large number of studies have been published in 2020 and 2021 on the general topic of online distance education. However, not all of these concern the situation of emergency remote teaching (ERT) connected to the COVID-19 pandemic. A relatively small proportion focus on remote electronic examination, and barriers typical for resource-constrained contexts.
To obtain an indication of the focus and distribution of research conducted during the pandemic, a Scopus search (title-abstract-keywords) for articles in English published (or under publication) in 2020 and 2021 was conducted on 26 May 2021 using the search terms ‘distance education’, ‘distance learning’, remote education’, ‘remote learning’, online course’, ‘online program’, ‘e-learning’, ‘higher education’, ‘university’, universities’, and yielded 2157 publications. 232 publications were eliminated through keywords that related to medical or public health aspects of the COVID-19 pandemic (e.g., ‘clinical study’, ‘viral pneunomia’), while publications with keywords such as ‘COVID-19′ or ‘pandemic’ were retained since these keywords are often used to signal the reason for emergency remote teaching. 1925 results remained. Of 1925 publications, 328 were from the US, followed by China (137), Spain (117), the UK (105), and Saudi Arabia (99). Five countries thus accounted for 41% of publications. Only a total of 12 publications were from low-income countries (as defined by the World Bank), Yemen (4), Uganda (3), Afghanistan (2), Sudan (2), Malawi (1) (0.6% of total publications. The West Bank and Gaza are according to the World Bank classification categorized as lower middle income, and five publications were found (included in Table 1 below). Of the total of 1925 publications examined closer, none had keywords referring to digital or educational equity, income, or infrastructure issues. No keywords referred to disability and only 14 referred to gender.
A search was made in the 1925 publications using the search terms “electronic assessment” and “online exams”, yielding 16 and 30 items respectively. Abstracts were scanned, and where necessary articles read in full text. Items from the search on publications concerning the West Bank and Gaza were added, duplicates were removed, while items concerning general experiences of online education prior to the pandemic, learning outcomes, or the development of online tools generally were excluded. Topics focused on the remaining 36 items are summarized in Table 1 below. A total of 20 items dealt with emergency remote education, 12 focused electronic examination forms, tools, or experiences, but only 4 related to the intersection between ERT and remote electronic examination. Most of the studies on exams concerned academic fraud, and only one [63] concerned an automated tool to offer feedback to students. Other topics found in this search included libraries offering access to digital content [64], socioemotional support and teachers use of data on students [65], teacher academic training needs [66], rebuilding the community of practice online [67], obstacles to ERT [37,38], student satisfaction [68], and need for funding digital learning environments [69].
The two literature reviews included in this search only marginally touch on the focus for the present study. Abu Talib et al. [70], make an analytical systematic review of 47 Scopus-listed publications in 2020 on online distance education in higher education. The review by Abu Talib et al. does not specifically mention issues connected to remote electronic assessment. Of the articles they reviewed, 15 found that online education was overall satisfactory and effective, while 8 expressed concerns or overwhelm. With respect to educational equity, Abu Talib et al. note that student access to online distance education is usually related to family income and that transitioning to online learning, therefore, increases the gap between privileged and underprivileged groups. Students from lower-income regions have limited access to devices and the internet, as well as lower technical abilities. Butler-Henderson and Crawford [71] make a systematic review of 36 publications from the period 2009–2018 concerning electronic assessment. Their search string did not include the search terms ‘remote’ or ‘distance’, reviewed publications mostly come from high-income countries, only five from middle-income countries, and none from low-income countries (as defined by the World Bank). Issues typical for resource-constrained contexts—such as internet connectivity or access to devices—were not mentioned in their analysis.
Although many Scopus listed publications are likely to have been missed due to differing terminology, while others would be found in other databases, the search does suggest that relatively little of the research conducted on ICT-mediated remote education touches on the concerns of resource-constrained contexts, encountered during the pandemic in connection with the electronic examination. To gain an overview of experiences of emergency remote teaching, learning, and examination in resource-constrained contexts, a wider search focusing on these contexts was therefore made in Google scholar, other databases such as ERIC, and through snowball searching. Our search was also extended to include publications before 2020, notably to find suggestions of practices that are used to overcome obstacles related to student income, cost of higher education, or inadequate infrastructure.

2.5. Earlier Research on Attitudes and Barriers for Remote Electronic Examination

Contexts, where the electronic examination is used, differ widely, as do the specific forms it takes. Results concerning student perceptions, attitudes, and barriers, therefore, show considerable variation, depending on the context. For instance, Gonçalves and Capucha [14] analyzed how teachers and students from Portugal adapted to the digital environment using information and communication technology (ICT) models and identified the changes imposed by COVID-19 on educational institutions members (teachers and students). They found that teachers and students expressed that the worst change brought by COVID-19 was the cancellation of normal (classroom) sessions, disturbing the students’ education during the pandemic [14]. By contrast, studies in Saudi Arabia [68] and Bahrain [76] show student satisfaction with electronic examinations, as well as technical capacity to adapt to constraints imposed by the pandemic. Similarly, Results from emergency remote teaching in a Spanish HEI [61] suggest that among the factors that enable the successful sudden transition to online modes are suitable pre-existing infrastructure and software, an organization that allows individual teachers freedom to make choices, good technical and digital skills among teachers, and good digital skills among students. This is more likely to be the case for programs with a technical orientation. In the investigated Spanish case, the usual commuting time had been one to two hours.
In a study of student’s attitudes towards the use of electronic exams at the University of New England in Australia, the main challenges identified were related to internet speed and security systems, the test system itself, and the technical problems associated with it [77]. In a Taiwan study, Liu et al. [54] investigated the acceptance and perception of students at community colleges regarding the introduction of electronic assessment using biometric authentication and webcams to monitor students during exams, to minimize academic fraud. The results showed that students responded positively and had confidence in the assessment procedures. However, both Liu et al. [54] and Shraim [21] note that such technical measures do not preclude all forms of cheating. In the context of an emergency remote teaching in Mexico, Gudino Paredes et al. [73] found that remotely proctored exams reduced fraud, but the back of privacy and anxiety emerged as aspects of concern for students. In an Indian study, Arora et al. [74] similarly found that students experienced more anxiety with online examinations.
Thomas et al. [78] examined the barriers to the use of electronic assessment in higher education, and in particular, differences in the student experience of conducting assessment over the internet, in a synchronized manner, in which the student is in contact throughout the exam period, compared to an asynchronous manner, in which the student downloads the exam and then re-sends it within a certain period. Unreliable internet connectivity during exams caused stress, and the results showed the effectiveness of asynchronous electronic assessments in reducing the students’ anxiety. In the Philippines, Baticulon et al. [20] also recommend asynchronous delivery. In a West Bank study [21], more than two-thirds of students appreciated the possibility of instant feedback but expressed that electronic examination was more suitable for formative assessment than for summative assessment, while three-quarters felt that it increased anxiety. Gaza students surveyed in a study by Shehab et al. [18] also preferred reports and assignments and believed the electronic mode was more suitable for formative assessment.
In their study of open distance learning in Malawi, Zozie and Chawinga [45] mention a variety of affordable options used in resource-constrained contexts with poor internet connectivity and unreliable electricity access. These include the use of print materials, supplemented by radio/television broadcasting and face-to-face delivery. Students in their study preferred print media because they were not dependent on electricity and offered more flexibility. The open distance learning approach used in Malawi gave students greater control over their learning processes, fostered independence, and relied to a greater extent on self-directed learning. In Malawi, universities offering distance courses avoided the problems associated with e-examination by assessing students through written continuous assignments, combined with end-of-semester examinations on campus [45]. Baticulon et al. [20] recommend more frequent formative assessment rather than a single, high-stakes examination, and additionally point to the role of sufficient communication with students (cf. Bojovic et al. [6], Shehab et al. [18]), clear instructions, as well as adequate feedback on assignments or examinations [20].
The diverging results and conclusions in the examples above suggest that both perceptions and factors enabling or preventing the use of electronic assessment can vary considerably. Besides possible social and cultural differences in acceptance of biometric authentication and webcams in exam situations (see Khlaif and Salha [29]), access to technology and internet connectivity appear to be significant factors. Any measures to facilitate the use of electronic assessment must therefore take into account local conditions, as well as requirements that may vary between academic programs or student demographics. Also, not only do the financial, infrastructure and organizational characteristics of learning and teaching contexts vary considerably across studies in the literature, but there is a difference between emergency remote teaching and examination, and contexts where online and ICT-facilitated methods have been introduced deliberately [12]. Notably, this has implications for issues such as the ability to devote careful attention to exam design [30], or the choice of students to take online courses. It can also have implications for the relative importance placed in programs on validating learning through examination of developing student independence through a more self-directed learning approach [18,79].
Due to differences in constraints and needs in different contexts, it is difficult to directly transfer conclusions made in the general literature to the conditions of any specific new context where the introduction of electronic assessment is being contemplated. It can be assumed that in higher education contexts that already have a high proportion of online provision, this is because the conditions and aims of those contexts make it both feasible and suitable. Notably, in resource-constrained contexts, difficulties encountered in contexts where an electronic assessment has been recently introduced due to the pandemic, cannot be dismissed simply as expressions of "growing pains" that will be resolved with time.
In those evaluations of student or teacher experiences that have been conducted, it is additionally difficult to disentangle the various factors at play, since in practice these factors interrelate and may cumulate to render studies and examination possible or not. More research is therefore called for, taking into account contextual characteristics, as well as implications for students’ future studies and career paths.

2.6. Higher Education and Socioeconomic Conditions in Gaza

For 2014–2015, the total population of the West Bank and Gaza was 4,550,368, with 668,657 in the age group 18–24, and 221,395 students enrolled in HEIs. The population of Gaza was 1,819,982, while within the Gaza governorates, both the majority of the population and university enrolments are concentrated in Gaza City, which for the same years had a population density of 8457 per square kilometers. The percentage of the population of the Gaza governorates enrolled in HEI on-campus education was approximately 4%, compared to 3% in the West Bank. Including the Al-Quds Open University, the total average HEI enrolment percentage was 4.8% for the West Bank and Gaza combined. The Al-Quds Open University had in 2016 five centers in Gaza [80].
The estimated population of the West Bank and Gaza had risen to 5,101,000 in 2020 [81]. The total number of students enrolled in HEIs (including university colleges and community colleges) for the Gaza governorates was 81,470 in the academic year 2019/2020 [82], while the total number of students enrolled at Al-Quds Open University (West Bank and Gaza) was in 2020 47,730 [75].
While conditions for higher education in Gaza are similar to those in the West Bank in certain respects, there are also significant differences. Thus, mobility for students in the West Bank can be a greater issue due to checkpoints, but economic conditions in Gaza are more difficult. Unemployment rates in the Gaza governorates in 2011 ranged between 26–33%, compared to unemployment rates of 13–22% in the West Bank, for the same year [83]. By 2018, unemployment rates in Gaza had reached 52% [83] and by November 2020 they had as a result of the pandemic soared to 82%, according to the Palestinian General Federation of Trade Unions [84]. An UNCTAD report notes that the pre-pandemic poverty rate in Gaza was already 55.4% (for 2017) [85]. 68% of households in Gaza were food insecure in 2018, compared to 11.6% in the West Bank [86].
Economic conditions affect students’ ability to conduct their studies and pay fees. All universities in Gaza charge fees, but public universities have somewhat lower fees than private universities because they are supported by the government and have a special budget, covering salaries of university employees. Fees vary according to specialization, but there is not much variation between universities. The fees are not calculated per semester or year but are calculated per hour in Jordanian Dinars (1 JD = 1.4 US $), a student can take 10 to 20 h per semester. As an example, the current hourly price of the College of Medicine at Al-Azhar University is 100 Jordanian dinars is the highest and the lowest is 18 Jordanian dinars per hour at the Faculty of Education [87]. An increasing number of students are unable to pay their fees and drop out [88], although universities have asked for government support to offer leniency for fees. Such difficulties are compounded by the overall insecurity in Gaza, trauma, and mental health issues [89], which also affect university staff [90].

3. Data Collection and Procedures

3.1. Study Approach and Method

This is an exploratory descriptive study that attempts to explore and visualize faculty and student perceptions concerning electronic examination in the context of Gaza universities. The study population was university teachers and students at Palestinian universities located in the governorates of Gaza and was conducted in September and October 2020, comprising four of the eight main Palestinian universities in the Gaza governorates: Al-Azhar University, the Islamic University, Al-Aqsa University, and Al-Quds Open University (see Table 2). The Gaza universities included in this study are all public. By comparison, the main private universities Palestine University, established in 2008, and Gaza University, established in 2007, are smaller, but with somewhat better resources per student. Palestine University and Gaza University had in 2013–2014 3156 and 624 students respectively, with 9.5 and 63 square meters available space on campus per student. During the pandemic, education at all Gaza universities remained face-to-face in a few scientific disciplines, but most was remote, including electronic examination, due to the assessment of the public health situation issued by the Ministry of Health.
We chose an online survey to reach a maximum number of respondents in a limited time and in compliance with social distancing measures. More than 300 questionnaires were sent, across the various subjects and programs taught at the universities, with a return of 152 university teachers and 55 students from the total distributed questionnaires (Table 3). The questionnaire (made through Google Forms) was sent to university teachers and students mostly by email¸ without any restriction on the subject they were teaching or studying. Some respondents were personally contacted because they are from the same universities and areas of one of the authors. The difference in the number of responses between university teachers and students was likely due to the availability of internet access for the faculty members from their working places, which resulted in close to double the number compared to student responses.
The questionnaire was distributed shortly after the remote exams. All electronic exams taken were written and synchronous. Students had to login to their website to start the exam at a specific time. When it was done, it was followed by a logout from the system with respect to the time limit. The minimum exam time was 40 min, and the maximum was 3 h. After completion, the teachers could see all students’ university pages to correct the exams and post the results. Most students used a computer, while a smaller number used iPhones and iPads. Pre-pandemic, students would normally sit for the exam in their classroom for most of the courses with a maximum number of about 50 students, while going to the auditorium or large hall if more than one class were taking the same course. Almost all university students had some prior experience of electronic exams on computers on campus, but just for those courses that included programming and calculation practices.

3.2. Study Tool

To identify the obstacles to conducting electronic exams in the emergency remote teaching system during the COVID-19 pandemic, the authors distributed an electronic questionnaire with a single open question to university teachers and students in the Palestinian governorates of Gaza. A single open question was preferred for this study, to allow respondents to formulate their answers and avoid imposing our pre-existing ideas. Although possible obstacles have been identified in the literature (such as lack of a suitable place to sit, distractions in the home environment, lacking or poor computing equipment, unstable internet, unreliable power supply, cheating due to lack of invigilation), conditions in Gaza differ in several respects from other resource-constrained contexts. Since very little research has been conducted on perceptions of electronic exams at Gaza universities and to serve future research in the context, for this exploratory study we wished to avoid directing respondent answers in any particular direction, which could be the case if open questions were placed after a series of closed questions for expected issues. The question used was: “in your opinion, what are the obstacles to conducting electronic exams in remote education?”
(برأيك، ما هي معوقات تطبيق الاختبارات الإلكترونية في ظل التعلم عن بعد؟)
Since the study is exploratory and aims to identify the obstacles, the responses were subjected to regression analysis. We have used the regression analysis to find the relationships and to understand the changes between different responses variable i.e., between the students and teacher responses.

3.3. Categorization and Coding

A data subset concerning the Gaza governorates was excerpted from a larger comparative study across several Arab countries, looking both at remote teaching and learning and at electronic exams [38]. The data for Gaza has been reexamined in more detail in the present study. However, the categories used in the earlier study were retained to facilitate future comparisons or follow-up studies. We used a comparative approach between the obstacles, to reach the final form for each obstacle separately for the teacher group and student group, limiting them to 13 obstacles. Both the broader categories (personal obstacles, pedagogical obstacles, technical obstacles, financial and organizational obstacles) and the clusters of specific obstacles within each category were agreed on in the larger group of researchers involved in the comparative study. In the first step, we identified the obstacles after collecting the Arabic language answers from teachers and students, then translated to English, to reach the final form for each sample separately. We have used both open and selective coding, by taking the textual data and breaking it up into discrete parts and then selecting one central category that connects all the codes from the collected data to link them. Designations used to name obstacles remained close to the wording used in responses. Responses mainly consisted of a series of short sentences or points, listing various obstacles with little explanation.
Examples of responses and coding:
widespread instances of cheating among students
(student)
there was no real surveillance during the exams
(student)
cheating was sometimes allowed
(teacher)
the students may resort to a variety of sources to obtain answers”
(teacher)
The responses above were coded as belonging to the obstacle: There was no real surveillance, widespread cases of fraud.
“weak internet for all and absence [of internet] in some houses
(student)
“weak internet connectivity”
(student)
internet problems
(teacher)
malfunctioning of the internet network leads to an interruption
(teacher)
The responses above were coded as belonging to the obstacle: Internet unavailability and poor internet quality.
several members of the same family sharing internet, with ensuing slow internet speed
(teacher)
This response was coded as belonging to the obstacle ‘Weak internet flow (speed)’.
The distinction between power cuts, connectivity, and internet speed was attempted, to obtain an indication of technical solutions that should be prioritized. However, the wording did not always give enough information to make this distinction (for example "internet problems"). An additional obstacle, ‘The lack of capabilities to communicate remotely (devices, internet, Apps, etc.)’, also comprised internet issues, but from the perspective of financial resources. Items were coded as belonging to this obstacle if the financial hardship of the family was explicitly referred to. However, it can be assumed that some of the internet problems categorized under the heading ‘Technical obstacles’ might also be indirectly related to the financial situation of the family.

3.4. Limitations

Using an open question has the advantage of avoiding imposing a particular framing, but also comes with the disadvantage that respondents are not prompted to reflect more closely on the issues, and answers may be superficial and unreflected. A further disadvantage with a single open question is that it becomes much more difficult to compare, quantify and synthesize free-text responses that are given by the respondent in reply to a more specific question. The question used for our study is biased in as much as it only concerns obstacles as the study is delimited to investigating obstacles experienced by students and teachers and does thus not propose to include possible advantages with the electronic exams that students or teachers may have experienced. It would be useful to conduct future studies including both obstacles and advantages, as well as questions to better determine the relative importance of the various issues. The phrasing of the question in terms of barriers in our study may also involve a generally negative framing that could have influenced the respondents. By comparison, a survey by Alsadoon [92] on student perceptions of electronic assessment in Saudi Arabia resulted in strongly positive replies, which may have been influenced by the positive wording of the questions and a list of questions mentioning potential advantages.
The phrasing of the question used in the present study did not make it clear whether respondents should answer based on their personal circumstances or give a more general answer with a communal perspective. The question did also not specify whether it concerned electronic exams in general, or those that had been given shortly before the questionnaire was distributed. Most answers did not allow the researchers to determine how the respondents had interpreted the question (example of an answer: “power outages”). However, in those cases that the wording shows the interpretation, the communal perspective appears to be more frequent (examples of answers with communal perspective: “lack of material and technical possibilities for some students”; “weak internet for all and absence [of internet] in some houses”) (examples of answers with individual perspective: “I constantly work with my students during electronic teaching”; “half of the questions were displayed with scrambled letters. This is an experience that has happened to me personally”).
The issues mentioned above may affect the validity of the findings, and above all have implications for how results should be interpreted. Examining the wording used in respondents’ answers suggests that a communal perspective was more common. This is helpful for the purpose of the study, particularly in view of the relatively small number of respondents, and in the context of the close-knit communities of Gaza, respondents likely have a fairly detailed and accurate knowledge of conditions other students and teachers were experiencing. However, this makes findings from the study less suitable for concluding differences among specific student or teacher groups and individuals.
Considering that the questionnaire was given shortly after electronic exams in the context of emergency remote teaching, it is likely that respondents were over-emphasizing their most recent exam experience and may have forgotten other issues. Since some students and teachers may have had prior experiences with electronic exams administered under circumstances that allowed better design and preparation, for future research and to better grasp potentials for developing electronic exams in the context of Gaza, it would be interesting to explore to which extent the emergency circumstances affected experiences. An additional issue is that answers frequently mentioned issues connected to remote learning and teaching during the pandemic, but the design of the questionnaire did not make it possible to systematically investigate any correlations between obstacles to teaching and learning, on the one hand, and obstacles connected to the electronic exams, on the other. Furthermore, the questionnaire did not permit to disentangle to which extent certain issues—such as the poor design of exam questions—was specifically connected to the electronic mode of delivery, if it was a problem encountered also with on-campus paper exams, or if the issue already existed but might have been aggravated by the additional stress caused by the circumstances.
Coding and grouping the data into larger categories entailed two major problems. Since the categories chosen are interconnected, there is possible overlap between some elements of the categories, while the interconnections are not mapped. The limited number of categories also meant that certain interesting details were left aside. For instance, the issue of scrambling of Arabic letters by ICTs is a well-known problem, that can in principle be addressed if adequate software and systems are used both at the university and among students. However, streamlining software and devices to ensure compatibility is a problem even in high-income contexts, and in the context of Gaza also becomes a problem of resources. In some cases, coding was not clear-cut, as the implications could fit into more than one category. An example is the following student response that was coded as belonging to the obstacle There was no real surveillance, widespread cases of fraud: “there might be a problem during the submission of exams, and there was nobody who could take care [of the problem]”. However, while the response does refer to the general issue that there was no invigilating of the electronic exams (which in turn was connected to a perception that fraud was widespread), this response also points to the distress of students who could not get help from a physical person invigilating the exam. Several such issues noted in the data but that were incompletely captured by the coding and categorization have therefore been discussed throughout the paper. It should also be noted that the option of counting each response for an obstacle as one for the quantitative analysis, does not necessarily reflect the importance this obstacle had for the individual respondent.
The study was conducted without funding and by a small group of researchers. Coding software would have allowed quicker and easier processing of the data. However, software for Arabic is designed for well-edited texts in standard Arabic, does not perform well for this type of data, which contains a number of spelling mistakes, incomplete sentences, as well as expressions and allusions that can only be completely understood with knowledge of local culture and context, and familiarity with local language usage.
Finally, it can be assumed that student perceptions of what constitutes an “obstacle” will depend on their beliefs concerning what are normal or acceptable challenges associated with university studies. This, in turn, will depend on their own prior exam experiences and the orientation of their studies (i.e., if investing in devices and ICT skills is central to their future career path or merely a means to the end of obtaining the degree). Beliefs are likely to be influenced by discussions with older role models or peers concerning what to expect from their studies. However, to what extent an obstacle is considered serious is likely to depend on students’ overall situation, and the multiple obstacles they may already be facing for other reasons. In the context of Gaza, the time and money invested in academic studies already constitute a major burden for many households, and failing exams represent the loss of that investment. Lack of resources limits the opportunities for students to extend studies and retake exams later while failing exams also means losing hope for future livelihoods in a local context with very high unemployment. Aspects that in other contexts might not be perceived as serious obstacles might therefore be mentioned by students in this study.

4. Results

4.1. Classification of Constraints

Table 4 presents the 13 barriers and obstacles that were identified in the analysis of the answers to the questionnaire. The thirteen obstacles were divided into four different groups or categories, where these obstacles are the factors that could be preventing or reducing the quality of applying electronic remote examination under the COVID-19 pandemic.
The numbers listed in the column below the student’s repetition are the number of times each obstacle was mentioned by the students out of a total of 55, and the next column is the same for the university teachers but out of a total of 97, the total overall column is the total number of selections for each obstacle from the university teachers and students together followed by their percentage over the total number of 152.
The university teachers and their students attributed the highest percentage of obstacles to achieving the quality of applying electronic exams remotely under the COVID-19 pandemic to the lack of financial and technical capabilities of some students (72.4%), and power cuts (lake of electricity, 66.4%), followed by unavailability of the internet and poor internet quality (57.9%), difficulty and poor study in light of difficult conditions (57.2%), and electronic exams will not show the true level of students and will not distinguish students from others (45.4%).

4.2. Obstacles Arrangement (Ordering)

Table 5 presents the obstacles and their percentage found in this study for the university teachers and students, respectively, as well as the overall percentage, in which this percentage explains the differences between the obstacles including their ranking and extent of agreement or divergence between teacher and student perceptions.
From the collected answers, we find that the total percentage of the sample stated that the lack of financial and technical capabilities of some students (such as devices, internet access, and applications) is one of the biggest obstacles to achieving the quality of applying electronic exams remotely under the COVID-19 pandemic, which was the main concern for both university teachers and students. These financial and technical obstacles also affected students’ opportunities for learning, since students had difficulties understanding some subjects in the absence of interaction in the classroom. While there is a gap in the breakdown percentage for most obstacles between university teachers and students, as well as for the overall percentages, there is no significant difference between them when looking at the order of the obstacles from 1 to 9.

4.3. Significance of Diverging Perceptions

While there was substantial agreement between the two groups of respondents concerning the relative importance of most obstacles, certain divergences could also be observed. According to the results in Table 6, four out of 13 specified obstacles differ from the viewpoint of university teachers compared to students.
For example, the lack of experience of parents and students in technology saw the highest percentage (58.8%) from the university teachers but did not appear in the students’ responses. Another example is that 27.3% of the students complained about lacking time, many questions, and their own lack of experience. This obstacle did not appear in the list of university teachers because such issues are perceived to fall mostly under the student’s responsibility and raise their fears.

5. Discussion

In this section, we discuss the obstacles that affect quality in applying electronic exams, according to the points of view of the study sample members, and then compare them in different forms. The challenges of teachers and students expressed in the survey are largely consistent with what has been found in previous studies, concerning the lack of financial and technical capabilities of some students, and the need for reliable electricity to complete the exam [18], as well as an uninterrupted internet connection throughout the exam period [78]. The survey further showed that teachers and students feel that there was no real monitoring, possibly widespread fraud, that electronic exams will not show the true level of students and will not distinguish students from each other. Academic fraud is another major issue with electronic exams that have been highlighted in the literature (see Ngqondi et al. [52], Liu et al. [54]). Findings from our survey are also in line with James [77], who points to issues of confidentiality.
The results of the present study indicate that the obstacles to implementing electronic exams adequate for distance education during the COVID-19 pandemic fall under the following four categories.

5.1. Personal Obstacles

According to the viewpoint of the university teachers and the students, four obstacles out of 13 were identified as self-disabilities. In this group, there are two obstacles reported by both teachers and students. The first is “difficulty and poor study under difficult conditions”, reported by a considerable percentage of students (40%) and a high percentage of university teachers (67%). The second is that “electronic exams will not show the students’ real level and will not distinguish between students” received a moderate percentage of students (23.6%), but more than twice that percentage among the university teachers (57.7%). Two obstacles were not shared between the two groups. The teachers perceived that parents’ lack of conviction about electronic exams (parents not able to cooperate) was an obstacle (41.2%), which was not mentioned by students. The students instead mentioned lack of time, a lot of questions, and a lack of experience (27.3%).
Informal follow-up conversations with the students in connection with the study suggest that they did not like to change their exam habits and had difficulty in comprehending some academic subjects in the absence of class interaction, due to their familiarity with traditional (classroom) learning. Also, the sudden transition to distance education appeared to result in random browsing, wasted time, and the student sitting for a long time in front of the computer [61], which may affect their health and nervousness, as well as leading to feelings of isolation and the absence of face-to-face interaction with others. Both results from the questionnaire and the informal follow-up conversations suggest that remote education and electronic exams can face resistance. These findings are in line with other studies in the field [54,61].

5.2. Pedagogical Obstacles

Three obstacles out of 13 were categorized as pedagogical obstacles. Both groups (teachers and students) of respondents agree that the exams did not have a high quality of design and preparation (21.8% of students, and 47.8% of teachers) and that answers are very similar among the students (16.4% of students, and 28.9% of teachers) By contrast, only the university teachers noted the lack of sufficient experience in preparing and applying electronic exams (36.1%).
The difficulties experienced in managing some pedagogical activities, such as conducting exams and assignments within the e-learning environment, in addition to the difficulty of teachers obtaining feedback in identifying the strengths and weaknesses of students, could be due to the lack of use e-mail, digital learning platforms or social media by university teachers in their communication with students (see Falta and Sadrata [93]).

5.3. Technical Obstacles

Three obstacles out of 13 were technical obstacles. The three obstacles were mentioned by both groups of respondents, but at different levels, e.g., power cuts (no electricity) were noted by almost half the students (47.3%) and by three-quarters of the teachers (77.3%). Also, university teachers and students indicated that one of the most important obstacles to achieving the quality of applying electronic exams remotely during the Covid-19 pandemic was the weak or unavailable internet (36.4% of students and 70.1% of teachers) in many areas and the consequent interruptions in broadcasting and obstruction to the follow-up of lessons, and perhaps students’ distraction from communication. This was in addition to the security and confidentiality of data and information, and everything related to the protection from piracy on the internet, which affects the online courses and exams and their results, confirmed by the literature [53,73]. Issues with power cuts and internet connectivity have been observed consistently as a serious problem in studies from low- and lower-middle-income contexts [16,17,18,19,20,45], but the internet can also be a concern in high-income countries [54,77,78].
While earlier work on e-learning focused on an asynchronous approach, where internet connectivity did not pose major problems, Ismail et al. [47] argue that more recent work supposes a synchronous approach. The latter approach increases the vulnerability to issues such as power outages or poor internet connectivity, which becomes particularly noticeable in the situation of high-stakes summative e-examination. By contrast, when e-exams are taken in centers with monitoring staff, students can get help with technical problems from the staff, which are also able to document the encountered issues. Liu et al. [54] note that during monitored paper exams, students suffer less stress since they do not have to worry about system crashes or network disconnections.

5.4. Financial and Organizational Obstacles

Three obstacles out of 13 were financial and organizational obstacles. Two of these were mentioned by both teachers and students: the lack of financial and technical capabilities of some students, noted by more than half of the students (52.7%) and by more than four-fifths of the university teachers (83.5%). The impact of this type of issue is in line with the study by Jadee [94] in an Iranian context, who established electronic exam centers and provided them with technical cadres, electronic devices, programs, and the necessary equipment to address such problems. In a southern African context, similar centers are discussed by Zozie and Chawinga [45] and Mbatha and Naidoo [95], who also stress the importance of functioning internet.
Among the details in the responses concerning lack of financial and technical capabilities, besides internet connectivity, a high percentage of respondents noted the lack of remote communication equipment, which is consistent with the findings in Al-Umari et al. [96], who observed the difficulty of obtaining computers or similar devices by some students in an Iraqi context. Lack of training in the use of information technology is a problem for both university teachers and students. This is consistent with the findings in the literature (see for instance Jawida et al. [97]) and Jadee [94]).
Most home environments in the Gaza governorates are not suitable for distance learning. Some family situations are chaotic due to a large number of children in the same house, the narrow space of the home, and the presence of a significant number of learners in the same family sharing one computer. All these circumstances would impact outcomes of remote emergency teaching during the COVID-19 pandemic or any similar crisis [37].
In a study of changes in student achievement at the Al-Quds Open University between mixed remote and blended education, and fully remote teaching and examination during the pandemic, Jawad and Shalash [75] found that student achievement had increased with fully remote delivery, particularly among male students. The positive effect was statistically significant for all programs except Arabic Language, Social Studies, and Community Service Programs. The study does not distinguish between the West Bank and Gaza, and students in their first academic year, students who withdrew from one of the two semesters, students who changed their Academic Major during the study period, and preparatory-year students were excluded from the sample.
In comparison with the other universities present in Gaza, the Al-Quds Open University has long experience with distance and blended teaching. Students also have experience of these modes of teaching, and it is also likely that students enrolled in the open university programs will have done so because it best suits their personal needs and constraints. Since the other Gaza universities do not have this capacity and experience, our study, as well as the study by Shehab et al. 18] reveal equity and capacity issues that may be more representative of the general university student and teacher population in Gaza.

5.5. Extent of Agreement between Student and Teacher Responses

This study aimed to identify and compare different obstacles to applying electronic exams during the COVID-19 pandemic, with the results listed in Table 3 and Table 4. We also used regression analysis to compare the answers of university teachers and students as seen in Figure 1 and Table 7. This section presents the differences in percentages that have been given to the obstacles from teachers and students, called the breakdown percentage.
Table 7 summarizes the output statistics analysis breakdown for the percentages of the nine obstacles identified by both university teachers and the students to specify and compare the relationship between their opinions. Thus, for R-square value of 0.892 indicates that the model accounts for about 89.2% of the dependent variable’s variance (for common obstacles between the teachers and the students, where the two variables used for this analysis are both responses from teachers and students). Usually, the higher R-square values are the better regression model fit between the two members of the comparison and this is what we find in this study which means there is strong agreement between the two samples teachers and students.
The standard error (error of the regression) of the estimate represents the average distance that the specified values fall from the regression line and indicates the typical size of the residuals between the two samples [98]. In this study, we find that the standard distance between the two values is 4.64 percent.
The low value signifies that the distances between the two sample points and the fitted values are small, which indicates strong agreement between university teachers and students concerning perceived obstacles.
Under ANOVA, you can see different values used for the analysis, but we selected the most important statistic (Significance F), in which this value is also the p-value for the F-test of overall significance. A p-value is a probability that you would obtain the effect observed in the data which is calculated based on the data under the assumption that the null hypothesis is true. Lower p-values indicate greater evidence against the null hypothesis [98].
Significance F (p-value) for the overall F-test is 0.0001 × 10−4 as seen in Table 5. This is a small value in which we have a decimal point of 4 places to the left. The smaller value is a reasonable significance level where we can conclude that this study’s regression model, as a whole, is statistically significant.

5.6. Future Research

Both the literature and findings from the present study point to general issues with electronic assessment in higher education linked to infrastructure and income. Such issues need additional attention to prevent large segments of the world’s population from being excluded from higher education. The Scopus search conducted (see Section 2.4 above) suggests that very little attention is devoted to gender, and disability is rarely considered in research design or findings. Disability (see e.g., Al-Rantisi [99]) is of particular concern in regions affected by armed conflict, while such regions are also exposed to specific constraints that both affect students’ ability to safely move between home, work, and campus, and the physical infrastructure of electricity production, grids, and internet. Both existing conditions in regions affected by armed conflict and strategies to address these challenges, therefore, require further research.
The present explorative study was based on an electronically distributed questionnaire with a single open-ended question. To interpret results—and, in particular, to gain a fine-grained picture of how electronic exams affect specific student groups, as well as how the different factors that enable or constrain studies to combine and interact—the questionnaire would need to be followed up and complemented with in-depth qualitative interviews to gain a picture of the lived experiences of students in connection with electronic exams.
For low- and lower-middle-income countries, but also those contexts in high-income countries with large proportions of disadvantaged students, the question of educational equity—across and within countries—becomes central. In such settings, we cannot treat difficulties encountered by students as merely an individual issue or normalize that weak socio-economic background should be equated with poor academic achievement. Electronic assessment can improve access to higher education for certain groups (such as working students) under certain conditions while barring low-income students from education in others. The specific constraints experienced by various student groups and how these constraints interrelate with various assessment options in resource-constrained settings, therefore, requires additional research, including both obstacles and advantages that different student groups may experience.

6. Conclusions

Whereas prior to the pandemic, the choice of delivery mode for any element of a program was made based on criteria of suitability for the context or purpose, COVID-19 has led to a generalized situation of emergency remote delivery. Electronic assessment has therefore been widely used in contexts where it might be less suitable for a variety of reasons (such as lack of technical equipment and infrastructure, nature of content or skills being assessed, insufficient pedagogical competence, or lack of time to design exams). This study concerns the current situation during the COVID-19 pandemic, particularly in the Palestinian universities in the governorates of Gaza. Most of the obstacles experienced by students and teachers correspond to issues identified in studies in other contexts. Some of these are general, while others are more specific to low-income countries, and contexts affected by social inequities.
With respect to several of the obstacles identified in this study, strategies successfully employed in other contexts could also be applied in the context of the Gaza governorates. Thus, to prevent some of the effects of the obstacles on applying electronic exams, university teachers can, for instance, alter questions to mitigate fraud or similar situations, such as asking questions that are not easy to answer from books or the internet, and you may add a timer to each question to limit the time to search for the answer anywhere [44]. For certain courses, replacing written exams with remote oral exams (by videoconferencing or by phone) might be an option [72,100]. Problems connected to stress and unfamiliarity with e-exam formats can be addressed, particularly if asynchronous exam formats are used [78] and sufficient time is allowed to complete them. Higher-order skills and learning outcomes can be supported using improved exam design as suggested by Guàrdia et al. [30], to avoid the reduction in quality that standardized assessment and automated correction can otherwise entail [21]. However, this presupposes considerable investments in training teachers and pedagogical development specific for various disciplines, as well as allocating sufficient time for exam preparation [46] and well-considered design with respect to the various skills and outcomes that are desired. Mandour [101] also stresses the importance of adequate planning and training for the introduction of electronic exams.
Most of the Gaza universities adopted distance learning with low knowledge about the obstacles that would prevent the achievement of applying electronic exams. Similar problems have been encountered in other settings that introduced electronic teaching provision and examination as an emergency measure and are discussed in the literature. A major issue in the context of the governorates of Gaza is the underlying vulnerability of the electrical power supply and grid, with long daily power cuts [18,102,103]. While in other resource-constrained contexts issues with access to electricity are often related to remote locations and can be remedied in the longer term through the use of learning centers with decentralized renewable power units [104], in Gaza, due to restrictions on imports of equipment and systematic attacks on basic infrastructure [105,106,107], this situation is unlikely to improve in the near future. Due to the impacts of the deteriorating economic situation in Gaza on ICT infrastructure, the internet connection speed in 2020 was only 8 megabytes/second, compared to 14 in 2018 [108]. Future prospects for improving the internet are somewhat better than for electricity infrastructure, but investments in this domain would place a heavy strain on already constrained resources. Like in other limited-speed internet countries, since the coronavirus (COVID-19) was declared a pandemic, the number of people using the internet has increased, which may further affect the ability to use distance education.
Access to suitable devices and students’ living conditions at home are difficult to address in the context of Gaza. Although three-quarters of households have access to mobile phones, this does not ensure sufficient access to online resources [18,109]. According to Shehab et al., only 9% of families own a computer, and two-thirds of families cannot afford to buy one [18]. Additionally, while in other settings, the literature suggests that it can be advantageous for HEIs to transfer certain costs connected to classroom examinations to the students, any solutions involving imports of technical equipment and devices will be a strain on the balance of payments for the local economy. Similar considerations apply to the increase in the cost of licenses for virtual platforms and electronic administrative systems. For individual students, in the high-stakes situation of exams, this also becomes a question of equity. However, although distances are small in Gaza, the cost and time of moving between home and university can still be significant for low-income students.
While there are relatively weak arguments for generalizing e-exams beyond the COVID-19 crisis, access to digitally mediated content could raise the quality of education and facilitate collaboration with HEIs outside the Gaza strip. In this perspective, decentralized learning centers [45,94,95] with stable power supplies, up-to-date devices, and the possibility to retrieve uploaded content asynchronously would offer considerable advantages (see also Baticulon et al. [20]). For the context of Gaza, Shehab et al. [18] also recommend making materials available asynchronously. Additionally, access to online free courses and open education resources could improve educational equity [20], while Dutta and Smita [17] recommend offering free or reduced-cost internet packages, to reduce digital inequity and the financial burden on students.
The present study has focused on issues directly connected to electronic examination in higher education. It is nevertheless important to note that digital provision in the course of the year is likely to affect study conditions and that inequalities related to the socioeconomic background of students are therefore likely to be aggravated. Besides issues such as lack of access to suitable devices or internet connectivity [18], this includes unsuitable study environment, lack of access to libraries for students who cannot afford expensive books, and the time needed to work and contribute to family livelihoods in a context of economic crisis [17,20]. If the situation is protracted, students from lower-income families will therefore be more likely to fail exams and abandon their studies [110]), as well as being less likely to commence higher education pathways.
Remote electronic provision of higher education can support educational equity when it is a complement to face-to-face on-campus teaching since it will be chosen by those students who find it to their advantage. This is particularly the case for asynchronous delivery, which allows working students and students taking care of children to take higher education courses. By contrast, when remote electronic education is offered as the dominant mode, it will tend to exclude students with the least resources. Dependence on expensive technology will also tend to increase the gap between education institutions [103].
To conclude, the sudden shift to a remote provision in connection with the COVID-19 pandemic has clearly placed exceptional strains on both faculty and students in the Gaza governorates. Nevertheless, the current crisis provides an opportunity to reflect on the potentials of using ICTs more widely in higher education, as well as valuable information on the issues that need to be resolved with respect to electronic exams.

Author Contributions

Conceptualization, R.B. and M.A.; methodology, R.B. and M.A.; validation, R.B.; formal analysis, R.B.; investigation, M.A.; resources, R.B., H.A. and M.A.; data curation, R.B.; writing—original draft preparation, R.B., M.A. and H.A.; writing—review and editing, R.B. and H.A.; visualization, R.B. and M.A. All authors have read and agreed to the published version of the manuscript.

Funding

This research was partly funded by FORMAS, project number (2017-01375).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

Thanks to the Centre for Middle Eastern Studies and for the support from the FORMAS Swedish research council for sustainable development.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. Scopus listed publications (summarized in Table 1).
Table A1. Scopus listed publications (summarized in Table 1).
Abu Talib M., Bettayeb A.M., Omer R.I. Analytical study on the impact of technology in higher education during the age of COVID-19: Systematic literature review. Education and Information Technologies 2021. Doi: 10.1007/s10639-021-10507-1
Adanir et. al. (a): Afacan Adanir G., Muhametjanova G., Celikbag M.A., Omuraliev A., Ismailova R. Learners’ preferences for online resources, activities, and communication tools: A comparative study of Turkey and Kyrgyzstan. E-Learning and Digital Media 2020, 17(2), 148 – 166.
Adanir et al. (b): Adanir G.A., Ismailova R., Omuraliev A., Muhametjanova G. Learners perceptions of online exams: A comparative study in Turkey and Kyrgyzstan. International Review of Research in Open and Distance Learning 2020, 21(3), 1 – 17.
Adzima K. Examining online cheating in higher education using traditional classroom cheating as a guide. Electronic Journal of e-Learning 2020, 18(6), 476 – 493.
Alfiras M., Bojiah J., Yassin A.A. COVID-19 pandemic and the changing paradigms of higher education: A Gulf university perspective. Asian EFL Journal 2020, 27(5.1), 339 – 347.
Almusharraf N.M., Khahro S.H. Students’ Satisfaction with Online Learning Experiences during the COVID-19 Pandemic. International Journal of Emerging Technologies in Learning 2020, 15(21), 246 – 267.
Alshurideh M.T., Al Kurdi B., AlHamad A.Q., Salloum S.A., Alkurdi S., Dehghan A., Abuhashesh M., Masa’deh R., Factors affecting the use of smart mobile examination platforms by universities’ postgraduate students during the COVID-19 pandemic: An empirical study. Informatics 2021, 8(2), 32.
Ana A. Trends in expert system development: A practicum content analysis in vocational education for over grow pandemic learning problems. Indonesian Journal of Science and Technology 2020, 5(2), 246-260.
Arora S., Chaudhary P., Singh R.K. Impact of coronavirus and online exam anxiety on self-efficacy: the moderating role of coping strategy. Interactive Technology and Smart Education 2021. Doi: 10.1108/ITSE-08-2020-0158
Ayyash M.M., Herzallah F.A.T., Ahmad W. Towards social network sites acceptance in e-learning system: Students’ perspective at Palestine Technical University-Kadoorie International Journal of Advanced Computer Science and Applications 2020, 2, 312 – 320.
Bashitialshaaer, R.; Alhendawi, M.; Lassoued, Z. Obstacle Comparisons to Achieving Distance Learning and Applying Electronic Exams during COVID-19 Pandemic. Symmetry 2021, 13, 99.
Butler-Henderson, K., Crawford, J. A systematic review of online examinations: A pedagogical innovation for scalable authentication and integrity. Computers & Education, 2020, 159, 104024
Cicek I., Bernik A., Tomicic I. Student thoughts on virtual reality in higher education—a survey questionnaire. Information 2021, 12(4), 151.
Coskun-Setirek A., Tanrikulu Z. M-Universities: Critical Sustainability Factors. SAGE Open 2021, 11(1), DOI 10.1177/2158244021999388
Goff D., Johnston J., Bouboulis B.S. Maintaining academic standards and integrity in online business courses. International Journal of Higher Education 2020, 9(2), 248 – 257.
Goldberg D.M. Programming in a pandemic: Attaining academic integrity in online coding courses. Communications of the Association for Information Systems 2021, 48, 47 – 54.
Gormaz-Lobos D., Galarce-Miranda C., Hortsch H., Teacher Training’s Needs in University Context: A Case Study of a Chilean University of Applied Sciences. International Journal of Emerging Technologies in Learning 2021, 16(9), 119 – 132.
Gottipati S., Shankararaman V. Rapid transition of a technical course from face-to-face to online. Communications of the Association for Information Systems 2021, 48, 7 – 14.
Gudino Paredes S., Jasso Pena F.D.J., de La Fuente Alcazar J.M. Remote proctored exams: Integrity assurance in online education? Distance Education 2021. Doi: 10.1080/01587919.2021.1910495
Harris L., Harrison D., McNally D., Ford C. Academic Integrity in an Online Culture: Do McCabe’s Findings Hold True for Online, Adult Learners? Journal of Academic Ethics 2020, 18(4), 419-434.
Jamieson M.V. Keeping a learning community and academic integrity intact after a mid-Term shift to online learning in chemical engineering design during the COVID-19 Pandemic. Journal of Chemical Education 2020, 97(9), 2768 – 2772.
Jia J., He Y. The design, implementation and pilot application of an intelligent online proctoring system for online exams. Interactive Technology and Smart Education 2021. Doi: 10.1108/ITSE-12-2020-0246
Jaramillo-Morillo D., Ruiperez-Valiente J., Sarasty M.F., Ramirez-Gonzalez G. Identifying and characterizing students suspected of academic dishonesty in SPOCs for credit through learning analytics. International Journal of Educational Technology in Higher Education 2020, 17(11), 45.
Jawad Y.A.L.A., Shalash B. The impact of e-learning strategy on students’ academic achievement case study: Al-Quds Open University. International Journal of Higher Education 2020, 9(6), 44-53.
Kamal A.A.,Shaipullah N.M.,Truna L., Sabri M., Junaini S.N. Transitioning to online learning during COVID-19 Pandemic: Case study of a Pre-University Centre in Malaysia. International Journal of Advanced Computer Science and Applications 2020, 11(6), 217-223.
Kamber D.N. Personalized Distance-Learning Experience through Virtual Oral Examinations in an Undergraduate Biochemistry Course. Journal of Chemical Education 2021 98 (2), 395 – 399.
Khan Z.F., Alotaibi S.R. Design and implementation of a computerized user authentication system for e-learning. International Journal of Emerging Technologies in Learning 2020, 15(9), 4-18.
Khan M.A., Vivek, Nabi M.K., Khojah M., Tahir M. Students’ perception towards e-learning during covid-19 pandemic in India: An empirical study. Sustainability 2021, 13(1), 1-14.
Klyachko T.L., Novoseltsev A.V., Odoevskaya E.V., Sinelnikov-Murylev S.G. Lessons Learned from the Coronavirus Pandemic and Possible Changes to Funding Mechanisms in Higher Education. Voprosy Obrazovaniya 2021,1, 1 – 30.
Kuhlmeier V.A., Karasewich T.A., Olmstead M.C. Teaching Animal Learning and Cognition: Adapting to the Online Environment. Comparative Cognition and Behavior Reviews 2020, 15, 1-29.
Lassoued, Z.; Alhendawi, M.; Bashitialshaaer, R. An exploratory study of the obstacles for achieving quality in distance learning during the COVID-19 pandemic. Educ. Sci. 2020, 10, 232, doi:10.3390/educsci10090232.
Novikov P. Impact of COVID-19 emergency transition to on-line learning on international students’ perceptions of educational process at Russian University. Journal of Social Studies Education Research 2020, 11(3) 270-302.
Qashou A. Influencing factors in M-learning adoption in higher education. Education and Information Technologies 2021, 26(2), 1755 – 1785.
Tsekea, S., Chigwada, J.P. COVID-19: strategies for positioning the university library in support of e-learning. Digital Library Perspectives 2021, 37(1), 54-64.
Usher M., Hershkovitz A. Forkosh-Baruch A. From data to actions: Instructors decision making based on learners data in online emergency remote teaching. British Journal of Educational Technology 2021. doi: 10.1111/bjet.13108
Wu W., Berestova A., Lobuteva A., Stroiteleva N. An Intelligent Computer System for Assessing Student Performance International Journal of Emerging Technologies in Learning 2021, 16(2) 31-45.

References

  1. World Health Organization. Corona Virus Disease (COVID-19): Question and Answer. 2020. Available online: https://www.who.int/ar/emergencies/diseases/novel-coronavirus-2019/advice-for-public/q-a-coronaviruses (accessed on 17 May 2020).
  2. UNESCO. Startling Disparities in Digital Learning Emerge as COVID-19 Spreads: UN Education Agency. UN News, 21 April 2020. Available online: https://news.un.org/en/story/2020/04/1062232 (accessed on 21 March 2021).
  3. Khalaf, Z.N. Corona Virus and Digital Equality in Tele-Teaching in Emergency Situations. New Education Blog. Available online: https://www.new-educ.com// (accessed on 17 May 2020).
  4. García-Alberti, M.; Suárez, F.; Chiyón, I.; Mosquera Feijoo, C. Challenges and Experiences of Online Evaluation in Courses of Civil Engineering during the Lockdown Learning Due to the COVID-19 Pandemic. Educ. Sci. 2021, 11, 59. [Google Scholar] [CrossRef]
  5. Gewin, V. Pandemic burnout is rampant in academia. Nature 2021, 591, 489–491. [Google Scholar] [CrossRef]
  6. Bojovic, Z.; Bojovic, P.D.; Vujosevic, D.; Suh, J. Education in times of crisis: Rapid transition to distance learning. Comput. Appl. Eng. Educ. 2020, in press. [Google Scholar]
  7. Abdelhai, R.A. Distance Education in the Arab World and the Challenges of the Twenty-First Century; The Anglo-Egyptian Bookshop: Cairo, Egypt, 2010. [Google Scholar]
  8. Galvis, Á.H. Supporting decision-making processes on blended learning in higher education: Literature and good practices review. Int. J. Educ. Technol. High. Educ. 2018, 15, 1–38. [Google Scholar] [CrossRef]
  9. Castro, R. Blended learning in higher education: Trends and capabilities. Educ. Inf. Technol. 2019, 24, 2523–2546. [Google Scholar] [CrossRef]
  10. Bond, M.; Buntins, K.; Bedenlier, S.; Zawacki-Richter, O.; Kerres, M. Mapping research in student engagement and educational technology in higher education: A systematic evidence map. Int. J. Educ. Technol. High. Educ. 2020, 17, 2. [Google Scholar] [CrossRef]
  11. Shen, C.W.; Ho, J.T. Technology-enhanced learning in higher education: A bibliometric analysis with latent semantic approach. Comput. Hum. Behav. 2020, 104, 106177. [Google Scholar] [CrossRef]
  12. Hodges, C.; Moore, S.; Lockee, B.; Trust, T.; Bond, A. The difference between emergency remote teaching and online learning. Educ. Rev. 2020, 27, 1–12. [Google Scholar]
  13. Liguori, E.W.; Winkler, C. From offline to online: Challenges and opportunities for entrepreneurship education following the COVID-19 pandemic. Entrep. Educ. Pedagog. 2020. [Google Scholar] [CrossRef] [Green Version]
  14. Gonçalves, E.; Capucha, L. Student-Centered and ICT-Enabled Learning Models in Veterinarian Programs: What Changed with COVID-19? Educ. Sci. 2020, 10, 343. [Google Scholar] [CrossRef]
  15. Huang, R.H.; Liu, D.J.; Tlili, A.; Yang, J.F. Handbook on Facilitating Flexible Learning during Educational Disruption: The Chinese Experience in Maintaining Undisrupted Learning in COVID-19 Outbreak; Smart Learning Institute of Beijing Normal University: Beijing, China, 2020. [Google Scholar]
  16. Chakraborty, P.; Mittal, P.; Gupta, M.S.; Yadav, S.; Arora, A. Opinion of students on online education during the COVID-19 pandemic. Hum. Behav. Emerg. Tech. 2020, 3, 1–9. [Google Scholar] [CrossRef]
  17. Dutta, S.; Smita, M.K. The impact of COVID-19 pandemic on tertiary education in Bangladesh: Students’ perspectives. Open J. Soc. Sci. 2020, 8, 53–68. [Google Scholar]
  18. Shehab, A.; Alnajar, T.M.; Hamdia, M.H. A study of the effectiveness of E-learning in Gaza Strip during COVID-19 pandemic: The Islamic University of Gaza “case study”. In Proceedings of the 3rd Scientia Academia International Conference (SAICon-2020), Kuala Lumpur, Malaysia, 26–27 December 2020. [Google Scholar]
  19. Subedi, S.; Nayaju, S.; Subedi, S.; Shah, S.K.; Shah, J.M. Impact of E-learning during COVID-19 pandemic among nursing students and teachers of Nepal. Int. J. Sci. Healthc. Res. 2020, 5, 68–76. [Google Scholar]
  20. Baticulon, R.E.; Sy, J.J.; Alberto, N.R.I.; Baron, M.B.C.; Mabulay, R.E.C.; Rizada, L.G.T.; Tia, C.J.S.; Clarion, C.A.; Reyes, J.C.B. Barriers to online learning in the time of COVID-19: A national survey of medical students in the Philippines. Med. Sci. Educ. 2021, 31, 615–626. [Google Scholar] [CrossRef]
  21. Shraim, K. Online examination practices in higher education institutions: Learners’ perspectives. Turk. Online J. Distance Educ. 2019, 20, 185–196. [Google Scholar] [CrossRef]
  22. Basilaia, G.; Dgebuadze, M.; Kantaria, M.; Chokhonelidze, G. Replacing the classic learning form at universities as an immediate response to the COVID-19 virus infection in Georgia. Int. J. Res. Appl. Sci. Eng. Technol. 2020, 8, 101–108. [Google Scholar] [CrossRef]
  23. International Telecommunication Union ITU. The Last-mile Internet Connectivity Solutions Guide: Sustainable Connectivity Options for Unconnected Sites; International Telecommunication Union: Geneva, Switzerland, 2020. [Google Scholar]
  24. Kaliisa, R.; Michelle, P. Mobile learning policy and practice in Africa: Towards inclusive and equitable access to higher education. Australas. J. Educ. Technol. 2019, 35, 1–14. [Google Scholar] [CrossRef]
  25. International Telecommunication Union ITU. The affordability of ICT services 2020; ITU Policy Brief; ITU: Geneva, Switzerland, 2021; 8p. [Google Scholar]
  26. Alshurideh, M.T.; Al Kurdi, B.; AlHamad, A.Q.; Salloum, S.A.; Alkurdi, S.; Dehghan, A.; Abuhashesh, M.; Masa’deh, R. Factors affecting the use of smart mobile examination platforms by universities’ postgraduate students during the COVID-19 pandemic: An empirical study. Informatics 2021, 8, 32. [Google Scholar] [CrossRef]
  27. Qashou, A. Influencing factors in M-learning adoption in higher education. Educ. Inf. Technol. 2021, 26, 1755–1785. [Google Scholar] [CrossRef]
  28. OECD. Bridging the Digital Gender Divide, Include, Upskill, Innovate; OECD: Paris, France, 2018. [Google Scholar]
  29. Khlaif, Z.N.; Salha, S. The unanticipated educational challenges of developing countries in Covid-19 crisis: A brief report. Interdiscip. J. Virtual Learn. Med Sci. 2020, 11, 130–134. [Google Scholar]
  30. Guàrdia, L.; Crisp, G.; Alsina, I. Trends and challenges of e-assessment to enhance student learning in Higher Education. In Innovative Practices for Higher Education Assessment and Measurement; Cano, E., Ion, G., Eds.; IGI Global: Hershey, PA, USA, 2017; pp. 36–56. [Google Scholar]
  31. Torres-Madroñero, E.M.; Torres-Madroñero, M.C.; Ruiz Botero, L.D. Challenges and Possibilities of ICT-Mediated Assessment in Virtual Teaching and Learning Processes. Future Internet 2020, 12, 232. [Google Scholar] [CrossRef]
  32. Bennett, R.E. Online Assessment and the Comparability of Score Meaning; Educational Testing Service: Princeton, NJ, USA, 2003. [Google Scholar]
  33. Appiah, M.; Van Tonder, F. E-Assessment in Higher Education: A Review. Int. J. Bus. Manag. Econ. Res. 2018, 9, 1454–1460. [Google Scholar]
  34. Bahmawi, H.; Lamir, N. The Impact of Educational Evaluation Methods in the Context of Comparative Pedagogy with Competencies on the Pattern of Learner Achievement. Master’s Thesis, University of Adrar, Adrar, Algeria, 2014. [Google Scholar]
  35. Fluck, A.; Pullen, D.; Harper, C. Case study of a computer based examination. J. Educ. Technol. 2009, 25, 509–523. [Google Scholar] [CrossRef] [Green Version]
  36. Ranne, S.C.; Lunden, O.; Kolari, S. An alternative teaching method for electrical engineering course. Proc. IEEE Trans. Educ. 2008, 51, 423–431. [Google Scholar] [CrossRef]
  37. Lassoued, Z.; Alhendawi, M.; Bashitialshaaer, R. An exploratory study of the obstacles for achieving quality in distance learning during the COVID-19 pandemic. Educ. Sci. 2020, 10, 232. [Google Scholar] [CrossRef]
  38. Bashitialshaaer, R.; Alhendawi, M.; Lassoued, Z. Obstacle Comparisons to Achieving Distance Learning and Applying Electronic Exams during COVID-19 Pandemic. Symmetry 2021, 13, 99. [Google Scholar] [CrossRef]
  39. Rytkönen, A.; Myyry, L. Student Experiences on Taking Electronic Exams at the University of Helsinki. In Proceedings of the World Conference on Educational Multimedia, Hypermedia and Telecommunications, Tampere, Finland, 24 June 2014; pp. 2114–2121. [Google Scholar]
  40. Al-Khayyat, M. Students and instructor attitudes toward computerized tests in Business Faculty at the main campus of Al-Balqa Applied University. An-Najah Univ. J. Res. Hum. Sci. 2017, 31, 2041–2072. [Google Scholar]
  41. Tawafak, R.M.; Mohammed, M.N.; Arshah, R.B.A.; Romli, A. Review on the effect of student learning outcome and teaching technology in Omani’s higher education institution’s academic accreditation process. In Proceedings of the 2018 7th International Conference on Software and Computer Applications, CSCA 2018, Kuantan, Malaysia, 8–10 February 2018; pp. 243–247. [Google Scholar] [CrossRef]
  42. Tsiligiris, V.; Hill, C. A prospective model for aligning educational quality and student experience in international higher education. Stud. High. Educ. 2021, 46, 228–244. [Google Scholar] [CrossRef] [Green Version]
  43. Ahokas, K. Electronic Exams Save Teachers’ Time—and Nature. Available online: https://www.valamis.com/stories/csc-exam (accessed on 2 January 2021).
  44. Knowly. Advantages and Disadvantages of Online Examination System. 22 July 2020. Available online: https://www.onlineexambuilder.com/knowledge-center/exam-knowledge-center/advantages-and-disadvantages-of-online-examination-system/item10240 (accessed on 2 December 2020).
  45. Zozie, P.; Chawinga, W.D. Mapping an open digital university in Malawi: Implications for Africa. Res. Comp. Int. Educ. 2018, 13, 211–226. [Google Scholar] [CrossRef] [Green Version]
  46. Bennett, R.E. How the internet will help large-scale assessments reinvent itself. Educ. Policy Anal. Arch. 2001, 9, 1–23. [Google Scholar] [CrossRef] [Green Version]
  47. Ismail, R.; Safieddine, F.; Jaradat, A. E-university delivery model: Handling the evaluation process. Bus. Process. Manag. J. 2019, 25, 1633–1646. [Google Scholar] [CrossRef]
  48. Ryan, S.; Scott, B.; Freeman, H.; Patel, D. The Virtual University: The Internet and Resource-Based Learning; Kogan Page: London, UK, 2000. [Google Scholar]
  49. Clotilda, M. Features of Online Examination Software. 24 May 2020. Available online: https://www.creatrixcampus.com/blog/top-15-features-online-examination-software (accessed on 5 January 2021).
  50. Harrison, N.; Bergen, C. Some design strategies for developing an online course. Educ. Technol. 2000, 40, 57–60. [Google Scholar]
  51. Luck, J.A.; Chugh, R.; Turnbull, D.; Rytas Pember, E. Glitches and hitches: Sessional academic staff viewpoints on academic integrity and academic misconduct. High. Educ. Res. Dev. 2021, 41, 1–16. [Google Scholar] [CrossRef]
  52. Ngqondi, T.; Maoneke, P.B.; Mauwa, H. A secure online exams conceptual framework for South African universities. Soc. Sci. Humanit. Open 2021, 3, 100132. [Google Scholar]
  53. Jaramillo-Morillo, D.; Ruiperez-Valiente, J.; Sarasty, M.F.; Ramirez-Gonzalez, G. Identifying and characterizing students suspected of academic dishonesty in SPOCs for credit through learning analytics. Int. J. Educ. Technol. High. Educ. 2020, 17, 45. [Google Scholar] [CrossRef]
  54. Liu, I.F.; Chen, R.S.; Lu, H.C. An exploration into improving examinees’ acceptance of participation in online exam. Educ. Technol. Soc. 2015, 18, 153–165. [Google Scholar]
  55. Sarrayrih, M.A.; Ilyas, M. Challenges of Online Exam, Performances and problems for Online University Exam. IJCSI Int. J. Comput. Sci. Issues 2013, 10, 439–443. [Google Scholar]
  56. Sarrayrih, M.A. Implementation and security development online exam, performances and problems for online university exam. Int. J. Comput. Sci. Inf. Secur. 2016, 14, 24. [Google Scholar]
  57. Harris, L.; Harrison, D.; McNally, D.; Ford, C. Academic Integrity in an Online Culture: Do McCabe’s Findings Hold True for Online, Adult Learners? J. Acad. Ethics 2020, 18, 419–434. [Google Scholar] [CrossRef]
  58. Dawson, P. Five ways to hack and cheat with bring-your-own-device electronic examinations. Br. J. Educ. Technol. 2016, 47, 592–600. [Google Scholar] [CrossRef]
  59. Chirumamilla, A.; Sindre, G.; Nguyen-Duc, A. Cheating in e-exams and paper exams: The perceptions of engineering students and teachers in Norway. Assess. Eval. High. Educ. 2020, 45, 940–957. [Google Scholar] [CrossRef]
  60. Küppers, B.; Eifert, T.; Zameitat, R.; Schroeder, U. EA and BYOD: Threat Model and Comparison to Paper-based Examinations. In Proceedings of the CSEDU 2020—12th International Conference on Computer Supported Education, Prague, Czech Republic, 2–4 May 2020; pp. 495–502. [Google Scholar]
  61. Iglesias-Pradas, S.; Hernández-García, Á.; Chaparro-Peláez, J.; Prieto, J.L. Emergency remote teaching and students’ academic performance in higher education during the COVID-19 pandemic: A case study. Comput. Hum. Behav. 2021, 119, 106713. [Google Scholar] [CrossRef]
  62. Kuntsman, A.; Rattle, I. Towards a paradigmatic shift in sustainability studies: A systematic review of peer reviewed literature and future agenda setting to consider environmental (Un)sustainability of digital communication. Environ. Commun. 2019, 13, 567–581. [Google Scholar] [CrossRef] [Green Version]
  63. Wu, W.; Berestova, A.; Lobuteva, A.; Stroiteleva, N. An Intelligent Computer System for Assessing Student Performance. Int. J. Emerg. Technol. Learn. 2021, 16, 31–45. [Google Scholar] [CrossRef]
  64. Tsekea, S.; Chigwada, J.P. COVID-19: Strategies for positioning the university library in support of e-learning. Digit. Libr. Perspect. 2021, 37, 54–64. [Google Scholar] [CrossRef]
  65. Usher, M.; Hershkovitz, A.; Forkosh-Baruch, A. From data to actions: Instructors’ decision making based on learners’ data in online emergency remote teaching. Br. J. Educ. Technol. 2021. [Google Scholar] [CrossRef]
  66. Gormaz-Lobos, D.; Galarce-Miranda, C.; Hortsch, H. Teacher Training’s Needs in University Context: A Case Study of a Chilean University of Applied Sciences. Int. J. Emerg. Technol. Learn. 2021, 16, 119–132. [Google Scholar] [CrossRef]
  67. Jamieson, M.V. Keeping a learning community and academic integrity intact after a mid-term shift to online learning in chemical engineering design during the COVID-19 Pandemic. J. Chem. Educ. 2020, 97, 2768–2772. [Google Scholar] [CrossRef]
  68. Almusharraf, N.M.; Khahro, S.H. Students’ Satisfaction with Online Learning Experiences during the COVID-19 Pandemic. Int. J. Emerg. Technol. Learn. 2020, 15, 246–267. [Google Scholar] [CrossRef]
  69. Klyachko, T.L.; Novoseltsev, A.V.; Odoevskaya, E.V.; Sinelnikov-Murylev, S.G. Lessons Learned from the Coronavirus Pandemic and Possible Changes to Funding Mechanisms in Higher Education. Vopr. Obraz. 2021, 1, 1–30. [Google Scholar]
  70. Abu Talib, M.; Bettayeb, A.M.; Omer, R.I. Analytical study on the impact of technology in higher education during the age of COVID-19: Systematic literature review. Educ. Inf. Technol. 2021. [Google Scholar] [CrossRef]
  71. Butler-Henderson, K.; Crawford, J. A systematic review of online examinations: A pedagogical innovation for scalable authentication and integrity. Comput. Educ. 2020, 159, 104024. [Google Scholar] [CrossRef] [PubMed]
  72. Kamber, D.N. Personalized Distance-Learning Experience through Virtual Oral Examinations in an Undergraduate Biochemistry Course. J. Chem. Educ. 2021, 98, 395–399. [Google Scholar] [CrossRef]
  73. Gudino Paredes, S.; Jasso Pena, F.D.J.; de La Fuente Alcazar, J.M. Remote proctored exams: Integrity assurance in online education? Distance Educ. 2021. [Google Scholar] [CrossRef]
  74. Arora, S.; Chaudhary, P.; Singh, R.K. Impact of coronavirus and online exam anxiety on self-efficacy: The moderating role of coping strategy. Interact. Technol. Smart Educ. 2021. [Google Scholar] [CrossRef]
  75. Jawad, Y.A.L.A.; Shalash, B. The impact of e-learning strategy on students’ academic achievement case study: Al-Quds Open University. Int. J. High. Educ. 2020, 9, 44–53. [Google Scholar] [CrossRef]
  76. Alfiras, M.; Bojiah, J.; Yassin, A.A. COVID-19 pandemic and the changing paradigms of higher education: A Gulf university perspective. Asian EFL J. 2020, 27, 339–347. [Google Scholar]
  77. James, R. Tertiary student attitudes to invigilated, online summative examinations. Int. J. Educ. Technol. High. Educ. 2016, 13, 2–13. [Google Scholar] [CrossRef] [Green Version]
  78. Thomas, P.; Price, B.; Paine, C.; Richards, M. Remote electronic examinations: Student experiences. Br. J. Educ. Technol. 2002, 33, 537–549. [Google Scholar] [CrossRef]
  79. Ossiannilsson, E. Quality Models for Open, Flexible, and Online Learning. J. Comput. Sci. Res. 2020, 2, 19–31. [Google Scholar] [CrossRef]
  80. Elayyat, J. Assessment of the Higher Education Sector Needs in Palestine, in the Context of the National Spatial Plan 2025. Ph.D. Thesis, Birzeit University, Birzeit, Palestine, 2016. [Google Scholar]
  81. UN Population Division. World Population Prospects. World Population Prospects—Population Division—United Nations. Available online: https://population.un.org/wpp/ (accessed on 10 June 2021).
  82. General Administration of Planning, Ministry of Education and Higher Education. Statistical Yearbook of Education in the Governorates of Gaza 2019/2020, Gaza, Palestine. Available online: www.mohe.ps (accessed on 10 June 2021).
  83. Palestinian Central Bureau of Statistics. PCBS. Available online: http://www.pcbs.gov.ps/default.aspx (accessed on 10 June 2021).
  84. Middle East Monitor. Official: Unemployment Rate Reaches 82% in Gaza Strip. Available online: https://www.middleeastmonitor.com/20201124-official-unemployment-rate-reaches-82-in-gaza-strip/ (accessed on 10 June 2020).
  85. UNCTAD. Economic Costs of the Israeli Occupation for the Palestinian People: The Gaza Strip under Closure and Restrictions. Report Prepared by the Secretariat of the United Nations Conference on Trade and Development on the Economic costs of the Israeli Occupation for the Palestinian People: The Gaza Strip under closure and Restrictions; A/75/310; UNCTAD: Geneva, Switzerland, 13 August 2020. [Google Scholar]
  86. OCHAOPT. Food Insecurity in the Opt: 1.3 Million Palestinians in the Gaza Strip are Food Insecure. 14 December 2018. United Nations Office for the Coordination of Humanitarian Affairs—Occupied Palestinian Territory. Available online: ochaopt.org (accessed on 10 June 2021).
  87. Al-Azhar University. Available online: http://www.alazhar.edu.ps/eng/index.asp (accessed on 10 June 2021).
  88. Jawabreh, A. Gaza’s University Students Drop Out at an Accelerating Rate Due to the Pandemic. 5 October 2020. Available online: www.al-fanarmedia.org (accessed on 10 June 2021).
  89. Elessi, K.; Aljamal, A.; Albaraqouni, L. Effects of the 10-year siege coupled with repeated wars on the psychological health and quality of life of university students in the Gaza Strip: A descriptive study. Lancet 2019, 393. [Google Scholar] [CrossRef]
  90. Radwan, A.S. Anxiety among University Staff during Covid-19 Pandemic: A Cross-Sectional Study. IOSR J. Nurs. Health Sci. (IOSR-JNHS) 2021, 10, 1–8. [Google Scholar] [CrossRef]
  91. Al-Quds Open University. Available online: www.qou.edu (accessed on 10 June 2021).
  92. Alsadoon, H. Students’ Perceptions of E-Assessment at Saudi Electronic University. Turk. Online J. Educ. Technol. TOJET 2017, 16, 147–153. [Google Scholar]
  93. Falta, E.; Sadrata, F. Barriers to using e-learning to teach masters students at the Algerian University. Arab J. Media Child. Cult. 2019, 6, 17–48. [Google Scholar]
  94. Jadee, M. Attitudes of faculty members towards conducting electronic tests and the obstacles to their application at the University of Tabuk. Spec. Int. Educ. J. 2017, 6, 77–87. [Google Scholar]
  95. Mbatha, B.; Naidoo, L. Bridging the transactional gap in open distance learning (ODL): The case of the University of South Africa (Unisa). Inkanyiso J. Hum. Soc. Sci. 2010, 2, 64–69. [Google Scholar] [CrossRef] [Green Version]
  96. Alumari, M.M.; Alkhatib, I.M.; Alrufiai, I.M.M. The reality and requirements of modern education methods—Electronic Learning. Al-Dananeer Mag. 2016, 9, 37–56. [Google Scholar]
  97. Jawida, A.; Tarshun, O.; Alyane, A. Characteristics and objectives of distance education and e-learning—A comparative study on the experiences of some Arab countries. Arab J. Lit. Hum. 2019, 6, 285–298. [Google Scholar]
  98. Frost, J. Statistics by Jim. Available online: https://statisticsbyjim.com/regression/interpret-r-squared-regression/ (accessed on 7 January 2021).
  99. Al-Rantisi, A.M. Blind Students’ Attitudes towards the Effectiveness of Services of Disability Services Center at Islamic University of Gaza. Asian J. Hum. Soc. Stud. 2019, 7. [Google Scholar] [CrossRef] [Green Version]
  100. Akimov, A.; Malin, M. When old becomes new: A case study of oral examination as an online assessment tool. Assess. Eval. High. Educ. 2020, 45, 1205–1221. [Google Scholar] [CrossRef]
  101. Mandour, E.M. The effect of a training program for postgraduate students at the College of Education on designing electronic tests according to the proposed quality standards. J. Educ. Soc. Stud. 2013, 19, 391–460. [Google Scholar]
  102. Hamed, T.A.; Peric, K. The role of renewable energy resources in alleviating energy poverty in Palestine. Renew. Energy Focus 2020, 35, 97–107. [Google Scholar] [CrossRef]
  103. Moghli, M.A.; Shuayb, M. Education under Covid-19 Lockdown: Reflections from Teachers, Students and Parents; Center for Lebanese Studies, Lebanese American University: Beirut, Lebanon, 2020. [Google Scholar]
  104. Pastor, R.; Tobarra, L.; Robles-Gómez, A.; Cano, J.; Hammad, B.; Al-Zoubi, A.; Hernández, R.; Castro, M. Renewable energy remote online laboratories in Jordan universities: Tools for training students in Jordan. Renew. Energy 2020, 149, 749–759. [Google Scholar] [CrossRef]
  105. Barakat, S.; Samson, M.; Elkahlout, G. The Gaza reconstruction mechanism: Old wine in new bottlenecks. J. Interv. Statebuild. 2018, 12, 208–227. [Google Scholar] [CrossRef]
  106. Barakat, S.; Samson, M.; Elkahlout, G. Reconstruction under siege: The Gaza Strip since 2007. Disasters 2020, 44, 477–498. [Google Scholar] [CrossRef]
  107. Weinthal, E.; Sowers, J. Targeting infrastructure and livelihoods in the West Bank and Gaza. Int. Aff. 2019, 95, 319–340. [Google Scholar] [CrossRef] [Green Version]
  108. Al Mezan Center for Human Rights. News Brief: Annual Report on Access to Economic, Social, and Cultural Rights in Gaza Shows Decline due to COVID-19 Pandemic and Continued Unlawful Restrictions by Occupying Power. 18 March 2021. Available online: https://mezan.org/en/post/23940 (accessed on 10 June 2021).
  109. Aradeh, O.; Van Belle, J.P.; Budree, A. ICT-Based Participation in Support of Palestinian Refugees’ Sustainable Livelihoods: A Local Authority Perspective. In Information and Communication Technologies for Development. ICT4D 2020. IFIP Advances in Information and Communication Technology; Bass, J.M., Wall, P.J., Eds.; Springer: Cham, Switzerland, 2020; Volume 587. [Google Scholar] [CrossRef]
  110. UNESCO. How Many Students Are at Risk of Not Returning to School? Advocacy Paper 30 July 2020; UNESCO COVID-19 Education Response, Section of Education Policy; UNESCO: Paris, France, 2020. [Google Scholar]
Figure 1. Comparing breakdown percentage between obstacles identified by university teachers and students.
Figure 1. Comparing breakdown percentage between obstacles identified by university teachers and students.
Information 12 00256 g001
Table 1. Summary of Scopus search results. (See Scopus listed publications in Appendix A).
Table 1. Summary of Scopus search results. (See Scopus listed publications in Appendix A).
Overall FocusPublicationsCountries
Literature reviewsAbu Talib et al. [70]; Butler-Henderson & Crawford [71]
Prospects of introducing ICT-mediated learning, technical acceptanceCicek et al.; Alshurideh et al. [26]; Coskun-Setirek & Tanrikulu; Adanir et al. (a); Qashou [27], Ayyash et al.Croatia, United Arab Emirates, Turkey, Kyrgyzstan, Palestine
ERTNovikov, Klyachko et al. [69]; Kamal et al.; Ana; Kuhlmeier et al.; Jamieson [67]; Kamber [72]; Gudino Paredes et al. [73]; Gormaz-Lobos et al. [66]; Khan et al.; Arora et al. [74]; Gottipati & Shankararaman; Alshurideh et al. [26]; Jawad & Shalash [75]; Lassoued et al. [37]; Bashitialshaaer et al. [38]; Almusharraf & Khahro [68]; Alfiras et al. [76]; Usher et al. [65]; Tsekea & Chigwada [64]Russia, Malaysia, Indonesia, Canada, USA, Mexico, Chile, India, Singapore, United Arab Emirates, Palestine, Arab states, Saudi Arabia, Bahrain,
Israel, Zimbabwe
General experiences of electronic examinationAdanir et al. (b)Turkey, Kyrgyzstan
Academic fraud and tools to prevent fraudGoldberg; Harris et al. [57]; Goff et al.; Adzima; Jia & He; Khan & Alotaibi; Gudino Paredes et al. [73]; Jaramillo-Morillo et al. [53]USA, China,
Mexico, Colombia
Other tools for remote electronic examinationWu et al. [63]Russia, China
Intersection ERT and remote electronic examinationAlshurideh et al. [26]; Kamber [72]; Gudino Paredes et al. [73]; Arora et al. [74]United Arab Emirates, USA,
Mexico, India
Table 2. Overview universities included in the study.
Table 2. Overview universities included in the study.
UniversityTypeEstablished
Year
Area Per
St. (m2) *
Student NumbersAcademic Staff
(2019–2020) ***
(2013–2014) *(2015–2016) **(2019–2020) ***
Al-AzharPublic19915.514,453 12,473226
Islamic UniversityPublic19784.219,273 15,521400
Al-AqsaGovernment199113.918,727 14,681450
Al-Quds Open UniversityPublic1975-60,230
(West Bank
& Gaza)
12,828
(five Gaza
branches)
Sources: * Elayyat [80], ** Al-Quds Open University [91]), *** Ministry of Education and Higher Education [82].
Table 3. Distribution of respondents answering the questionnaire.
Table 3. Distribution of respondents answering the questionnaire.
Targeted MembersTotal NumberPercentage (%)Percentage M/F (%)
University teachers9763.870/30
University students5536.260/40
Total15210066/34
Table 4. Obstacles for the electronic exam under the coronavirus pandemic (a subset of data excerpted from [38]).
Table 4. Obstacles for the electronic exam under the coronavirus pandemic (a subset of data excerpted from [38]).
Obstacles Category (Groups)ObstaclesStudents Repetition
(n = 55)
Teachers
Repetition
(n = 97)
Overall
Repetition
(n = 152)
Overall
Percentage
(%)
Personal obstaclesDifficulty and poor study in light of critical conditions.22658757.2
Parents are not convinced of electronic exams (parents are not cooperating).-404026.3
Lack of time, many questions, and lack of experience.15-159.9
Electronic exams will not show the students’ real level and will not distinguish students from each other.13566945.4
Pedagogical obstaclesThe answers are very similar among students.9283724.3
Some teachers do not have sufficient experience to prepare and apply for the exams.-353523.0
The exams did not have a high quality of design and preparation.12465838.2
Technical obstaclesPower cuts (lack of electricity).267510166.4
Internet unavailability and poor internet quality.20688857.9
Financial and organizational obstaclesThere was no real surveillance, widespread cases of fraud.14486240.8
Not communicating well between students and lecturers.10293925.7
Parents and students lack experience with technology.-575737.5
Lack of financial and technical capabilities of some students.298111072.4
Table 5. Relative significance of electronic exam obstacles for respondents showing overall and breakdown between university teachers and students.
Table 5. Relative significance of electronic exam obstacles for respondents showing overall and breakdown between university teachers and students.
Type of ObstaclesOverall Percentage
and Arrangement
Breakdown Percentage
and Arrangement
StudentsTeachers
(%)
(n = 152)
Obstacle
Order
(%)
(n = 55)
Order(%)
(n = 97)
Order
Lack of financial and technical capabilities of some students.72.4152.7183.51
Power cuts (lack of electricity).66.4247.3277.32
Internet unavailability and poor internet quality.57.9336.4470.13
Difficulty and poor study in light of critical conditions.57.2440.0367.04
Electronic exams will not show the students’ real level and will not distinguish students from each other.45.4523.6757.76
There was no real surveillance, widespread cases of fraud.40.8625.5649.57
The exams did not have a high quality of design and preparation.38.2721.8847.48
Parents and students lack experience with technology.37.58--58.85
Parents are not convinced of electronic exams (parents are not cooperating).26.39--41.29
Not communicating well between students and lecturers.25.71018.2929.911
The answers are very similar among students.24.31116.41028.912
Some teachers do not have sufficient experience to prepare and apply for the exams.23.012--36.110
Lack of time, many questions, and lack of experience.9.91327.35--
Table 6. The significant obstacles not shared between university teachers and students.
Table 6. The significant obstacles not shared between university teachers and students.
Type of ObstaclesOverall Percentage
and Arrangement
Breakdown Percentage
and Arrangement
StudentsTeachers
(%)
(n = 152)
Obstacle
Order
(%)
(n = 55)
Order(%)
(n = 97)
Order
Lack of time, many questions, and lack of experience.9.91327.35--
Parents and students lack experience with technology37.58--58.85
Parents are not convinced of electronic exams (parents are not cooperating)26.39--41.29
Some teachers do not have sufficient experience to prepare and implement the exams23.012--36.110
Table 7. Summary output regression statistics analysis comparing breakdown between obstacles identified by university teachers and students.
Table 7. Summary output regression statistics analysis comparing breakdown between obstacles identified by university teachers and students.
Regression Statistics
Multiple R0.94
R Square0.89
Adjusted R Square0.88
Standard Error4.64
Observations9
ANOVA
dfSSMSFSignificance F
Regression11243.01243.057.80.0001
Residual7150.621.5
Total81393.6
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Bashitialshaaer, R.; Alhendawi, M.; Avery, H. Obstacles to Applying Electronic Exams amidst the COVID-19 Pandemic: An Exploratory Study in the Palestinian Universities in Gaza. Information 2021, 12, 256. https://doi.org/10.3390/info12060256

AMA Style

Bashitialshaaer R, Alhendawi M, Avery H. Obstacles to Applying Electronic Exams amidst the COVID-19 Pandemic: An Exploratory Study in the Palestinian Universities in Gaza. Information. 2021; 12(6):256. https://doi.org/10.3390/info12060256

Chicago/Turabian Style

Bashitialshaaer, Raed, Mohammed Alhendawi, and Helen Avery. 2021. "Obstacles to Applying Electronic Exams amidst the COVID-19 Pandemic: An Exploratory Study in the Palestinian Universities in Gaza" Information 12, no. 6: 256. https://doi.org/10.3390/info12060256

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop