of digital exams, as both were proven to be vulner-
able and surpassable in a plethora of ways. We have
not found further research into the efficacy of such
methods, besides these prior experiments and the cur-
rent paper. Other online proctoring tools which record
the examinees during their test face criticism related
to privacy issues and raising anxiety levels for test
takers (Hylton et al., 2016). The privacy issues are
also among the concerns found in (Krak and Diesvelt,
2016; Krak and Diesvelt, 2017).
Regarding online proctoring, we look at two related
research lines, one that tackles the acceptance of these
systems by examinees, and another that looks at how
it impacts the performance in a given test.
In 2009, using Software Secure Remote Proctor-
ing SSRP system, researchers conducted an experi-
ment with 31 students from 6 different faculties in a
small regional university to evaluate students’ accep-
tance of online proctoring tools. The results showed
that slightly less than half the students expressed their
support for online proctoring tools, whilst a quarter of
the students expressed refusal of such proctoring tech-
niques (Bedford et al., 2009). Lilley et al. investigated
the acceptance of online proctoring with a group of 21
bachelor students from 7 different countries. Using
ProctorU, the subjects participated in an online for-
mative and two online summatve assessments. 9 of
the 21 participants shared their experiences with on-
line proctoring, 8 of which expressed their support to
use online proctors in further modules (Lilley et al.,
2016). A later experiment conducted by Milone et al.
in the university of Minnesota in 2017 concerned a
larger pool of students, 344, and showed that 89% of
the students were satisfied with their experience using
an online proctoring tool, ProctorU, for their online
exams, while 62% agreed that the setup of the proc-
toring tool takes less than 10 minutes (Milone et al.,
2017).
Another direction in proctoring research concerns
the impact of proctoring tools on test scores. A study
by Weiner and Hurtz contrasted on-site proctoring to
online proctoring. The experiment concerned more
than 14.000 participants and concluded that there is
a high overlap between the scores of the examinees
in both online and on-site settings. Furthermore, the
examinees dissociated their test scores from the type
of proctoring in place (Weiner and Hurtz, 2017). In a
different setting, Alessio et al. compared the scores of
students in proctored and unproctored settings. The
study concerned 147 students enrolled in a online
course on medical terminology. The experiment set-
ting allowed students to be divided over 9 sections,
according to their majors, 4 of which took an online-
proctored test, whilst the remaining 5 took an unproc-
tored test. The results of the study show that stu-
dents in the unproctored setting scored significantly
higher (14% more) than their proctored counterparts,
and spent twice as much time taking the tests, which
the investigators linked to unproctored tests allowing
much space for cheating (Alessio et al., 2017). A
similar result was achieved by Karim et al., whose
experiment setup involved 295 participants who were
handed out to cognitive ability test, one that is search-
able online and one that isn’t. The experiment saw
30% of the participants withdrawing from the proc-
tored test compared to 19% in the unproctored one,
it also confirms that unproctored examinees scored
higher than the proctored ones. Opposing (Alessio
et al., 2017), Hylton et al. administered an experiment
with two groups of participants, wherein the first takes
an unproctored exam while the other is proctored on-
line. Though the results show that the unproctored ex-
aminees score 3% more than their proctored peers and
spend 30% more time on the test, the researchers of-
fer a different interpretation linking the slightly lower
results in proctored settings to higher anxiety levels
(Hylton et al., 2016). Results from a study conducted
at the University of Minnesota show slightly differ-
ent results from (Alessio et al., 2017) and (Karim
et al., 2014). In this setup, students taking a psy-
chology minor afford the freedom of choosing on-
site or online proctored exams. The study spans three
semesters and found that the scores of online exam-
inees were 8% lower than their on-site counterparts
for two semesters; this difference disappeared in the
third semester with both types of examinees scoring
similar results (Brothen and Klimes-Dougan, 2015).
A more recent study by Neftali and Bic compared the
performance of students taking an online and an on-
site version of the same discrete math course. The
study found that while online students score higher in
online homework, their results in the online proctored
exams are 2% less from their online peers.
Dendir and Maxwell (Dendir and Maxwell, 2020)
report on a study ran in between 2014 and 2019,
in which the scores of students in two online
courses, principles of microeconomics and geography
of North America, were compared before and after
the adoption of a web-based proctoring tool in 2018,
Respondus Monitor. The experiment showed that af-
ter the adoption of online proctoring the scores have
dropped on average by 10 to 20%. This suggests, that
prior to the adoption of proctoring, cheating on online
exams was a common occurrence. This confirms that
the use of online proctoring has a preventive effect, as
was also suggested in our own student survey.
Vazquez et al. (Vazquez et al., 2021) ran a study
with 974 students enrolled in two sections —online
CSEDU 2021 - 13th International Conference on Computer Supported Education
288