
9 CONCLUSION AND FUTURE
WORK
This paper presents the concept development of a sup-
port tool for oral exams. Based on a requirements
analysis by interviewing examiners and observers re-
quirements were collected. An initial design prototype
was developed to streamline the concept development
process. Based on the results, a concept has been devel-
oped that considers all phases of the exam (preparation,
execution, and review) and focuses on the execution
phase to support observers in STEM exams, but can
probably also be used in and extended to other disci-
plines with little effort. Based on the concept a proof-
of-concept prototype was implemented and evaluated.
The results indicate that the concept is generally well
received, does not lack significant features, and is a
good basis for further research.
There are many possible extensions to the concept
that have been suggested by participants in our stud-
ies. For the initial proof-of-concept prototype only an
essential fraction could be implemented. In particular,
a few minor bugs need to be fixed, placeholders need
to be replaced, which would be necessary for a fully
usable product. Furthermore, a separate, distinct ex-
aminer view that displays the current log, the elapsed
time and stored questions should be added.
The complete digital, structured logging of an oral
exam is fundamental to conducting assessment ana-
lytics. Because of a machine-readable database, the
prototype lays the foundation to built promising analyt-
ics features to support grading (in terms of reliability
and validity) and to optimize exams/questions in the re-
view phase. This also opens up possible extensions for
an examiner view, which can then use the stored data to
display covered topics and to recommend (rarely used
or good follow-up) questions. Such features would
not be possible without the structured recording of
exam data and the usage of a specialized tool and may
increase the reliability of oral exams.
Next steps include bug fixing, conducting case stud-
ies in authentic oral examinations, making the features
configurable, and researching additional features with
a focus on usability and assessment analytics. Further
research should also focus on examiners as stakehold-
ers, for example, by investigating different types of
visualizations of the log to support them in asking the
“right” questions and in the grading.
More information about the current state of the
EasyProtocol project and the software is available at
https://www.tel.ifi.lmu.de/software/easyprotocol/.
ACKNOWLEDGMENT
The authors thank the study participants for their valu-
able input. This research was supported by the German
Federal Ministry of Education and Research (BMBF),
grant number [16DHBKI013] (AIM@LMU).
REFERENCES
Abuzied, A. I. H. and Nabag, W. O. M. (2023). Structured
viva validity, reliability, and acceptability as an assess-
ment tool in health professions education: a systematic
review and meta-analysis. BMC medical education,
23(1):531.
Akimov, A. and Malin, M. (2020). When old becomes new: a
case study of oral examination as an online assessment
tool. Assess. Eval. High. Educ., 45(8):1205–1221.
Ally, S. (2024). Scheduling online oral assessments us-
ing an iterative algorithm: A profound software for
educational continuity. Interdisciplinary Journal of
Education Research, 6:1–11.
Baghdadchi, S., Phan, A., Sandoval, C., Qi, H., Lubarda,
M., and Delson, N. (2022). Student perceptions of
oral exams in undergraduate engineering classes and
implications for effective oral exam design. ASEE
Annual Conference & Exposition Proceedings.
Bayley, T., Maclean, K. D. S., and Weidner, T. (2024). Back
to the future: Implementing large-scale oral exams.
Management Teaching Review.
Burke-Smalley, L. A. (2014). Using oral exams to assess
communication skills in business courses. BPCQ,
77(3):266–280.
Delson, N., Baghdadchi, S., Ghazinejad, M., Lubarda, M.,
Minnes, M., Phan, A., Schurgers, C., and Qi, H. (2022).
Can oral exams increase student performance and mo-
tivation? ASEE Annual Conference & Exposition.
Dicks, A. P., Lautens, M., Koroluk, K. J., and Skonieczny, S.
(2012). Undergraduate oral examinations in a univer-
sity organic chemistry curriculum. Journal of Chemical
Education, 89(12):1506–1510.
Ellis, C. (2013). Broadening the scope and increasing the
usefulness of learning analytics: The case for assess-
ment analytics. BJET, 44(4):662–664.
Fischer, J. D. (2022). Pr
¨
ufungsrecht. C.H.Beck, 8. edition.
Gharibyan, H. (2005). Assessing students’ knowledge: oral
exams vs. written tests. SIGCSE Bull., 37(3):143–147.
Hartmer, M. and Detmer, H., editors (2022). Hochschulrecht:
Ein Handbuch f
¨
ur die Praxis. Juris Zusatzmodul Justiz
Verwaltungsrecht. M
¨
uller, C.F., Heidelberg, 4. edition.
Hazen, H. (2020). Use of oral examinations to assess student
learning in the social sciences. Journal of Geography
in Higher Education, 44(4):592–607.
Heppner, C., Wienfort, N., and H
¨
artel, S. (2022). Die
m
¨
undliche Pr
¨
ufung in den juristischen Staatsexamina –
eine Blackbox mit Diskriminierungspotential. ZDRW,
9(1):23–40.
CSEDU 2025 - 17th International Conference on Computer Supported Education
592