
All participants indicated an excellent level of us-
ability (SUS score > 80.5): the application was user-
friendly and easy to use. The API results showed that
the PA encouraged participants to reflect and focus
on the complexity of healthcare processes (the central
topic of their discussion). In addition, most partici-
pants indicated that the PA was interesting and knowl-
edgeable. As expected based on the PA design, the
level of humanness was low; in fact, participants rec-
ognized that the PA did not exhibit particular emo-
tions and was neither friendly nor entertaining.
These results aligned with participants’ feedback,
which was generally positive, particularly regarding
the agent’s credibility. For instance, one participant
stated: “Although the agent did not have a realistic ap-
pearance, its behavior was in line with its role”. An-
other participant indicated the usefulness of the PA in
instructional activities: “Very valuable tool that could
provide good support to the student in studying and
developing a simulation model for the management
of healthcare organizational exam”.
To our knowledge, this is the first study to explore
the use of a virtual PA for training BME students.
Consistent with previous research (Petersen et al.,
2021; Zhang et al., 2024; Kyrlitsias and Michael-
Grigoriou, 2022), our findings highlight the poten-
tial of VR-based PAs in creating dynamic and inter-
active learning environments. BME students tradi-
tionally face limited opportunities to develop com-
munication skills needed for interaction with health-
care professionals (Montesinos et al., 2023), PAs not
only provide a valuable platform for communication
skill training but also offer a unique avenue for stu-
dents to explore and acquire new knowledge through
interactive experiences. Following previous studies
(Chheang et al., 2024; Grivokostopoulou et al., 2020),
participants responded positively to the application,
emphasizing its engaging nature, real-time feedback
capabilities, and the flexibility to practice interviews
at their convenience. Another important aspect of
the proposed system is its integration with Moodle.
This integration makes the system particularly valu-
able for supporting open-source initiatives and open
educational resources.
Despite the promising results, some limitations
should be acknowledged. First, the study involved a
small sample size, limiting the generalization of the
findings. Future research should conduct larger-scale
evaluations to validate the effectiveness of the PA.
Second, while the PA provided accurate and struc-
tured responses, its limited nonverbal cues (e.g., body
gestures, lip synchronization) affected perceived en-
gagement. Future improvements could enhance the
agent’s interactive cues and fidelity to improve the
learning experience (Nu
˜
nez et al., 2023). Moreover,
as a future development, we aim to integrate the appli-
cation into a fully immersive virtual environment, fur-
ther increasing engagement and realism in the learn-
ing experience. Furthermore, recognizing the current
limitations of the agent’s knowledge base, we plan to
expand its available data to improve the accuracy and
comprehensiveness of its responses.
REFERENCES
Alenezi, M. (2023). Digital learning and digital institution
in higher education. Education Sciences, 13(1):88.
Apoki, U. C., Hussein, A. M. A., Al-Chalabi, H. K. M.,
Badica, C., and Mocanu, M. L. (2022). The role of
pedagogical agents in personalised adaptive learning:
A review. Sustainability, 14(11):6442.
Baylor, A. and Ryu, J. (2003). The api (agent persona in-
strument) for assessing pedagogical agent persona. In
EdMedia+ innovate learning, pages 448–451. Associ-
ation for the Advancement of Computing in Education
(AACE).
Bergmann, K., Branigan, H. P., and Kopp, S. (2015).
Exploring the alignment space–lexical and gestural
alignment with real and virtual humans. Frontiers in
ICT, 2:7.
Black, J. and Abrams, M. (2018). Remote usability testing.
The Wiley Handbook of Human Computer Interaction,
1:277–297.
Brooke, J. et al. (1996). Sus-a quick and dirty usability
scale. Usability evaluation in industry, 189(194):4–7.
Chheang, V., Sharmin, S., M
´
arquez-Hern
´
andez, R., Patel,
M., Rajasekaran, D., Caulfield, G., Kiafar, B., Li,
J., Kullu, P., and Barmaki, R. L. (2024). Towards
anatomy education with generative ai-based virtual as-
sistants in immersive virtual reality environments. In
2024 IEEE International Conference on Artificial In-
telligence and eXtended and Virtual Reality (AIxVR),
pages 21–30. IEEE.
Chiou, E. K., Schroeder, N. L., and Craig, S. D. (2020).
How we trust, perceive, and learn from virtual hu-
mans: The influence of voice quality. Computers &
Education, 146:103756.
Convai (2024). https://www.convai.com, Accessed Jan. 25,
2025.
Dai, C.-P., Ke, F., Zhang, N., Barrett, A., West, L.,
Bhowmik, S., Southerland, S. A., and Yuan, X.
(2024). Designing conversational agents to support
student teacher learning in virtual reality simulation: a
case study. In Extended Abstracts of the CHI Confer-
ence on Human Factors in Computing Systems, pages
1–8.
Dai, L., Jung, M. M., Postma, M., and Louwerse, M. M.
(2022). A systematic review of pedagogical agent re-
search: Similarities, differences and unexplored as-
pects. Computers & Education, 190:104607.
Davis, R. O., Park, T., and Vincent, J. (2023). A meta-
analytic review on embodied pedagogical agent de-
Pedagogical Agents in Virtual Reality for Training of Biomedical Engineering Students
919