A Problem Analysis in Game-Based Student Response System from UX
Elements Perspective
Brendo Campos
,1 a
, Jose Carlos Duarte
,1,2 b
, Genildo Gomes
1,2 c
, Leonardo Marques
1,2 d
,
Bruno Gadelha
1,2 e
and Tayana Conte
1,2 f
1
Federal University of Amazonas (UFAM), Manaus, Amazonas, 69067-005, Brazil
2
USES Research Group - Institute of Computing (IComp), Manaus, Amazonas, Brazil
Keywords:
Game Based Student Response System, User Experience, Elements of User Experience, Rapid Review.
Abstract:
Game-Based Student Response Systems (GBSRS) are tools for improving learning through student interaction
and participation. Promoting a good user experience in GBSRS is essential in adopting such tools in the
educational context. In this sense, it is necessary to design GBSRS by thinking about how to provide the best
experience for users. This paper presents an investigation of UX problems in two GBSRS tools, Kahoot! and
Quizizz, to verify whether we could avoid UX problems even in the initial stages of product design. For that,
we performed a rapid review, and from the selected articles, we cataloged and classified general problems in
the tools from the perspective of the UX elements defined by Garret’s framework. Our results showed that the
problems identified in our analysis could be avoided if we applied UX principles in the tool design phase.
1 INTRODUCTION
Due to the COVID-19 pandemic, it was necessary to
adopt social isolation measures to reduce the spread
of the new virus (WHO, 2021). Isolation measures
have heavily impacted how classes are delivered, with
most countries opting for remote learning systems.
Thus, school activities were predominantly carried
out virtually to minimize the adverse effects of iso-
lation during the school year (Misirli and Ergulec,
2021).
The COVID-19 pandemic has presented the need
to innovate and develop new educational systems and
assessment strategies. Thus, the use of teaching
and learning platforms has increased significantly by
many education providers worldwide (Hassan et al.,
2022).
In the context of virtual learning, many tools that
existed even before the pandemic became great allies
of teachers during classes. Among the various types
a
https://orcid.org/0000-0002-7246-9292
b
https://orcid.org/0000-0001-5732-9729
c
https://orcid.org/0000-0002-2901-3994
d
https://orcid.org/0000-0002-3645-7606
e
https://orcid.org/0000-0001-7007-5209
f
https://orcid.org/0000-0001-6436-3773
The first two authors contributed equally to this work
of tools, one category gained prominence by promot-
ing engagement through participation and competi-
tion among students, the Game-Based Student Re-
sponse System (GBSRS) (Rodrigues et al., 2022).
The GBSRS brings some aspects of gameplay,
i.e., it uses sounds and background music, timed ques-
tions, scores for the correct answers, and player rat-
ings (Ranieri et al., 2021). In the educational context,
these tools emerge as alternatives to promote motiva-
tion, engagement, and participation (Rodrigues et al.,
2022). The effectiveness of these tools depends on
many factors, such as the context of use, the applica-
tion, and the aspects that influence the User Experi-
ence (UX).
UX is the term used to describe a user’s experi-
ence with a product or service. It is a concept that
encompasses all aspects of the user’s interaction with
the product, including design, usability, ease of use,
feeling, and satisfaction. The goal of UX is to create
intuitive, efficient, and pleasant products to use (Has-
senzahl, 2003).
Many studies address the impacts of UX on using
GBSRS, such as Nieto Garc
´
ıa and Sit (2022), which
exploit the benefits of UX to students when using the
Kahoot tool. This study focused on the desirability
factor, which refers to the extent that a digital inter-
face is “desired” by users. To do that, they adopted
a quantitative deductive approach to test the hypothe-
328
Campos, B., Duarte, J., Gomes, G., Marques, L., Gadelha, B. and Conte, T.
A Problem Analysis in Game-Based Student Response System from UX Elements Perspective.
DOI: 10.5220/0011992900003467
In Proceedings of the 25th International Conference on Enterprise Information Systems (ICEIS 2023) - Volume 2, pages 328-335
ISBN: 978-989-758-648-4; ISSN: 2184-4992
Copyright
c
2023 by SCITEPRESS Science and Technology Publications, Lda. Under CC license (CC BY-NC-ND 4.0)
ses developed from the critical revision of the GB-
SRS and UX literature. The goal was to measure the
desire remembered by the students to use Kahoot in
the classroom and the subsequent effects on the per-
ception of the utility of GBSRS and the motivation to
attend a class (Nieto Garc
´
ıa and Sit, 2022).
The study of Rodrigues et al. (2022), through
some usability assessments in Kahoot and interviews,
has been able to visualize and compare the differences
in the experience of the two user profiles of the plat-
form, considering the point of view of teachers and
students. The results indicated that the platform pro-
moted a better student experience but presented more
usability and UX problems for teachers.
The study by Degirmenci (2021) presents a sys-
tematic review of the literature on the effectiveness
and role of the Quizizz tool and the perspectives of
teachers and students regarding its use. The inves-
tigation showed good effectiveness of Quizizz in the
learning processes in which it was applied.
With the growth of GBSRS in the educational con-
text, developers must prioritize the user experience to
ensure the efficiency of these systems. For this, they
must dedicate themselves to improving and perfect-
ing the UX of these resources. An alternative is us-
ing the conceptual model for UX design proposed by
Garrett (2011). The model aims to assist in the deci-
sions made during the product design stage, making
it possible to simplify the UX in five planes, namely:
Strategy, Scope, Structure, Skeleton, and Surface. We
chose this model due these advantages: works in the
system design phase, focuses on the user experience,
and presents an easy-to-follow structure.
Considering the need to improve the UX during
the design phase, this paper seeks to answer the fol-
lowing research question: could the problems present
in GBSRS have been avoided during the design phase
by using the UX elements proposed by Garret? To
answer the research question, we performed an ex-
ploratory literature through a rapid review search to
identify problems in two GBSRS tools, Quizizz and
Kahoot. Then, we aimed to classify the problems ac-
cording to the UX elements of Garrett (2011). Our
goal is to find out if, applying these principles to the
GBSRS development process, the final system would
provide a better user experience with fewer errors and
less frustration.
We hope this study will allow developers to un-
derstand better the main concerns when developing
GBSRS, considering the UX. In addition, we expect
our study to contribute to understanding the best prac-
tices to be followed in developing GBSRS systems
that provide positive user experiences.
2 BACKGROUND
This section presents the main concepts discussed in
this study. Thus, in the next subsections it is possi-
ble to find information about Game based student re-
sponse system, Kahoot, Quizizz and the elements of
Garrett’s user experience.
2.1 Game Based Student Response
System
The GBSRS are an interactive learning tool that al-
lows students to answer questions in real-time, get
class performance statistics, and are generally de-
signed to help teachers assess the student (Owen and
Licorish, 2020).
The GBSRS aims to increase students’ interest
and motivate them to learn while allowing teachers to
track performance and provide immediate feedback
to students (Ranieri et al., 2021). The system can be
used to teach specific content and to evaluate student
performance.
Among the tools that are part of this category, Ka-
hoot and Quizizz stand out. In the study of Basuki and
Hidayati (2019), students present Kahoot and Quizizz
as mandatory applications in the organization of on-
line questionnaires.
We used Kahoot and Quizizz as an object of study,
as they are popular and free tools in academia. In
addition, for standing out in studies in the literature
that compare these tools with each other. After cata-
loging problems in the literature associated with the
two tools, we categorized them according to a frame-
work proposed by Garrett (2011) to support the design
of GBSRS that contemplate a better user experience.
Next, we present information about Kahoot, Quizizz,
and the framework proposed by Garrett.
2.2 Kahoot
Kahoot shown in Figure 1 is a GBSRS, that helps
teachers improve students’ learning experience by
providing them with a platform to create and share
interactive quizzes (Wang, 2015). Games are cre-
ated with custom content and can be used for any-
thing from math to history (Aktekin et al., 2018). Stu-
dents can participate in the games using smartphones,
tablets, or any other device connected to the internet.
As a result, students are engaged and motivated to
learn, as they can compete against each other to ac-
cumulate points and win prizes (Wang, 2015).
A Problem Analysis in Game-Based Student Response System from UX Elements Perspective
329
Figure 1: Kahoot Screen.
2.3 Quizizz
The Quizizz platform, shown in Figure 2 offers gam-
ification features where users can earn points for
answering questions correctly and level up (Unesa,
2022). Quizizz also allows instructors to create their
unique games and share them with other platform
users. Instructors can also review and customize
games created by other users (Pham, 2022).
Figure 2: Quizizz Screen.
2.4 The Elements of User Experience of
Garrett
According to Agusdin et al. (2021), UX elements
are interdependent planes or layers to help in design
phase decisions in order to design the entire user ex-
perience. This approach consists of 5 planes, the
lower the level of the problem, the more abstract it
will be. Likewise, the higher the level, the more con-
crete the problem. Each plane is highly dependent
on the plane below it. So the surface depends on the
skeleton, which depends on the structure, the scope,
and the strategy respectively (Garrett, 2011).
Figure 3 presents the five levels of user experience
proposed by Garrett (2011), organized in ascending
order. Next, we present each one in detail:
Strategy Plane: At this point, user needs
and stakeholder interests (business scope) related
to the developed product must be taken into ac-
count(Garrett, 2011). According to Agusdin et al.
(2021), some techniques can be used at this stage to
acquire data, such as interviews, questionnaires, and
focus groups. The result will be an integration be-
tween product objective and users’ needs.
Figure 3: The theoretical model of UX elements. Source:
Garrett (2011).
Scope Plane: The structure defines the way in
which the various features and functions of the soft-
ware fit together. This phase transforms the strat-
egy into functional specifications that describe the
product’s functionalities. Considering the product
as a way of information, the output of the strategy
plane can also be converted into content require-
ments (Agusdin et al., 2021).
Structure Plane: This phase transforms the func-
tional specifications and content requirements into the
interaction arrangement between the user and sys-
tem(Agusdin et al., 2021). Such interaction is rep-
resented by Interaction Design, where the flows are
determined by user interaction, defining how the user
interacts with the product functionalities. This plane
also defines the structural design to facilitate access to
contents (Garrett, 2011).
Skeleton Plane: In the skeleton plane, the out-
put of the structure plane starts to be considered by
other factors, such as Information Design, related
to the presentation of information to the user, Inter-
face Design, responsible for the design and position-
ing of elements in the interface, and finally Naviga-
tion Design, responsible for facilitating user naviga-
tion through the product interface (Garrett, 2011).
Surface Plane: In this phase, the combination of
all the previous planes must converge in a concrete
Visual and sensorial design of the finished prod-
uct(Agusdin et al., 2021). This phase is used to vali-
date how the design will manifest itself to the users’
senses and which of the five senses (sight, hearing,
touch, smell, and taste) are affected (Garrett, 2011).
ICEIS 2023 - 25th International Conference on Enterprise Information Systems
330
3 EMPIRICAL DESIGN
We executed a rapid review to select the papers used
in our analysis. A rapid review is a condensed form
of systematic review in which components of the sys-
tematic review process are simplified or omitted to
produce information in a timely manner (Tricco et al.,
2015). It is often used to quickly identify information
relevant to research, policy development, or decision-
making. The rapid review can be used in many con-
texts, from medical research to evaluating the effec-
tiveness of a new program (Garritty et al., 2021).
3.1 Rapid Review Procedures
We searched for papers at IEEE xplore. IEEE xplore
is a search engine that provides access to scientific
and technical content from the IEEE (Institute of
Electrical and Electronics Engineers), IET (Institution
of Engineering and Technology), and other publishing
partners (Wilde, 2016).
We established ve inclusion criteria to select pa-
pers: i. paper published between 2020 and 2022.
We defined this period because our research focused
on the main game-based student response system
tools that gained prominence during the pandemic.
ii. paper about Kahoot and or Quizizz. iii. paper
that presents results with the perceptions of the use
of tools by students or teachers; or iv. paper that
presents problems identified in the tools; or v. pa-
per that presents strengths and weaknesses of the use
of the tool. The exclusion of papers is due to non-
compliance with the inclusion criteria.
In the first search, we set the following search
string: (“All Metadata”:game-based student re-
sponse system) OR (“All Metadata”:kahoot) OR
(“All Metadata”:quizz) OR (“All Metadata”:quizzes
) Filters Applied: 2020 - 2022. The search resulted
in 314 papers. We read the titles, abstracts, and key-
words according to the inclusion criteria. After ap-
plying the inclusion criteria in all papers returned, we
selected 17 papers for full reading to identify prob-
lems or perceptions about the studied tools.
We perform a second search with the following
search string: (“All Metadata”:game-based student
response system) AND (“All Metadata”:Evaluating)
OR (“All Metadata”: kahoot) OR (“All Meta-
data”:quizz) OR (“All Metadata”:quizzez). Filters
Applied: 2020 - 2022” . The second search resulted
in 28 papers. After applying the inclusion criteria, we
selected 17 papers for a full reading. We identified
that of the 17 papers selected, 13 were repeated (al-
ready returned in the first search).
Before performing the searches, we established
the papers of Rodrigues et al. (2022), and Figueiredo
et al. (2021) as control papers. Both papers meet all
inclusion criteria. At the end of the process, we ana-
lyzed 23 papers, 16 in the first search, 5 in the second
search, and two control papers.
3.2 Analysis and Categorization of the
Problems in the Five Planes of User
Experience
After the selection process, we started to read the se-
lected papers integrally. By reading the papers thor-
oughly, it is possible to verify whether the criteria
used in the Rapid Review were adequate for select-
ing papers and to assess whether other relevant fac-
tors may have influenced the decision to include or
exclude a paper. In addition, the complete reading of
the papers also allows the extraction of data relevant
to the study, such as the type of methodology used,
the results obtained, and the conclusions of the author
(Kitchenham et al., 2010).
From the extraction of the UX problems we ob-
tained in the literature regarding the Kahoot and Quiz-
izz tools, we performed the categorization of the prob-
lems considering one of the five planes presented by
Garrett (2011). In order to establish an acceptable
level of agreement in the categorization of problems
with the consensus among the authors, we performed
the Kappa (A coefficient of agreement for nominal
scales) (Cohen, 1960).
Cohen’s Kappa is a concordance index that mea-
sures the consistency between two evaluators when
rating the same observations. It is used to measure
a rater’s accuracy for specific judgment categories.
The index varies between 0, which indicates no agree-
ment between evaluators, and 1, which means perfect
agreement (Landis and Koch, 1977).
Two authors of this paper, using the concepts es-
tablished in Section 2.4 on UX planes, carried out
the classification of problems independently (without
having contact with the other classification). Then we
calculated the Kappa. The Kappa result was 0.81132,
considered Almost Perfect, and means that the eval-
uators are “aligned” concerning the interpretation of
the defined criteria (Landis and Koch, 1977).
4 RESULTS
We present, in a supplementary material, a table with
the 23 works selected and read in integrally
1
.
1
https://doi.org/10.6084/m9.figshare.22213675.v3
A Problem Analysis in Game-Based Student Response System from UX Elements Perspective
331
Regarding the tool addressed by the papers, 16 are
studies on Kahoot ([01-05][07-11][13-14][16-18][21-
22]); 2 on Kahoot and the Quizziz ([06][20]); 1
about Quizizz ([23]); 1 paper did not specify the tool
([03]); 1 on Kahoot and Nearpods ([12]); 1 on Ka-
hoot, Quizziz and Quizle ([15]) and one on Kahoot,
Quizziz, Socrative and Nearpods ([19]); all are GB-
SRS.
However, only seven papers described problems
concerning the tools ([01, 05, 07, 10, 21, 22, and 23]),
which resulted in 21 problems, as shown in Table 1.
We classified the identified problems within the five
UX planes presented in Section 2.4. Below we will
present and discuss the classification of problems.
Table 1: Problems Found.
Paper Problems Found
[01] Internet connection
[01] Fast pace of the game
[01] Competitive nature of the tool
[05] Students found tool sounds annoying
[07] View questions and answers in class
[07] Time stress
[07] Fear of losing
[07] Difficulty keeping up with the score
[10] Absence of any indication on the screen
when a wrong answer is marked
[21] Short timeout to reply
[21] Confusion in multiple-choice questions
[21] Students who cannot read fast end up
falling behind, which makes them feel em-
barrassed in front of other students
[22] Lack of language standardization
[22] Confusing navigation/interface
[22] Teachers rated it as technical, confusing,
and common
[22] An interface that is not visually pleasing
and has solid colors is tiring on the eyes
[22] Confusing and difficult-to-understand
symbols
[22] Functions difficult to access
[22] Difficulty using some features like adding
media to the created Kahoot
[23] Menu “More” when clicked is below the
“Help” button, making it difficult to access
two submenus
[23] Created activities appear on very similar
white cards, which can make them difficult
to distinguish
4.1 The Strategy Plane
Given the classification of problems, we associated
six problems with the strategy plane:
1. Students who cannot read fast end up falling be-
hind, which makes them feel embarrassed in front
of the other students;
2. Time stress;
3. Competitive character of the tool;
4. Fear of losing;
5. View questions and answers in class;
6. Teachers classified the tool as technical, confus-
ing, and common.
We can observe that Problems 1 and 2 are related
to the time each question can be answered. We iden-
tified these issues in Kahoot. The problems identi-
fied inform the direction the development team should
take when creating and improving their products and
services.
However, some tools have features that increase
the time factor when creating a match. In this sense,
it is possible to adjust the time limit for questions, add
a countdown, and even create games in real-time.
Problems 3 and 4 are related to each other. The
competitive character represents the desire to be the
best among other competitors, which can lead to the
fear of losing. These are usually factors used to en-
courage users to improve their performance level. In
addition, problem 5 motivates competitors to fight for
their goals and overcome challenges, i.e., it can also
improve engagement positively or negatively. Prob-
lem 6 is important because the use of these tools is
often a teacher’s decision, if the teachers don’t like
the tool it is not used.
The strategic plane should be developed based on
a deep understanding of the needs and preferences of
the users (Garrett, 2006). User needs are objectives of
external origin identified through user research. We
can obtain user information through interviews, ques-
tionnaires, and direct observation, in addition to data
analysis and user feedback (Garrett, 2011).
The development team can use user needs to drive
the creation of new features, improve the usability and
accessibility of systems, enhance the user experience,
and increase service quality. These needs must be
used to identify opportunities to create new products
or services that meet users’ needs.
4.2 The Scope Plane
The scope plane specifies the functionality and con-
tent elements that the system must include to sat-
isfy user needs. Functional specifications are a set
of detailed functionality that the system must have.
Content requirements are the content elements neces-
sary for the system to meet the user’s needs (Garrett,
2006).
ICEIS 2023 - 25th International Conference on Enterprise Information Systems
332
The classification revealed four problems associ-
ated with the scope plane:
1. Short timeout to reply;
2. Fast pace of the game;
3. Absence of any indication on the screen when a
wrong answer is marked;
4. Difficulty keeping up with the score.
Scope Problem 1 is related to the problems of the
previous plane. Therefore, Problem 1 could have been
categorized in the Strategy or Structure planes. How-
ever, Problem 1 was categorized as Scope, as it can
be considered a requirement to be implemented in the
tools.
Another factor that could have been considered a
requirement is Scope Problem 2, related to the infor-
mation scrolling too quickly on the screen, which sig-
nificantly affects the user experience. Problems 3 and
4 point to the lack of more appropriate feedback. The
information must be presented clearly and objectively,
so the user can absorb it without rereading or going
back to be sure of what s/he has read.
4.3 The Structure Plane
The structure plane establishes the guidelines for im-
plementing elements in the system. Interaction design
defines how users will interact with the product, while
information architecture provides the structure for the
organization, accessibility, and location of product in-
formation. In this sense, two main problems were as-
sociated:
1. Difficulty using some features, like adding media
to the created kahoot;
2. Functions that are difficult to access.
The two Structure problems show difficulties en-
countered by users in functionalities that should be
easy to access. Difficulties characterize poor interface
design and affect the flow of information. The struc-
ture should be designed to give the user access to the
desired information as quickly as possible, providing
a direct path to the necessary data. In addition, the
structure needs to be intuitive and straightforward so
that the user does not feel lost or disoriented. Finally,
navigation must be organized clearly and logically, so
the user can easily find the information.
4.4 The Skeleton Plane
The skeleton should help organize information into
logical sections, allowing users to access the content
they want quickly. In interface design, the skeleton
establishes the visual and interactive components nec-
essary for users to interact with the system. The nav-
igation design provides users with clear paths to the
desired content, allowing them to navigate the prod-
uct easily. Problems with the skeleton’s plane focus
on the confusion and doubts that some interface ele-
ments caused the users. In this sense, we found four
problems:
1. Confusing and difficult-to-understand symbols
2. Confusion in multiple-choice questions
3. Confusing navigation/interface
4. Menu “More” when clicked is below the button
“Help”, making it difficult to access two sub-
menus
Skeleton Problems 1, 2, and 3 show a lack of con-
sistency between the interface and difficulty under-
standing how functions and interface elements work.
Regarding Skeleton Problem 4, the navigation menus
must be well positioned so that users can easily find
what they are looking for in the interface.
4.5 The Surface Plane
The surface plane is responsible for creating a pleas-
ant and beautiful environment. It is the first impres-
sion that users will have of the system. If the surface
plane is developed correctly, it can help establish trust
and create a foundation for a good user experience.
Therefore, we associate four problems with the sur-
face plane:
1. Created activities appear on similar white cards,
which can be difficult to distinguish;
2. An interface that is not visually pleasing and has
solid colors is tiring on the eyes;
3. Students found the tool’s sound effects annoying;
4. Lack of language standardization.
Problems 1, 2, and 3 presented for the Surface
plane show that visual and aesthetic elements can be
unpleasing or tiring, diminishing the positive experi-
ence of using the tool. For Surface Problem 4, it was
reported that even when changing the tool’s language,
information still appears in the original language, thus
not maintaining the language pattern in the interface.
The surface plane must be designed in such a way
as to meet the functional and aesthetic needs of the
product. Visual elements such as colors, fonts, im-
ages, and icons contribute to the user experience. The
design should be used to make the user experience
more intuitive, attractive, and easy to use. In this
sense, an unpleasant visual design can affect the user
experience.
A Problem Analysis in Game-Based Student Response System from UX Elements Perspective
333
From our analysis, we classified the problems ac-
cording to the planes presented by Garrett (2011).
However, the problem of “Internet connection” is re-
lated to a problem outside the context of the applica-
tion. Therefore, we classified the problem in the “Ex-
ternal problems” category, which represents problems
that affect the use of the tool from external factors.
5 DISCUSSION
UX planes were initially framed in the practice of
web design. However, these same principles apply
to products with both functional and informational
aspects. In these products, UX design focuses on
user tasks or as a form of information where the de-
sign considers the information offered by the product
(Garrett, 2006), i.e., any system that helps users to
perform tasks and promotes communication between
users. GBSRS allows teachers to provide immediate
feedback to students through multiple response op-
tions (Ranieri et al., 2021), in addition to providing
students with interactive assignments.
In the elements presented by Garrett (2011), the
experience is built from the bottom up. The lowest
elements of the framework represent the product de-
sign phases, up to the interface that will be presented
to the end user. We analyzed the problems classified
in each plane. We can infer that some problems could
have been prevented by adequately solving the pre-
vious plane’s problems, since each plane depends on
the decisions in the below planes.
The problems considered in the lower plane are
quite abstract. We suppose that their possible solu-
tions could only be applied in higher planes, since de-
sign decisions become perceptible and visual as we
approach the top.
The development team must ensure the GBSRSs
are correctly implemented to satisfy the user. Thus,
the GBSRS can help quickly and intuitively in the
proposed task.
Applying UX elements in the GBSRS develop-
ment process can allow designers to plan all aspects
of UX. Consequently, ensuring that no aspect of the
user’s experience with the software happens without
their conscious and explicit intention is possible.
In general, Garret’s model offers a guide to de-
velop guided experiences, involving all the necessary
elements to create deeper emotional connections with
the target audience. The model does not result in stan-
dard designs but well-crafted systems tailored to the
specific user from the ground up (Garrett, 2006).
To get the best out of Garrett’s plans, we must
continually verify each plane to ensure it is consis-
tent with our planned goals in the present and former
planes. We need to create a test planning for each
plane, analyze the test results, and identify the points
that need to be improved. Also, check the perfor-
mance of the GBSRS continuously, and carry out pe-
riodic updates and improvements to ensure the system
meets users’ needs over time and we to follow what
we established in planning. Besides that we must do
verification of the UX planes in a systematic and doc-
umented way to ensure the quality of the system and
meet user expectations. To ensure the quality of the
system and meet user expectations, we need to verify
the UX planes systematically and document this pro-
cess in a rigorous manner. This will help us optimize
the UX planes for usability, accessibility, and overall
user satisfaction. It is important to adhere to accepted
academic standards while conducting this verification
process.
6 CONCLUSION
This paper aimed to investigate whether applying the
principles of Garrett’s UX planes could help improve
the UX in the Game-Based Student Response Sys-
tem in the development process. In this sense, we
conducted a Rapid Review to gather studies on the
GBSRS according to the established criteria. In this
way, we identified problems, which we classified and
analyzed in each plane. We can assume that some
UX problems could have been avoided if Garret’s el-
ements principles had been applied correctly and ap-
propriately.
Classifying the problems identified according to
Garret’s UX elements can help identify possible so-
lutions to the problems or at least prevent such prob-
lems from happening. Thus, provide developing more
effective and more satisfactory systems for users. In
future work, we intend to propose guidelines for ver-
ifying UX plans for GBSRS to guarantee the quality
of the systems we develop.
ACKNOWLEDGMENT
The present work is the result of the Research and
Development (R&D) project 001/2020, signed with
Federal University of Amazonas and FAEPI, Brazil,
which has funding from Samsung, using resources
from the Informatics Law for the Western Ama-
zon (Federal Law No 8.387/1991), and its disclo-
sure is in accordance with article 39 of Decree No.
10.521/2020. Also supported by CAPES - Financing
Code 001, CNPq process 314174/2020-6, FAPEAM
ICEIS 2023 - 25th International Conference on Enterprise Information Systems
334
process 062.00150/2020, and grant #2020/05191-2
S
˜
ao Paulo Research Foundation (FAPESP). We thank
USES Research Group members for their support.
REFERENCES
Agusdin, R. P., Salsabila, A., and Putri, D. A. K. (2021).
Designing user experience design of the healthy diet
mobile application using the fives planes framework.
Jurnal Buana Informatika, 12(1):11–20.
Aktekin, N., C¸ elebi, H., and Aktekin, M. (2018). Let’s ka-
hoot! anatomy utilicemos kahoot! anatom
´
ıa. Interna-
tional Journal of Morphology, 36(2).
Basuki, Y. and Hidayati, Y. (2019). Kahoot! or quizizz:
The students’ perspectives. In Proceedings of the 3rd
English Language and Literature International Con-
ference (ELLiC), pages 202–211.
Cohen, J. (1960). A coefficient of agreement for nominal
scales. Educational and psychological measurement,
20(1):37–46.
Degirmenci, R. (2021). The use of quizizz in language
learning and teaching from the teachers’ and students’
perspectives: A literature review. Language Educa-
tion and Technology, 1(1):1–11.
Figueiredo, J. V. M., de Carvalho, S. M., da Silva, I. M. L.,
and Furtado, C. (2021). Tecnologias educacionais:
an
´
alise da interface da plataforma quizizz com base
nos princ
´
ıpios de design da informac¸
˜
ao. DAT Journal,
6(3):297–311.
Garrett, J. (2011). The Elements of User Experience: User-
Centered Design for the Web and Beyond. New Riders
Publishing.
Garrett, J. J. (2006). Customer loyalty and the elements
of user experience. Design management review,
17(1):35–39.
Garritty, C., Gartlehner, G., Nussbaumer-Streit, B., King,
V. J., Hamel, C., Kamel, C., Affengruber, L., and
Stevens, A. (2021). Cochrane rapid reviews meth-
ods group offers evidence-informed guidance to con-
duct rapid reviews. Journal of clinical epidemiology,
130:13–22.
Hassan, R., Murad, D. F., Wahi, W., Wijanarko, B. D.,
Ismail, N. H. A., and Awwad, S. A. (2022). On-
line learning experience assessment survey during the
covid-19 pandemic. In 2022 International Confer-
ence on Information Management and Technology
(ICIMTech), pages 133–138. IEEE.
Hassenzahl, M. (2003). The thing and i: understanding the
relationship between user and product. In Blythe, M.
and Monk, A., editors, Funology 2: From Usability to
Enjoyment, pages 31–42. Springer International Pub-
lishing.
Kitchenham, B., Pretorius, R., Budgen, D., Brereton, O. P.,
Turner, M., Niazi, M., and Linkman, S. (2010). Sys-
tematic literature reviews in software engineering–a
tertiary study. Information and software technology,
52(8):792–805.
Landis, J. R. and Koch, G. G. (1977). The measurement of
observer agreement for categorical data. biometrics,
33(1):159–174.
Misirli, O. and Ergulec, F. (2021). Emergency remote teach-
ing during the covid-19 pandemic: Parents experi-
ences and perspectives. Education and information
technologies, 26(6):6699–6718.
Nieto Garc
´
ıa, M. and Sit, J. (2022). Students’recalled desir-
ability of using game-based student response systems
(gsrss): a user experience (ux) perspective. Marketing
Education Review, pages 1–13.
Owen, H. E. and Licorish, S. A. (2020). Game-based stu-
dent response system: The effectiveness of kahoot! on
junior and senior information science students’ learn-
ing. Journal of Information Technology Education:
Research, 19:511–553.
Pham, A. T. (2022). University students’ attitudes towards
the application of quizizz in learning english as a for-
eign language. International Journal of Emerging
Technologies in Learning, 17(19):278–290.
Ranieri, M., Raffaghelli, J. E., and Bruni, I. (2021). Game-
based student response system: Revisiting its poten-
tials and criticalities in large-size classes. Active
Learning in Higher Education, 22(2):129–142.
Rodrigues, M., Nery, B., Castro, M., Klisman, V., Duarte,
J. C., Gadelha, B., and Conte, T. (2022). Let’s play!
or don’t? the impact of ux and usability on the adop-
tion of a game-based student response system. In
Proceedings of the 14th International Conference on
Computer Supported Education - Volume 1: CSEDU,,
pages 273–280. INSTICC, SciTePress.
Tricco, A. C., Antony, J., Zarin, W., Strifler, L., Ghassemi,
M., Ivory, J., Perrier, L., Hutton, B., Moher, D., and
Straus, S. E. (2015). A scoping review of rapid review
methods. BMC medicine, 13(1):1–15.
Unesa, A. A. A. B. (2022). The implementation of quizizz
in vocabulary learning activities.
Wang, A. I. (2015). The wear out effect of a game-based
student response system. Computers & Education,
82:217–227.
WHO, W. H. O. (2021). Coronavirus disease (covid-19):
How is it transmitted? urlhttps://www.who.int/news-
room/q-a-detail/q-a-coronaviruses.
Wilde, M. (2016). Ieee xplore digital library. The
Charleston Advisor, 17(4):24–30.
A Problem Analysis in Game-Based Student Response System from UX Elements Perspective
335