Our main objective with this study was to inves-
tigate whether UXNator met its primary objective of
recommending UX evaluation methods. In addition,
we get valuable feedback regarding improvements
that can be implemented in the next version of UX-
Nator. As a main result, we noticed that most partic-
ipants had a positive perception of UXNator useful-
ness, especially in the process of selecting which UX
method to use in a specific evaluation.
In future work, we intend to improve UXNator
and create our repository of UX evaluation methods.
We also intend to carry out a study with different pro-
files of participants.
ACKNOWLEDGEMENTS
We thank all the participants in the empirical study.
The present work is the result of the Research and
Development (R&D) project 001/2020, signed with
Federal University of Amazonas and FAEPI, Brazil,
which has funding from Samsung, using resources
from the Informatics Law for the Western Ama-
zon (Federal Law No 8.387/1991), and its disclo-
sure is in accordance with article 39 of Decree No.
10.521/2020. Also supported by CAPES - Financing
Code 001, CNPq process 314174/2020-6, FAPEAM
process 062.00150/2020, and grant #2020/05191-2
S
˜
ao Paulo Research Foundation (FAPESP).
REFERENCES
Bevan, N., Carter, J., and Harker, S. (2015). Iso 9241-
11 revised: What have we learnt about usability
since 1998? In International conference on human-
computer interaction, pages 143–151. Springer.
Chernev, A., B
¨
ockenholt, U., and Goodman, J. (2015).
Choice overload: A conceptual review and meta-
analysis. Journal of Consumer Psychology,
25(2):333–358.
Darin, T., Coelho, B., and Borges, B. (2019). Which in-
strument should i use? supporting decision-making
about the evaluation of user experience. In Inter-
national conference on human-computer interaction,
pages 49–67. Springer.
Ferreira, V. G. and Canedo, E. D. (2020). Design sprint
in classroom: exploring new active learning tools for
project-based learning approach. Journal of Ambient
Intelligence and Humanized Computing, 11(3):1191–
1212.
Hassenzahl, M. (2018). The thing and i: understanding the
relationship between user and product. In Funology 2,
pages 301–313. Springer.
Kieffer, S., Rukonic, L., de Meerendr
´
e, V. K., and Vander-
donckt, J. (2019). Specification of a ux process refer-
ence model towards the strategic planning of ux activ-
ities. In VISIGRAPP (2: HUCAPP), pages 74–85.
Kit, I. D. (2019). The human-centered design toolkit.
IDEO. URL: https://www. ideo. com/work/human-
centered-design-toolkit/[accessed 2020-06-20].
Law, E. L.-C., Roto, V., Hassenzahl, M., Vermeeren, A. P.,
and Kort, J. (2009). Understanding, scoping and defin-
ing user experience: a survey approach. In Proceed-
ings of the SIGCHI conference on human factors in
computing systems, pages 719–728.
Liu, X., He, S., and Maedche, A. (2019). Designing an
ai-based advisory platform for design techniques. In
Proceedings of the 27th European Conference on In-
formation Systems (ECIS).
Meireles, M., Souza, A., Conte, T., and Maldonado, J.
(2021). Organizing the design thinking toolbox: Sup-
porting the requirements elicitation decision making.
In Brazilian Symposium on Software Engineering,
pages 285–290.
Nakamura, W., Marques, L., Rivero, L., Oliveira, E., and
Conte, T. (2017). Are generic ux evaluation tech-
niques enough? a study on the ux evaluation of the
edmodo learning management system. In Brazil-
ian Symposium on Computers in Education (Simp
´
osio
Brasileiro de Inform
´
atica na Educac¸
˜
ao-SBIE), vol-
ume 28, page 1007.
Pettersson, I., Lachner, F., Frison, A.-K., Riener, A., and
Butz, A. (2018). A bermuda triangle? a review of
method application and triangulation in user experi-
ence evaluation. In Proceedings of the 2018 CHI con-
ference on human factors in computing systems, pages
1–16.
Rivero, L. and Conte, T. (2017). A systematic mapping
study on research contributions on ux evaluation tech-
nologies. In Proceedings of the XVI Brazilian sympo-
sium on human factors in computing systems, pages
1–10.
Russo, P., Costabile, M. F., Lanzilotti, R., and Pettit, C. J.
(2015). Usability of planning support systems: An
evaluation framework. In Planning support systems
and smart cities, pages 337–353. Springer.
Saad, J., Martinelli, S., Machado, L. S., de Souza, C. R., Al-
varo, A., and Zaina, L. (2021). Ux work in software
startups: a thematic analysis of the literature. Infor-
mation and Software Technology, 140:106688.
Souza, A., Ferreira, B., Valentim, N., Correa, L., Marczak,
S., and Conte, T. (2020). Supporting the teaching of
design thinking techniques for requirements elicita-
tion through a recommendation tool. IET Software,
14(6):693–701.
Vermeeren, A. P., Law, E. L.-C., Roto, V., Obrist,
M., Hoonhout, J., and V
¨
a
¨
an
¨
anen-Vainio-Mattila, K.
(2010). User experience evaluation methods: current
state and development needs. In Proceedings of the
6th Nordic conference on human-computer interac-
tion: Extending boundaries, pages 521–530.
UXNator: A Tool for Recommending UX Evaluation Methods
343