5.4 Limitations
This paper presents the process of reconstructing and
validating the Trust factor within the User Experience
Questionnaire Plus (UEQ+). Our findings lend credi-
bility to the selected four items for the Trust factor, in-
dicating their validity within the chosen context. Nev-
ertheless, it is crucial to ensure continued validity in
future UEQ+ applications incorporating the Trust fac-
tor. A confirmatory factor analysis typically serves as
a reliable methodology to ascertain it.
Furthermore, the outcomes from the factor anal-
yses discussed in this article reveal specific nu-
ances. Items possessing identical phrasing but orig-
inating from distinct factors may not invariably be at-
tributable to a single factor unequivocally. This ob-
servation underscores the need for enhanced attention
and scrutiny in future applications, especially when
dealing with similarly worded items from different
factors. Future research endeavours could further illu-
minate these findings and help refine the methodolo-
gies for the more explicit assignment of such items.
6 CONCLUSIONS AND FUTURE
WORK
This paper outlines the construction and validation
process for the Trust factor for the User Experience
Questionnaire Plus (UEQ+). The initial stage, termed
as ’preconstruction’, encompassed the collation of
potential items for this factor. These items were sub-
sequently subjected to an evaluation in a study involv-
ing four distinct test objects and 405 participants. The
ensuing exploratory factor analysis break the Trust
factor down into the following four items:
• insecure-secure
• untrustworthy-trustworthy
• unreliable-reliable
• non-transparent-transparent
A further analysis of these four items was con-
ducted in the next stage, referred to as the ’First
Group of Validation’. During this phase, a study en-
compassing 443 participants evaluated Facebook and
YouTube. The following confirmatory factor analysis
substantiated the four items for the Trust factor.
An additional validation study was carried out
with five test objects and 454 participants, also known
as the ’Second Group of Validation’. The confirma-
tory factor analysis resulting from this phase once
again corroborated the validity of the four trust items.
Thus, the primary objective of this manuscript –
to construct and validate a new Trust factor for the
UEQ+ – has been fulfilled.
Given the broad applicability of the UEQ+, it
is important to note that not all product categories
could be encompassed within the scope of our stud-
ies. Therefore, subsequent studies or applications de-
ploying the UEQ+ and the Trust factor should aim to
affirm its validity.
REFERENCES
Boos, B. and Brau, H. (2017). Erweiterung des UEQ um
die Dimensionen Akustik und Haptik [Extension of
the UEQ by the dimensions acoustics and haptics]. In
Mensch & Computer im Strukturwandel (Humans Hu-
mans & computers computers in times of change in
structure)., page S. 321 – 327. Mensch & Computer
(Humans & computers) 2017, Usability Professionals,
Regensburg: Gesellschaft für Informatik e.V.,.
Brühlmann, F., Petralito, S., Rieser, D. C., Aeschbach, L. F.,
and Opwis, K. (2020). TrustDiff: Development an
dValidation ofa Semantic Differential for User Trust
on the Web. In Journal of Usability Studies, volume
16-1, pages 29–48.
Comrey, A. L. and Lee, H. B. (2013). A First Course in
Factor Analysis. Taylor and Francis, Hoboken, 2nd
ed. edition.
ISO/TC 159/SC 4 Ergonomics of human-system interaction
(2010). Part 210: Human- centred design for interac-
tive systems. In Ergonomics of human-system inter-
action, volume 1, page 32. International Organization
for Standardization (ISO), Brussels.
Klein, A. M., Hinderks, A., Schrepp, M., and
Thomaschewski, J. (2020). Measuring user ex-
perience quality of voice assistants. In 2020 15th
Iberian Conference on Information Systems and
Technologies (CISTI), pages 1–4. IEEE.
Körber, M. (2019). Theoretical Considerations and Devel-
opment of a Questionnaire to Measure Trust in Au-
tomation. In Bagnara, S., Tartaglia, R., Albolino, S.,
Alexander, T., and Fujita, Y., editors, Proceedings of
the 20th Congress of the International Ergonomics As-
sociation (IEA 2018), pages 13–30, Cham. Springer
International Publishing.
Laugwitz, B., Held, T., and Schrepp, M. (2008). Construc-
tion and evaluation of a user experience questionnaire.
In Holzinger, A., editor, HCI and Usability for Ed-
ucation and Work, volume 5298 of Lecture Notes in
Computer Science, pages 63–76. Springer Berlin Hei-
delberg, Berlin, Heidelberg.
Laugwitz, B., Schrepp, M., and Held, T. (2006a). Konstruk-
tion eines fragebogens zur messung der user experi-
ence von softwareprodukten. In Heinecke, A. M. and
Paul, H., editors, Mensch & Computer 2006, pages
125–134. Oldenbourg, München.
Laugwitz, B., Schrepp, M., and Held, T. (2006b). Konstruk-
tion eines Fragebogens zur Messung der User Experi-
WEBIST 2023 - 19th International Conference on Web Information Systems and Technologies
328