• Required prerequisites for the tutorial are ex-
plained at the start of the tutorial.
These items refer to the way the tutorial communi-
cates the underlying learning goals, prerequisites, and
concepts and how they can be applied to practice. So
somehow they describe that the tutorial makes its pur-
pose transparent to the learner. Therefore, we call this
scale Transparency.
The Cronbach-Alpha (Cronbach, 1951) values are
0.95 for the structural clarity scale and 0.89 for the
transparency scale indicating a high internal scale
consistency.
The first four items form the scale Structural Clar-
ity, while the last 4 items form the scale Transparency.
The answers are coded from -3 (Fully disagree) to +3
(Fully agree), thus a 0 represents a neutral evalua-
tion. The scale score for Structural Clarity respec-
tively Transparency is calculated as the mean over
all items in the scale and all participants of a study.
The overall tutorial quality score is the mean of the
scale scores for Structural Clarity and Transparency.
Thus, the questionnaire reports an overall score and
two scores for the sub-scales.
5 LIMITATION AND OUTLOOK
We described the development of a standard question-
naire to measure tutorial quality. Of course, a sound
validation of the questionnaire concerning reliability
and validity as well concerning its ability to differ-
entiate between tutorials of different quality must be
evaluated in further studies.
Data collection for the construction was done in
Indonesia. Of course, it must be checked if there were
any culture specific influences that had an impact on
the selection of items. This is not really likely, we
know for example that the impact of cultural back-
ground on the importance of UX quality aspects is rel-
atively low (Santoso and Schrepp, 2019). In addition,
the investigated tutorials do of course not cover the
full range of tutorials concerning length, complexity
or topics. Thus, the scale structure must be confirmed
by repeating the study in different countries and with
different types of tutorials. The results presented in
this paper are just the first step in the construction pro-
cess.
Currently, the questionnaire is available in English
and Indonesian language. It is planned to develop fur-
ther translations and to make the questionnaire as well
as the translations available to researchers over a ded-
icated website that allows to view and download the
material.
REFERENCES
Beaudry, A. and Pinsonneault, A. (2005). Understanding
user responses to information technology: A coping
model of user adaptation. MIS quarterly, pages 493–
524.
Brill, J. and Park, Y. (2011). Evaluating online tutorials for
university faculty, staff, and students: The contribu-
tion of just-in-time online resources to learning and
performance. International Journal on E-learning,
10(1):5–26.
Cattell, R. B. (1966). The scree test for the number of
factors. Multivariate Behavioral Research, 1(2):245–
276.
Comrey, A. L. and Lee, H. B. (2013). A first course in factor
analysis. Psychology press.
Cronbach, L. J. (1951). Coefficient alpha and the internal
structure of tests. Psychometrika, 16(3):297–334.
Guttman, L. (1954). Some necessary conditions for
common-factor analysis. Psychometrika, 19(2):149–
161.
Hassenzahl, M., Burmester, M., and Koller, F. (2003).
AttrakDiff: Ein Fragebogen zur Messung
wahrgenommener hedonischer und pragmatis-
cher Qualit
¨
at. Mensch & Computer 2003: Interaktion
in Bewegung, pages 187–196.
Head, A., Jiang, J., Smith, J., Hearst, M. A., and Hartmann,
B. (2020). Composing flexibly-organized step-by-step
tutorials from linked source code, snippets, and out-
puts. In Proceedings of the 2020 CHI Conference on
Human Factors in Computing Systems, pages 1–12.
Hotelling, H. (1933). Analysis of a complex of statistical
variables into principal components. Journal of Edu-
cational Psychology, 24(6):417–441.
Hsieh, C. (2005). Implementing self-service technology to
gain competitive advantages. Communications of the
IIMA, 5(1):77–83.
Husseniy, N., Abdellatif, T., and Nakhil, R. (2021). Im-
proving the websites user experience (ux) through the
human-centered design approach (an analytical study
targeting universities websites in egypt). Journal of
Design Sciences and Applied Arts, 2(2):24–31.
Jolliffe, I. T. and Cadima, J. (2016). Principal compo-
nent analysis: a review and recent developments.
Philosophical transactions of the royal society A:
Mathematical, Physical and Engineering Sciences,
374(2065).
Kim, A. S. and Ko, A. J. (2017). A pedagogical analy-
sis of online coding tutorials. In Proceedings of the
2017 ACM SIGCSE Technical Symposium on Com-
puter Science Education, pages 321–326.
Kirakowski, J. and Corbett, M. (1993). The software us-
ability measurement inventory. British Journal of Ed-
ucational Technology, 24(3):210–212.
Lamontagne, C., S
´
en
´
ecal, S., Fredette, M., Labont
´
e-
LeMoyne,
´
E., and L
´
eger, P.-M. (2021). The effect of
the segmentation of video tutorials on user’s training
experience and performance. Computers in Human
Behavior Reports, 3.
WEBIST 2024 - 20th International Conference on Web Information Systems and Technologies
320