problem-solving and worked-examples is taken by
students who assessed the direct way of untutored
problem-solving to be -still- impassable, explaining
the relationship with prior knowledge.
Our study is based on creating a taxonomy of
learning behaviours by measuring trace data
generated by student activity in e-tutorials. That
taxonomy corroborates the concept of ‘help abuse’
developed by Shih et al. (2008). Rather than trying to
solve problems by asking for hints, some students
bypass these hints and directly call for complete
solutions. Table 1 makes clear that there exist huge
differences in the ratio of hints called for and
solutions called for between the several categories
generated on the quartile splits. That finding is in line
with the hypothesis of help abuse. However, we
cannot easily characterize the extreme categories of
few hints and many solutions versus many hints and
few solutions in terms of the learning dispositions
included in this study. That is: although we find
categories that might represent help abuse, they are
not easily connected with the notions of good and bad
student use as introduced in Shih et al. (2008).
We also corroborate the findings of, e.g. Amo et
al. (2018) and Amo-Filvà et al. (2019) that traces of
learning processes represent useful sources of data for
profiling learning behaviour. At the same time: these
data do capture only part of the learning process. That
is: the main limitation of this research approach is that
all learning that takes place outside the traced e-
tutorials remains unobserved.
The current study has focussed on individual
differences between students in their preference for
learning strategies, and the relationship with learning
dispositions. In future research, we intend to
additionally include the task dimension, by
investigating student preference for learning
strategies as a function of both individual differences
in learning dispositions and task characteristics.
REFERENCES
Amo, D., Alier, M., García-Peñalvo, F. J., Fonseca, D. and
Casañ, M. J. (2018). Learning Analytics to Assess
Students’ Behavior With Scratch Through Clickstream.
In M. Á.Conde, C. Fernández-Llamas, Á. M. Guerrero-
Higueras, F. J. Rodríguez-Sedano, Á. Hernández-
García and F. J. García-Peñalvo (Eds.), Proceedings of
the Learning Analytics Summer Institute Spain 2018 –
LASI-SPAIN 2018, pp. 74-82. Aachen, Germany:
CEUR-WS.org.
Amo-Filvà, D. A., Alier Forment, M., García-Peñalvo, F.
J., Fonseca-Escudero, D. and Casañ, M. J. (2019).
Clickstream for learning analytics to assess students’
behaviour with Scratch. Future Generation Computer
Systems, 93, 673-686. DOI: 10.1016/j.future.2018.
10.057. 2019.
Azevedo, R., Harley, J., Trevors, G., Duffy, M., Feyzi-
Behnagh, R., Bouchet, F., et al. (2013). Using trace data
to examine the complex roles of cognitive,
metacognitive, and emotional self-regulatory processes
during learning with multi-agents systems. In R.
Azevedo and V. Aleven (Eds.), International handbook
of metacognition and learning technologies, 427–449.
Amsterdam, the Netherlands: Springer.
Buckingham Shum, S. and Deakin Crick, R. (2012).
Learning Dispositions and Transferable Competencies:
Pedagogy, Modelling and Learning Analytics.
In S. Buckingham Shum, D. Gasevic and R. Ferguson
(Eds.). Proceedings of the 2nd International
Conference on Learning Analytics and Knowledge, 92-
101. ACM, New York, NY, USA. DOI:
10.1145/2330601.2330629.
Gašević, D., Jovanović, J., Pardo, A. and Dawson, S.
(2017a). Detecting learning strategies with analytics:
Links with self-reported measures and academic
performance. Journal of Learning Analytics, 4(1), 113–
128. DOI: 10.18608/jla.2017.42.10.
Gašević, D., Mirriahi, N., Dawson, S. and Joksimović, S.
(2017b). Effects of instructional conditions and
experience on the adoption of a learning tool
Computers in Human Behavior, 67, 207–220. DOI:
10.1016/j.chb.2016.10.026.
McLaren, B. M., van Gog, T., Ganoe, C., Karabinos, M. and
Yaron, D. (2016). The efficiency of worked examples
compared to erroneous examples, tutored problem
solving, and problem-solving in classroom
experiments. Computers in Human Behavior, 55, 87-
99. DOI: 10.1016/j.chb.2015.08.038.
McLaren, B.M., van Gog, T., Ganoe, C., Yaron, D. and
Karabinos, M. (2014) Exploring the assistance
dilemma: Comparing instructional support in examples
and problems. In S. Trausan-Matu et al. (Eds.)
Proceedings of the Twelfth International Conference on
Intelligent Tutoring Systems (ITS-2014). LNCS 8474.
(pp. 354-361). Springer International Publishing
Switzerland.
Nguyen, Q., Tempelaar, D.T., Rienties, B. and Giesbers, B.
(2016). What learning analytics based prediction
models tell us about feedback preferences of students.
In Amirault, R. and Visser, Y., (Eds.), e-Learners and
Their Data, Part 1: Conceptual, Research, and
Exploratory Perspectives.
Quarterly Review of
Distance Education. 17(3).
Papamitsiou, Z. and Economides, A. (2014). Learning
Analytics and Educational Data Mining in Practice: A
Systematic Literature Review of Empirical Evidence.
Educational Technology & Society, 17 (4), 49–64.
Pekrun, R., Vogl, E., Muis, K.R. and Sinatra, G.M. (2017).
Measuring emotions during epistemic activities: the
Epistemically-Related Emotion Scales, Cognition and
Emotion, 31(6), 1268-1276, DOI: 10.1080/
02699931.2016.1204989.
Renkl, A. (2014). The Worked Examples Principle in