learners’ decision to provide their learning progress
data. The solution with using a MD5 hash value of the
students’ university accounts at the front and a
doubled hashed value in the core application ensures
a satisfying amount of privacy for the projects pilot
phase (Slade and Prinsloo, 2013; Pardo and Siemens,
2014). We are able to compute an anonymous,
complete, coherent dataset at the end of the semester,
without the need to store critical, personal data during
the semester.
But as a MD5 hash is not unique, it exists a
minuscule possibility to dilute our dataset. In theory,
two different university accounts could be hashed to
the same value. The current system would not be able
to separate them. Nonetheless, this probability is quite
low. The hashing and merging of the different data
sources is therefore a topic of current research in our
project.
The students appreciate the option to include or
exclude themselves from the data tracking but mostly
ignore this possibility and stay in status ‘anonym’. To
what extend this is based on an active decision or
passive laziness is a topic of further investigation and
is depended on their individual privacy calculus for
disclosing personal data (Ifenthaler and Schumacher,
2016).
4.2 Impact of Prompts on the Learning
Progress
As this part of the project started just at the beginning
of the fall semester 2017, we are not yet able to
provide convincing insights regarding the impact on
the students’ learning progresses. We are currently
performing a research study, whose results will be
available at the end of the semester. Beside the
prompts within the productive learning environment,
we set up a dedicated copy of the university’s learning
platform and used this laboratory system to
investigate the impact of different prompting types on
the students learning progress under laboratory
conditions with various sample groups. The first
insights might be presented at the conference.
5 CONCLUSIONS
We implemented a tracking and prompting solution
into the existing e-learning infrastructure of our
university by injecting the respective functionality
through separate JavaScript libraries into the legacy
systems. By tracking the students via a
pseudonymous hash, we are able to collect students’
data throughout various systems without the necessity
to collect further personal data (Pardo and Siemens,
2014). We are further able to merge this data with
other university known data like demographic data
and grades at the end of the semester into a complete,
anonymous dataset for further investigation.
The solution is used to perform various
educational research studies, focussing on effects of
prompting for self-regulated learning (Bannert,
2009). We are further planning to extend the various
LA features. The next step is the extension of the
students’ direct feedback. The students will get a
more transparent feedback on the amount and type of
data which was collected and how this data can be
allocated to their current learning processes.
Furthermore, we will steadily improve the application
and plan to extend the area of research to more
courses in the following semester.
REFERENCES
Bannert, M., 2009. Promoting self-regulated learning
through prompts. Zeitschrift für Pädagogische
Psychologie, 23, 139–145.
Davis, E., 2003. Prompting middle school science students
for productive reflection: generic and directed prompts.
Journal of the Learning Sciences, 12(1), 91–142.
Gašević, D., Dawson, S. and Siemens, G., 2015. Let’s not
forget: Learning analytics are about learning.
TechTrends, 59(1), 64–71
Ifenthaler, D., 2012. Determining the effectiveness of
prompts for self-regulated learning in problem-solving
scenarios. Journal of Educational Technology &
Society, 15(1), 38-52.
Ifenthaler, D. and Schumacher, C., 2016. Student
perceptions of privacy principles for learning analytics.
Educational Technology Research and Development,
64(5), 923–938.
Ifenthaler, D. and Widanapathirana, C., 2014. Development
and validation of a learning analytics framework: Two
case studies using support vector machines.
Technology, Knowledge and Learning, 19(1–2), 221–
240.
McLoughlin, C. and Lee, M. J. W., 2010. Personalized and
self regulated learning in the Web 2.0 era:
International exemplars of innovative pedagogy using
social software. Australasian Journal of Educational
Technology, 26(1), 28–43.
Pardo, A. and Siemens, G., 2014. Ethical and privacy
principles for learning analytics. British Journal of
Educational Technology, 45(3), 438–450.
Slade, S. and Prinsloo, P., 2013. Learning analytics: Ethical
issues and dilemmas. American Behavioral Scientist,
57(10), 1510–1529.
Spring Boot Project: 2017. https://projects.spring.io/spring-
boot/. Accessed: 2017-10-02.