Authors:
Malte Neugebauer
1
;
Ralf Erlebach
2
;
Christof Kaufmann
1
;
Janis Mohr
1
and
Jörg Frochte
1
Affiliations:
1
Bochum University of Applied Sciences, 42579 Heiligenhaus, Germany
;
2
University of Wuppertal, 42119 Wuppertal, Germany
Keyword(s):
Learning Analytics, Learning Management System, Gamification, Pedagogical Agent, A/B Testing, Self-Regulated Learning, Higher Education.
Abstract:
The relevance of e-learning for higher education has resulted in a wide variety of online self-learning materials over the last decade like pedagogical agents (PA) or learning games. Regardless of this variety, educators wonder whether they can make use of these tools for their goals and if so, which tool to choose and in which context a specific tool performs best. To do so, the collection and analysis of learning data – referred to as Learning Analytics (LA) – is required. Along with digital learning environments the possibilities of applying LA are growing. Often, LA focuses on data that can easily be quantified: drop-out quota, time or grade performance. To facilitate learning in a more procedural sense, a deeper understanding of learners’ behavior in specific contexts with specific exercise designs is desired. This study therefore focuses on usage patterns. Learners’ movements through three different designs of mathematical exercises – (i) plain exercises, (ii) PA supported and
(iii) fantasy game design – are analyzed with Markov chains. The results of an experiment with 503 students inform about which design facilitates what kind of learning. While the PA design lets learners enter more partial solutions, the fantasy game design facilitates exercise repetition.
(More)