Building on our results, we are currently devel-
oping a course recommender system to support espe-
cially students at risk of dropping out (Wagner et al.,
2022). Indeed, if students enroll properly and pass
more courses, they may be less likely to drop out as
Table 2 and 8 suggest. Students may need guidance
for enrolling in the appropriate courses as well as the
appropriate number of courses. Our recommender
system supports these two aspects.
REFERENCES
Aulck, L., Nambi, D., Velagapudi, N., Blumenstock, J., and
West, J. (2019). Mining university registrar records to
predict first-year undergraduate attrition. In Proceed-
ings of the 12th International Conference on Educa-
tional Data Mining, page 9–18.
Berens, J., Schneider, K., Gortz, S., Oster, S., and Burghoff,
J. (2019). Early detection of students at risk - predict-
ing student dropouts using administrative student data
from german universities and machine learning meth-
ods. Journal of Educational Data Mining, 11(3):1–41.
Cohausz, L. (2022). When probabilities are not enough -
a framework for causal explanations of student suc-
cess models. Journal of Educational Data Mining,
14(3):52–75.
Dekker, G., Pechenizkiy, M., and Vleeshouwers, J. (2009).
Predicting students drop out: A case study. In Pro-
ceedings of the 2nd International Conference on Edu-
cational Data Mining, pages 41–50.
DZHW (2020). Veröffentlichungen, hochschul-IT,
hochschulforschung, hochschulentwicklung.
Gardner, J., Brooks, C., and Baker, R. (2019). Evaluat-
ing the fairness of predictive student models through
slicing analysis. In Proceedings of the 9th Interna-
tional Conference on Learning Analytics & Knowl-
edge, pages 225–234.
Han, H., Wang, W.-Y., and Mao, B.-H. (2005). Borderline-
smote: A new over-sampling method in imbalanced
data sets learning. In Advances in Intelligent Comput-
ing, pages 878–887.
Han, J., Kamber, M., and Pei, J. (2012). Data Mining -
Concepts and Techniques. Morgan Kaufmann.
Hardt, M., Price, E., and Srebro, N. (2016). Equality
of opportunity in supervised learning. In 30th Con-
ference on Neural Information Processing Systems,
(NIPS 2016).
Kemper, L., Vorhoff, G., and Wigger, B. U. (2020). Predict-
ing student dropout: A machine learning approach.
European Journal of Higher Education, pages 28–47.
Lemaître, G., Nogueira, F., and Aridas, C. K. (2017).
Imbalanced-learn: A python toolbox to tackle the
curse of imbalanced datasets in machine learning.
Journal of Machine Learning Research, 18(17):1–5.
Manrique, R., Nunes, B. P., Marino, O., Casanova, M. A.,
and Nurmikko-Fuller, T. (2019). An analysis of stu-
dent representation, representative features and clas-
sification algorithms to predict degree dropout. In
Proceedings of the 9th International Conference on
Learning Analytics & Knowledge, pages 401–410.
McNemar, Q. (1947). Note on the sampling error of the
difference between correlated proportions or percent-
ages. Psychometrika, 12:154–166.
Molnar, C. (2022). Interpretable machine learning. https:
//christophm.github.io/interpretable-ml-book/. Last
checked on Dec 07, 2022.
Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V.,
Thirion, B., Grisel, O., Blondel, M., Prettenhofer,
P., Weiss, R., Dubourg, V., Vanderplas, J., Passos,
A., Cournapeau, D., Brucher, M., Perrot, M., and
Duchesnay, E. (2011). Scikit-learn: Machine learning
in Python. Journal of Machine Learning Research,
12:2825–2830.
Swamy, V., Radmehr, B., Krco, N., Marras, M., and Käser,
T. (2022). Evaluating the explainers: Black-box ex-
plainable machine learning for student success predic-
tion in MOOCs. In Proceedings of the 15th Interna-
tional Conference on Educational Data Mining, pages
98–109.
Wagner, K., Hilliger, I., Merceron, A., and Sauer, P. (2021).
Eliciting students’ needs and concerns about a novel
course enrollment support system. In Companion
Proceedings of the 11th International Conference on
Learning Analytics & Knowledge LAK20, pages 294–
304.
Wagner, K., Merceron, A., and Sauer, P. (2020). Accuracy
of a cross-program model for dropout prediction in
higher education. In Companion Proceedings of the
10th International Learning Analytics & Knowledge
Conference (LAK 2020), pages 744–749.
Wagner, K., Merceron, A., Sauer, P., and Pinkwart, N.
(2022). Personalized and explainable course recom-
mendations for students at risk of dropping out. In
Proceedings of the 15th International Conference on
Educational Data Mining, pages 657–661.
Williamson, K. and Kizilcec, R. (2021). Effects of algorith-
mic transparency in bayesian knowledge tracing on
trust and perceived accuracy. In Proceedings of the
14th International Conference on Educational Data
Mining, pages 338–344.
Zhang, J., Andres, J. M. A. L., Hutt, S., Baker, R. S.,
Ocumpaugh, J., Mills, C., Brooks, J., Sethuraman,
S., and Young, T. (2022). Detecting SMART model
cognitive operations in mathematical problem-solving
process. In Proceedings of the 15th International Con-
ference on Educational Data Mining, pages 75–85.
CSEDU 2023 - 15th International Conference on Computer Supported Education
26