Authors:
Hassan Tagharobi
and
Katharina Simbeck
Affiliation:
University of Applied Sciences HTW Berlin, Berlin, Germany
Keyword(s):
Code Audit, Algorithmic Fairness, Moodle, Learning Analytics.
Abstract:
Machine learning based predictive systems are increasingly used in various areas, including learning analytics (LA) systems. LA systems provide educators with an analysis of students’ progress and offer predictions about their success. Although predictive systems provide new opportunities and convenience, studies show that they harbor risks for biased or even discriminatory outcomes. To detect and solve these discriminatory issues and examine algorithmic fairness, different approaches have been introduced. The majority of purposed approaches study the behavior of predictive systems using sample data. However, if the source code is available, e.g., for open-source projects, auditing it can further improve the examination of algorithmic fairness. In this paper, we introduce a framework for an independent audit of algorithmic fairness using all publicly available resources. We applied our framework on Moodle learning analytics and examined its fairness for a defined set of criteria. Our
fairness audit shows that Moodle doesn’t use protected attributes, e.g., gender, ethnicity, in its predictive process. However, we detected some issues in data distribution and processing, which could potentially affect the fairness of the system. Furthermore, we believe that the system should provide users with more detailed evaluation metrics to enable proper assessment of the quality of learning analytics models.
(More)