of 50% the mean scores in cases Q, G and QG were
30, 37.5 and 37.5, respectively. He concludes that a
60-question four-choice test is rather unreliable, and
one of the main reasons is that G and QG allow guess-
ing, which is not the case of Q.
The authors in (Oliveira Neto and Nascimento,
2012) adapted the Learning Management System
(LMS) Moodle to make formative assessment during
the teaching-learning process with high quality feed-
back for a distance learning course of 40h per week
in Mathematical Finance. These evaluations can bet-
ter direct the student’s performance if the feedback
is quick and precise at pointing out their difficulties.
Moreover, the feedback can guide the teacher about
the adopted teaching process, and so the students’
understanding can be reinforced regarding some top-
ics that have not been assimilated yet. By analysing
the students’ answers in previous classes, the authors
have improved the QB with additional rules to tests,
error messages and links to either theoretical topics or
extra exercises.
In the elaboration of multiple-choice questions, it
is also important to consider suitable wrong options
among the alternatives, also called distractors. Un-
suitable distractors enable the examinee to guess the
correct answer by discard, as discussed in (Moser
et al., 2012), where the authors present a text pro-
cessing algorithm for automatic selection of distrac-
tors. A more recent work is (Susanti et al., 2018),
but devoted to automatic production of distractors for
the English vocabulary. In (Ali and Ruit, 2015) the
authors present an empirical study on flawed alterna-
tives and low distractor functioning. They conclude
that removal or replacement of such defective distrac-
tors, together with increasing the cognitive level, im-
prove detection of high- and low-ability examinees.
Our present work introduces an automatic gen-
erator and corrector devoted to exams that consist
of multiple-choice questions with weighted alter-
natives. It is adapted from the open-source sys-
tem MCTest available on GitHub. For such exams
MCTest stores the correction in a CSV-file and emails
it to the professor. This file contains each student’s
responses compared with the individual answer key
of the exam issue received by that student. Common
programs like Excel and LibreOffice open the file in a
spreadsheet with built-in formulas that give each stu-
dent’s final mark according to the weights, as we shall
detail in this paper.
As a related work we cite (Presedo et al., 2015),
in which the authors use Moodle to create multiple-
choice questions with weighted alternatives. Their
system also enables the user to give an exam in hard-
copy but with neither the student’s id nor variations
of the exam. Moreover, it requires the plugin Offline
Quiz (moodle.org/plugins/mod offlinequiz). Moodle
enables Calculated question type, that we call para-
metric question in MCTest, in which the statement
and the alternatives accept wildcards but in Moodle
only for simple mathematical operations. By contrast,
MCTest enables nominal exams, numerous variations
and wildcards that accept complex formulas written in
Python and its libraries. Details on parametric ques-
tions with MCTest can be found in (Zampirolli et al.,
2021; Zampirolli et al., 2020; Zampirolli et al., 2019).
The paper is organized as follows. Section 2 de-
scribes the adapted MCTest for an automated assess-
ment with multiple-choice questions using weighted
answers; Section 3 shows the obtained results and dis-
cusses them; finally, Section 4 presents our main con-
clusions and opportunities for future work.
2 USING ADAPTED MCTest:
MATERIALS AND STEPS
This work applies the open-source Information and
Communication Technology (ICT) MCTest available
on GitHub (https://github.com/fzampirolli/mctest).
We have implemented MCTest in order to enable
weighting answers of multiple-choice questions. In
this section, we explain how to create exams that in-
clude such questions with weighted answers.
2.1 Creating Multiple-choice Questions
After downloading MCTest from GitHub, the sys-
tem administrator must install it on a server. Be-
fore creating a question, they have to include Insti-
tution, Course, Discipline and also associate a pro-
fessor as Discipline Coordinator. This one can cre-
ate discipline Topics and also add more professors.
See vision.ufabc.edu.br for details. Afterwards, any
of them can add a Class and also questions asso-
ciated to a Topic thereof. An example would be
setting [ED]<template-figure> at Choose Topic
in Figure 1. Namely, this topic belongs to a dis-
cipline called ED, a mnemonic to Example Disci-
pline. In that figure we have Short Description:
template-fig-tiger-en, which is optional but
makes it easier to locate questions in Question Banks
(QB), as we shall explain in Subsection 2.2. The field
Group is also optional for the user to define a group
of questions, so that in each exam MCTest will always
draw only one question from that group. The most rel-
evant field is Description, where we can insert para-
graphs in L
A
T
E
X and also combine them with a Python
code, as explained later in another example.
Automated Assessment with Multiple-choice Questions using Weighted Answers
255