the students could judge during the whole 90-minute
session whenever they had the feeling that the pro-
fessor was going too fast or too slow, or the volume
was not adequate. The data suggested that the live-
feedback is a tool that is used only in case of prob-
lems. In the two sessions of the economy course stu-
dents used the possibility to give feedback regarding
the volume only three times: At the beginning of the
first lecture, during the first lecture when the professor
was showing a video with poor sound quality and at
the beginning of the second lecture. At each of these
three points around 20% of the students judges the
volume as to low. Concerning the speed of the pro-
fessor there was only in the first session a significant
feedback activity (with more than 5% of the registered
and active students voting). Students used the speed
buttons to feed back to the professor if they need
more time to work on the learning questions. The
professor had prepared six learning questions, which
were distributed in blocks of two questions. Around
20% of the students voted during this breaks that they
want the professor to go on faster (if they had already
finished working on the learning questions) or to go
slower (if they need some more time). The evaluation
activity diminished in the second session. One expla-
nation is that students noticed that the professor was
not immediately changing his teaching. That points
out that there is the need of an interface which gives
relevant information back to the lecturer (see section
further development).
According to the professor, the results of the learn-
ing questions gave him a useful overview about the
knowledge state of his audience. Students appreci-
ated the possibility to work on learning questions as
well. They even judged them to be more useful than
the live-feedback tool.
5 CONCLUSIONS AND FURTHER
DEVELOPMENT
AMCS provides opportunities to support students to
evaluate the lecturer and their teaching. The presented
features mainly aim at fostering regulation and mas-
tering demands of self-regulating learning of the stu-
dents. But they also can be used for the formative
evaluation during and after the lecture. The lecturers
can receive instant feedback – i.e., information in real-
time during each lecture – they can react to directly
during a lecture, e.g. by adapting their presentation,
but they can also receive evaluative feedback – i.e., a
summary of comments and opinion after the conclu-
sion of each lecture – in order to make some changes
after the course or semester have ended. The first pi-
lot tests have shown that learning questions, cognitive
and metacognitive prompts, and instant feedback can
be used in university lectures in order to support stu-
dents in mastering the demands of this learning situa-
tion as well as lecturers to improve their teaching. At
the end of the semester lecturers will be provided with
an overview of all events, allowing an overall evalua-
tion of the entire lecture or tutorial series.
In the next development steps we will focus on
the representation of the evaluation data to the lec-
turer. We will provide more features for the aggre-
gation and visualization of the evaluation results. In
order to provide the real-time feedback without too
much interruption of the presentation, a second de-
vice – i.e., a second screen – would be helpful; also, a
smartwatch could be utilized.
ACKNOWLEDGEMENTS
We wish to thank our busily working team for im-
plementing the prototypes and providing valued con-
tributions to our concept, namely Patrick Buchholz,
Markus Heider, Tommy Kubica and Martin Weiss-
bach.
REFERENCES
Braun, I., Kapp, F., K
¨
orndle, H. and Schill, A. (2015). On-
linegest
¨
utzte Audience Response Systeme: F
¨
orderung
der kognitiven Aktivierung in Vorlesungen und Er
¨
off-
nung neuer Evaluationsperspektiven. In Proceedings
of the Wissensgemeinschaften in Wirtschaft und Wis-
senschaft 2015, 153–165.
Dubrau, M. and Krause, J. (2015). Mobiles Feedback
- Praxisbericht zur Integration eines Audience Re-
sponse Systems in eine Lehrveranstaltung als Instru-
ment der Lehrevaluation. Proceedings of the Wissens-
gemeinschaften in Wirtschaft und Wissenschaft 2015,
167–171.
Hadwin, A. F. and Winne, P. H. (2001). Conotes2: A soft-
ware tool for promoting self-regulation. Educational
Research and Evaluation, 7(2-3):313–334.
Hara, T., Kapp, F., Braun, I., and Schill, A. (2015). Compar-
ing tool-supported lecture readings and exercise tuto-
rials in classic university settings. In Proceedings of
the 7th International Conference on Computer Sup-
ported Education (CSEDU 2015), pages 244–252.
Kapp, F., Braun, I., K
¨
orndle, H., and Schill, A. (2014).
Metacognitive Support in University Lectures Pro-
vided via Mobile Devices. In INSTICC; Proceedings
of CSEDU 2014.
Kapp, F., Narciss, S., K
¨
orndle, H., and Proske, A. (2011).
Interaktive Lernaufgaben als Erfolgsfaktor f
¨
ur E-
Learning. Zeitschrift f
¨
ur E-Learning, 1(6):21–32.
CSEDU 2016 - 8th International Conference on Computer Supported Education
256