helped the discussion about the goodness of the pro-
posed solutions.
Since the students were the same for both the
lessons, exercises and questions were different to
avoid annoying them. In the first lesson, we used the
system described in this paper to allow the students to
answer the questions and to record data about interac-
tion. In the second lesson, the students were required
to raise their hand to answer to the question when they
agree with one of the possible answers and the teacher
counted the number of raised hands to record data.
During the first lesson, 96 users registered to the
system, 55 of them as “anonymous”. The system
recorded 215 answers to the eight questions, on aver-
age, nearly 27 answers to each question. This number
could seem low considering the number of users fol-
lowing the lesson, but we must note here that the ques-
tions were not easy for the students and they knew the
system was recording their interactions so they want
to avoid to give wrong answers. Nevertheless, con-
sidering only the authenticated students, the number
of answers per users was, on average, 2,68.
The teacher counted 94 students attending the sec-
ond lesson. In this case the number of answers to each
question was, on average, 4, 5, so the use of the sys-
tem, during this test, increased nearly six times the
number of answers. We must note here that this result
can be due to many factors as, for example, the age
and gender of the participants, but it is clear that the
use of a device to answer, instead of a raised hand,
can help to increase the number of received answers
in a very significant way.
Finally, we collected data in 37 commercial pre-
sentations, where the system was used by different
companies which aimed at increasing users engage-
ment and attention during the presentation of their
products. Five out of these 37 events were on-line
presentations, the others were on-site. During these
events, 3753 users accessed the slideshow using our
system (2372 during the two on-line events, 1381
users were following the events on-site). Unfortu-
nately, we were not always able to collect the number
of people attending the events, but in the 11 events for
which we know the number of attending people, on
average, 75% of the audience followed the slideshows
using their smartphones or tablets, with a peak of
100% for an event.
During the 37 slideshows, the users liked, on the
whole, 4842 slides; the average of like operations per
user is 1, 66. In particular an event reaches 8, 59 “like”
per user.
The users filled out 2266 questionnaires, shared
38 slides on Facebook and sent 175 slides by email.
Moreover, the system collected 920 comments.
All these data shows that the system is able to per-
suade users to interact with the presentation, and this
is an interesting result. When we can collect the num-
ber of participants, we recorded that the system has
involved, on average, 75% of the audience. More-
over, by interacting with the presentation, the users
also gave some information about themselves to the
server, e. g. the email, and this is really useful during
commercial presentation since this data can be used
to re-contact people after the event. In the same way,
information like time spent viewing each slide can be
used for commercial purpose, since it can help to un-
derstand what has most interested the user.
Since the number of interactions with the system
is only a rough measure of its ability to engage users,
at the end of the last three events we asked to the audi-
ence to answer a question about their satisfaction with
the system: 1090 users answered the question (85% of
the total audience), 63% of the users gave a very pos-
itive evaluation, 36% gave a positive evaluation and
only 1% of the users gave a negative evaluation of the
system, therefore we can state that the users liked the
proposed system.
Finally, these tests have shown that the sys-
tem is not simply a prototype but its coverage of
PowerPoint
r
features is sufficient to make it usable.
Moreover, during all these tests, both the users and
the speakers never reported problems in the synchro-
nization of the slides.
5.3 Tests with Other Formats
During the whole test phase we considered also
other format, i. e., presentation created with Ap-
ple Keynote
r
or OpenOffice. In particular, we stud-
ied the ODF format(Organization for the Advance-
ment of Structured Information Standards (OASIS),
2005), but it resulted completely different from the
OOXML format(ECMA International, 2012) used by
PowerPoint
r
, therefore our converter is currently not
able to support this format. But our system, with some
limitations, can be used also to distribute presenta-
tions created with Keynote
r
or OpenOffice, using a
converter from the output from these software to the
OOXML format.
We tested three set of slide generated by other
software and then converted to the OOXML format.
The obtained results, converted with our system, were
evaluated with the same method described above:
each slide was evaluated to 0, 1 or 2 according to
the quality of the conversion, by the same people dis-
cussed in Section 5.1.
The results shown that, on average, the presenta-
tions got 89% of the total available points. In partic-
WEBIST2015-11thInternationalConferenceonWebInformationSystemsandTechnologies
62