undertaking the assessment did not have possession
of their own individual handsets, so an assessment
signing in sheet was created. When students signed
in at the assessment room, they showed their student
identification to the invigilator who could then
locate and give them their individual handsets. At
the end of the assessment, the handsets were
collected from the students prior to them leaving the
assessment room.
Each test consisted of 20 MCQs, and each
question was allocated a time limit of 2 minutes.
Traditionally, when used for instant formative
feedback, it is common to display a graph showing
the percentage responses for each option answer,
and to display the correct answer for each question.
As the system was going to be used for summative
assessment, the author felt that displaying this data
would be a distraction for the students, and could
impact negatively on their confidence if they saw
that they were getting a number of questions wrong.
Therefore, after each question was polled for 2
minutes, signalled by a countdown indicator, no data
was displayed in the assessment room and the
lecturer would move on to the next question.
Following the completion of each MCQ test, The
PRS software was used to generate reports of
individual student marks, and also generic feedback
for each question with pie charts showing the spread
of responses and identifying the correct answers. For
each of the tests, the students’ individual marks and
generic feedback was made available through the
module website on the University’s managed
learning environment (MLE) within a 24 hour
period.
3 CHALLENGES
One if the primary considerations in considering
adopting the PRS as an assessment method in the
module, was the overall contribution of the
assessment towards the students final module mark.
The three PRS assessments were equally weighted at
at 20% so, collectively, The PRS assessments
contributed 60% towards the students’ final module
mark. The other 40% contribution towards the
students’ final module mark was by submission of
an individually written assignment. As the PRS
assessments contributed the greater portion of the
Students final module mark, it was important to
ensure that the assessment method and process were
robust and reliable. Prior to the implementation of
the PRS for summative assessment, a number of key
challenges were identified:
i. Reliability of hardware/software.
ii. Student trust in the PRS and assessment
process.
iii. Lecturer confidence in the PRS and the
assessment process.
Similar challenges have been identified by other
researchers (Roe and Robinson, 2010).
The PRS has been used extensively since 2005
for formative feedback, and during that time, had
shown itself to be reliable. In order to be assured of
reliability of its use for summative assessment a
number of checks were put in place. Once the MCQ
test had been created, it was given a trial run using a
small number of handsets (approximately 6). This
allowed a visual check of the questions for visibility
and clarity of format. It also allowed for visibility
and consistency of the countdown indicator on each
question slide to be checked as a means of showing
the open polling window and automatically closing
the question poll after the allotted timeframe had
elapsed. Finally, a report was generated to
demonstrate that the question data had been captured
and could be displayed as both individual marks and
generic feedback.
A key challenge to the assessment method would
be the need to gain the students trust and confidence
in the process, particularly as this would be the first
time that they had encountered this method of
summative assessment. This was approached in a
number of ways. Firstly, the assessment process was
described in detail during the module induction
session, which helped to demonstrate to the students
that the structure and process of the assessment had
been fully considered prior to its introduction.
Secondly, early in the module following the first
topic, there was a formative session arranged with a
“mock” of an actual summative assessment. At this
point handsets had been assigned to individual
names, and were distributed to the students. The first
slide asked students to “press button 1” to check that
the handsets were working. During the polling
timeframe, the “show response grid” button was
selected on the TurningPoint™ toolbar. This
projected a grid of all the handsets, and students
could see their name change colour as they pressed
their handset buttons (see figure 1 below – the
student names have been removed to ensure
confidentiality). This helped to increase the students’
confidence and trust in the reliability of the
technology.
Following the “mock” PRS assessment, the
process was discussed and any questions or concerns
were addressed. One issue that the students raised
was concerns that their handset wouldn’t work on
ReflectionsontheUseofaPersonalResponseSystem(PRS)forSummativeAssessmentinanUndergraduateTaught
Module
289