Researching Student Perceptions of and Experiences with Alternative
Learning Technologies
Replacing Traditional Tutorials with i>clicker Tutorials and Online Tutorials
Barry Cartwright and Sheri Fabian
Simon Fraser University, Burnaby, Canada
Keywords: Alternative Learning Technologies, Student Response Systems, Traditional Tutorials, i>clicker Tutorials,
Online Tutorials, Online Learning, Online Surveys, Learning Outcomes, Blended Learning, Hybrid
Learning.
Abstract: The researchers invited university students enrolled in two different offerings of a large introductory course
which had recently transitioned from traditional tutorials to student response system (i>clicker) tutorials,
four different offerings of two courses which had recently transitioned from traditional tutorials to online
tutorials, plus two different upper division courses which continued to employ traditional tutorials to
participate in an online survey regarding their experiences with traditional tutorials, fully online tutorials,
and tutorials that employed student response systems. The purpose of this study was to evaluate student
perceptions of and experiences with alternative learning technologies, and to determine whether these
alternative technologies improved learning outcomes when compared to more traditional teaching methods.
This paper reports on the design and implementation of the i>clicker and online tutorials, the design and
administration of the online survey, and strategies employed to enhance student participation in the survey.
While there was no measurable difference in terms of learning outcomes, the survey results indicate that
students prefer online tutorials over i>clicker and traditional tutorials, and that there is generally a high level
of student satisfaction when it comes to alternative learning technologies. The researchers were able to
identify which facets of traditional, i>clicker and online tutorials the students found most appealing (and/or
useful), and which facets they did not find appealing and/or useful.
1 INTRODUCTION
This paper reports on the results of a research study
into student perceptions of and experiences with
tutorials involving student response systems, online
tutorials, and traditional (in-person) tutorials. It
further describes the design and implementation of
the online tutorials and the tutorials involving
student response systems, the design and
administration of the online survey used to collect
the data, and the strategies employed to encourage
student participation in the survey. The research
findings identify which facets of the three tutorial
formats the students found most appealing and/or
useful, and which facets they did not find appealing
and/or useful. It was also possible to confirm the
degree to which student experiences with and
perceptions of emerging alternative technologies
actually correlated with learning outcomes.
2 BACKGROUND
At Simon Fraser University (SFU), three credit
courses require three hours of classroom instruction.
In the past, delivery of first and second year
Criminology courses invariably consisted of weekly
two hour lectures by the course instructor (in a large
lecture theatre, with all of the students present), plus
weekly 50 minute tutorials, led by a teaching
assistant (in a small classroom, with 15-17 students,
at a time other than that of the lecture).
In 2009, the School replaced traditional tutorials
in Introduction to Criminology (CRIM 101) with
tutorials involving student response systems,
referred to variously as digital voting systems, or
audience response systems (Comer and Lenaghan,
2012; Mathiasen, 2015). The School chose i>clicker
technology, known for its relative simplicity,
comparatively low cost, and compatibility with
226
Cartwright, B. and Fabian, S.
Researching Student Perceptions of and Experiences with Alternative Learning Technologies - Replacing Traditional Tutorials with i>clicker Tutorials and Online Tutorials.
DOI: 10.5220/0006253402260233
In Proceedings of the 9th International Conference on Computer Supported Education (CSEDU 2017) - Volume 1, pages 226-233
ISBN: 978-989-758-239-4
Copyright © 2017 by SCITEPRESS Science and Technology Publications, Lda. All rights reserved
Power Point, Excel and Word (cf. Barber and Njus,
2007; Heaslip, Donovan and Cullen, 2014). Unlike
traditional tutorials, conducted by teaching assistants
in small classrooms with small groups of students,
these 50 minute CRIM 101 i>clicker tutorials are
conducted in a large lecture theatre immediately
following the weekly two hour lecture, with the
entire class in attendance.
Encouraged by the apparent success of the
i>clicker tutorials, the School decided in 2011 to
replace traditional tutorials in CRIM 104 and CRIM
131 with online tutorials. These 50 minute online
tutorials can be taken at any time and from any
computer with an Internet connection, during the
one-week period they are open.
This shift in tutorial format was occasioned by
increasing student enrolment in Criminology
courses, and by budgetary issues caused by reduced
government funding (cf. Heaslip et al., 2014).
Among the appeals of these new learning
technologies are that they decrease costs and
accommodate growing student demand, while
maintaining the impression that universities are on
the cutting edge of technological and educational
innovation (Kirkwood and Price, 2014;
Larreamendy-Joerns and Leinhardt, 2006).
CRIM 101, 104 and 131 are high enrolment
courses, compulsory for intending Criminology
majors and minors, of which there are sometimes
more than two thousand. These are also popular
‘general interest’ or ‘breadth’ courses for
undergraduate students from other departments.
Thus, these three courses have lengthy waiting lists,
with many intending Criminology majors and
minors complaining about the courses acting as a
bottleneck.
To appreciate the logistics, for a single course
such as CRIM 101, using traditional tutorials would
require 22 different tutorial times, sufficient
classroom space (over a period of several days), and
six graduate students/teaching assistants.
Following the transition, CRIM 101 came to
involve a series of ten i>clicker tutorials (SFU has a
thirteen week semester, with no tutorial in the first
week, no tutorial during the mid-term exam week,
and no tutorial in the last week). The clicker tutorials
were structured around a customized course reader
designed in conjunction with the tutorials, with each
of the tutorials focusing on a selection from the
reader. The tutorials began with an i>clicker quiz to
assess student familiarity with the assigned reading,
followed by further instruction regarding the
reading, introduction of supplementary course
content, and interactive class activities and
discussion (facilitated by clickers). Attendance,
participation and quiz performance for clicker
tutorials accounted for between 12 and 20 percent of
the overall grade for the course, depending upon the
course instructor.
CRIM 104 and 131 have enrolments ranging
from 110 to 180 students. In their new
configurations, both involve a weekly, two hour
face-to-face lecture, supplemented by a series of ten
online tutorials, again premised upon a thirteen week
teaching semester. This combination of face-to-face
lectures and online tutorials could be described as a
‘hybrid’ or ‘blended’ approach to education
(Alammary, Sheard and Carbone, 2014; Means,
Toyama, Murphy, Bakia and Jones, 2010; Nguyen,
2016).
Each online tutorial for CRIM 104 consists of a
20-30 minute audio-visual presentation, an
interactive preliminary assessment exercise (an
educational video game), 10 interactive flash cards
that flip from term to definition, and a timed, 10
minute (5 question) quiz at the end (cf. MacKenzie
and Ballard, 2015). Students can earn one point per
tutorial for attendance and participation, by spending
a minimum of 30 minutes going through all four of
the required elements, and up to one point for their
performance on a five question quiz at the end. The
CRIM 104 tutorials are worth two percent each, or
20 percent of the overall grade for the course.
CRIM 131 is broken down at the beginning of
the semester into groups (or tutorials) consisting of
19 students. Each online tutorial for CRIM 131
consists of two to three online readings plus a 20
minute (10 question) quiz at the end. Over the
course of the semester, each CRIM 131 student is
required to provide an online presentation on the
assigned reading to their group, along with
discussion questions. Twice per semester, assigned
discussants from the group respond to these
questions, with the presenter facilitating discussion
(monitored by the teaching assistants and instructor)
(cf. Alammary et al., 2014). The CRIM 131 tutorials
are worth 25 percent of the overall grade for the
course—the quizzes, 10 percent, the presentation, 10
percent, and discussion, 5 percent.
The implementation of i>clicker tutorials in
CRIM 101 and online tutorials in CRIM 104 and
131 was influenced as much by pedagogical
considerations as by lengthy waiting lists and
budgetary constraints. While there remains a degree
of suspicion amongst some faculty with respect to
these new learning technologies (cf. Comer and
Lenaghan, 2012; Kirkwood and Price, 2014), there
is ample evidence to suggest that they have
Researching Student Perceptions of and Experiences with Alternative Learning Technologies - Replacing Traditional Tutorials with
i>clicker Tutorials and Online Tutorials
227
justifiably earned their place in higher education.
The jury may still be out regarding the capacity of
student response systems to improve learning
outcomes and/or final grades, but there seems to be
little question that such systems are useful for
stimulating discussion and increasing attendance and
participation in first and second year lecture-style
courses with large enrolments (FitzPatrick, Finn and
Campisi, 2011; Steer and Gray, 2012; Ulbig, 2016).
Research indicates that asynchronous discussion
groups like those used in CRIM 131 foster more
meaningful interaction, and thus promote ‘reflective
learning,’ because students have more time to think
about (reflect upon) what they want to say than in
face-to-face classroom discussions (Comer and
Lenaghan, 2012; Turney, Robinson, Lee and Soutar,
2009). The type of interactive preliminary
assessment exercises (educational video games)
employed in CRIM 104 have been shown to have a
positive effect on test results, and also, have proven
to be useful to students when it comes time to re-
visit course content in preparation for mid-term and
final examinations (Grimley, Green, Nilsen,
Thompson and Tomes, 2011; Hood, 2013; Means et
al., 2010). It could be said that i>clicker technology,
online discussion groups and interactive video
games are consistent with the ‘active learning’
paradigm—the notion that students learn better
when they assume a greater degree of responsibility
for their own education, through participation and
engagement in problem-solving, self-assessment,
and interaction (online or otherwise) with other
students (Handelsman, Miller and Pfund, 2007;
Heaslip et al., 2014).
3 METHODOLOGY
A decision was made to employ an online survey,
rather than an in-class survey. There are distinct
advantages to online surveys, including efficiency,
cost savings and the ease with which data can be
collected and analyzed (Anderson and Kanuka,
2003; Evans et al., 2009). A primary consideration
was the time required to complete a paper
questionnaire during class, especially when students
might have to sift through numerous questions that
did not directly pertain to them (e.g., questions about
tutorial formats with which they had no firsthand
experience, or questions about their proficiency in
English when English was their first language).
Moreover, to increase the breadth of the sample
population, it was deemed necessary to survey
courses that had been offered in the previous
semester. Given that classes were already finished
when the survey was conducted, students who took
one or more of the courses previously would have
been unable to complete an in-class survey. There
are, however, a number of reported problems with
online surveys, including low response rates and
survey abandonment (Adams and Umbach, 2012;
Webber, Lynch, and Oluku, 2013). Measures taken
to encourage participation and maximize completion
rates included six modest cash prizes (drawn
randomly) for students who completed the survey,
plus a series of three carefully timed reminders,
including one ‘personalized’ reminder from the
course instructor’s own email account (Best and
Krueger, 2004; Joinson and Reips, 2007).
Participation in the online survey was voluntary
and anonymous. The research was categorized as
“minimal risk” and was approved by SFU’s
Research Ethics Board in April 2013.
3.1 Population
The researchers invited students enrolled in their
Fall 2012 and Spring 2013 offerings of CRIM 101
(i>clicker tutorials), Fall 2012 and Spring 2013
offerings of CRIM 104 and CRIM 131 (online
tutorials) and the Spring 2013 offerings of CRIM
300 and CRIM 321 (traditional tutorials) to
participate in this online survey regarding their
experiences with traditional tutorials, i>clicker
tutorials and online tutorials. Students enrolled in the
Spring 2013 offerings of CRIM 300 and CRIM 321
were asked to participate because they by definition
had firsthand experience with traditional tutorials in
these two classes, and many had personal experience
with i>clicker tutorials or online tutorials, or both.
3.2 Response Rates
The response rate was 50 percent, with a completion
rate of 94 percent (N = 629), considered high for
online surveys of university classes (cf. Sax,
Gilmartin, and Bryant, 2003; Sue and Ritter, 2007).
The cash prizes likely had some effect on the
comparatively high response rates. However, most
participants completed the survey once started, and
many took the time to type in additional comments,
indicating a degree of personal commitment to the
outcome and integrity of the survey. In addition,
most prospective participants were personally
invited to take the survey by the course instructors
during lecture, which may have influenced the high
response and completion rates (Pan, Woodside and
Meng, 2013).
CSEDU 2017 - 9th International Conference on Computer Supported Education
228
3.3 Survey Design
The online survey consisted of 22 questions
regarding student experiences with and perceptions
of traditional tutorials, student response system
(i>clicker) tutorials, and online tutorials. There were
also 21 ‘demographic’ questions pertaining to age,
gender, citizenship, fluency in the English language,
credit hours accumulated and grade point average.
These ‘quantitative’ questions were designed in a
manner that facilitated direct data transfer to SPSS.
There were also three areas in the survey where
students were invited to offer as much ‘qualitative’
commentary as they liked on the three different
types of tutorials.
The survey employed software from
fluidsurveys.com, which hosts online surveys, stores
survey data, and offers a number of pre-designed
survey templates (Evans, Burnett, Kendrick,
MacRina, Snyder, Roy and Stephens, 2009). The
software permits researchers to send personal email
invitations to prospective participants, keep track of
overall response rates, and send out reminders. As
importantly, fluidsurveys.com software offers ‘skip
logic,’ whereby required questions are answered by
all participants, while topic-specific questions are
presented only to participants who trigger them
through their previous responses (Evans et al., 2009;
Rademacher and Lippke, 2007). To illustrate,
students who indicated that they had never taken a
traditional tutorial would not be required to answer
questions on this subject, and instead, would
automatically be redirected to the next set of
questions regarding i>clicker tutorials (assuming
that they had previously indicated experience with
i>clicker tutorials). This ‘skip logic’ kept
participants from answering questions that were not
intended for them, thereby resulting in an average
survey completion time of 11 minutes and 22
seconds. Simplifying the format and reducing
length are crucial factors in minimizing “survey
fatigue” and enhancing completion rates for online
surveys (Anderson and Kanuka, 2003; Kaplowitz,
Lupi, Couper, and Thorp, 2012; Maloshonok and
Terentev, 2016).
4 RESEARCH FINDINGS
Of the 1500 students eligible to participate, 663
started the online survey, and 629 completed it. Of
the 663 who started the online survey, 135 (20.4%)
were declared Criminology majors or minors, while
another 276 (41.6%) were intending Criminology
majors or minors. That said, 252 of the respondents
came from a wide range of disciplines, such as
business administration, psychology, health
sciences, economics, linguistics, communications,
kinesiology, biology, chemistry, and mathematics.
As expected when surveying students in first, second
and third year university courses, ages ranged from
18 to 46, with the average age being 21 (mode = 19,
median = 20). Of the 630 who answered the question
on gender, 385 (58.1%) were females, 244 (36.8%)
were males, while one identified as transgendered.
The seeming overrepresentation of females is
consistent with known enrolment patterns in the
courses being surveyed. Moreover, other researchers
have reported that females are more likely than
males to respond to online surveys (Laguilles,
Williams, and Saunders, 2011; Sax et al., 2003).
4.1 Student Perceptions of and
Experiences with Traditional
Tutorials
Traditional tutorials are the predominant way in
which the requisite third hour of weekly instruction
is delivered to first and second year students at SFU.
As noted above, these tutorials are conducted by
teaching assistants in small classrooms, with 15-17
students in attendance. Tutorial activities typically
consist of student presentations, discussion of the
weekly readings and lecture content, and/or
supplementary instruction by the teaching assistant.
Student perceptions of and experiences with
traditional tutorials were generally positive (see
Figure 1 below). Of the 172 students who reported
firsthand experience with these tutorials, 64.5% said
that they enjoyed the opportunity to meet and
interact with other students, 62% felt that they
acquired a better understanding of the course
content, and 56.4% that they received a better
quality of instruction (multiple responses were
permitted).
On the other hand, 51.4% said that they disliked
doing student presentations, 51.4% that there was a
disparity in the quality of instruction between the
different teaching assistants, and 44.7% that they did
not enjoy having to speak in class. By far the most
common complaint—by 59.2% of those with
firsthand experience with traditional tutorials—was
that tutorial times conflicted with other courses they
wanted to take, or conflicted with their work
schedules (cf. Bolliger and Erichsen, 2013).
Researching Student Perceptions of and Experiences with Alternative Learning Technologies - Replacing Traditional Tutorials with
i>clicker Tutorials and Online Tutorials
229
Figure 1: Positive Experiences with Traditional Tutorials.
4.2 Student Perceptions of and
Experiences with i>clicker
Tutorials
The student response system (i>clicker) tutorials for
CRIM 101 are conducted in a large lecture theatre
immediately following the weekly two hour lecture,
with the entire class in attendance. These tutorials
begin with an i>clicker quiz on the assigned reading,
followed by further instruction or clarification
regarding the reading, supplementary course content,
and interactive class activities and discussion.
Although a marked departure from the traditional
tutorial format, i>clicker tutorials seem to have
generally been well received by students (see Figure
2 below). Of the 319 students who reported firsthand
experience with clicker tutorials, 70.5% said that
they appreciated the opportunity to practice exam-
type questions during the quizzes and interactive
activities (cf. Hwang, Wong, Lam and Lam, 2015;
Ulbig, 2016), 51.5% that clicker technology allowed
them to participate actively in class without having
to speak (cf. Heaslip et al., 2014), 53.1% that they
liked being able to gauge knowledge of the course
content through the clicker quizzes, and 48.1% that
they liked having the tutorial scheduled for the one
hour period immediately following the lecture.
More than half the students (52.7%) said they
found the three hour session (a two hour lecture
followed by a 50 minute tutorial) too long. This
could not be resolved without returning to the
traditional tutorial format or turning to the online
tutorial format. The second most common complaint
(47.0%) was the $40 cost of the i>clicker (cf. Ulbig,
2016). This second problem is resolving itself over
time, because more and more courses are employing
i>clicker technology at SFU, there are growing
numbers of previously used clickers for sale at cheap
prices, and clickers are being shared between friends
or family members who are not registered in the
same course.
Figure 2: Positive Experiences with Clicker Tutorials.
4.3 Student Perceptions of and
Experiences with Online Tutorials
The online tutorials for CRIM 104 and 131 can be
taken at any time and from any computer with
Internet connectivity, during the one week that they
are open. The tutorials for these two courses are
quite different from each other. The CRIM 104
tutorials consist of an audio-visual presentation, an
interactive preliminary assessment exercise/video
game, interactive flash cards and a 10 minute (5
question) quiz. Online tutorials for CRIM 131
consist of weekly online readings and a 20 minute
(10 question) quiz at the end, plus online
presentations and discussions.
Of the 303 students who reported firsthand
experience with online tutorials, 63.5% appreciated
that the tutorial structure allowed them to participate
without having to speak in class, 60.7% said they
liked being able to gauge their knowledge of the
course content through the weekly quizzes, and
57.8% that they liked the opportunity to practice
exam-type questions during quizzes and interactive
activities (see Figure 3 below). In contrast to
i>clicker tutorials, where only 36.5% felt they
developed a better understanding of the course
content and the assigned readings, 53.8% reported a
better understanding of course content and readings
with online tutorials. What students overwhelmingly
appreciated about online tutorials (82.5%) was that
they could attend them at a time of their own
choosing (cf. Cole, Shelley and Schwartz, 2014).
The most comment complaint (72.6%) was the
$40 cost of accessing the online tutorials. This
problem likely cannot be resolved, as the tutorials
for both courses were designed in conjunction with
publishing companies, and involved copyright
issues, proprietary templates, royalties, and
professional design teams external to the university
(cf. MacKenzie and Ballard, 2015).
CSEDU 2017 - 9th International Conference on Computer Supported Education
230
Figure 3: Positive Experiences with Online Tutorials.
4.4 Student Ratings of Tutorial
Formats
When students were asked to rate the three different
tutorial formats, by assigning the type of letter
grade that they would themselves receive for their
own coursework (with A being the highest grade,
and F the lowest), the online tutorials proved to be
the most well received of the three formats (41.3%
As, 41% Bs, and 11% Cs), followed by i>clicker
tutorials (28.5% As, 51.5% Bs and 14.7% Cs), and
then traditional tutorials (11.2% As, 63% Bs and
21.3% Cs) (see Figure 4). However, when asked in a
follow-up question what type of tutorial format they
Figure 4: Student Ratings for the Three Tutorial Formats.
would prefer if given a choice, the order between
traditional tutorials and i>clicker tutorials was
reversed, with 20.2 percent stating a preference for
online tutorials, 15.7 percent saying they would
prefer traditional tutorials, and 14.3 percent saying
they would prefer clicker tutorials. Student support
also existed for a combination of traditional and
i>clicker tutorials (10.6%), or traditional and online
tutorials (8.4%), or clicker and online tutorials
(5.7%), suggesting that students appreciate ‘blended’
course delivery methods (Bolliger and Erichsen,
2013; Cole et al., 2014; Hood, 2013).
5 DISCUSSION
The researchers were concerned that students might
have rated i>clicker and online tutorials more
favorably because they appealed to a large segment
of the SFU student body that speaks English as an
additional language. Of the 626 students who
completed the survey, 467 (74.6%) spoke a language
other than English. Of those, 356 (76.2%) learned to
speak another language before they learned to speak
English. Approximately 35% spoke a Chinese
dialect (e.g., Mandarin, Cantonese), 9.6% spoke
Punjabi, 7.7% spoke Korean, with the remainder
speaking a wide variety of languages.
A series of chi-square tests were performed,
coming at the subject from a number of different
angles, including student-reported difficulty in
reading, writing or speaking English, and even
residency status in Canada. Most showed negligible
results. Moreover, directional measures including
lambda and Goodman and Kruskal tau refuted any
predictive relationship or association between
language proficiency and overall ratings of the
i>clicker and/or online tutorials.
There was also a concern that i>clicker and/or
online tutorials might be rated more favorably
because they were perceived as being less
scholastically challenging than traditional tutorials.
However, there were no statistically significant
findings to report when it came to the relationship
(or lack thereof) between scholastic achievement (as
measured by student-reported GPA) and overall
student ratings of the i>clicker and/or online
tutorials.
Finally, there was a concern that between-
instructor teaching methods might have influenced
how the students rated the three tutorial delivery
methods. Again, however, there were no statistically
significant findings to report.
For all introductory classes, tutorial grades were
subjected to an independent samples t-test. The
mean of the traditional tutorial is 78.41 (s=10.75)
and the mean of the non-traditional tutorial is 82.58
(s=11.09). As shown in Table 1, there is a
statistically significant mean difference of 4.17
percent (t=13.68, df=3769, p<0.001) between the
traditional and non-traditional formats. The tutorial
grades for each separate class displayed the same
result of a significant mean difference between
traditional and non-traditional tutorial grades with
the exception of CRIM 131. Despite the mean
differences between grades for traditional and non-
traditional tutorials for CRIM 101 and 104, there
Researching Student Perceptions of and Experiences with Alternative Learning Technologies - Replacing Traditional Tutorials with
i>clicker Tutorials and Online Tutorials
231
were no substantive differences in the overall final
grades for the different versions of the courses.
Table 1: Independent Samples t-test of Differences in
Tutorial Grades.
6 CONCLUSIONS
The fact that students liked the online tutorials more
than traditional tutorials, and/or that they rated
clicker tutorials on roughly the same plane as
traditional tutorials, does not necessarily imply that
online tutorials and i>clicker tutorials should be
regarded as superior to—or the equivalent of—
traditional tutorials. Indeed, the findings of this
present research study run contrary to the findings of
a number of other studies (cf. MacKenzie and
Ballard, 2015; Nguyen, 2015), which suggest that
greater use of online content and student response
systems in large classroom environments may lead
to improved learning outcomes. The data analysis
from the present study suggests that implementation
of these alternative learning technologies had a
minimal effect on learning outcomes, as measured
by final grades and grades on midterm and final
examinations (cf. Ulbig, 2016). There also remains
the salient issue of whether or not educators should
be catering to student preferences for “anonymity”
and not having to speak in front of a class (Bolliger
and Erichsen, 2013; Heaslip et al., 2014; Mathiasen,
2015).
Nevertheless, if students are open to these
emerging learning technologies, feel more engaged
in the learning process as a consequence, and feel
that they learn—and perform—better on
examinations, then the argument can be made that
these learning technologies deserve consideration for
wider deployment in higher education (cf. Cole et al,
2014). In fact, since this research study was
conducted, one version of CRIM 101 has shifted to
the use of online tutorials similar to those already in
use in CRIM 104 (the other version of CRIM 101 is
still using i>clicker tutorials). Moreover, a new
special topics course on cybercrime, CRIM 218, has
been designed using i>clicker tutorials similar to
those still used in one of the versions of CRIM 101.
While these newly-designed courses were not
included in this present study, students at SFU are
asked to complete formal written evaluations for
every course that they take, rating the presentation of
course materials and the performance of the
instructor. Course evaluations completed by students
at the end of each term continue to indicate that
these alternative learning technologies have been
well received by students enrolled in the two new
courses.
ACKNOWLEDGEMENTS
This study was funded in part by a Teaching and
Learning Development Grant from the Institute for
the Study of Teaching and Learning in the
Discipline at Simon Fraser University in Burnaby,
Canada. We would also like to thank the Centre for
Online and Distance Education at SFU for allowing
us to conduct this study. In addition, we value the
work of our research assistants, Aynsley Pescitelli
and Rahul Sharma, and the assistance of the
Teaching and Learning Centre at SFU.
REFERENCES
Adams, J.D. and Umbach, P.D., 2012. Nonresponse and
online student evaluations of teaching: Understanding
the influence of salience, fatigue and academic
environments. Research in Higher Education, 53(4),
pp. 576-591.
Alammary, A., Sheard, J. and Carbone, A., 2014. Blended
learning in higher education: Three different design
approaches. Australasian Journal of Educational
Technology,30(4), pp. 440-454.
Anderson, T. and Kanuka, H., 2003. e-Research: Methods,
Strategies, and Issues. Pearson Education, Inc.,
Boston.
Barber, M. and Njus, David, 2007. Clicker evolution:
Seeking intelligent design., Life Sciences Education,
6(1), pp. 1-20.
Best, S.J. and Krueger, B.S. 2004 Internet Data
Collection, Sage Publications, Thousand Oaks, CA.
Bolliger, D.U. and Erichsen, E.A. 2013 Student
Satisfaction with Blended and Online Courses Based
on Personality Type. Canadian Journal of Learning
and Technology, 39(10), pp. 1-23.
Cole, M.T., Shelley, D.J. and Swartz, L.B., 2014. Online
Instruction, E-Learning, and Student Satisfaction: A
Three Year Study. The International Review of
Research in Open and Distance Education Learning,
15( 6), pp. 111-131.
Comer, D.R. and Lenaghan, J.A., 2012. Enhancing
Discussions in the Asynchronous Classroom: The
Al l cour s e s
(mean)
Crim 101
(mean)
Crim 104
(mean)
Cri m 131
(mean)
Tradi ti onal
78.41 (10.75) 77.79 (10.23) 79.91 (12.57) 78.61 (13.57)
Non-Traditional
82.58 (11.09) 84.25 (10.20) 87.11 (11.03) 78.15 (11.45)
Mean Difference
4.17 (0.30) 4.66 (0.35) 7.20 (0.69) 0.46 (1.25)
t-value
13.68*** 13.45*** 10.46*** 0.37
***p<0.0001
CSEDU 2017 - 9th International Conference on Computer Supported Education
232
Lack of Face-to-Face Interaction Does not Lessen the
Lesson. Journal of Management Education,37(2), pp.
261-294.
Evans, R.R., Burnett, D.O., Kendrick, O.W., MacRina,
D.M., Synder, S.W., Roy, J.P.L. and Stephens, B.C.,
2009. Developing Valid and Reliable Online Survey
Instruments Using Commercial Software Programs.
Journal of Consumer Health on the Internet, 13(1), pp.
42-52.
FitzPatrick, K.A., Finn, K.E. and Campisi, J., 2011. Effect
of Personal Response Systems on Student Perception
and Academic Performance in Courses in a Health
Sciences Curriculum. Advances in Physiology
Education, 35(3), pp. 280-289.
Grimley, M., Green, R., Nilsen, T., Thompson, D. and
Tomes, R., 2011. Using Computer Games for
Instruction: The Student Experience. Active Learning
In Higher Education, 12(1), pp. 45-56.
Handelsman, J., Miller, S. and Pfund, C., 2007. Scientific
Teaching. Roberts and Company, Englewood, CO.
Heaslip, G., Donovan, P. and Cullen, J.G., 2014. Student
response systems and learner engagement in large
classes. Active Learning in Higher Education, 15(1),
pp. 11-24.
Hood, M., 2013. "Bricks or clicks? Predicting student
intentions in a blended learning buffet." Australasian
Journal of Educational Technology, 29(6), pp. 762-
776.
Hwang, I., Wong, K., Lam, S.L. and Lam, P., 2015.
Student Response (clicker) Systems: Preferences of
Biomedical Physiology Students in Asian Classes. The
Electronic Journal of e-Learning, 13(5), pp. 319-330.
Joinson, A.N. and Reips, U., 2007. Personalized
Salutation, Power of Sender, and Response Rates to
Web-Based Surveys. Computers in Human Behavior,
23(3), pp. 1372-1383.
Kaplowitz, M.D., Lupi, F., Couper, M.P. and Thorp, L.,
2012. The Effect of Invitation Design on Web Survey
Responses. Social Science Computer Review, 30(3),
pp. 339-349.
Kirkwood, A. and Price, L., 2013. Technology-enhanced
learning and teaching in higher education: what is
‘enhanced’ and how do we know? A critical literature
review. Learning, Media and Technology, 39(1), pp.
6-36.
Laguilles, J.S., Williams, E.A. and Saunders, D.B., 2011.
Can Lottery Incentives Boost Web Survey Response
Rates? Findings from Four Experiments. Research in
Higher Education, 52(2), pp. 537-553.
Larreamendy-Joerns, J. and Leinhardt, G., 2006. Going
the Distance with Online Education. Review of
Educational Research, 76(4), pp. 567-605.
MacKenzie, L. and Ballard, K., 2015., Can Using
Individual Online Interactive Activities Enhance Exam
Results? MERLOT Journal of Online Learning and
Teaching, 1(2), pp. 262-266.
Mathaisen, H., 2015. Digital Voting Systems and
Communication in Classroom Lectures: an empirical
study based around physics teaching at bachelor level
at two Danish universities. Journal of Interactive
Media in Education, 1(1), pp. 1-8.
Means, B., Toyama, Y., Murphy, R., Bakia, M. and Jones,
K., 2010. Evaluation of Evidence-Based Practices in
Online Learning: A Meta-Analysis and Review of
Online Learning Studies. U.S. Department of
Education, Washington, D.C.
Nguyen, T., 2015. The Effectiveness of Online Learning:
Beyond No Significant Difference and Future
Horizons. MERLOT Journal of Online Learning and
Teaching, (2), pp. 309-319.
Pan, B., Woodside, A.G. and Meng, F., 2013. How
Contextual Cues Impact Response and Conversion
Rates of Online Surveys. Journal of Travel
Research, 53(1), pp. 58-68.
Rademacher, J.D. and Lippke, S., 2007. Dynamic Online
Surveys and Experiments with the Free Open-Source
Software dynaQuest. Behavior Research
Methods,39(3), pp. 415-426.
Sax, L.J., Gilmartin, S.K. and Bryant, A.N., 2003.
Assessing Response Rates and Nonresponse Bias in
Web and Paper Surveys. Research in Higher
Education, 44(4), pp. 409-432.
Steer, D.N. and Gray, K., 2012. Personal Response
Systems and Learning: It is the Pedagogy that Matters,
Not the Technology. Journal of College Science
Teaching, 41(5), pp. 80-89.
Sue, V.M. and Ritter, L.A., 2007. Conducting Online
Surveys. Sage Publications, Thousand Oaks, CA.
Turney, C.S.M., Robinson, D., Lee, M. and Soutar, A.,
2009. Using Technology in Higher Education: The
Way Forward? Active Learning In Higher
Education, 10(1), pp. 71-83.
Ulbig, S.G., 2016. I Like the Way this Feels: Using
Classroom Response System Technology to Enhance
Tactile Learners’ Introductory American Government
Experience. Journal of Political Science Education,
12(1), pp. 41-57.
Webber, M., Lynch, S. and Oluku, J., 2013. Enhancing
Student Engagement in Student Experience
Surveys. Educational Research, 1, pp. 71-86.
Researching Student Perceptions of and Experiences with Alternative Learning Technologies - Replacing Traditional Tutorials with
i>clicker Tutorials and Online Tutorials
233