and addressed this issue in follow-up student focus
groups. He discovered that “students expressed the
belief that if they were content with their teachers’
performance, there was no reason to complete the
survey. Therefore, the lack of student participation
may be an indication that the teacher was doing a
good job, not the reverse.”
Given the existence of such public Web sites as
RateMyProfessors.com, we had thought that the
simple existence of our system would trigger a Field
of Dreams reaction among students: if we built it,
they would come. Obviously, we were wrong.
The reasons why we were wrong are not clear at
this time. A presentation on the system was made to
the Student Government Association, an article was
printed in the campus newspaper, and detailed
directions were posted on the first page of the
application’s Web site. (Notice the
Further Infor-
mation
links in the First Page Choice Point shown
previously in Figure 4.) At least some professors
strongly encouraged their students to use the system
and were deeply disappointed that they didn’t, as
demonstrated by the e-mail shown in Figure 8.
I checked my online evaluations and discovered
that no student filled one out. Can you check this out
and confirm [it]? If true, I am surprised and
dismayed, since I devoted 15 minutes of one class
encouraging students to fill them out -- not just for
me -- but for all of their classes.
Two possibilities exist:
1) You toggled [Web accessibility to OFF
on December 22, the last day of the final exam
period] per the Provost’s memo before my students
had a chance to use the system. (I toggled it back
[ON] just yesterday [January 2].)
2) Students are students.
[I think the reason is] more likely to be the
second possibility. Based upon this data (0 for 36
students), I am inclined to use a paper version
distributed in class [in] the future.
Figure 8: Faculty E-Mail re Response Rates
There has been discussion about offering some
sort of lottery prize that students would automatic-
ally become eligible for if they filled out at least one
course evaluations form, but we haven’t been able to
figure out how to make that possible without com-
promising the system’s anonymity.
We suspect some combination of several reasons
why the system was not used more by students.
• Students may not trust the system’s anonymity
and fear that professors may “get back at them” if
they submit less than flattering evaluations. We
certainly stressed the system’s anonymity and the
fact that professors can’t see the results until after
the deadline to turn in final grades. Still, when
we replaced the “pick a card from the hat”
system with the “copy the key from one Web
screen to another” system, we removed the phys-
ical manifestation of the system’s anonymity and
ultimately asked the students to trust our asser-
tion that the system kept their identities secret.
• Despite our efforts to “get the word out,” many
students still may not have known that the system
existed.
• The process of retrieving course survey keys and
then logging in to fill out an evaluation for each
course may be more confusing or difficult to
students than we imagine.
• As implied by our colleague in his e-mail shown
in Figure 8, students may simply be too lazy or
apathetic to bother with the system.
• The end of the semester and final examinations
are traumatic to many students, and evaluating a
course honestly and usefully can take real effort.
When students are asked to add course evalua-
tions to their list of duties at this time, it may be
inevitable that doing so ends up with a lower
priority than studying, relaxing, worrying, and
recuperating.
We intend to sponsor focus groups or at least talk to
groups of students early next semester to try to
understand why the system wasn’t used more exten-
sively. We consider this a critical task, as it is clear
that we need to increase student response rates
significantly to make our online course evaluation
system successful.
3.2 Faculty Information Distribution
It was rather disheartening to realize that numerous
faculty don’t read official university announcements
sent to their university e-mail accounts.
A memo from the Provost and system developers
was sent to all faculty e-mail accounts on December
5, 2004. This message contained general informa-
tion about the system’s use and specific information
providing the faculty member’s username and pass-
word. Further information was posted on the site via
a link that was accessible without logging in (see the
First Page Choice Point shown previously in Figure
4). Many faculty claimed to have never received
this e-mail for a variety of reasons such as those
illustrated in the two e-mail excerpts in Figure 9.
These problems were exacerbated by the fact
that the system was not officially announced until
the final two weeks of the semester. The reason for
the delay was that the Provost wanted the Dean’s
Council to approve the system before announcing it
to the general faculty, and that Council only met at
the beginning of each month. Therefore, we weren’t
able to get approval until early December.
DEVELOPMENT AND DEPLOYMENT OF A WEB-BASED COURSE EVALUATION SYSTEM - Trying to Satisfy the
Faculty, the Students, the Administration, and the Union
447