THE EVALUATION OF AN E-LEARNING
WEB-BASED PLATFORM
Michail N. Giannakos
Department of Informatics, Ionian University, Ntinou Theotoki 2B, Corfu, Greece
Keywords: Evaluation, Usefulness, Usability, e-Learning, Effectiveness, Efficiency, Pedagogical usability,
Presentation, Hypermediality, User’s activity, Application proactivity, Learning effectiveness calculator, e-
Learning platform.
Abstract: In recent years, designing useful learning diagnosis systems has become a hot research topic. In order to
help teachers and designers to create useful e-learning environment we tried to find an evaluation method
that would evaluate an applications’ usefulness and also its pedagogical abilities. Because one evaluator
(typically a teacher, designer or planer) can hardly be an expert on all fields of science, a multidisciplinary
evaluation framework has been created to help the evaluators to address the critical factors of quality of e-
learning. The purpose of this paper is to describe an evaluation system based on usability and pedagogical
usability evaluation of e-learning. The evaluation framework and the prototype have been tested at the
Department of Informatics at Ionian University, in the courses of Mathematical Modelling.
1 INTRODUCTION
New information and communication technologies
allow learning “far away” from the teaching source.
One challenge for e-learning educators is to design
useful learning diagnosis system (Ssemugabi, De
Villiers, 2007). An e-learning system based on the
two cores, usability and pedagogical usability. The
International Standards Organisation (ISO) defines
usability as (ISO-924, 1998):
“The extent to which a product can be used by
specified users to achieve specified goals with
effectiveness, efficiency and satisfaction in a
specified context.”
There are various evaluation methods such as
analytical, expert heuristic evaluation, survey,
observation and experimental methods (Hartson,
Andre, & Williges, 2003, Quintana, Carra, Krajcik,
Elliot, 2001).
Pedagogical usability evaluation should address
aspects of pedagogy and learning from education
domains as well as human-computer interaction
factors (Ravden, Johnson, 1989), such as the
effectiveness of interfaces and the quality of
usability and interaction (Silius, Tervakari,
Pohjolainen, 2003).
The paper is organized as follow:
Section 2 describes the e-learning platform which is
used for the evaluation procedure.
Section 3 describes the evaluation procedures as
well as usability field as at pedagogical. Section 4
reports the early results and Section 5 provides
conclusions.
2 THE E-LEARNING PLATFORM
Our e-learning platform (figure 1) consists of a Web
Page, with navel point an enhanced Webcast and at
the same time it will have other capabilities, like
Java applications, connections on the Web in
selected applications and services. Looking at the
interface (figure 1) we can easily see the format that
our e-learning platform have. The system will be
user friendly, (Figure 1) it doesn’t require any
special computer skills from the user. We also
suggest that the study of this educational material be
linear, exactly like the creator has designed it.
Of course, the system isn’t restricted only to the
study of the material the Webcast provides, because
of the other media that are been utilised, mainly
through the Web.
433
N. Giannakos M. (2010).
THE EVALUATION OF AN E-LEARNING WEB-BASED PLATFORM.
In Proceedings of the 2nd International Conference on Computer Supported Education, pages 433-438
DOI: 10.5220/0002799504330438
Copyright
c
SciTePress
Figure 1: The e-Learning Platform.
3 E-LEARNING EVALUATION
The e-learning evaluation was based on earlier
research on human-computer interaction (
Quintana,
Carra, Krajcik, Elliot, 2001), psychology and
pedagogy as well as on evaluation research which
has its roots in the theory of usefulness of computer
system (Silius, Tervakari & Pohjolainen, 2003).
Usefulness of e-learning environments is divided
into two main issues: usability and pedagogical
usability (Figure 2).
Figure 2: Usefulness of e-learning environment is a
combination of its usability and pedagogical usability
Based on Nielsen (1993).
3.1 Usability Evaluation
Ensuring usability is one of the main challenges for
the e-learning system developers. An appropriate set
of 10 criteria (Table 1) based on an undertaken study
of the author (Giannakos, 2009) combined with a 5-
point rating scale (similar to Nielsen)(Nielsen, 1993)
will be a nice solution to assess the problems and
assign severities (Table 2).
Table 1: Set of usability criteria.
General Usability Criteria for e-Learning Context
1 Avoid unnecessary elements. Avoid when possible
chattiness, complex graphics etc. Extra information
distracts the user from its target.
2 Usage of comprehensive language. Avoid using
computer and system terms. Pursuit to use phrases
that the user can easily understand.
3 Minimization of the users mnemonic load.
Help the user to identify and not to remember
4 Maintain consistency throughout the interface.
5 Provide feedback. Inform the users about what is
happening in the system. If the waiting is >10’ then
we have a sign of work progress, if it is 1-10’ change
of the cursor shape.
6 Give easy and clear ways of escape. To exit fast
from a wrong situation. Give the ability to Cancel,
Undo and Redo
7 Provide shortcuts for quick work access from
experienced users. (e.x. keys, predict typing,
repetition of last orders, recent documents, macros.)
The shortcuts should be obvious to users
8 Provide clear messages of errors. Avoid encoding
the error messages. Not aggressive or rude language,
precise expression, constructive character,
indications, connection with aids.
9 Design to avoid errors from the user. For example
option of a name record instead of its typing,
confirmation before a dangerous act, avoid using the
same order with different meaning in a different
situation
.
10 Efficient backup- Help and Manuals. The search
in the manuals has to be easy, to be structured
according to the user’s works; there should be an
extensive usage of examples etc
CSEDU 2010 - 2nd International Conference on Computer Supported Education
434
Table 2: Five-point rating scale.
Cosmetic Problem
Will not affect the use of the system. Fix it if
possible.
1
Minor Problem
Users can easily work around the problem. Fixing it
should be given a low priority.
2
Medium Problem
Users are likely to encounter this problem but will
quickly adapt. Fixing it should be a medium
priority.
3
Major Problem
Users will find this problem difficult but may work
around it. Fixing it should have a high priority
4
Catastrophic Problem
Users will be unable to do their work because of
this problem. Fixing it is mandatory
5
Not Applicable
I don’t consider this to be a problem
N
Based on a previous research (Ardito, Costabile, De
Marsico, Lanzilotti, Levialdi, Roselli, Rossano, 2006)
we can divide usability evaluation analysis in 4 same
weight dimensions:
Presentation:
All aspects bound to visualization of services and
elements of e-learning platforms.
Hypermediality:
Hypermediality allows communicating through
different channels and even to organize lessons in a
non-sequential way, possibly allowing a student to
choose a logical path different from the one
suggested.
Application Proactivity:
E-learning platforms services not strictly related to
reading the content. Ease of use of such services
gains an even greater importance in Learning Centre
Design (LCD) systems, because the user just makes
an effort consisting in learning, which is the primary
goal.
User’s Priority,
User’s needs that could arise during the interaction.
Each dimension according to the general
principles (
Ardito, Costabile, De Marsico, Lanzilotti,
Lev
ialdi, Roselli, Rossano, 2006) is effectiveness
and efficiency:
Effectiveness:
How the tools provided by the platform allow
learning and preparing lessons in effective way.
How the provided services satisfy these needs
greatly influence the learning effectiveness.
Efficiency:
How efficiently the activities the user usually
performs are structured and visualised.
How the platform adapts to the technology used by
the learner to access it.
Making a deeper analysis (Ardito, Costabile, De
Marsico, Lanzilotti, Levialdi, Roselli, Rossano,
2006), it result this model of usability evaluation
(Figure 3).
Figure 3: Usability evaluation model.
According to the 10 criteria study we undertook
combined with the four directions that has been
studied we come to the Table 3. The table with the
5-point rating scale and with the following equation
(1) are our guide for the usability evaluation
procedure.
12
11 1 1 2 2 2 2
0.5 0.5
0.5 0.5
44
UU U
P H UA AP P H UA AP
=+=
++ + ++ +
⎛⎞
+
⎜⎟
⎝⎠
(1)
Presentation P, Hypermediality H, User’s
Activity UA, Application Proactivity AP,
Usability U, Indicator 1 Usability
Effectiveness, Indicator 2 Usability Efficiency.
3.2 Pedagogical Usability Evaluation
In its different form the e-learning offers a set of
considerable priorities over the traditional teaching
(Valcheva, Todorova, 2005).
Individual Instructions
Reduced Costs
Opportunity for team work
Flexibility of the learning material etc.
e-Learning takes the place of face to face learning.
As a result all these pedagogical theories (Bruner,
1960,
Quintana, Carra, Krajcik, Elliot, 2001) that
were applied in face to face learning must be
implemented into e-learning. That is the reason why
we should make the pedagogical evaluation of e-
learning.
THE EVALUATION OF AN E-LEARNING WEB-BASED PLATFORM
435
Table 3: The outcome of the combination of the table 1 criteria and the 4 dimensions analysis evaluation, compose this
questionnaire which is the base of the usability evaluation of the e-learning platforms. Based on (Ardito, Costabile, De
Marsico, Lanzilotti, Levialdi, Roselli, Rossano, 2006, Vlamos, 2001 Gillham, 2000).
General
Princi
p
les
Guidelines
Effectiveness
Presentation
For interface graphical matters the same UCD attributes hold
Errors and points to avoid are marked
Possibility to personalize interface graphics
Hypermediality
The lecturer is supported in preparing multimedia material
Easy navigation between subjects is allowed by highlighting cross-reference
through state and course maps
Through different media channels communication is possible
You can have a personalized access to learning subjects
Application
Proactivity
Lecturers are able to access scaffolding libraries and propose winning models
Ability to administer user profile
The platform automatically updates students’ progress tracking
It is possible to put in learning domain tools
Possibility to put in assessment test in different forms
User’s Activity
Authoring tools are easy-to-use
Ability to learn learning domain tools even when it is not on the schedule
Possibility to eliminate scaffolding or personalize its reduction
Asynchronous and synchronous tools are available
Possibility to communicate with lecturers and also with students
Possibility to make annotation
Integration of the given material is possible
Efficiency
Presentation
System condition is clearly and continually shown
Progress tracking is clearly visualized
Possibilities and commands offered are obvious
Course form is clearly visualized
Alteration of the graphical aspect to the context of use is supplied
Hypermediality
The repository can be accessed by the lecturer and the student also
Available creation of contextualized bookmarks
Off-line platform access, without loosing tools or learning content
Application
Proactivity
There are mechanisms in order to prevent usage mistakes
There are mechanisms in order to teach-through-mistakes
Easy to use platform tools
Possibility to automatically and correctly assuage scaffolding
There are different modes to access the repository by the lecturer and the students
Possibility to adapt technology into the content of use
Registration of the date of the last modification so updating is possible
User’s Activity
There are mechanisms for search by key or natural language
Pedagogical Usability Evaluation is divided into
learning effectiveness and learning efficiency.
Firstly we will explain the calculation method of
learning efficiency. If learning is defined as
knowledge or skills acquired by instruction or study,
learning efficiency can be defined as the sum of
knowledge and skills gained that improves
performance divided by the sum of all the
information delivered during the learning
process(Valcheva, Todorova, 2005).
Perfect learning efficiency where all the
information delivered leads to learning that
improves performance – is achieved at a rate of 1.0.
Figure 4: Efficiency calculation algorithm.
The efficiency score of e-learning course can be
counted with special tests. These tests will contain
all the delivered knowledge. The average result of
CSEDU 2010 - 2nd International Conference on Computer Supported Education
436
Figure 5: Flow diagram of each learner’s e-learning effectiveness calculation (Huang, Chu, Guan, 2007).
these tests of all the members of the group is the
efficiency percentage of the platform,
2
PU
.
In order to calculate the learning effectiveness
we have to follow an approved method (Huang,
Chu, Guan, 2007). Because our e-learning platform
is a web page application, for the e-learning
effectiveness we will follow a web page learning
effectiveness calculation algorithm.
According to the approved method of e-learning
effectiveness calculation (Huang, Chu, Guan, 2007):
The input T
i, shown on the upper-left of Figure 5
represents learner X’s browsing time of the ith web
page during his/her online learning activities.
Notably, the browsing time measured is a single trip
to the web page instead of a sum of trips to the page
over time.
In this work we first compute learner X’s
average browsing time of each web page,
12 1
...
,
n
i
ni
T
TT T
T
nn
=
+++
==
(2)
Where n represents the total number of web pages
that learner X browsed.
We then compute the deviation of the effective
learning time for browsing the ith web page,
.
ii
dTT=−
(3)
The bias of the effective learning time period for
browsing the ith web page is defined as,
.
i
i
d
b
T
=
(4)
Next we compute the weight value of the ith web
page that represents the learning effectiveness when
learner X browsed the ith web page,
2
1
.
i
i
le
b
= (5)
Notably, the integer one is added to the denominator
to resolve the infinity problem when the bias is zero.
Accordingly, le
i becomes one when the bias bi is
zero. This also consists with the definition of the
learning effectiveness in this work since the learner
spent a regular learning time in browsing the ith web
page when the bias bi is zero. Furthermore, all the
web pages organized for the learning materials on
the e-learning platform are assumed to have similar
complexities and difficulty levels in this work. In
case different pages have varied inherent
complexities and difficulty levels, the instructor
should specify a difficulty level for each web page
that is proportional to the estimated web page
browsing time for each pupil, and then the rectified
average browsing time of each web page is given by:
12
12
...
,
n
n
T
TT
ww w
T
n
+++
=
(6)
Where w
i denotes the complexity and difficulty level
of ith web page.
The deviation of the effective learning time for
browsing the ith web page as given by Eq. (3)
should be updated as follows accordingly,
.
i
i
i
T
dT
w
=
(7)
The learning effectiveness that learner X achieved
after browsing n web pages can be cumulated as
follows (Eq.8)
1
11
1
.
n
to ta l i
n
mn
i
mn
le le
le
PU
m
=
==
=
=
(8)
THE EVALUATION OF AN E-LEARNING WEB-BASED PLATFORM
437
As a result the effectiveness of the e-learning
platform (PU
1) can be calculated as the average
learning effectiveness of all the members of the
experimental group that we examined.
4 CONCLUSIONS
Finally, the goal of this research is a general
algorithm, which gives us the usefulness of our e-
learning system (eq. 9). Defining: Usefulness
Use, Usability U, Pedagogical Usability PU,
Effectiveness Indicator 1, Efficiency Indicator
2, we conclude into the following general algorithm
12 1 2
12 1 2
2
22
2
4
UPU
Use
U U PU PU
Use
U U PU PU
Use
+
=⇒
++
+
=⇒
++ +
=
(9)
Our effort for an early credibility verification of this
e-learning evaluation system is composed by the
evaluation of the e-learning application with the
method mentioned above and with the conduction of
a between-groups evaluation case study. In this case
study the traditional teaching method is considered
to be a useful learning way. More specific, our e-
learning platform examined according to the
effectiveness efficiency and 5-rating evaluation
system criteria mentioned above. After that, a class
of 40 students at Ionian University, department of
Informatics were divided in two equal groups. The
first group took the e-learning courses at the
laboratory and the second group took the courses
with the traditional way. Following we, with the
method mentioned above, defining: E1E-learning
Efficiency, E2Traditional learning Efficiency
1
2
_0.6
0.86
_0.7
_0.75
0.88
_0.85
Students Score
E
Use platform
Students Score
E
Use traditional
===
===
Thus, we can accept in some point that the
evaluation method we suggest is correct.
REFERENCES
Ardito, C., Costabile, F., De Marsico, M., Lanzilotti, L.,
Levialdi, S., Roselli, T., Rossano, V. 2006. An
approach to usability evaluation of e-learning
applications. Access Inf. Soc. 4(3), 270–283.
doi:10.1007/s10209-005-0008-6
Bruner, J. 1960. Towards a Theory of instruction. Belknap
Press Cambridge.
Giannakos, M. 2009. A combinational evaluation method
of computer applications, MASAUM Journal Of Basic
and Applied Sciences (MJBAS), 1(2) 240-242.
Giannakos, M. 2009 Combination of Education
Technologies for the Enhancement of an
Asynchronous System, Proceedings of ICICTE 2009,
897-905.
Gillham, B. 2000. Developing a Questionnaire. London:
Bill Gillham.
Hartson, H.R., Andre, T.S. & Williges, R.C. 2003. Criteria
for Evaluating Usability Evaluation Methods,
International Journal of Human-Computer
Interaction, 15(1) 145-181.
Huang, C-J, Chu, S-S, Guan, C-T. 2007. Implementation
and performance evaluation of parameter
improvement mechanisms for intelligent e-learning
systems,, Computers & Education, 49(3), 597-614.
ISO. 1998. ISO-9241: Guidance on Usability Standards.
http://www.iso.ch/iso/en/CatalogueListPage.Catalogue
List?ICS1=13&ICS2=180 retrieved 20/10/09.
Nielsen, J., 1993. Usability Engineering, Academic Press.
Quintana, C., Carra, A., Krajcik, J., Elliot, S. 2001.
Learner-centred design: reflections and new directions.
In: Carroll (ed)Human-computer interaction in the
new millennium. ACM Press, New York, USA.
Ravden, SJ., Johnson, GI. 1989. Evaluating usability of
human computer interface: a practical method. Wiley,
Chichester.
Silius, K., Tervakari, A-M. & Pohjolainen, S. 2003. A
Multidisciplinary Tool for the Evaluation of Usability,
Pedagogical Usability, Accessibility and Informational
Quality of Web-based Courses. The Eleventh
International PEG Conference: Powerful ICT for
Teaching and Learning, Proceedings of PEG2003.
Ssemugabi, S., De Villiers, R., 2007. A Comparative
Study of Two usability Evaluation Methods using a
web-based e-Learning Application, In Proceedings of
the 2007 Annual research Conference of the South
African Institute of Computer Scientists and
information technologists on IT research in
developing countries, ACM Press, New York, USA.
Valcheva, D., Todorova, M., 2005. Defining a system of
indicators for evaluation the effectiveness of e-
learning, International Conference on Computer
Systems and Technologies - CompSysTech’2005.
Vlamos, P. 2003. Criteria for textbook evaluation. 3
rd
Mediterranean conference at mathematics.
CSEDU 2010 - 2nd International Conference on Computer Supported Education
438