DEFINING ADAPTIVE ASSESSMENTS
Construction of Adaptive Assessments Based in the Learning Style of the Students
Héctor Barbosa Leon, Francisco García Peñalvo
Departamento de Informática y Automática, Universidad de Salamanca, Plaza de los Caidos s/n, Salamanca, España
Maria José Rodríguez Conde
Departamento de Didáctica, Organización y Métodos de Investigación, Facultad de Educación
Universidad de Salamanca, España
Keywords: Learning Objects, Assessment, IMS specifications, Authoring tool, Learning Styles, Adaptive systems.
Abstract: This article presents a proposal to define learning objects for adaptive assessments. Two adaptation
processes are described here, the first one is the adaptation in the level of complexity of the questions, and
the second one is our proposal to adapt the final presentation according to the user’s learning style. To
define the items and exams, a model to construct adaptive assessment and a proposal of three levels of
integration is detailed.
1 INTRODUCTION
The educative content is evolving from a static view
to an adaptive one in which the content is adapted to
the needs and/or preferences of the users. This
content, developed in form of units of learning look
to cover a specific learning objective including one
or several learning objects and their related material.
Inside those objects we can find objects
describing assessment activities to evaluate the
knowledge of the students. The assessment activity
inside a unit of learning could be seen as an element
that closes and complete a circular activity, being an
integral part of the learning process (Barbosa &
García, 2005).
Nowadays, it is necessary to produce educative
Internet-based systems that permit the dissemination
of the education, covering the needs of diverse
learning group profiles. To obtain this, it is desirable
that such systems perform automatic task to adapt
itself to each user, disconnecting the content from its
presentation by using a semantic approach rather
than a syntactical one, defining a meaningful web. In
consequence, learning systems must be flexible and
efficient, and one way to accomplish that is to be an
open and standardized system (Barbosa & García,
2006).
In addition, the assessment activity inside the e-
learning process could be used to adapt the system
by setting a new user knowledge level, evaluate, and
setting new learning profiles, assign user grades and,
in consequence, performing user content re-
adaptation (Barbosa & García, 2005b). According to
the Australian Flexible Learning Framework
(Backroad Connections, 2003), assessment,
especially when is included within a real learning
task or exercises, could be an essential part of the
learning experience, giving to the entire Web site the
characteristic to adapt itself to the needs and the
acquired knowledge of the users.
The rest of the paper is structured as follow: In
section two, we present the model overview,
organized in levels of abstractions and sections for
each stage of activity. In the section three we briefly
describe the construction of simple items using open
standards and the definition of complete exams
containing groups of simple items. In the fourth
section we present the integration levels going from
simple items (first level) to complete exams with
adaptation rules. In the fifth section we describe the
adaptation processes in the final presentation
(adapted to the learning style) and the adaptation in
the complexity level. Finally we give our
conclusions and future work.
409
Barbosa Leon H., García Peñalvo F. and José Rodríguez Conde M. (2007).
DEFINING ADAPTIVE ASSESSMENTS - Construction of Adaptive Assessments Based in the Learning Style of the Students.
In Proceedings of the Third International Conference on Web Information Systems and Technologies - Society, e-Business and e-Government /
e-Learning, pages 409-414
DOI: 10.5220/0001273204090414
Copyright
c
SciTePress
2 MODEL OVERVIEW
We propose a model to construct assessment items
with characteristics of adaptability. This model have
four main sections (figure 1), some of them have
three levels of conceptualizations with activity
definitions form the abstract level (lower layers) to a
more concrete level (upper layers) to a more
concrete level (upper layers).
We structured the model starting with the
definition of the basic core elements so we can
evolve them to a more concrete definition,
identifying their requirements and interactions
between them at the same time (Barbosa & García,
2006b).
2.1 Levels: From an Abstract to a
Concrete View
In the first level (dark colour), we identify the core
elements (learning objects, management, test
construction, and LMS (Learning Management
System) interaction). In the second level (grey
color), we describe the main sections, identifying
four phases: authoring, repository management,
visualization, and interaction with LMS. In the third
level (white colour), we categorize each activity into
a subsystem (authoring, item management,
publishing, and interaction with LMS).
2.2 Sections: From the Creation to the
Interaction With the LMS
Section one: Learning Objects Definition (ASI:
Assessment, Section, and Items). We focused our
research in technologies and specifications in the
definition of this section; we use the authoring use-
cases of the IMS QTI specification (IMS QTI,
2006). In addition, we want to use the IMS Learning
Design specification (IMS LD, 2006) to define a
resource-learning object in three different levels of
integration.
Section two: Item Management. We define this
section with the aim to give to the developer a tool
to organize, manage, and import ASI from other
authoring tools. We propose a native XML
(Extensible Mark-up Language) (XML, 2006)
database management system to manage the items.
Section three: Test Construction: We consider
in this section the activities made by the assessor to
view the items by rendering them in a visualization
system. We suggest using accepted software plug-
ins to show a final rendering to the user. In this
section, the assessor selects the ASI to construct the
test that will be delivered into LMS in a XML
format. In addition, we want to integrate data fields
Figure 1: Model to construct adaptive assessment items.
WEBIST 2007 - International Conference on Web Information Systems and Technologies
410
so the assessor could construct adaptability test by
defining trees of related questions depending of the
responses and the feedback presented to the user.
We propose the development of a user interface for
the students or candidate so they can access and
respond to the exam, sending the results to the LMS
to make an assessment record.
Section four: Test Delivery: In this phase, the
candidate or student activates the test by accessing
to it through an LMS, from which we get the
learning style definition to make the adaptation.
3 DEFINING ASSESSMENT
ITEMS AND EXAMS USING
IMS SPECIFICATIONS
3.1 Defining Items Using IMS QTI and
IMS CP Specifications
The IMS CP specification is used when is necessary
to transfer learning objects (lessons, exams or other
material) between several Learning Management
Systems. In this case we can use this specification to
package and transfer assessment items between
LMS. In the case of a simple element, the package
will contain (a) the manifest (XML file called
imsmanifest.xml), (b) the item (QTI XML format
file) and, (c) any auxiliary files required by the item
(ref: ims qti integration guide).
3.2 Defining Exams Items Using IMS
QTI, IMS LD and IMS CP
Specifications
The IMS QTI and IMS LD specifications allow their
integration to define a learning object to evaluate the
knowledge acquired by the students when interact
with an unit of learning (UoL). The IMS LD
includes activities of course instruction, modules,
lessons and exams in a generic context (those
considered as formative assessment) to support the
recent knowledge or to give immediate feedback to
the student.
But the IMS specifications could be used to
define learning objects with extra characteristics like
adaptation rules for the final presentation and
sequence of the questions.
The main structure defined in a IMS LD object is
the manifest, containing the organization and
resources. Inside the organizations section some
elements are described, like the roles, properties,
activities, environments and methods. The
integration of the specifications could be done
defining tags and instructions in the imsld:properties
to control the visibility and order of the elements and
the imsld:conditions to define decision structures.
The environments section is a container for the
environment elements, each could be used to
describe assessment items for a particular learning
style. These structures (the environment ones) could
be executed by the LMS in parallel, allowing
multiple students of different learning styles to
access their own adaptable elements (or an adaptable
exam).
4 PACKAGE CONFIGURATION
FOR ITEMS AND EXAMS
4.1 First Level of Integration: Simple
Items
This is a package (figure 2) that contains a simple
item or question and the multimedia files or
references to them. The purpose is that this package
could be exported to third-party authoring tools or to
be referenced by learning objects like those
constructed using the IMS LD specification in
formative exams. This object is supposed to include
the question description in IMS QTI and the
reference to the multimedia file that will be used for
the LMS when the item is displayed to the student.
Figure 2: First level of integration for items.
4.2 Second Level: Many Items, One
Learning Style
This package (figure 3) contain many question items
and hence a complete exam for a single learning
style. This package could be constructed by
selecting a group of items with the same educational
objective (to evaluate an unit of learning) and
DEFINING ADAPTIVE ASSESSMENTS - Construction of Adaptive Assessments Based in the Learning Style of the
Students
411
categorized for the same learning style. In this level
IMS LD metadata is defined and sequencing
instructions for the final presentation are defined as
well, using rules defined in the imsld:methods
section.
4.3 Third Level: Many Items, Many
Learning Styles
This package (figure 4) describe a complete exam,
with several components: (a) an environment for
each learning style, (b) a method section describing
the adaptation rules for the sequencing, (c) the
resource section, containing the description of the
resources for each learning style and the files (or the
reference to them) to support the adaptation to the
final presentation to the student.
5 ADAPTATION PROCESSES
5.1 Adaptation in the Final
Presentation to the User
The learning object could adapt their final
presentation taking into account the needs or
preferences of the users. For this, the learning object
containing an adaptable exam includes content for
each learning style through the use of a specific
context or environment. The LMS access the
appropriate environment that fit to the user learning
style or preference, showing the multimedia
material. For example, the next sentence (figure 5)
consider the ability level and the learning style of the
student to show the next question with the right
multimedia material
5.2 Adaptation in the Complexity Level
Another process of adaptation is the level of
complexity of the questions that are presented to the
student (figure 5). The questions are selected by
their level of complexity, taking into account the
response of the student to the last question answered.
If he/she answer correctly then the next question is
of the same or higher complexity, if not, then the
next question is of a lower complexity. This is the
traditional adaptation process used by some
developments in this area (Gouli et. al, 2001),
(Guzmán et. al, 2005).
Figure 3: Second level of integration for exams.
Figure 4: Third level of integration for adaptive exams.
WEBIST 2007 - International Conference on Web Information Systems and Technologies
412
5.3 Adaptation Rules
One of the main characteristics of the IMS LD
specification is its potential to define adaptive
characteristics covering the student preferences,
prior knowledge and/or learning needs. To do this it
is necessary to use this specification in the level B to
define individual interactions because in this level
we can use some elements like <properties> and
<conditions>.
The learning style values could be set from
values stored in the user model in the LMS and
stored in property elements (<locpers-property>,
<globpers-property) to perform the adaptation.
Finally, the <on-completion> element could be used
to set the actions that will be done once certain
action is performed. To perform adaptability to the
learning style the LMS access the environment
according to the need or selection of the user. From
this, the questions set are presented to the student
applying the adaptation algorithm proposed by
(Stern & Woolf, 1994).
Adaptation rule RUL1 is to perform adaptability
according to the ability level and RUL2 is to adapt
the presentation to the user learning style.
RUL1= IF <student>::(response,true)
THEN
newLevel = oldLevel+
oldLevel(oldLevel/10)
ELSE
newLevel = oldLevel+
oldLevel(5-oldLevel/10).
RUL2= IF <student>::(LS_visual)
THEN
show item (newLevel,visual)
ELSE
show item (newLevel,verbal).
6 CONCLUSIONS
Online assessment is an important step inside the e-
learning process because gives convenient feedback
to all participants in the process, helping to improve
the learning and teaching experience.
Given the fact that assessment is an important
element of the e-learning process and that this
process looks to be interoperable, then we can think
that the assessment tool could be used with different
educative content administrators with different
conceptualizations and ways to design and apply a
test for their students. To face this situation it is
necessary to develop an assessment tool that give
several ways to design an test with different types of
resources, different kind of assessments, group of
students, kind of questions, managing schedules, etc.
Under this conceptualization, we propose to
construct assessment items incorporating adaptive
characteristics taking into account the learning style
of the user, besides the adaptation process in the
level of complexity of the questions, already in use
by some assessment tools. We propose to
incorporate adaptation rules using the IMS LD
specification in an exam to construct a learning
object that could be used by the LMS.
ACKNOWLEDGEMENTS
We want to thank to the group KEOPS (ref.
TSI2005-00960) of the University of Salamanca for
their contributions and ideas for the development of
this work.
Héctor Barbosa thanks the National System of
Figure 5: Adaptation processes.
DEFINING ADAPTIVE ASSESSMENTS - Construction of Adaptive Assessments Based in the Learning Style of the
Students
413
Technological Institutes (SNIT–Mexico) for its
financial support.
REFERENCES
Backroad Connections Pty Ltd. 2003. Assessment and
Online Teaching, Australian Flexible Learning
Framework Quick Guides series, Australian National
Training Authority, Version 1.00.
Barbosa, H., Garcia F. 2005 Importance of the online
assessment in the e-learning process. 6th International
Conference on Information Technology-based Higher
Education and Training ITHET, Santo Domingo,
Dominican Republic. IEEE CD version.
Barbosa, H., García F. 2005b. A model for online
Assessment in adaptive e-learing platform. In
proceedings of the 3th. Internatonal Conference on
Multimedia and Information: m-ICTE, Cáceres, Spain,
Vol.l 1, pp. 16-20.
Barbosa, H., García F. 2006. An Authoring Tool to
Develop Adaptive Assessments. In the International
Conference on Web Information Systems and
Technologies WebIst 06. Setúbal Portugal.
Barbosa, H., García, F. 2006b. Setting and Sharing
Adaptive Assessment Assets. In proceedings of 8
th
International Simposium of Educative Informatics.
SIIE 06. Leon Spain.
Gouli E., Kornilakis, H., Papanicolau, K., Grigoriadou, M.
2001. Adaptive Assessment Improving Interaction in
an Educational Hypermedia System. In procedings de
Panhellenic Conference on Human-Computer
Interaction, Patras, Greece.
Guzmán, E., Machuca, E., Conejo, R., Libbrecht, P. 2005.
LeActiveMath, Integrated Adaptive Assessment Tool.
http://polux.lcc.uma.es/siette/doc.
IMS LD, Learning Definition. 2006.
http://www.imsproject.org/learningdesign/index.html
IMS QTI, Question and Test Interoperability. 2006.
http://www.imsproject.org/question/index.html.
Stern, M., Woolf, B. 1994. Curriculum Sequencing in a
Web-based tutor. Proceeding of Intelligent Tutoring
Systems. LNCS, Vol. 1452, pp. 574-578.
XML. Extensible Mark up Language. http://www.w3.org.
http://hermes.di.uoa.gr/lab/cvs/papers/Papanikolaou/.
WEBIST 2007 - International Conference on Web Information Systems and Technologies
414