Design of a Simulation Framework for Model-based Learning
Sarah Zribi
1
, Antonello Calabr
`
o
2
, Francesca Lonetti
2
, Eda Marchetti
2
,
Tom Jorquera
1
and Jean-Pierre Lorr
´
e
1
1
Linagora, 75 route de revel 31400, Toulouse, France
2
Istituto di Scienza e Tecnologie dell’Informazione “A. Faedo”, CNR, Pisa, Italy
Keywords:
Model-based Learning, Simulation, Business Process.
Abstract:
In model-driven learning, simulation of Business Process is a key step for improving the learners’s skills and
enhancing teaching performance. In this paper, we provide the architectural design and the main functionali-
ties of a model-based learning simulation framework. The main objectives of the proposed framework are: i)
enable user-friendly learning simulation with a strong support for collaboration and social interactions; ii) pro-
vide a process-driven learning environment that allows emulation of the learner behavior when no learners are
available to be involved on the simulation; iii) include event-based monitoring aiming at providing feedbacks
for the learner assessment.
1 INTRODUCTION
In recent years, inside many industrial contexts and
application domains, Business Process (BP) models
have increased their importance due mainly to the
possibility to easily allow formal specification, pro-
vide accepted and concise definitions and taxonomies,
and develop an executable framework for overall
managing of the process itself.
Specifically, in learning environment the availabil-
ity of this kind of models makes easier the adoption
of Business Process simulation for teaching purposes.
Usually, a simulation attempts to mimic real-life or
hypothetical behavior to see how processes or systems
can be improved and to predict their performance un-
der different circumstances. In learning environment,
commonly BP simulation enhances student learning
and problem-solving so to improve students’s knowl-
edge. Thus, models and simulations become a foun-
dation for improving the learners’s skill, enhancing
teaching performance and providing a comprehensive
framework. Different conceptual and mathematical
models have been proposed for model-based learning
and several types of simulations, including discrete
event and continuous process simulations, have been
considered (Blumschein et al., 2009). However, the
main challenges of existing learning simulation pro-
posals are about collaborative simulation and the de-
rived learning benefits.
In this paper, we present a new simulation frame-
work providing a flexible simulation environment
with a strong support for collaboration and social in-
teractions, as well as process visualization and moni-
toring. Model-based learning simulation can be com-
pared to a collaborative game where a team of players
composed of one coach and any number of learners
work together in order to achieve a common goal. The
main objective is consequently to provide an easy to
use and user-friendly environment for the learners in
order to let them take part of the process when their
turn comes, assuming different roles according to the
content they have to learn. The principal contribu-
tion of this paper is the design of an infrastructure
for simulation of model-based learning. The simu-
lation environment includes an event-based monitor-
ing framework aiming at providing feedbacks for the
learner evaluation and allows multi-sessions as well
as collaborative BP simulation. In the rest of the paper
we first briefly introduce some background concepts
(Section 2), then is Section 3 we present the main
components of the simulation framework architecture
whereas in Section 4 we describe its main functional-
ities. Finally, conclusion concludes the paper.
2 BACKGROUND
The proposal of a simulation framework for model-
based learning originated in the context of the Model-
Based Social Learning for Public Administrations
(Learn PAd) project (LearnPAd, 2016). The devel-
oped Learn PAd platform will support an informa-
tive learning approach based on enriched BP mod-
els, as well as a procedural learning approach based
Zribi, S., Calabrò, A., Lonetti, F., Marchetti, E., Jorquera, T. and Lorré, J-P.
Design of a Simulation Framework for Model-based Learning.
DOI: 10.5220/0005688606310639
In Proceedings of the 4th International Conference on Model-Driven Engineering and Software Development (MODELSWARD 2016), pages 631-639
ISBN: 978-989-758-168-7
Copyright
c
2016 by SCITEPRESS Science and Technology Publications, Lda. All rights reserved
631
on simulation and monitoring that will allow users
to learn by doing. In learning context, BP simula-
tion approaches are very popular since learners pre-
fer simulation exercises to either lectures or discus-
sions (Anderson and Lawton, 2008). Simulations
have been used to teach procedural skills and for
training of software applications and industrial con-
trol operations as well as for learning domain specific
concepts and knowledge, such as business manage-
ment strategies (Clark and Mayer, 2011). Nowadays,
more attention is given to business process oriented
analysis and simulation (Jansen-Vullers and Netjes,
2006). Studies have shown that the global purpose of
these existing business process simulation platforms
is to evaluate BPs and redesign them, whereas in the
last years simulation/gaming is establishing as a dis-
cipline (Crookall, 2010). However, these platforms
present several shortcomings regarding their applica-
bility to a collaborative learning approach. Namely,
no existing platform regroups all of the main func-
tionalities of a learning simulation solution such as
facilities for providing a controlled and flexible sim-
ulated environment (for example allowing to switch
between possible outcomes of a task, in order to ex-
plore the different paths of a process), good visualiza-
tion and monitoring of a process execution flow (in or-
der both to assist and evaluate the learners) (Crookall,
2010). The main challenges of a learning simulation
are about collaborative simulation and the derived
learning benefits. To answer all of these concerns
a new learning simulation framework is designed in
this paper, providing a flexible simulation environ-
ment with a strong support for collaboration and so-
cial interactions, as well as process visualization and
monitoring. The important objective is consequently
to provide an easy to use and user-friendly environ-
ment for the learners in order to let them take part of
the learning process assuming different roles.
3 SIMULATION FRAMEWORK
ARCHITECTURE
In this section, we describe the high level architecture
of the proposed simulation framework, its main com-
ponents, their purpose, the interfaces they expose, and
how they interact with each others. In particular, as
depicted in Figure 1 each component is exposed as
a service and provides an API as a unique point of
access. Inside the Learn PAd infrastructure, the pro-
posed simulation framework interacts with the Learn
PAd components by means of the Learn PAd Core
Platform and specifically through the Bridge and the
Core Facade interfaces. Moreover, in the Learn PAd
vision two levels of learners have been considered:
the civil servant who is the standard learner, and the
civil servant coordinator who is a generalization of the
civil servant who is in charge to activate and manage
a simulation session.
The simulation framework components are:
SimulationGUI: it is in charge of the interactions
between learners and simulator’s components.
PersistenceLayer: it stores the status of the sim-
ulation at each step in order to give to the civil servant
the ability to stop it and restart when needed.
RobotFramework: it allows to simulate the be-
havior of civil servants by means of robots.
SimulationEngine: this is the core component
of the simulation framework. It enacts business pro-
cesses and links activities with corresponding civil
servants or robots.
Monitoring: it collects the events occurred during
the simulation and infers rules related to the business
process execution.
Communication middleware: it provides event-
based communication facilities between the simula-
tion components according to the publish/subscribe
paradigm.
UserFacade: it is in charge of encapsulating real
or simulated civil servants (i.e. robots) in order to
make the learner interaction transparent to the other
components of the architecture.
In the following more details are provided.
3.1 GUI
As depicted in Figure 2, the main features of the sim-
ulation GUI are: the possibility of chatting and noti-
fication, the mutual interaction by input/output panel,
the visualization of simulation members, the possibil-
ity of reading and searching for documents and addi-
tional information useful during the simulation activi-
ties (Context Area) and the usage of standard facilities
as for instance Play, Save, Pause, Stop (Simulation
lifecycle menu).
During a simulation, the civil servant coordinator
selects the difficulty level of the simulation that can
be elementary, intermediate or advanced. For each
task of the BP and until the end, the simulation frame-
work notifies the civil servant when his/her turn to
learn comes, so that the civil servant fills the required
forms and submits them. Moreover, the simulator of-
fers the possibility to the civil servant, when he/she
needs some help, to chat with online experts, or to
consult recommended resources and FAQ that fit for
the current context. Finally, when all tasks of the BP
are simulated, the simulator provides an evaluation of
the simulation.
MODELSWARD 2016 - 4th International Conference on Model-Driven Engineering and Software Development
632
Figure 1: Simulation Framework Architecture.
Figure 2: Main User Interface.
3.2 Persistency Layer
The Persistency Layer component is in charge to store
simulation logs and business process states. Its main
sub-components are:
Logger: It is in charge of storing time-stamped
event data coming from the simulation engine. These
event data are produced each time a treatment is trig-
gered by the engine: BP activity invocation, user in-
put/output, etc.
BPStateStorage: This component allows to
store/retrieve/delete/update the state of a given sim-
ulation associated to a BP. This is a key feature that
allows the user to freeze a simulation, log out and
come back to the simulation later on. The compo-
nent manages all the necessary data and stores them
into a database.
TestDataRepository: This component collects
the historical data that relate to the simulations ex-
ecutions. Historical data specify input and output in-
formation/values corresponding to paths (or subpaths)
and activities of the BP and are inserted and validated
Design of a Simulation Framework for Model-based Learning
633
Figure 3: Simulation Environment Functionalities (Civil Servant Coordinator).
Figure 4: Simulation Environment Functionalities (Civil Servant).
by experts. These data can be used for verifying the
correctness of the information provided by a civil ser-
vant during a simulation session as well as for infer-
ring the robot behavior.
3.3 Robots Framework
During a simulation some of the required civil ser-
vants may not be present, therefore robots mimic
missing learners behavior. They are aimed to provide
outputs that would have been produced by a human
with the same input in the same context. The Robots
are implemented on the basis of the availability of his-
torical data, i.e. the data saved in the TestDataRepos-
itory during a previous simulation session and pro-
vided by an expert who takes the role of the civil
servant. In case of unavailability or incompleteness
of these data, either an expert may be asked to pro-
MODELSWARD 2016 - 4th International Conference on Model-Driven Engineering and Software Development
634
vide historical data manually or a knowledge-based
approach able to deduce output from previously col-
lected answers provided by learners, is applied.
3.4 Simulation Engine
Simulation engine takes in charge the simulation of a
given business process instance. It takes the form of
an orchestration engine that invokes treatments asso-
ciated to each activity of the current process. Such
workflow may involve multiple civil servants taking
different roles that may be present or not. For those
that are not available, robots are used in order to
mimic their behavior. A simulation manager is pro-
vided in order to manage BP lifecycle according to
the current context (create, stop, resume, kill, etc.).
Business processes are made of two kinds of activ-
ities: i) Human activities involve civil servants who
should provide information in order to complete the
task. The concept of human activity is used to spec-
ify work which has to be accomplished by people; ii)
Mocked activities involve robots to compute the treat-
ment associated to the activity. When the simulation
engine invokes a human activity the corresponding
civil servant is asked to provide input through a form.
Those forms are managed by a form engine that dele-
gates task to a robot if necessary. All the state infor-
mation necessary to restart a specific simulation are
stored “on the fly”. The civil servant may decide to
freeze a running simulation, to store it, to backtrack
to a previous stored state and to logout. He/she will
be able to resume it later.
Business Process orchestrator takes in charge the
step by step execution of a given BP instance. Such
BP instance is made of a BPMN description enriched
with necessary run-time information such as end-
points of software applications mocks, user id, etc.
The BP engine is connected with the Forms Engine in
order to take in charge users and robots input/output.
As reported in Figure 5, during the simulation pro-
cess, the input generated by the civil servant (mes-
sages 1-4) can be validated on line by means of data
stored on TestDataRepository component.
In particular the validation process is divided into
three main sub-cases:
generic case: it occurs when the simulation en-
gine have to validate the learner input. In this case
the simulation engine will check through the Test-
DataRepository if the input can be considered cor-
rect or not and provides a feedback to the Learner
(messages 5,6 of Figure 5).
extra-evaluation required: this sub-case is acti-
vated when the learner (Civil Servant) input and
the validation obtained by the simulation frame-
work do not match each other. In case the learner
is sure of the correctness of his/her reply, he/she
has the possibility to open a dispute requesting an
extra evaluation from one or more experts (mes-
sage 7 of Figure 5).
validation missed: in this last case the simulation
framework has insufficient information to evalu-
ate the learners input. This could happen because
Test Data Repository does not contain a valid and
certified by expert data (messages 8,9 of Figure
5).
In both second and third case, the simulation engine
will raise an alert to an expert and let the learner con-
tinue the simulation (if experts are not available on-
line), and the result of that validation will be provided
to the learner as soon as possible.
3.5 Monitoring
The simulation framework is equipped with a mon-
itoring facility that allows to provide feedbacks on
the business process execution and learning activi-
ties. This monitoring allows to collect data of in-
terest during the run-time business process execution
and is based on an event based monitoring infrastruc-
ture (Bertolino et al., 2011) which has the peculiar-
ity of decoupling the events specification from their
collection and processing. This monitoring facility
can be independent from any specific business pro-
cess modeling notation and execution engine. Data
collected during monitoring of business process exe-
cution are used for deriving the key performance indi-
cators (KPI) that allow continuous tracking of the pro-
cess behavior and measurement of learning-specific
goals. We refer to (Calabr
`
o et al., 2015b) for a de-
tailed description of the monitoring components.
4 FUNCTIONAL SPECIFICATION
OF THE LEARNING
SIMULATION FRAMEWORK
The simulation framework provides the subsystem
where learners can simulate BP interactively and is
used by one or multiple civil servant(s) in order to
learn processes. Figures 3 and 4 show the main func-
tionalities that the Simulation Framework provides at
the stakeholders. As mentioned in Section 3, the sim-
ulation framework distinguishes between the two fol-
lowing actors: the civil servant coordinator who is in
charge of starting a simulation session and the civil
Design of a Simulation Framework for Model-based Learning
635
Figure 5: Simulation activity diagram.
servant who represents a generic participant to a sim-
ulation session. In particular, as in Figure 3, the civil
servant coordinator can request to start a new simu-
lation execution of a Public Administration business
process or he/she can manage an ongoing one by for
instance inviting/cancelling other civil servants. The
civil servant coordinator can also restart/stop a current
simulation session and redefine a new coordinator. On
its turn, each civil servant has different possibilities
(as listed in Figure 4) like for instance joining, dis-
connecting or pausing a simulation session, chatting,
asking for evaluation/help, or managing his/her own
profile.
The simulation framework functionalities have
been split into three different phases: i) Initialization
in which the simulation environment is set up; ii) Ac-
tivation in which the participants to the simulation are
invited; iii) Execution in which the participants effec-
tively collaborate each other during a learning ses-
sion. During the Activation phase, the civil servant
can select the type of simulation he/she wants to exe-
cute. Specifically, three different types of simulation
are provided:
Individual Simulation. The civil servant decides to
execute the simulation without interacting with other
human participants. In this case the other participants
are emulated by means of Robots (see section 3.3 for
more details). The creation of robots instances is per-
formed before the simulation execution.
Collaborative Simulation. This option of simula-
tion involves the collaboration of several human par-
ticipants (no robots instances are involved). During
the collaborative simulation, users can interact be-
tween them using chat instruments. This will improve
performances of the overall learning session due to
the possibility to rapidly share experience between
human participants. This kind of simulation can
be considered the most interesting from the learning
point of view, because cooperation can make learn-
ing procedures more intensive and productive. Di-
versities will raise up and the opportunity to reflect
upon encountered issues will help learners to improve
MODELSWARD 2016 - 4th International Conference on Model-Driven Engineering and Software Development
636
Figure 6: Simulation execution.
Design of a Simulation Framework for Model-based Learning
637
their knowledge and better understand the problem.
For activating a simulation, the system requires that
all the civil servants involved have joined the session
in order to provide an online collaborative environ-
ment. Moreover, the simulation framework also sup-
ports also some asynchronous tasks execution among
simulation participants. If a civil servant does not sat-
isfy the simulation requirements or time constraints,
the civil servant coordinator may decide either to kick
the civil servant, or to swap him with another one
among those available, or replace him with a Robot.
Mixed Simulation. This type of simulation re-
quires the participation of both humans and robots.
This usually happens when there are not enough civil
servants to cover all the necessary roles to execute a
BP or if one or more civil servants leave the ongoing
simulation (disconnection or kick). The activation of
a mixed simulation can be done only if the following
two constraints are met: i) the required instances of
robots are ready; ii) all the invited civil servants have
completed the connection procedures.
The sequence diagram presented in Figure 6 is one
of the realizations of the use case “Manage own sim-
ulation session” of Figure 3 and shows the main ac-
tivities of the simulation execution. This sequence
diagram refers to simulation activation sequence di-
agram as a precondition, not presented here for space
limitations. Once simulation activation phase is com-
pleted the Simulation Engine generates the events re-
lated to the execution of tasks of the BP (message
1) and sends them to the Monitor for their evalua-
tion (message 2). The simulation execution contin-
ues managing the BP tasks and sending notification to
the civil servants by means of SimulationGuiAPI till a
special kind of event is received. Four types of special
events are managed: END rises when the BP execu-
tion reaches the natural end of the model; STOP rises
when the civil servant coordinator decides to early ter-
minate a simulation execution; PAUSE rises when a
civil servant decides to pause the simulation execu-
tion; PLAY rises when a civil servant decides to re-
cover a paused simulation. During the execution the
simulation engine, continuously generates events and
forwards them to the CepAPI. According to the event
executed the civil servants participating to the simu-
lation receive through the SimulationGUI, the docu-
ments and suggestions related to each task. If the BP
execution has reached the end, an end event is sent
to the CepAPI in order to stop the monitoring session
and unload resources.
5 CONCLUSIONS
In this paper, the architectural design of the simula-
tion and monitoring framework is presented with a
particular focus on the definition of the functionalities
and interactions among its components. The frame-
work is currently under evaluation inside the Learn
Pad project. However, the preliminary results pro-
vided in (Calabr
`
o et al., 2015b; Calabr
`
o et al., 2015a;
Calabr
`
o et al., 2016) evidenced positive feedbacks,
especially concerning the possibility of executing col-
laborative simulation and providing learners assess-
ment. Moreover, the design of some parts of the archi-
tecture, such as the Test Data Repository, will be re-
fined and improved through comments and hints that
will be collected over the project duration.
ACKNOWLEDGEMENTS
This work has been partially funded by the Model-
Based Social Learning for Public Administrations
project (EU FP7-ICT-2013-11/619583).
REFERENCES
Anderson, P. H. and Lawton, L. (2008). Business simula-
tions and cognitive learning: Developments, desires,
and future directions. Simulation & Gaming.
Bertolino, A., Calabr
`
o, A., Lonetti, F., and Sabetta, A.
(2011). Glimpse: A generic and flexible monitoring
infrastructure. In Proc. of the 13th EWDC, pages 73–
78.
Blumschein, P., Hung, W., and Jonassen, D. H. (2009).
Model-based approaches to learning: Using systems
models and simulations to improve understanding and
problem solving in complex domains. Sense Publish-
ers.
Calabr
`
o, A., Lonetti, F., and Marchetti, E. (2015a). Kpi
evaluation of the business process execution through
event monitoring activity. In Proc. of Third Interna-
tional Conference on Enterprise Systems.
Calabr
`
o, A., Lonetti, F., and Marchetti, E. (2015b). Moni-
toring of business process execution based on perfor-
mance indicators. In Proc. of Euromicro-SEAA.
Calabr
`
o, A., Lonetti, F., Marchetti, E., Zribi, S., and Jor-
quera, T. (2016). Model-based learning assessment
management. In Proc. of MODELSWARD.
Clark, R. C. and Mayer, R. E. (2011). E-learning and
the science of instruction: Proven guidelines for con-
sumers and designers of multimedia learning. John
Wiley & Sons.
Crookall, D. (2010). Serious games, debriefing, and simu-
lation/gaming as a discipline. Simulation & gaming,
41(6):898–920.
MODELSWARD 2016 - 4th International Conference on Model-Driven Engineering and Software Development
638
Jansen-Vullers, M. and Netjes, M. (2006). Business process
simulation–a tool survey. In Workshop and Tutorial
on Practical Use of Coloured Petri Nets and the CPN
Tools, Aarhus, Denmark, volume 38.
LearnPAd (2014-2016). Model-Based Social Learning for
Public Administrations European Project (EU FP7-
ICT-2013-11/619583). http://www.learnpad.eu/.
Design of a Simulation Framework for Model-based Learning
639