time the task and corrected the errors in the same time
used a locked system and a concept DataRecord (MI
Kabbaj, A Bétari, Z Bakkoury, A Rharbi, 2015).
Moreover, they didn't use the loop when the system
ad hoc had a problem in a send message in a linear
model or model with Xor branches. In this way, the
same approach is used when adding a loop modeling
in the linear model with an Xor split in order to detect
the data flow modeling anomalies. Indeed, the Xor -
split is used to feedback an existing message errors at
a proceeding of modeling, this message errors
returned to the source activity where is created it up
to proceed of correction. Therefore, employ active
help (MI Kabbaj, A Bétari, Z Bakkoury, A Rharbi,
2015), and the rules for verification in the model, that
is triggered when some issue in the time of modeling
has occurred. However, the loop couldn’t assimilate
this approach to detect the anomalies not because the
active help is insufficient but because the rules of this
approach could only create and update. Subsequently,
it's proposed to enrich this approach with some
enhancements in rules and model in order for the
approach to be adapted by the loop. A decision node
is proposed like a connector that has a data connection
at the input data. In this case, it requires a Boolean
predicate (Yes=true, No=false) in a finite-state
automaton determinist, so we used the guarding (i.e.
blocking) tasks solely on the DataState (N Trcka, W
van der Aalst, N Sidorova,2008). In this context, we
implemented DataState to verify the last record state
of the dataset for each input and output in the activity.
In this manner, this data connection is a decision
variable that is a routing decision can be made based
on a set of data items inputted to the decision node.
Each of such data items involved in a routing decision
is called a decision variable (SX Sun, JL Zhao, JF
Nunamaker,2006). Also, this decision variable is
allowed to change the state of DataState that can be
initialized in each iteration of connection. Moreover,
there isn't the problem in the first iteration however
when the iteration is high requires an initialization of
the DataState.
The remaining of the paper is organized as
follows. Section 2 presents some approach and
concepts used in this paper. Section 3 shows that the
loop modeling cannot integrate assimilate the
approach with active help. In Section 4 presents the
new visualization of the approach. Section 5
concludes the paper and discusses the perspective.
2 RELATED WORK
Modeling in the business process has become very
important in recent years, with data-flow modelling
and verification being the two important challenges in
workflow system management. It had many works
stakeholders in this problem of anomalies of data-
flow and control flow in the workflow. Recently, data
flow formalization in process modeling has been
investigated by many researchers. In most
organisation, it is particularly important that the
responsible of key processes feel their interests are
represented during the latter phase. To achieve this,
the main stakeholders such as the heads of key
functions intersected by the process, the managers
with operational responsibility for the process,
suppliers of important change resources (e.g., the IT,
human resource, and financial functions), and process
customers and suppliers, both internal and external
should participate in the team during the design
phase. (TH Davenport - 1993 – books). In graph-
based approaches to business process modelling, data
dependencies are represented by data flow between
activities. Each process activity is given a set of input
and a set of output parameters. Upon its start, an
activity reads its input parameters, and upon its
termination, it writes data it generated to its output
parameters. These parameters can be used by follow
up activities in the business process (M Weske p.100
,2012). The importance of data-flow verification in
workflow processes was first mentioned in (S Sadiq,
M Orlo, W Sadiq, C Foulger, 2004). The information
perspective defines what data are expended and
produced with reverence to each activity in a business
process. Thus, the operational perspective requires
what tools and applications are used to execute a
particular task (SX S, JL Z, JF N, 2006). Many
approaches have been proposed for for data-flow
verification, these approaches enable systematic and
automatic elimination of data-flow errors as in (SX S,
JL Z, JF N, 2006). An approach of ad hoc that treated
the anomalies of data-flow for each activity by an
active help using a conception of dataRecord which
stored data with their state read, write and destroy
presented in (MI K, A B, Z Ba, A R ,2015). Indeed,
data flow perspective approach formally discovers
the correctness criteria for data-flow modeling. Petri
Net based approach is proposed for modeling the
control flow of workflow. We extended this model by
including the input and output of data flow and added
a complexity of algorithm for detecting the anomalies
of data flow as in (LIU C, Q Z, D H ,2014). Our
approach extends and generalizes data flow
verification methods that have been recently