requirements desired by different stakeholders.
Since these patterns are in early stages of
development and validation, we would rather call
them proto-patterns. Due to space limitations, we
discuss these patterns informally:
Direct & Local (DL): this pattern considers
accessing the required functionalities across
enterprise systems by direct invocation using native
APIs. On the other hand, each BP implements
locally the activities that have no corresponding
functionality. This pattern is applicable when the
majority of BPs activities have corresponding
functionalities in existing enterprise systems. It is
also applicable when performance and reliability are
the major quality concerns and when redevelopment
cost of existing systems functionalities is quite high.
Direct & Shared (DS): this pattern considers
accessing the required functionalities across
enterprise systems by direct invocation through their
native APIs. It also considers implementing all
activities of BPs that have no corresponding
implementation in any of the existing systems as
shared service-based interfaces. This pattern
applicability is similar to that of DL with the
exception that the majority of BPs activities are not
being implemented in existing systems.
Wrapper & Shared (WS): this pattern considers
providing a unified service-based interface to all
functionalities embedded in all enterprise systems. It
also considers implementing all BPs activities that
have no corresponding implementation as shared
service-based interfaces. This pattern is applicable
when having the majority of BPs activities not being
implemented. It is also applicable when ease of
installation and maintenance cost are primary quality
concerns and when redevelopment cost of existing
systems functionalities is quite high.
Wrapper & Local (WL): this pattern considers
providing a unified service-based interface to all
functionalities across all enterprise systems. On the
other hand, each BP implements locally the
activities that have no corresponding
implementation. This pattern applicability is similar
to that of WS with the exception that the majority of
BPs activities having corresponding implementation.
Migrate (MG): this pattern considers replacing
existing systems with new ones. This involves
migrating the implementation of required
functionalities into shared service-based interfaces
through a re-engineering process. It is applicable
when existing systems are likely to be obsolete in
the near future, and also when maintenance cost is
expected to be high due to significant changes
required. However, the development cost for this
pattern will be far more higher that other patterns.
3 QUANTITATIVE EVALUATION
Our discussion in the previous section shows that
each alternative pattern is impacting differently on a
number of quality attributes. So, in order to evaluate
and rank these alternatives accordingly, we need to
employ some quantitative measures in scoring them
according to their satisfaction to stakeholders’
preferences on relevant quality attributes. To this
end, we borrow from existing methods from the
literature of Multiple-Attribute Decision Making
(MADM) (Yoon and Hwang, 1995). In particular,
we employ the Analytical Hierarchy Process (AHP)
method which relies on pair-wise comparison, thus
making it less sensitive to judgmental errors
common to other MADM methods.
The application of AHP method comprises four
main steps as shown in Figure 1. We now formally
discuss each step:
Preparation: this step articulates the different
elements involved in the process of deciding about
design decision D
j
(
mj <=
=1
). It involves
identifying stakeholders involved in this decision S
1
,
S
2, …
S
u
, potential design alternatives to select from
A
1
, A
2, …
A
n
, and quality attributes used in the
evaluation process Q
1
, Q
2, …
Q
k
.
Weighting Quality Attributes: the aim of this step
is to determine the relative weight for every quality
attribute Q
z
(
kz
=
=1
). Each stakeholder S
h
(
uh
=
=1
) will need to provide their preferences
on considered quality attributes, by comparing every
pair of quality attributes (Q
a
,Q
b
), using a 9-point
weighting scale, with 1 representing equality and 9
representing extreme difference. This will be used to
determine how important Q
a
is, in comparison to Q
b
(
kba
=
= ,1
). For example, if Q
a
is considered as
"extremely more important" than Q
b
then we have
the entry (a,b)=9 and adversely (b,a)=1/9.
This means that for k quality attributes, k(k-1)/2
pair-wise comparisons will need to be made by each
stakeholder. At the end, each stakeholder S
h
will
build up a k x k matrix P
h
=( )
representing their preferences on quality attributes.
Having gathered all stakeholders’ quality
preferences P
kbaP
h
ab
<=<= ,1;
1
, P
2
, …
P
u
, we now aggregate them all
into one k x k matrix P=(
) by
computing the geometric mean for each individual
entry (a,b) using the following formula:
kbaP
ab
<=<= ,1;
u
u
h
h
abab
PP
∏
=
=
1
(1)
After that we compute the geometric mean G
a
for
every quality attribute Q
a
:
QUANTITATIVE EVALUATION OF ENTERPRISE INTEGRATION PATTERNS
399