innovative times”.
Evolvabilty and agility are in fact exactly the goal
and purpose of Normalized Systems theory, since
Normalized Systems are highly evolvable and stable
systems based on structured elements that minimize
combinatorial effects. For this reason the component
of artifact mutability, which is formally defined as
“the changes in state of the artifact anticipated in the
theory” (Gregor & Jones, 2007, p. 322) can be cleared
recognized as the anticipated changes of NS.
The next component Gregor & Jones (2007) de-
fine, is testable propositions. These are claims or pre-
dictions about the quality of a system or tool when the
design theory is applied. As such, the testable propo-
sition of NS theory can be formulated as the elim-
ination of combinatorial effects when the principles
of form and function are pursued consistently. Al-
though this component seems clearly defined within
NS theory, the definition of the component also re-
quires the propositions to be testable. According to
Walls et al. (1992), the assertion that applying a set of
design principles will result in an artifact that achieves
its goal can be verified by building (an instance of)
the artifact and testing it. Applying this verification
method on Normalized Systems, the proposition of
NS theory (the elimination of combinatorial effects)
can be verified/tested by building a Normalized sys-
tem according to the principles of form and function
and proving that the system is exempt of combinato-
rial effects.
Walls et al. (1992) formulated the idea that kernel
theories should be part of an Information System De-
sign Theory (ISDT). According to these authors, ker-
nel theories govern both the design requirements and
the design process. As Walls et al. (1992) believe,
these two aspects should be separated and therefore
defined both product kernel theories and process ker-
nel theories. Walls et al. (2004) elucidate the impor-
tance of kernel theories for design science, by stat-
ing that the design science process uses these theo-
ries and combines them with existing artifacts to for-
mulate new design theories. Gregor & Jones (2007,
p. 327) however argue that thetwo types of kernel the-
ories (i.e. process and product kernel theoris) are “a
linking mechanism for a number, or all, of the other
aspects of the design theory” and should be consid-
ered as one component, the justificatory knowledge
that explains why a design works. This symbiosis is
substantiated by the argument that the design process
and design product are mostly founded by a single
kernel theory (e.g. the justificatory knowledge). They
define this component as “the underlying knowledge
or theory from the natural or social or design sci-
ences that gives a basis and explanation for the de-
sign” (Gregor & Jones, 2007, p. 322). According to
this definition, the underlying justification for Nor-
malized Systems theory is twofold. First the central
idea of Normalized Systems theory is systems sta-
bility, as formulated in the systems stability theory
which states that a bounded input should always result
in a bounded output (BIBO-principle). In Normalized
Systems theory, this is interpreted as the transforma-
tion of functional requirements into software primi-
tives (Mannaert et al., 2011). Secondly, Normalized
Systems theory shows compatibility with the concept
of entropy, as has been discussed in Section 2. Initial
research efforts largely validate the use of Normal-
ized System principles when studying the NS theory
from the point of view of entropy theory (Mannaert,
De Bruyn & Verelst, 2012b).
The next component of a design theory is its prin-
ciples of implementation. Gregor & Jones (2007) con-
sider this and the next components as additional com-
ponents that are not a no essential part of a design
theory but should be formulated if the credibility of
the theory is to be enhanced. Concerning this compo-
nent, we can refer to the model taxonomy of Winter,
Gericke & Bucher (2009). According to this taxon-
omy, Normalized Systems theory should be classified
rather as a prescriptive model with result recommen-
dation (“a model”) than a model with activity recom-
mendation (“a method”), referring to the clear prin-
ciples of form and function of NS mentioned earlier.
Although the emphasis within NS theory is on the “re-
sult view” rather than the “activity view”, Winter et al.
(2009) argue that both are views on the same “prob-
lem solving artifact”. This similarity of methods and
models is also apparent in the classification used in
this paper, in the form of the similarities between the
principles of form and function and the principles of
implementation. The form or architecture of an arti-
fact can in itself be used as a underlying principle and
target on which a method and guidelines for construc-
tion of the artifact are based. As opposed to clearly
defined principles of form and function, such a for-
mal methodology in the form of a procedure that ex-
plicitly articulates the steps that need to be followed
to construct normalized elements, does not exist. The
formulation of the Normalized elements simply hap-
pens while keeping the theorems at the back of one’s
mind, and is helped by some supporting applications
(e.g. “Prime radient”). These applications are more
than a tool, as they provide guidelinesfor constructing
Normalized elements. For this reason, they could be
considered the principles of implementation of Nor-
malized System theory.
The final component, expository instantiation, has
two functions: it shows the applicability of a design
Second International Symposium on Business Modeling and Software Design
38