2 PHILOSOPHICAL
FOUNDATIONS OF
MODELLING AND
SIMULATION
From a philosophical point of view, the systems-
software dualism can be traced back to the 1950’s
and early 1960’s when the AI (Artificial
Intelligence) was emerging as a new, unpredicted
and unpredictable discipline, historically identified
as a branch of the cognitive sciences paradigm. AI
was founded at a conference held during the summer
of 1956 at Dartmouth College in Hanover, New
Hampshire where John McCarthy, Marvin Minsky,
Nathaniel Rochester and Claude Shannon presented
a very innovative work able to merge a considerable
number of philosophical theories developed in the
early 20th century, including linguistics
investigations and epistemological approaches, and
the most advanced engineering experimental works
(McCarthy, Minsky, Rochester and Shannon, 1955).
Since the Dartmouth conference, the
international scientific community interest toward
the AI rapidly increased and the most part of further
investigations proofed the presence of an
epistemological lack of effectiveness specially with
respect to the human knowledge representation area.
This, in turn, led the scientific community to explore
the possibility of finding a common terrain where
would be possible a productive confrontation
between different disciplines, methodologies and
approaches in order to establish a new common
paradigm serving as scientific and academic
theoretical bridge, the Cognitive Science paradigm.
Recently it has been defined as a contemporary,
empirically, based effort to answer long-standing
epistemological questions – particularly those
concerned with the nature of knowledge, its
components, its development, and its deployment
(Gardner, 1986).
It is relevant that there was a close constant
overlap between the results and the assumptions of
most of those theories similar to a chain reaction
even those they were developed in different times
and places, sometime very distant one from each
other, and that it has been demonstrated that all of
them were founded assuming the validity of the
Frege’s Principle, (Frege, 1893), known as Principle
of Compositionality, that states the meaning of a
complex expression is determined by the meanings
of its constituent expressions and the rules used to
combine them (Brucato, 2003).
Early Cognitive Science scopes and assumptions,
including for first the Frege’s Principle of
Compositionality we assume as its main
epistemological pillar, they continue to play a
central role in many contemporary disciplines like
SWE and SE modelling and simulation. More
specifically, they are the philosophical foundations
of OPM-based modelling and simulation holistic
approach to SWE and SE we identify with but not
limit to:
Wittgenstein's Tractatus logico-philosophicus
(1921). It contains the distinction between the
World, and the Language (s) used to describe
(give a picture of) it. This is the main
assumption of the well known Picture Theory
of the Meaning Wittgenstein developed to state
that the language is a picture of the world and it
is obtained combining the language building
blocks (the signs, later called symbols) into
propositions according to the predetermined set
of syntactic rules specifically pertaining to the
adopted language. He often used to compare
the process of composing a syntactically
correct proposition to the work of an architect
who designs and constructs a new building. If
something has been designed wrongly, the
building will not be able to be used for the
intended purposes, hence it will be basically
useless, with respect to the language, if the
syntactic rules are not properly followed the
proposition will be non-sensed and, in some
cases, it will also be not understandable;
Hierarchy of Languages (Russell, B., 1905).
Here Bertrand Russell illustrated the necessity
to adopt a higher level language to completely
and consistently describe a lower level
language. This theory has been formalized as
Type Theory;
The Mathematical Theory of Communication
developed by Shannon and Weaver (Shannon,
1938). The communication is assumed to be
the result of the information transmitting
process. Using a physical channel, a
predetermined quantity of information the
sender previously compressed through a code
he shares with the receiver, is possible to
reproduce at one point (the destination) either
exactly or approximately a message selected at
another point (the source);
The ballistic researches of Von Neumann
(1945) led him to the definition of a stable
machine structure, known as Von Neumann
Architecture, which served as the basis of all
the modern calculators and computational
machines;
KEOD2014-InternationalConferenceonKnowledgeEngineeringandOntologyDevelopment
406