by Shannon (1949) justifies the statistical attributes
of information. In terms of the semantic aspect,
Dretske’s (1991) Semantic Theory of Information
has a fundamental significance to the study of the
content of information. Barwise and Seligman (1997)
developed the Information Flow Channel Theory
that enables one to identify information flow
between systems with the notion of ‘distributed
systems’.
Despite of those well established theories about
information, the debates around information have
never stopped. Particularly, what is the true nature of
information, and is it possible to give a single and
universally accepted definition to information?
Information has been referred to as processed data,
the propositional content of a sign, data plus
meaning and many more. Moreover, various natures
are being attributed to information including
objective, subjective and combinations of both.
Therefore, finding a clear, justifiable, and applicable
concept of information becomes increasingly vital
for academic researchers and society as a whole.
The study of information can be traced back
many centuries. According to Harper (Lyytinen,
Klein and Hirschheim, 1991), the notion of
“Information” is originally invented in 1387 with the
definition of “act of informing”. It was referred to as
“knowledge communicated” a century later. The
development of modern technology has inevitably
multiplied the number of definitions for information
with varying degrees of complexity. Among them, a
common view is that information is data that has
been processed in some way to make it useful for
decision makers, which is revealed by Lewis’s
(Lewis, 1993) survey of 39 IS texts. Information
embodies an objective nature according to this
assumption, because data is objective and
independent to its observer in term of its existence
and structure. Dretske argues that “Information is
the propositional content of a sign (Dretske, 1991, p.
65), (Mingers, 1995, p. 6)”. The generation of
information is due to reduction in uncertainty of
what might have happened.
Bateson suggests that information is a difference
that makes a difference (Bateson, 2000, p. 286),
which can be interpreted that it is the difference that
generates an event, a sign, a symbol, or an utterance.
Subjectivists Lewis and Checkland believe that
information exists within human’s cognition. As
Lewis argues, “Different observers will generate
different information from the same data since they
have differing values, beliefs, and expectations
(Lewis, 1993)”. Moreover, Checkland formulates
this view as “information equals data plus meaning
(Checkland, 1990, p. 303)”. That is, by attributing
meaning to data, we create information”.
It is hardly surprising to experience such fierce
controversy over the nature of information. Some
philosophers have sensed the powerful, elusive
nature of information and brought out an impartial
idea – the definition of information depends on
different fields of requirements. As Shannon points
out “It is hardly to be expected that a single concept
of information would satisfactorily account for the
numerous possible applications of this general field”
(Shannon, 1993, p. 180). Floridi further emphasises,
“It (information) can be associated with several
explanations, depending on the cluster of
requirements and desiderata that orientate a theory.”
Some philosophers pay their attention to defining
other attributes of information. Shannon is the
founder of the Mathematical Theory of
Communication (Shannon, 1949), which focuses on
the statistical perspective of information. The basic
idea of this theory is that information can be
accurately quantified as long as the unlikeliness, i.e.,
the probability, of the random event is known.
Philosophers and mathematicians such as
Barwise and Seligman (1997) and Devlin (1995)
developed and formulated the Information Flow
Channel theory and the Infon theory. Their
motivating idea is that information flow is made
possible by regularities in distributed systems.
Constraints capture what (information) flows, and
channels reveal why such flow takes place. For
example, a constraint concerning a tree trunk could
be ‘Number of rings’
Ö
‘Age of tree’.
Meaning is most commonly used in the field of
linguistics, e.g., Semantics, although it plays equally
important roles in non-linguistic fields like
Semiotics. The notion of ‘meaning’ may seem
simple, but in reality, the characteristics of the
notion of ‘meaning’ are that it is far too ambiguous
and hard to define. Furthermore, understanding the
relationship correctly between information and
meaning is crucial since this decides how IS and
meaning system are related.
The study of meaning has the same prolonged
history as information. In the past, meaning was
referred to as tenor, gist, drift, trend, purport, sense,
significance, intention, etc. Grice (Grice, 1957, pp.
377-388) divides the convention of meaning into
two categories, natural and non-natural meaning.
The natural meaning is close (if not equivalent) to
the ordinary sense of “information”, for example, a
blown fuse means the circuit has been overloaded,
and that it is raining means that the grass is wet.
Non-natural meaning is relating to language and
THE NOTION OF “MEANING SYSTEM” AND ITS USE FOR “INFORMATION SYSTEMS”
53