Entropy as a Quality Measure of Correlations between n Information
Sources in Multi-agent Systems
G. Enee
1
and J. Collonge
2
1
ISEA - EA 7484, UNC, Campus de Nouville, Noumea, New Caledonia
2
Atout Plus Groupe, Noumea, New Caledonia
Keywords:
Entropy, Multi-agent Systems, Agent Communication Languages.
Abstract:
Shanon’s entropy has been widely used through different Science fields, as an example, to measure the quantity
of information found in a message coming from a source. In real world applications, we need to measure the
quality of several crossed information sources. In the specific case of language creation within multi-agent
systems, we need to measure the correlation between words and their meanings to evaluate the quality of
that language. When sources of information are numerous, we are willing to make correlations between
those differents sources. Considering those n sources of information are put together in a matrix having n
dimensions, we propose in this paper to extend Shanon’s entropy to measure information quality in R
2+
and
then in R
n+
.
1 INTRODUCTION
Entropy introduced by (Shanon C.E. and al., 1949)
can be used to measure uncertainty or randomness
in a flux coming from a source. The more a source
is uncertain, the more it brings novelty and thus the
highest is the measure of entropy. On the opposite,
the more a source repeats the same pattern, the less it
brings new information and the lower is the entropy
of such a source. The main focus of our article is
to adapt entropy to multiple sources of information.
Thus we will first describe the measure itself as it was
presented by (Shanon C.E. and al., 1949). Then we
will demonstrate how to deal with two sources of in-
formation. To enhance interpretation, we propose a
new measure of information quality that will be sus-
tained by an example of the emergence of a language
within a multi-agent system. In the fourth part, we
generalize the entropy measure to n sources of infor-
mation and finally we conclude our work.
2 ENTROPY TO MEASURE
UNCERTAINTY
Information theory and thus entropy has been used in
computer science mainly to optimize the transmission
of data through a medium of communication. En-
tropy gives precise bit size to use to transmit a par-
ticular serie of data. It also measures the uncertainty
in a flux coming from a source giving an evaluation
of transmission error. We will focus here on the de-
scription of the measure itself applied to information
transmission. We will shortly describe the behavior
of the measure while data bring uncertainty or not.
2.1 Entropy for T , a Transmitted
Message
Let’s M be the 1 − dimension matrix describing the
transmitted message T . The message contains n dif-
ferent values. Each of the n boxes of matrix M is
filled with the number of times each symbol of T ap-
pears. Since the transmitted message is one informa-
tion source, the matrix M is mono-dimensional too.
Thus p
i
represents the probability of having the i
th
particular symbol among n others and is calculated as
follow:
p
i
=
M
i
∑
j
M
j
(2.1)
Thanks to p
i
, we are now able to measure the
quantity of uncertainty in the transmitted message T :
H = −
∑
i
p
i
× log
2
(p
i
) (2.2)
Finally that measure brings useful information
about the transmitted message:
Enee, G. and Collonge, J.
Entropy as a Quality Measure of Correlations between n Information Sources in Multi-agent Systems.
DOI: 10.5220/0007684802810287
In Proceedings of the 11th International Conference on Agents and Artificial Intelligence (ICAART 2019), pages 281-287
ISBN: 978-989-758-350-6
Copyright
c
2019 by SCITEPRESS – Science and Technology Publications, Lda. All rights reserved
281