deepest concept subsuming both concepts, then the
distance between c
1
and c
2
is
hkhk
hkhk
lk
ee
ee
eccsim
22
22
1
)2,1(
−
−
−
+
−
⋅=
(15)
(k
1
and k
2
are scaling parameters for the shortest path
vs. the depth) (Debenham, and Sierra, 2008). This
type of measure can be done for asserted ontologies.
Another style of metrics is based on set-theoretic
principles, by counting intersections and unions of
ontological concepts. Best known is Tversky’s
similarity measure between two objects a and b and
with properties (feature sets) A and B:
||)),(1(||),(||
||
),(
ABbaBAbaBA
BA
bas
−−+−+∩
=
αα
(16)
where |.| is the cardinality of the set, minus is the set
difference and
α
(a,b) in the interval [0,..,1] is a
tuning factor that weights the contribution of the
first reference model (Tversky’s similarity measure
is not symmetrical).
Besides Tversky’s measure, similarity measure
functions available in most scientific mathematical
libraries are, for example Cosine, Dice, Euclidian,
Manhattan or Tanimoto.
The algorithm for using any of the similarity
measures above to determine the weight between
two agents is the following:
1. Generate the set of all ontological properties (both
numerical and non-numerical) of agent i as a union
of all ontological properties valid at iteration k. Let
this set be A and the cardinality of the set be |A|;
2. Generate in the same manner the set of all
ontological properties for agent j; let this set be B
with cardinality |B|;
3. Compute the cardinality of the intersection, union,
differences, symmetric difference, as needed by the
selected similarity formula;
4. Compute the similarity index. The resulting value
is the weight
)(kw
j
i
.
5 EXAMPLE
To demonstrate the weighted method we present an
example with a cooperation multi-robot system used
in supermarket supervision. Two of the robots in the
system have both common and distinct sensors.
Robot 1, Ro1, has sensors (see relation (1)):
Distance = Se(1,1); Shape (cube, cylinder, sphere) =
Se(1,2); Dimensions (length, width, height,
diameter) = Se(1,3); Temperature = Se(1,4); Colour
= Se(1,5).
The second robot, Ro2, has sensors to obtain
information for: Distance = Se(2,1); Shape =
Se(2,2); Dimensions Se(2,3); Weight = Se(2,4).
Combining Ro1 sensors information, we obtain
possible perception relations (see relation (2)), e.g.:
Re([Se(1,1), Se(1,2), Se(1,3)], 1) – for a
combination of Distance, Shape and Dimension and
Re([Se(1,1), Se(1,4), Se(1,5)], 5) – for a
combination of Distance, Temperature and Colour.
These perception relations have associated
symbolic perceptions (see relation (3)), as shown in
Table 1.
Table 1: Symbolic perceptions Ro1.
Index Re Perception Relation Re
Tv
(symbol)
1 Distance < 10m AND Shape =
Parallelepiped AND Dimension >
10m x 2m x 2m
shelf
2 Distance < 5m AND Shape =
Parallelepiped OR Cube AND
Dimension < 1m x 1m x 0.5m
box
3 Distance < 5m AND Shape =
Sphere AND Dimension < 1m
ball
4 Distance < 5 m AND Shape =
Sphere AND Dimension < 0.5m
balloon
5 Distance < 50m AND
Temperature > 200
0
C AND Colour =
Red OR Orange
fire
6 Distance < 10m AND
Temperature > 50
0
C AND Colour =
White OR Yellow
lamp
A similar computation for robot 2 gives the
results in Table 2.
Table 2: Symbolic perceptions Ro2.
Index
Re
Perception Relation Re
Tv
(symbol)
1 Distance < 10m AND Shape =
Parallelepiped AND Dimension > 10m x 2m
x 2m
shelf
2 Distance < 5m AND Shape =
Parallelepiped OR Cube AND Dimension <
1m x 1m x 0.5m AND Weight > 3 kg
box
3 Distance < 5m AND Shape =
Parallelepiped AND Dimension < 1.5m x
0.5m x 1m AND Weight < 3 kg
cart
We have now two sets of symbols for the two
robots (agents): P={shelf, box, ball, balloon, fire,
lamp} and Q={shelf, box, cart}. We compute the set
operations required e.g. using the “dice” similarity
measure:
ICINCO 2010 - 7th International Conference on Informatics in Control, Automation and Robotics
184