Table 1: Symmetry feature vectors of the four wallpapers showed in Figure 1. The bold values means a value that has to be
considered as presence of symmetry to consider each vector as the group that it belongs to, while the others mean absence of
others symmetries.
Sample SFV=(R
2
, R
3
, R
4
, R
6
, T
1
, T
2
, D
1
, D
2
, T
1G
, T
2G
, D
1G
, D
2G
) PSG
1 (0.62, 0.47, 0.69, 0.34, 0.65, 0.67, 0.80, 0.59, 0.37, 0.43, 0.80, 0.59) P1
2 (0.82, 0.09, 0.20, 0.09, 0.88, 0.83, 0.20, 0.19, 0.27, 0.26, 0.2, 0.19) PMM
3 ( 0.95, 0.42, 0.33, 0.46, 0.39, 0.45, 0.31, 0.48, 0.98, 0.99, 0.31, 0.48) PGG
4 (0.46, 0.69, 0.28, 0.49, 0.74, 0.65, 0.48, 0.72, 0.74, 0.65, 0.48, 0.72) P31M
Table 2: Binary prototypes for the 17 PSG classes.
Classes Prototype
Feature vectors
P1 (0,0,0,0,0,0,0,0,0,0,0,0)
P2 (1,0,0,0,0,0,0,0,0,0,0,0)
PM
1
(0,0,0,0,1,0,0,0,0,0,0,0)
PM
2
(0,0,0,0,0,1,0,0,0,0,0,0)
PG
1
(0,0,0,0,0,0,0,0,1,0,0,0)
PG
2
(0,0,0,0,0,0,0,0,0,1,0,0)
CM
1
(0,0,0,0,0,0,1,0,0,0,1,0)
CM
2
(0,0,0,0,0,0,0,1,0,0,0,0)
PMM (1,0,0,0,1,1,0,0,0,0,0,0)
PMG
1
(1,0,0,0,1,0,0,0,0,1,0,0)
PMG
2
(1,0,0,0,0,1,0,0,1,0,0,0)
PGG (1,0,0,0,0,0,0,0,1,1,0,0)
Classes Prototype
Feature vectors
CMM (1,0,0,0,0,0,1,1,0,0,1,1)
P4 (1,0,1,0,0,0,0,0,0,0,0,0)
P4M (1,0,1,0,1,1,1,1,0,0,1,0)
P4G (1,0,1,0,0,0,1,1,1,1,1,0)
P3 (0,1,0,0,0,0,0,0,0,0,0,0)
P31M
1
(0,1,0,0,1,1,1,0,1,1,1,0)
P31M
2
(0,1,0,0,1,1,0,1,1,1,0,0)
P3M1
1
(0,1,0,0,0,0,1,0,0,0,1,0)
P3M1
2
(0,1,0,0,0,0,0,1,0,0,0,0)
P6 (1,1,0,1,0,0,0,0,0,0,0,0)
P6M (1,1,0,1,1,1,1,1,1,1,1,0)
start by proposing the use of binary prototypes repre-
senting each one of the 17 classes. Each prototype has
a SFV with components to ’1’ if the symmetry is hold,
and ’0’ otherwise. The Table 2 shows the resulting
23 prototypes. Some classes have two prototypes be-
cause there are two possibilities where reflection sym-
metry can appears (two UL sides and two diagonals).
We then use a Nearest Neighbour Classifier (NNC) to
perform the task. The Euclidean distance to the class
prototype can be used as a measure of confidence.
After applying the NNC to several image collec-
tions we did not found significant improvements in
comparision with RBC (see the Experiments section).
It is probably due to the bias of the feature values.
The minimum values (non-symmetry) are higher than
expected because the SAD between image patches al-
ways produces non-null values. Moreover, maximum
values (symmetry) rarely approximate to traslational
symmetry. In this situation, the use of binary proto-
types, with inter-class boundaries equidistant to each
class, does not fit the problem. However, some ad-
vantage has been achieved. The NNC produces an or-
dered set of outputs describing the class membership
of each sample. This latter consideration can enable
an automatic adjustment of the prototypes in order to
adapt them to the image variability.
3.3 Adaptive NNC (ANNC)
Recent works on NN classifiers have shown that adap-
tive schemes (Wang et al., 2007) outperforms the re-
Figure 3: Original image (up) width the unit lattice (UL)
and score map (down) for 2-fold symmetry rotation.
sults of classic NNC in many applications. We adopt
this approach for several reasons. Firstly, the discrete
nature of the image domain introduces errors in the
computations of image transforms. However, when
COMPUTATIONAL SYMMETRY VIA PROTOTYPE DISTANCES FOR SYMMETRY GROUPS CLASSIFICATION
89