centers of frequency bands of in their zones. This means that there is a memory vector
M
s
whose input can simultaneously fire S CCNs.
As M
s
is stored in all the S CCNs, their centers of frequency bands will be as-
signed the corresponding frequencies of M
s
(each component of M
s
is represented by
a frequency) and no other vector M
t
, (M
s
6= M
t
) can be stored in any of the S CCNs.
So we have only N − S CCNs left to store N − 1 vectors. If S > 1, then we can not
store all the remaining vectors (remaining N −1 vectors, since only one memory vector
M
s
is stored) to the memory. Therefore, this case does not allow us to store and recall
N vectors. In the worst case, if all the N CCNs have the same centers of frequency
bands then we can store only one vector in the whole network instead of N vectors.
Thus the performance degrades down to 1/N percent which is quite worst for large
values of N. In general, if S
1
is the number of CCNs having the same centers of fre-
quency band f
1
1
, · · · , f
1
R
, S
2
is the number of CCNs with the same centers of frequency
band f
2
1
· · · f
2
R
and S
p
is the number of CCNs with the same centers of frequency band
f
p
1
, · · · , f
p
R
then we can achieve p/N percent of vectors to be stored and recalled cor-
rectly where S
1
+ S
2
+ · · · + S
p
= N. The following example demonstrates such a
case.
Suppose the network is required to store input vectors {0, 1, 0, 0}, {1, 1, 0, 0}, {1, 0,
1, 0} and recall when any of the vectors is presented to the network. Assume we have a
network consisting of three CCNs each with four zones. Let the first, second and third
CCN’s band centers are 0.1, 0.2, 0.1, 0.1; 0.1, 0.2, 0.1, 0.1 and 0.2, 0.2, 0.1, 0.1, respec-
tively. Let 0 and 1 be encoded by the frequencies 0.1 and 0.2 respectively. Therefore,
we obtain the equivalent representation of the four input vectors as {0.1, 0.2, 0.1, 0.1},
{0.2, 0.2, 0.1, 0.1} and {0.2, 0.1, 0.2, 0.1}, respectively. As weapply the input {0.1, 0.2,
0.1, 0.1} to activate some CCN, we find all the zones of CCNs 1 and 2 become active
and they (both CCNs) fire. Thus, the input vector {0, 1, 0, 0} is stored in both of them. In
this way, the two CCNs are stimulated and their centers of frequency bands are assigned
the frequencies 0.1, 0.2, 0.1, 0.1. When the second input pattern {0.2, 0.2, 0.1, 0.1} is
presented it stimulates the third CCN and causes it to fire by storing the input frequen-
cies to the corresponding centers of frequency bands. Now for the last input there is no
CCN that can be activated since all the CCNs are already attracted to the two previous
input vectors. Although we have three CCNs to store and recall three vectors according
to the algorithm presented in the paper [1], we cannot store more than two input pat-
terns in the associative memory for this particular example. Thus the performance of
the proposed technique degrades in this case.
5 Solution Proposed for CCNs
In this section, we outline the improvement in the architecture of the CCN neural net-
work to remedy the situation illustrated above. This is intended so that at most one CCN
can be stimulated (attracted) by a single input pattern. The modified network is shown
in Fig. 3.
The main idea is to connect every CCN to all other CCNs and assign indices to them.
These indices, beginning from 1 to the number of CCNs, will be assigned arbitrarily
among the CCNs. It is assumed that the CCN with the lowest index has the highest
53