3 DISCUSSION
AND CONCLUSIONS
The findings from the previous paragraph allow us
to control the ground state of the net to a
considerable extent. Let us consider a
-
dimensional hyper-cube with edge length of 2 and
center at the origin of coordinates. Configurations
s
are located at cube vertexes. Symmetric directions in
the hyper-cube must be chosen as vector
u
. For
each
u
of that kind 2
s
-configurations are
distributed in symmetric sets with vector
u
being
the axis of symmetry. Each set like that forms one of
k
classes. It can be turned into the ground state by
using the approach offered. Particularly, it is
possible to create the ground state from a very large
number of configurations. For example, the number
of
()
k
e
-class configurations (7) is equal to
!! !pkpk
.
Some coordinates of vector
u
can be zero. Let
1
0u . Then the same class will comprise not only
configuration
12
(, ,..., )
sss , but also
configuration
12
' ( , ,..., )
p
sss . In other words,
vector
u
having a zero coordinate results in the
number of configurations doubling in each class
k
.
In this event the conclusive statement of Theorem is
more general and should read: if non-zero
coordinates of vector
u
are equal to each other, the
ground-state configurations are the only fixed points
of the net.
What possible consequences the approach can
have are not known yet. It is necessary to look
through all symmetric directions of
u
in the hyper-
cube and arrange cube vertexes with respect to
vertex-to-vector
u
distance in each case. It is
necessary to turn to methods of the group theory
here (Davis, 2007).
The disadvantage of the whole approach is that
configurations comprising the ground state can’t be
arbitrary. They are the same distance from vector
u
and, therefore, form a symmetric set. We hope that
the following tricks (or their combinations) can help
us to avoid total symmetry of the ground state. First,
we can use a few vectors like
u
into the connection
matrix and thresholds rather just one vector. For
example, let there be vector
12
(, ,..., )
vv vv ,
2
pv
, and let us consider a neural net similar to
(1):
1
()(1 )( ),
()( ),
(1)sgn () .
ij ij i j i j
iii
p
iijji
j
Jfx uuvv
Tgxuv
Js T
(12)
If vectors
u
and
v
are configurations, it proves that
as long as
does not exceed the first transition
point
1
, the initial configurations
u
and
v
themselves are the ground state. If
1
x , a set
of configurations equally distant from both
u
and
v
will constitute the ground state. The net (12) will not
have other fixed points. Supported by a computer
simulation, this result arouses cautious optimism.
Second, it is possible to “separate” in (1)
thresholds
i
T and numbers
i
u used for building the
multiplicative matrix
ij
. Let us use earlier-
introduced vector
v
and consider a neural net
1
()(1 ) , () ,
(1)sgn () .
ij ij i j i i
p
iijji
j
fx uu T gxv
sJsT
Tentative considerations show that its ground state is
formed by
k
-class configurations nearest to the
vector difference
uv
. In other words, the trick
allows us to avoid the total symmetry of the ground
state. Of course the results need closer research.
The memory of the standard Hopfield model
with the Hebbian connection matrix and random and
independent patterns
()
s is well understood.
However, if the connection matrix is of the general
form, the memory of such a network is practically
unknown. In the same time an arbitrary connection
matrix
J
can be presented as a quasi-Hebbian one,
when using: i) orthogonal vectors
(μ)
u related to the
eigenvectors of the matrix
J
,
() ()
11
(1 )
ij ij i j
Juu
()+ ()
J~ u u
where
() ()
1
,...,
p
uu
(μ)
u=( )
,
()
i
u
1
R , ,
(μ)(ν)
uu
ii) or configuration vectors
(μ)
s with the weights r
(Kryzhanovsky, 2007):
1
r
()+()
J~ s s
,
()
1
i
s
,
1
r
R
.
Our multiplicative matrix
M
is only one term of the
quasi-Hebbian expansion. We hope that a detailed
analysis of the network with the connection matrix
M
will allow us to make headway on investigating
a more general case.
MultiplicativeNeuralNetworkwithThresholds
527