Robust Stability Analysis of a Class of Delayed Neural Networks
Neyir Ozcan
1
and Sabri Arik
2
1
Istanbul University, Department of Electrical and Electronics Engineering, 34320 Avcilar, Istanbul, Turkey
2
Isik University, Department of Electrical and Electronics Engineering, 34980 Sile, Istanbul, Turkey
Keywords:
Neural Networks, Delayed Systems, Lyapunov Functionals, Stability Analysis.
Abstract:
This paper studies the global robust stability of delayed neural networks. A new sufficient condition that
ensures the existence, uniqueness and global robust asymptotic stability of the equilibrium point is presented.
The obtained condition is derived by using the Lyapunov stability and Homomorphic mapping theorems and
by employing the Lipschitz activation functions. The result presented establishes a relationship between the
network parameters of the neural system independently of time delays. We show that our results is new and
improves some of the previous global robust stability results expressed for delayed neural networks.
1 INTRODUCTION
In recent years, neural networks proved to be a use-
ful system which has been successfully applied to
various practical engineering problems such as op-
timization, image and signal processing, and asso-
ciative memory design. In the design of neural net-
works for solving practical problems, the key fac-
tor associated with the dynamical behavior of neu-
ral networks is the characterization of the equilibrium
point in terms of the network parameters and activa-
tion functions. In some special applications of neu-
ral networks such as designing neural networks for
solving optimization problems, the equilibrium point
of the designed neural network must be unique and
globally asymptotically stable. On the other hand,
when an electronically implemented neural network
is used in real time applications, we might get faced
with two undesired physical event that may affect the
dynamics of neural networks. The first event is the
time delays time delays that occur during the signal
transmission between the neurons, the other one is the
deviations due the to the tolerances of the electronic
components used in the implementation of neural net-
works. The readers can find a detailed robust stability
analysis of delayed neural networks under various as-
sumptions on the activation functions and present var-
ious robust stability conditions for different classes of
neural networks in (Arik and Tavsanoglu, 2000); (Cao
and Wang, 2003); (Cao and Wang, 2005); (Ensari and
Arik, 2010); (Forti and Tesi, 1995); (Li et al., 2003);
(Liao and Wang, 2000);(Liao et al., 2002); (Liao and
Yu,1998); (Liao et al., 2001); (Mohamad, 2001); (Oz-
can and Arik, 2006); (Singh, 2007); (Sun and Feng,
2003); (Wang and Michel, 1996); (Yi and Tan, 2002).
The neural network model we consider in this pa-
per is described by the following equations:
dx
i
(t)
dt
= c
i
x
i
(t) +
n
j=1
a
ij
f
j
(x
j
(t)
+
n
j=1
b
ij
f
j
(x
j
(t τ
j
)) + u
i
, i = 1,2,...,n (1)
where n is the number of the neurons, x
i
(t) denotes
the state of the neuron i at time t, f
i
(·) denote activa-
tion functions, a
ij
and b
ij
are the weight coefficients,
τ
j
are the delay parameters, u
i
is the constant input to
the neuron i, c
i
is the charging rate for the neuron i.
Neural network model (1) can be written in the
vector-matrix form as follows
˙x(t) = Cx(t) + Af(x(t)) + Bf(x(t τ)) + u (2)
where x(t) = (x
1
(t), x
2
(t), ..., x
n
(t))
T
R
n
,
C = diag(c
i
> 0)
n×n
is a positive diagonal ma-
trix, A = (a
ij
)
n×n
, B = (b
ij
)
n×n
, u = (u
1
,u
2
,...,u
n
)
T
and f(x(t)) = ( f
1
(x
1
(t)), f
2
(x
2
(t)), ..., f
n
(x
n
(t)))
T
and f(x(t τ)) = ( f
1
(x
1
(t τ
1
)), f
2
(x
2
(t
τ
2
)),..., f
n
(x
n
(t τ
n
)))
T
.
It will be assumed that the matrices C, A and B
in (2) are uncertain but their elements have the lower
and upper bounds. That is to say, C, A and B are
assumed to have the parameter ranges defined as
603
Ozcan N. and Arik S..
Robust Stability Analysis of a Class of Delayed Neural Networks.
DOI: 10.5220/0004090506030606
In Proceedings of the 4th International Joint Conference on Computational Intelligence (NCTA-2012), pages 603-606
ISBN: 978-989-8565-33-4
Copyright
c
2012 SCITEPRESS (Science and Technology Publications, Lda.)
follows :
C
I
:= {0 < CCC,i.e.,0 < c
i
c
i
c
i
}
A
I
:= {A = (a
ij
) : AAA,i.e.,a
ij
a
ij
a
ij
} (3)
B
I
:= {B = (b
ij
) : BBB,i.e.,b
ij
b
ij
b
ij
}
We also assume that f
i
(·) are Lipschitz continuous,
i.e., there exist some positive constants
i
> 0 such
that
| f
i
(x) f
i
(y)|≤
i
|x y|, i = 1,2,...,n, x,y R,x 6=y
The class of Lipschitz activation functions is
denoted by f L .
The following two lemmas will play an important
role in determining the sufficient conditions for the
global robust exponential stability of the equilibria of
neural networks (1) and (2) :
Lemma 1 (Cao and Wang, 2005). Let the matrices A
and B in (3) be defined in the intervals A [A,A] and
B [B,B]. Then, the following inequalities hold :
||A||
2
≤||A
||
2
+ ||A
||
2
||B||
2
≤||B
||
2
+ ||B
||
2
where A
=
1
2
(A+A), A
=
1
2
(AA), B
=
1
2
(B+B)
and B
=
1
2
(B B).
Lemma 2 (Ensari and Arik, 2010). Let the matrices
A and B in (3) be defined in the intervals A [A,A]
and B [B,B]. Then, the following inequalities hold :
||A||
2
q
||A
||
2
2
+ ||A
||
2
2
+ 2||A
T
|A
|||
2
||B||
2
q
||B
||
2
2
+ ||B
||
2
2
+ 2||B
T
|B
|||
2
where A
=
1
2
(A+A), A
=
1
2
(AA), B
=
1
2
(B+B)
and B
=
1
2
(B B).
Lemma 3 (Singh, 2007). Let the matrices A and B in
(3) be defined in the intervals A [A,A] and B [B,B].
Then, the following inequalities hold :
||A||
2
≤||
ˆ
A||
2
||B||
2
≤||
ˆ
B||
2
where
ˆ
A = ( ˆa
ij
)
n×n
with ˆa
ij
= max{|a
ij
|,|a
ij
|} and
ˆ
B = (
ˆ
b
ij
)
n×n
with
ˆ
b
ij
= max{|b
ij
|,|b
ij
|}.
Lemma 4 (Forti and Tesi, 1995). If H(x) C
0
satisfies the following conditions
(i) H(x) 6= H(y) for all x 6= y,
(ii) ||H(x)||→as ||x||→,
then, H(x) is homeomorphism of R
n
.
We also make use of the following vector norm
and matrix norm in the proof of our main result. Let
v = (v
1
,v
2
,...,v
n
)
T
R
n
and W = (w
ij
)
n×n
. Then, we
have
||v||
2
=
n
i=1
|v
i
|
2
1/2
,||Q||
2
= [λ
max
(Q
T
Q)]
1/2
Throughout this paper, for v = (v
1
,v
2
,...,v
n
)
T
R
n
,
|v| will denote |v| = (|v
1
|,|v
2
|,...,|v
n
|)
T
. For any ma-
trix W = (w
ij
)
n×n
, |W| = (|w
ij
|)
n×n
. If W is positive
definite, then, λ
m
(W) and λ
M
(W) will denote the min-
imum and maximum eigenvalues of W, respectively.
2 GLOBAL ASYMPTOTIC
ROBUST STABILITY ANALYSIS
In this section, we present new sufficient conditions
for the existence, uniqueness and global robust sta-
bility of the equilibrium point for the neural systems
(1). We proceed with following result:
Theorem 1: Let f L . Then, the neural network
model (2) is globally asymptotically robust stable, if
the following condition holds
= r ||P||
2
||Q||
2
> 0
where r =
c
m
µ
M
with c
m
= min(c
i
) and µ
M
= max(µ
i
),
and
||P||
2
= min{||A
||
2
+ ||A
||
2
,
q
||A
||
2
2
+ ||A
||
2
2
+ 2||A
T
|A
|||
2
,||
ˆ
A||
2
}
||Q||
2
= min{||B
||
2
+ ||B
||
2
,
q
||B
||
2
2
+ ||B
||
2
2
+ 2||B
T
|B
|||
2
,||
ˆ
B||
2
}
Proof: For the map
H(x) = Cx+ Af(x) + Bf(x) + u
we have
H(x) H(y) = C(x y) + A( f(x) f(y))
+B( f(x) f(y))
If we multiply both sides of (20) by (x y)
T
, then we
get :
(x y)
T
(H(x) H(y))
= (x y)
T
C(x y)
+(x y)
T
A( f(x) f(y))
+(x y)
T
B( f(x) f(y))
c
m
||x y||
2
2
+(||A||
2
+ ||B||
2
)||x y||
2
|| f(x) f(y)||
2
IJCCI2012-InternationalJointConferenceonComputationalIntelligence
604
The fact that || f(x) f(y)||
2
µ
M
||x y||
2
implies
(x y)
T
(H(x) H(y))
c
m
||x y||
2
2
+ µ
M
(||A||
2
+ ||B||
2
)||x y||
2
2
Since ||A||
2
≤||P||
2
, ||B||
2
≤||Q||
2
, we obtain
(x y)
T
(H(x) H(y))
c
m
||x y||
2
2
+ µ
M
(||P||
2
+ ||Q||
2
)||x y||
2
2
which is equivalent to
1
µ
M
(x y)
T
(H(x) H(y))
(r (||P||
2
+ ||Q||
2
))||x y||
2
2
=
||x y||
2
2
(4)
implying that
(x y)
T
(H(x) H(y)) < 0,x 6= y
from which it can be directly concluded that
H(x) 6= H(y) when x 6= y.
In order to show that ||H(x)|| as ||x|| ,
we let y = 0 in (21), in which case, we can write
1
µ
M
x
T
(H(x) H(0))
||x||
2
2
from which one can derive that
||x||
||H(x) H(0)||
1
µ
M
||x||
2
2
Using ||x||
||x||
2
and ||H(x) H(0)||
1
||H(x)||
1
||H(0)||
1
, it follows that ||H(x)||
1
µ
M
||x||
2
+ ||H(0)||
1
. Since ||H(0)||
1
is finite, we
conclude that ||H(x)|| as ||x|| . Hence,
under the condition of Theorem 1, neural network (1)
has a unique equilibrium point.
We will now simplify system (1) as follows : we
let z
i
(·) = x
i
(·) x
i
, i = 1,2,...,n and note that the
z
i
(·) are governed by :
˙z
i
(t) = c
i
z
i
(t) +
n
j=1
a
ij
g
j
(z
j
(t))
+
n
j=1
b
ij
g
j
(z
j
(t τ
ij
)), i = 1,2,...,n (5)
where g
i
(z
i
(·)) = f
i
(z
i
(·)+x
i
) f
i
(x
i
), i = 1,2,...,n.
It can easily be verified that the functions g
i
satisfy
the assumptions on f
i
i.e., f L implies that g L .
We also note that g
i
(0) = 0,i = 1,2,...,n. It is thus
sufficient to prove the stability of the origin of the
transformed system (4) instead of considering the
stability of x
of system (1).
For τ
ij
= τ
j
, (4) can be expressed in the matrix-
vector form as follows :
˙z(t) = Cz(t) + Ag(z(t)) + Bg(z(t τ)) (6)
where z(t) = (z
1
(t), z
2
(t), ..., z
n
(t))
T
R
n
is state
vector of transformed neural system, g(z(t)) =
(g
1
(z
1
(t)), g
2
(z
2
(t)), ..., g
n
(z
n
(t)))
T
and g(z(t τ)) =
(g
1
(z
1
(t τ
1
)),g
2
(z
2
(t τ
2
)),..., g
n
(z
n
(t τ
n
)))
T
.
Now construct the following positive definite Lya-
punov functional
V(z(t)) = z
T
(t)z(t) + k
n
i=1
Z
t
tτ
i
z
2
i
(ζ)dζ
where the k is a positive constant to be determined
later. The time derivative of the functional along the
trajectories of system (5) is obtained as follows
˙
V(z(t)) = 2z
T
(t)Cz(t) + 2z
T
(t)Ag(z(t))
+2z
T
(t)Bg(z(t τ)) + k||z(t)||
2
2
k||z(t τ)||
2
2
2c
m
||z(t)||
2
2
+ 2||A||
2
||z(t)||
2
||g(z(t))||
2
+2||B||
2
||z(t)||
2
||g(z(t τ))||
2
+k||z(t)||
2
2
k| |z(t τ)||
2
2
2c
m
||z(t)||
2
2
+ 2µ
M
||A||
2
||z(t)||
2
2
+2µ
M
||B||
2
||z(t)||
2
||z(t τ)||
2
+k||z(t)||
2
2
k| |z(t τ)||
2
2
2c
m
||z(t)||
2
2
+ 2µ
M
||A||
2
||z(t)||
2
2
+µ
M
||B||
2
||z(t)||
2
2
+ µ
M
||B||
2
||z(t τ)||
2
2
+k||z(t)||
2
2
k| |z(t τ)||
2
2
2c
m
||z(t)||
2
2
+ 2µ
M
||P||
2
||z(t)||
2
2
+µ
M
||Q||
2
||z(t)||
2
2
+µ
M
||Q||
2
||z(t τ)||
2
2
+ k| |z(t)||
2
2
k||z(t τ)||
2
2
Letting k = µ
M
||Q||
2
results in
˙
V(z(t)) 2(c
m
µ
M
||P||
2
µ
M
||Q||
2
)||z(t)||
2
2
= 2µ
M
(r ||P||
2
||Q||
2
)||z(t)||
2
2
= 2µ
M
||z(t)||
2
2
It is easy to see that
˙
V(z(t)) < 0 for all z(t) 6= 0,
and
˙
V(z(t)) = 0 if and only if z(t) 6= 0. In addition,
V(z(t)) is radially unbounded since V(z(t)) as
||z(t)|| . Thus, it follows that the origin system
(5), or equivalently the equilibrium point of system
(2) is globally asymptotically stable.
RobustStabilityAnalysisofaClassofDelayedNeuralNetworks
605
We will now compare our result obtained in
Theorem 1 with a previously reported corresponding
stability result which is given in the following:
Theorem 2 (Ozcan and Arik, 2006). Let f
L . Then, the neural network model (2) is globally
asymptotically robust stable, if
σ = r (||A
||
2
+ ||A
||
2
+ ||B
||
2
+ ||B
||
2
) > 0
where r =
c
m
µ
M
with c
m
= min(c
i
)and µ
M
= max(µ
i
).
Since ||P||
2
≤||A
||
2
+ ||A
||
2
, ||Q||
2
≤||B
||
2
+
||B
||
2
, Theorem 1 directly implies the result of The-
orem 2.The result of Theorem 2 can be considered a
special case of the result of Theorem 1.
3 CONCLUSIONS
By using a proper Lyapunov functional, we have ob-
tained a easily verifiable delay independent sufficient
condition for the global robust stability of the equilib-
rium point. We havealso compared our result with the
previous corresponding robust stability results pub-
lished in the previous literature, proving that our con-
dition is new and generalizes previously reported re-
sults.
REFERENCES
Arik, S. and Tavsanoglu, V. (2000). On the global asymp-
totic stability of delayed cellular neural networks.
IEEE Trans. Circuits and Syst.I, 47(5):571–574.
Cao, J. and Wang, J. (2003). Global asymptotic stability
of a general class of recurrent neural networks with
time-varying delays. IEEE Trans. Circuits and Syst.I,
50:34–44.
Cao, J. and Wang, J. (2005). Global asymptotic and ro-
bust stability of recurrent neural networks with time
delays. IEEE Trans. Circuits and Syst.I, 52:417–426.
Ensari, T. and Arik, S. (2010). New results for robust sta-
bility of dynamical neural networks with discrete time
delays. Expert Systems with Applications, 27:5925
5930.
Forti, M. and Tesi, A. (1995). New conditions for global
stability of neural networks with applications to linear
and quadratic programming problems. IEEE Trans.
Circuits Syst., 42(7):354–365.
Li, X. M., Huand, L. H., and Zhu, H. (2003). Global stabil-
ity of cellular neural networks with constant and vari-
able delays. Nonlinear Analysis, 53:319–333.
Liao, T.-L. and Wang, F. C. (2000). Global stability for
cellular neural networks with time delay. IEEE Trans.
on Neural Networks, 11:1481–1485.
Liao, X., Chen, G., and Sanchez, E. N. (2002). Lmi-based
approach for asymptotic stability analysis ofdelayed
neural networks. IEEE Trans. Circuits and Syst.I,
49:1033–1039.
Liao, X. F., Wong, K. W., Wu, Z., and Chen, G. (2001).
Novel robust stability for interval-delayed hopfield
neural. IEEE Trans. Circuits and Syst.I, 48:1355–
1359.
Liao, X. F. and Yu, J. (1998). Robust stability for interval
hopfield neural networks with time delay. IEEE Trans.
Neural Networks, 9:1042–1045.
Mohamad, S. (2001). Global exponential stability in
continuous-time and discrete-time delayed bidirec-
tional neural networks. Physica D, 159:233–251.
Ozcan, N. and Arik, S. (2006). Global robust stability
analysis of neural networks with multiple time delays.
IEEE Trans. Circuits and Syst.I, 53(1):166–176.
Singh, V. (2007). Global robuststability of delayed neural
networks: Estimating upper limit of norm of delayed
connection weight matrix. Chaos, Solitons and Frac-
tals, 32:259263.
Sun, C. and Feng, C. B. (2003). Global robust exponen-
tial stability of interval neural networks with delays.
Neural Processing Letters, 17:107–115.
Wang, K. and Michel, A. N. (1996). On the stability of
family of nonlinear time varying systems. IEEE Trans.
Circuits Syst., 43(7):517–531.
Yi, Z. and Tan, K. (2002). Dynamicstability conditions for
lotka-volterra recurrent neural networks with delays.
hysical Review E, 66:011910.
IJCCI2012-InternationalJointConferenceonComputationalIntelligence
606