)(
gbestH
xF
2
t
2
10
−
0
1.0
1
4
8
10
2
C
1
D
2
D
3
D
4
D
)(
gbestH
xF
2
t
2
10
−
0
1.0
1
4
8
10
2
C
1
D
2
D
3
D
4
D
Figure 4: Local search process of F
H
.
AS2s in 100 trials for different initial states. Table 2
shows the SR/#ITE of local search after the successful
global search. We can see that the NDPSO can find
all the AS2 speedily.
Table 1: SR of global search of F
H
for t
max1
= 50.
C
1
= 3 C
1
= 5 C
1
= 10 C
1
= 20 C
1
= 30
m
1
= 32 20 36 59 79 84
m
1
= 64 32 53 74 92 94
m
1
= 128 46 61 80 91 95
Table 2: SR/#ITE of local search of F
H
for t
max1
= 50.
C
1
= 3 C
1
= 5 C
1
= 10 C
1
= 20 C
1
= 30
m
1
= 32 100/5.30 100/5.60 100/5.46 100/5.55 100/5.57
m
1
= 64 100/5.01 100/5.34 100/5.41 100/5.43 100/5.49
m
1
= 128 100/5.08 100/5.20 100/5.36 100/5.42 100/5.55
Table 3: SR of global search of F
H
for C
1
= 5.
m
1
= 32 m
1
= 64 m
1
= 128
t
max1
= 10 8 8 8
t
max1
= 30 31 40 47
t
max1
= 50 36 53 61
t
max1
= 100 - 57 70
t
max1
= 200 - 60 71
t
max1
= 400 - - 71
t
max1
= 800 - - 72
Table 3 showsthe SR in global search for t
max1
and
m
1
. The SR increases as t
max1
increases. For m
1
= 64
and 128, the SR saturates and t
max1
= 50 (or 100) is
sufficient for reasonable results. The parameter t
max1
can control the SR and computation costs.
4 CONCLUSIONS
The NDPSO is presented and its capability is investi-
gated in this paper. Basic numerical experiments are
performed and the results suggest the following.
1. The parameters m
1
and C
1
can control rough-
ness in the global search that is important to find all
the LSRs successfully. Higher resolution encourages
trapping and suitable roughness seems to exist.
2. Parallel processing of the local search in LSRs
is basic for efficient search. If LSRs can be con-
structed, the AS2s can be found speedily and steadily.
3. The discretization is basic to realize reliable
and robust search in both software and hardware.
Future problems are many, including analysis
of search process, analysis of role of parameters,
comparison with various PSOs (Engelbrecht, 2005)
(Miyagawa and Saito, 2009) and application to prac-
tical problems (Valle et al., 2008) (Kawamura and
Saito, 2010).
REFERENCES
Engelbrecht, A. P. (2005). Fundamentals of computational
swarm intelligence. Willey.
Garro, B. A., Sossa, H., and Vazquez, R. A. (2009). Design
of artificial neural networks using a modified particle
swarm optimization algorithm. In Proc. IEEE-INNS
Joint Conf. Neural Netw., pages 938–945.
Kawamura, K. and Saito, T. (2010). Design of switching
circuits based on particle swarm optimizer and hy-
brid fitness function. In Proc. Annual Conf. IEEE Ind.
Electron. Soc., pages 1099–1103.
Miyagawa, E. and Saito, T. (2009). Particle swarm optimiz-
ers with growing tree topology. IEICE Trans. Funda-
mentals, E92-A:2275–2282.
Parsopoulos, K. E. and Vrahatis, M. N. (2004). On the
computation of all global minimizers through parti-
cle swarm optimization. IEEE Trans. Evol. Comput.,
8(3):211–224.
Sevkli, Z. and Sevilgen, F. E. (2010). Discrete particle
swarm optimization for the orienteering problem. In
Proc. IEEE Congress Evol. Comput., pages 1973–
1944.
Valle, Y., Venayagamoorthy, G. K., Mohagheghi, S., Her-
nandez, J.-C., and Harley, R. G. (2008). Particle
swarm optimization: basic concepts, variants and ap-
plications in power systems. IEEE Trans. Evol. Com-
put., 12(2):171–195.
Wachowiak, M. P., Smolikova, R., Zheng, Y., and Zurada,
J. M. (2004). An approach to multimodal biomedical
image registration utilizing particle swarm optimiza-
tion. IEEE Trans. Evol. Comput., 8(3):289–301.
Yang, S. and Li, C. (2010). A clustering particle swarm
optimizer for locating and tracking multiple optima in
dynamic environments. IEEE Trans. Evol. Comput.,
14(6):959–974.
Yang, X.-S. and Deb, S. (2010). Eagle strategy using levy
walk and firefly algorithms for stochastic optimiza-
tion. Nature Inspired Cooperative Strategies for Opti-
mization, 284:101–111.
ECTA 2011 - International Conference on Evolutionary Computation Theory and Applications
266