parents). Alternatively, parameters of genetic
operators might be included in a chromosome, evolve
in iterations and be applied to generate the next
candidate solution (Pellerin et al., 2004).
However, a booming interest in self-adaptation
has resulted in many proposed techniques and again
caused a problem of choice. Moreover, in some
approaches, self-adaptive operators use a number of
thresholds to switch between different types of a
genetic operator. These thresholds also should be
selected properly. On the other hand, due to the
impressive computing power available nowadays, it
became possible to test various settings of the
algorithm in parallel, which might be an alternative
approach to self-adaptation.
Nevertheless, in some studies, it has been shown
that at different stages of optimization, certain types
of genetic operators are beneficial for the search
(Tanabe and Fukunaga, 2013). Self-adaptive EAs
support this replacement of operators in generations,
whereas EAs with diverse settings run in parallel do
not provide this option. At the same time,
incorporating a migration process into parallel EAs,
i.e. the exchange of solutions, and creating a co-
operation of EAs with different settings allows
introducing candidate solutions generated by various
EAs operators in the population.
Therefore, in this study, we compare several self-
adaptation techniques with parallel EAs and their co-
operation having three variants of its topology based
on the example of a Differential Evolution (DE)
algorithm, which needs tuning CR and F parameters
(Storn and Price, 1997). Since DE is one of the most
effective and widely used heuristics, it is essential to
investigate different approaches to tune its key
parameters.
2 METHODS COMPARED
The general DE scheme for a minimization problem
contains the following steps:
Randomly initialize the population of size M: X
= {x
1
, …, x
M
};
Repeat the next operations until the stopping
criterion is satisfied:
- For each individual x
i
,
, in the
population do:
1. Randomly select three different individuals from
the population a, b, c (which are also different from
x
i
);
2. Randomly initialize an index
, where
n is the problem dimensionality;
3. Generate a mutant vector. For each ,
define
. Next, if
or , then
=
, otherwise
. CR and F
are the DE parameters.
4. If
, then replace
.
As a basis of this work, we used algorithms
implemented in the PyGMO library (Biscani et al.,
2018). There are two self-adaptive versions of DE
called SaDE and DE1220, wherein two variants of CR
and F control and adaptation are available,
particularly, jDE (Brest et al., 2006) and iDE (Elsayed
et al., 2011). In SaDE, a mutant vector is produced
using a DE/rand/1/exp strategy (by default and in our
experiments too), whereas in DE1220 the mutation
type is coded in a chromosome and also adapted.
In addition to self-adaptive algorithms, we applied
a conventional DE with different values of CR and F
parameters: and
Using the island class of PyGMO,
we could run DEs with different settings in parallel
threads to save computational time.
Next, we extended the PyGMO library with a set
of functions implementing the migration process
among the parallel islands. In this study, three
topologies of the island co-operation are investigated:
Ring, Random, and Fully Connected. After each T
m
generations, N
best
individuals with the highest fitness
from every population are sent to other islands to
substitute N
worst
solutions having the lowest fitness
there.
In the Ring topology (Figure 1), at every
migration stage, solutions are sent along the same
route, i.e. from the i-th island to the (i+1)-th one.
Island numbers keep constant during the search.
Every (i+1)-th accepts min(N
best
i
, N
worst
i+1
) solutions
to replace the worst individuals in its population.
Figure 1: Ring topology.
In the Random topology (Figure 2), at each
migration stage, for every j-th island, where
and is the total number of islands in the co-
operation, the i-th island, sending the best individuals
to it, is chosen randomly so that . The j-th island
accepts min(N
best
i
, N
worst
j
) solutions.
ECTA 2019 - 11th International Conference on Evolutionary Computation Theory and Applications
260