(Ag’s) and B cells. The input is the Ag (i.e.,the prob-
lem to tackle, the function to optimize); the output
is basically the candidate solutions, the B cells, that
have solved/recognized the Ag. All IAs based on the
clonal selection theory are population based. Each
individual of the population is a candidate solution
belonging to the combinatorial fitness landscape of
a given computational problem. Using the cloning
operator, an immune algorithm produces individuals
with higher affinities (higher fitness function values),
introducing blind perturbation (by means of a hyper-
mutation operator) and selecting their improved ma-
ture progenies. We will describe two different exam-
ples of Clonal Selection Algorithms. We start with
the algorithm CLONALG (de Castro L. N., 2002a),
which uses fitness values for proportional cloning, in-
versely proportional hypermutation and a birth oper-
ator to introduce diversity in the current population
along with a mutation rate to flip a bit of a B cell mem-
ory. Extended algorithms use also threshold values to
clone the best cells in the present population. We will,
then, describe an immune algorithm that uses a static
cloning operator, hypermutation and hypermacromu-
tation operators, without memory cells and an aging
phase, a deterministic elimination process; we will re-
fer to the algorithm using the acronym opt-IA.
CLONALG. CLONALG (de Castro L. N., 2002a)
is characterized by two populations: a population
of antigens Ag and a population of antibodies
Ab (denoted with P
(t)
). The individual anti-
body, Ab, and antigen, Ag, are represented by
string attributes m = m
L
,...,m
1
, that is, a point
in an L−dimensional real-valued shape space
S, m ∈ S
L
⊆
L
. The Ab population is the set of
current candidate solutions, and the Ag is the environ-
ment to be recognized. After a random initialization
of the first population P
(0)
, the algorithm loops for a
predefined maximum number of generations (N
gen
). In the first step, it determines the fitness function
values of all Abs in relation to the Ag. Next, it
selects n Abs that will be cloned independently and
proportionally to their antigenic affinities, generating
the clone population P
clo
. Hence, the higher the
affinity-fitness, the higher the number of clones
generated for each of the n Abs with respect to
the following function: N
c
=
i=1...n
(β ∗ n)/i
where β is a multiplying factor to be experimentally
determined. Each term of the sum corresponds to the
clone size of each Ab. The hypermutation operator
performs an affinity maturation process inversely
proportional to the fitness values generating the
matured clone population P
hyp
. After computing
the antigenic affinity (i.e., the fitness function) of
the population P
hyp
, CLONALG creates randomly
d new antibodies that will replace the d lowest fit
Abs in the current population (for the pseudo-code of
CLONALG see (de Castro L. N., 2002a)).
opt-IA. The opt-IA algorithm uses only two en-
tities: antigens (Ag) and B cells like CLONALG.
At each time step t, we have a population P
(t)
of
size d. The initial population of candidate solutions,
time t =0, is generated randomly. The function
Evaluate(P) computes the affinity (fitness) function
value of each B cell x ∈ P. The designed IA , like
all immune algorithms based on the clonal selection
principle, is characterized by clonal expansion, the
cloning of B cells with higher antigenic affinity.
The implemented IA uses three immune operators,
cloning, hypermutation and aging. The cloning oper-
ator, simply, clones each B cell dup times producing
an intermediate population P
clo
of size d × dup. The
hypermutation operator acts on the the B cell receptor
of P
clo
. The number of mutations M is determined
by a mutation potential. It is possible define various
mutation potential. We tested our IA using static,
and inversely proportional hypermutation operators,
hypermacromutation operator, and combination of
hypermutation operators and hypermacromutation.
The two hypermutation operators and the Hyperma-
cromutation perturbs the receptors using different
mutation potentials, depending upon a parameter c
In particular, it is worthwhile to note here, that all
the implemented operators try to mutate each B cell
receptor M times without using probability mutation.
The mutation potentials used in this research work
are the following: Static Hypermutation (H1): the
number of mutations is independent from the fitness
function f, so each B cell receptor at each time
step will undergo at most M
s
(x)=c mutations.
Inversely Proportional Hypermutation (H2): the
number of mutations is inversely proportional to
the fitness value, that is it decrease as the affinity
function of the current B cell increases. So at
each time step t, the operator will perform at most
M
i
(f(x)) = ((1 −
E
∗
f(x)
) × (c × )) + (c × ))
mutations. In this case, M
i
(f(x)) has the shape
of an hyperbola branch. Hypermacromutation (M):
the number of mutations is independent from the
fitness function f and the parameter c. In this case,
we choose at random two integers, i and j such
that (i +1) ≤ j ≤ the operator mutates at most
M
m
(x)=j − i +1directions, in the range [i, j].
The aging operator eliminates old B cells, in the
populations P
(t)
, P
(hyp)
and/or P
(macro)
, to avoid
premature convergence. To increase the population
diversity, new B cells are added by the Elitist
Merge
function. The parameter τ
B
sets the maximum num-
ber of generations allowed to B cells to remain in the
population. When a B cell is τ
B
+1old it is erased
HOW TO ESCAPE TRAPS USING CLONAL SELECTION ALGORITHMS
323