
 
 
are to be used as inputs to the variation operators), as 
well as  a survival scheme (to  decide how the next 
generation is to be created from the current one and 
outputs of the variation operators). Additionally, real 
valued  parameters  of  the  chosen  settings  (the 
probability of recombination, the level of mutation, 
etc.) have to be tuned (Eiben et al., 1999). 
The  process  of  settings  determination  and 
parameters tuning is known to be a time-consuming 
and  complicated  task.  Much  research  has  tried  to 
deal  with  this  problem.  Some  approaches  tried  to 
determine  appropriate  settings  by  experimenting 
over a set of well-defined functions or by theoretical 
analysis.  Another  set  of  approaches,  usually 
applying  terms  like  "self-adaptation"  or  "self-
tuning",  are  eliminating  the  setting  process  by 
adapting settings through the algorithm execution.  
There  exist  much  research  devoted  to  "self-
adapted"  or  "self-tuned"  GA  and  authors  of  the 
corresponding papers determine similar ideas in very 
different  ways,  all  of  them  aimed  at  reducing  the 
role of human expert in algorithms designing.  
The main idea of the approach used in this paper 
relies  to  automated  selecting  and  using  existing 
algorithmic components. That is why our algorithms 
might be called as self-configuring ones.  
In  order  to  specify  our  algorithms  more 
precisely, one can say that, according to (Angeline, 
1995) classification, we use dynamic adaptation on 
the level of population (Meyer-Nieberg and  Beyer, 
2007).  The  probabilities  of  applying  the  genetic 
operators  are  changed  "on  the  fly"  through  the 
algorithm execution. According to the classification 
given in (Gomez, 2004) we use centralized control 
techniques  (central  learning  rule)  for  parameter 
settings  with  some  differences  from  the  usual 
approaches.  Operator  rates  (the  probability  to  be 
chosen  for  generating  off-spring)  are  adapted 
according  to  the  relative  success  of  the  operator 
during  the  last  generation  independently  of  the 
previous results. This  is  why  our  algorithm avoids 
problem  of  high  memory  consumption  typical  for 
centralized  control  techniques  (Gomez,  2004). 
Operator  rates  are  not  included  in  individual 
chromosome  and  they  are  not  subject  to  the 
evolutionary  process.  All  operators  can  be  used 
during one generation for producing off-spring one 
by one.  
Having  in  mind  the  necessity  to  solve  hard 
optimization problems and our intention to organize 
GA self-adaptation to these problems, we must first 
improve the GA flexibility before it can be adapted. 
For  this  reason  we  have  tried  to  modify  the  most 
important GA operator, i.e., crossover.  
The uniform crossover operator is known as one 
of  the  most  effective  crossover  operators  in 
conventional genetic algorithm (Syswerda, 1989; De 
Jong,  Spears,  1991).  Moreover,  nearly  the 
beginning, it was suggested to use a parameterized 
uniform  crossover  operator  and  it  was  shown  that 
tuning this parameter (the probability for a parental 
gene to be included in off-spring chromosome) one 
can  essentially  improve  "The  Virtues"  of  this 
operator (De Jong and Spears, 1991). Nevertheless, 
in the majority of cases using the uniform crossover 
operator the mentioned possibility is not adopted and 
the probability for a parental gene to be included in 
off-spring chromosome is given equal to 0.5 (Eiben 
and Smith, 2003; Haupt and Haupt, 2004).  
Thus it seems interesting to modify the uniform 
crossover operator with an intention to improve its 
performance.  Desiring  to  avoid  real  number 
parameter  tuning,  we  suggested  introducing 
selective  pressure  on  the  stage  of  recombination 
(Semenkin  and  Semenkina,  2007)  making  the 
probability  of  a  parental  gene  to  be  taken for  off-
spring dependable on parent fitness values. Like the 
usual  GA  selection  operators,  fitness  proportional, 
rank-based and tournament-based uniform crossover 
operators  have  been  added  to  the  conventional 
operator  called  here  the  equiprobable  uniform 
crossover. 
Although the proposed new operators, hopefully, 
give  higher  performance  than  the  conventional 
operators, at the same time the number of algorithm 
setting  variants  increases  that  complicates 
algorithms adjusting  for  the  end  user.  That  is  why 
we need suggesting a way to avoid this extra effort 
for the adjustment. 
With this aim, we apply operators’ probabilistic 
rates dynamic adaptation on the level of population 
with  centralized  control  techniques.  To  avoid  real 
parameters  precise  tuning,  we  use  setting  variants, 
namely  types  of  selection,  crossover,  population 
control and a level of mutation (medium, low, high). 
Each  of  these  has  its  own  probability  distribution. 
E.g.,  there  are  5  settings  of  selection  –  fitness 
proportional, rank-based, and tournament-based with 
three tournament sizes. During the initialization all 
probabilities  are  equal  to  0.2  and  they  will  be 
changed  according  to  a  special  rule  through  the 
algorithm’s  execution  in  such  a  way  that  a  sum  of 
probabilities should be equal to 1 and no probability 
could  be  less  than  a  preconditioned  minimum 
balance. The list of crossover operators includes 11 
items,  i.e.,  1-point,  2-point  and  four  uniform 
crossovers all with  two numbers of  parents  (2  and 
7).  The  "idle  crossover"  is  included  in  the  list  of 
Spacecrafts' Control Systems Effective Variants Choice with Self-configuring Genetic Algorithm
89